report
stringlengths 319
46.5k
| summary
stringlengths 127
5.75k
| input_token_len
int64 78
8.19k
| summary_token_len
int64 29
1.02k
|
---|---|---|---|
As part of our audit of the fiscal years 2005 and 2004 CFS, we evaluated Treasury's financial reporting procedures and related internal control, and we followed up on the status of Treasury and OMB corrective actions to address open recommendations regarding the process for preparing the CFS that were in our prior years' reports. In our disclaimer of opinion on the fiscal year 2005 CFS, which is included in the fiscal year 2005 Financial Report of the United States Government, we discussed material deficiencies relating to Treasury's preparation of the CFS. These material deficiencies contributed to our disclaimer of opinion on the CFS and also constitute a material weakness in internal control, which contributed to our adverse opinion on internal control. We performed sufficient audit procedures to provide the disclaimer of opinion in accordance with U.S. generally accepted government auditing standards. This report provides the details of the additional weaknesses we identified in performing our fiscal year 2005 audit procedures related to the process for preparing the CFS and our recommendations to correct those weaknesses, as well as the status of corrective actions taken by Treasury and OMB to address recommendations in our prior reports. We requested comments on a draft of this report from the Director of OMB and the Secretary of the Treasury or their designees. OMB provided oral comments, which are discussed in the Agency Comments and Our Evaluation section of this report. Treasury's comments are reprinted in appendix II and are also discussed in the Agency Comments and Our Evaluation section. As discussed in our fiscal year 2005 audit report, fiscal year 2005 was the second year that Treasury used GFRS to collect agency financial statement information taken directly from federal agencies' audited financial statements. The goal of GFRS is to be able to directly link information from federal agencies' audited financial statements to amounts reported in the consolidated financial statements and resolve many of the weaknesses we previously identified in the process for preparing the consolidated financial statements, a goal we strongly support. For both the fiscal year 2005 and 2004 reporting processes, GFRS was able to capture agency financial information submitted to Treasury, but GFRS is still under development and not at the stage that it could be used to fully compile the consolidated financial statements from the information captured. As we have reported in the past, Treasury's process for compiling the CFS does not yet fully ensure that financial information from federal agencies' audited financial statements and other financial data directly link to amounts reported in the CFS. In our fiscal year 2005 audit report, we noted that Treasury made progress in demonstrating amounts in the Balance Sheet and the Statement of Net Cost were consistent with federal agencies' audited financial statements prior to eliminating intragovernmental activity and balances. However, about 25 percent of the significant federal agencies' auditors reported internal control weaknesses related to the processes the agencies perform to provide financial statement information to Treasury for preparing the consolidated financial statements. In our prior report, we recommended that as Treasury continues to design and further implement its new process for compiling the CFS, the Secretary of the Treasury should direct the Fiscal Assistant Secretary, in coordination with the Controller of OMB, to modify Treasury's closing package to (1) require federal agencies to directly link their audited financial statement notes to the CFS notes and (2) provide the necessary information to demonstrate that all of the five principal consolidated financial statements are consistent with the underlying information in federal agencies' audited financial statements and other financial data. Progress was made during fiscal year 2005. Treasury has been continuing to design and further implement its new process for compiling the CFS with the development of GFRS. We continue, though, to be concerned that the disciplined processes necessary to reduce risks to acceptable levels have not yet been effectively implemented. For example, Treasury moved forward with the project before ensuring that certain key elements, such as a concept of operations, were developed or even defining and documenting the financial reporting weaknesses that were expected to be addressed by the system. Not effectively implementing such disciplined processes creates an unnecessary risk that the system will cost more and take longer than expected to deploy, while not providing all of the intended system functionality. The implementation of any major system, such as GFRS, is not without risk; however, organizations that follow and effectively implement accepted best practices in systems development and implementation have been shown to reduce these risks to acceptable levels. A more detailed discussion of our assessment of Treasury's ongoing effort to develop and implement GFRS, along with recommendations to reduce the risk noted above, can be found in a separate report. The CFS includes 2 years of financial information. Because comparative financial statements are intended to furnish useful data about the differences in activity and balances between the 2 years shown, consistency in how amounts are reported for the 2 years is a major factor in creating comparability. We found that Treasury lacked a process to ensure that consolidated financial statements and notes for fiscal years 2005 and 2004 were consistently reported and therefore comparable. During fiscal year 2005, Treasury requested that agencies resubmit fiscal year 2004 financial information along with their fiscal year 2005 financial information. Some agencies resubmitted fiscal year 2004 amounts in fiscal year 2005 that differed from what Treasury published in fiscal year 2004. Also, certain information reported for fiscal 2004 may have required reclassification to be comparable to the fiscal year 2005 amounts. Treasury did not analyze the fiscal year 2004 information submitted in fiscal year 2005 or reclassify amounts within various financial statement line items and notes to achieve comparability and chose to continue to report what was published for fiscal year 2004. For example, the Reconciliations of Net Operating Cost and Unified Budget Deficit showed $47.8 billion and $.2 billion for property, plant, and equipment disposals and revaluations for fiscal years 2005 and 2004, respectively. However, based on the audited financial information provided by agencies to Treasury in GFRS in fiscal year 2005, the fiscal year 2004 amount should be $25.4 billion, rather than $.2 billion. The difference should have been reclassified from the Net Amount of All Other Differences line item on the Reconciliations of Net Operating Cost and Unified Budget Deficit. We recommend that the Secretary of the Treasury direct the Fiscal Assistant Secretary to develop a process to help ensure that, for each reporting year, the 2 years of consolidated financial statements and note information presented are consistently and comparably reported, in all material respects. Treasury and OMB did not require closing packages from 4 of the 35 verifying agencies to be audited. Specifically, the Treasury Financial Manual (TFM) states that the Inspector General or a contracted independent public accountant for each federal verifying agency--except those agencies whose fiscal year ends on a date other than September 30-- must opine on the closing package data entered by the Chief Financial Officer into GFRS. Because of these year-end differences, the TFM does not require the Federal Deposit Insurance Corporation's Funds, National Credit Union Administration, and Farm Credit System Insurance Corporation--all of which have a year end other than September 30--to have their closing package data be audited. In addition, for fiscal years 2004 and 2005, OMB waived the closing package audit requirement for the Tennessee Valley Authority (TVA), which does have a September 30 fiscal year end. In these four cases, Treasury and OMB did not develop any alternative solutions that include the requirement for adequate audit procedures to be performed over significant information included in the CFS. As a result, unaudited September 30 information was included in the CFS for 4 agencies that Treasury and OMB consider to be significant. Treasury, therefore, has less assurance that the information included in the CFS for these agencies is fairly stated and directly links to the agencies' audited financial statements. We recommend that the Director of OMB direct the Controller of the Office of Federal Financial Management, in coordination with the Treasury Fiscal Assistant Secretary, to develop an alternative solution for obtaining audit assurance related to the Federal Deposit Insurance Corporation's Funds, National Credit Union Administration, and Farm Credit System Insurance Corporation, which includes the requirement for adequate audit procedures to be performed over significant information included in the CFS for these agencies. We also recommend that the Director of OMB direct the Controller of the Office of Federal Financial Management to consider not waiving the closing package audit requirement for any verifying agency in future years, such as TVA. GAO's Standards for Internal Control in the Federal Government states that internal control is a major part of managing an organization and should include monitoring. Monitoring of internal control should include assessing the quality of performance over time and implementing policies and procedures for the timely follow-up and resolution of findings of audits and other reviews. The goal of these policies and procedures is to ensure that managers (1) promptly evaluate findings from audits and other reviews, including those showing deficiencies and recommendations reported by auditors and others who evaluate agencies' operations; (2) determine proper actions in response to findings and recommendations from audits and reviews; and (3) complete, within established time frames, all actions that correct or otherwise resolve the matters brought to management's attention. However, Treasury, in coordination with OMB, had not developed policies and procedures for monitoring internal control or provided us with adequate documentation evidencing an executable plan of action and milestones for short-term and long-range solutions for certain internal control weaknesses we have previously reported regarding the process for preparing the CFS. Without effective monitoring of internal control, findings of audits may not be resolved timely and properly. We recommend that the Secretary of the Treasury direct the Fiscal Assistant Secretary, in coordination with the Controller of OMB, to develop policies and procedures for monitoring internal control to help ensure that (1) audit findings are promptly evaluated, (2) proper actions are determined in response to audit findings and recommendations such as a documented plan of action with milestones for short-term and long- range solutions, and (3) all actions that correct or otherwise resolve the audit findings are completed within established time frames; and an executable plan of action and milestones for short-term and long- range solutions for certain internal control weaknesses we have previously reported regarding the process for preparing the CFS. The TFM prescribes how federal agencies are to submit financial information to Treasury to be used in compiling the CFS. While our planned audit procedures were not to review the entire TFM to determine if its guidance to agencies was clear, we found several areas where the TFM did not give clear guidance to federal agencies about the information that they were required to provide to Treasury, GAO, and OMB. Specifically, we found that the TFM did not give clear guidance for (1) reporting note disclosures for restricted cash, (2) reporting note disclosures for accounts payable, (3) preparing summaries of unadjusted misstatements to be included with federal agencies' closing package management representation letters, and (4) certain information to be reported by OPM to Treasury that is used to allocate costs on the Statement of Net Cost. For example, the TFM defines restricted cash as "amounts of cash that an entity holds and does not have authority to spend" and cash that is not restricted as "amounts of cash that an entity holds for which it has the authority to spend." Although these definitions are accurate at the agency level, these definitions are not accurate at the CFS level. For example, an agency may hold cash that it does not have the authority to spend because of a certain law or regulation, but when this cash is consolidated at the governmentwide level, the federal government as a whole may have the authority to spend the cash. Therefore, this cash would appropriately be restricted at the agency level, but not at the governmentwide level. As a result of the unclear guidance, agencies reported certain financial information inconsistently. This increases the risk of incomplete and inaccurate summarization of data in the CFS. We recommend that the Secretary of the Treasury direct the Fiscal Assistant Secretary to ensure that the TFM and any other guidance to federal agencies provide clear instructions for providing reliable data to Treasury in the following specific areas: summaries of unadjusted misstatements, and certain information reported by OPM that is used to allocate costs on the Statement of Net Cost. OMB and Treasury require federal agencies to reconcile selected intragovernmental activity and balances with their "trading partners" and report on the extent and results of the reconciliation efforts to Treasury. As part of the reconciliation report, federal agencies were required to categorize any material differences, as determined by Treasury, with their trading partners at fiscal year end within five categories: (1) confirmed reporting; (2) accounting methodology differences; (3) accounting or reporting errors; (4) timing difference--current year, timing difference-- prior year; and (5) unknown/unreconciled. According to Treasury, confirmed reporting, the first category listed above, is intended to indicate that the agency has verified that the amount it has reported is accurate. The TFM requires a federal agency that selects the category "confirmed reporting" to provide a detailed explanation to support its response. However, we found that in many cases where a federal agency selected the "confirmed reporting" category, the agency did not provide detailed explanations. We also found cases where both trading partners selected "confirmed reporting" for the same material difference and the agencies did not provide detailed explanations for how both trading partners' amounts could be accurate when the material difference remained. When this situation occurs, we found that Treasury and OMB do not have an effective process to obtain clarification for inconsistent explanations provided and that agencies may be unclear as to when to select this category. Incorrect use of the confirmed reporting category and lack of detailed explanations may hinder efforts to identify and correct problems that federal agencies are experiencing in reconciling with their trading partners. Further, Treasury received the closing packages that contained each agency's intragovernmental activity and balances amounts on November 18, 2005, and provided agencies with reconciliation reports that showed material differences with their trading partners on November 21, 2005. Treasury and OMB also require federal agencies' IGs to annually perform agreed-upon procedures on the intragovernmental activity and balances reported in the closing package. For fiscal year 2005, Treasury required agency IGs for the 35 verifying agencies to complete and report on these agreed-upon procedures by December 2, 2005. The timing of these procedures did not optimize their value because (1) this reporting date is over 2 weeks after federal agencies' audited financial statements were required to be issued to OMB, and (2) did not allow Treasury sufficient time to review the results and make any necessary adjustments to the CFS. We recommend that the Secretary of the Treasury direct the Fiscal Assistant Secretary, working in coordination with the Controller of OMB, to provide clear guidance to federal agencies as to when the "confirmed reporting" category in the intragovernmental reconciliation report should be selected, develop an effective process for obtaining clarification from federal agencies for inconsistent or incomplete explanations provided in all material difference categories, and accelerate the due date for IGs to complete and report on the results of agreed-upon procedures on the intragovernmental activity and balances or develop an alternative solution that would allow Treasury sufficient time to review the results and make any necessary adjustments to the CFS. In oral comments on a draft of this report, OMB stated that it generally agreed with the new findings and related recommendations in this report. In addition, OMB provided some technical comments, which we have incorporated as appropriate. In written comments on a draft of this report, which are reprinted in appendix II, Treasury stated that it agrees that the preparation process still needs improvement and that it is addressing many of the recommendations in our previous reports. Treasury also stated that it concurs with all of the new recommendations in this report except for the recommendation to accelerate the due date for IGs to complete the agreed-upon procedures on the intragovernmental activity and balances. For fiscal year 2006, Treasury does not plan to accelerate the due date for completing these intragovernmental agreed-upon procedures. However, Treasury stated that for fiscal year 2006, it plans to expand the audit coverage for intragovernmental activity and balances by requiring the IGs to opine on such information in their audit of the closing package, which is due to Treasury by November 17, 2006. This is an appropriate alternative solution to accelerating the due date for the IGs to complete the intragovernmental agreed-upon procedures. We have modified our recommendation to also include developing an alternative solution to address this finding. This report contains recommendations to the Secretary of the Treasury and the Director of OMB. The head of a federal agency is required by 31 U.S.C. 720 to submit a written statement on actions taken on these recommendations. You should submit your statement to the Senate Committee on Homeland Security and Governmental Affairs and the House Committee on Government Reform within 60 days of the date of this report. A written statement must also be sent to the House and Senate Committees on Appropriations with the agency's first request for appropriations made more than 60 days after the date of the report. We are sending copies of this report to the Chairmen and Ranking Minority Members of the Senate Committee on Homeland Security and Governmental Affairs; the Subcommittee on Federal Financial Management, Government Information, and International Security, Senate Committee on Homeland Security and Governmental Affairs; the House Committee on Government Reform; and the Subcommittee on Government Management, Finance, and Accountability, House Committee on Government Reform. In addition, we are sending copies to the Fiscal Assistant Secretary of the Treasury and the Deputy Director for Management of OMB. Copies will be made available to others upon request. This report is also available at no charge on GAO's Web site at http://www.gao.gov. We acknowledge and appreciate the cooperation and assistance provided by Treasury and OMB during our audit. If you or your staff have any questions or wish to discuss this report, please contact Jeffrey C. Steinhoff, Managing Director, Financial Management and Assurance, on (202) 512-2600, or Gary T. Engel, Director, Financial Management and Assurance, on (202) 512-3406. Staff contacts and other key contributors to this report are listed in appendix II. This appendix includes open recommendations from three of our prior reports: Financial Audit: Process for Preparing the Consolidated Financial Statements of the U.S. Government Needs Improvement, GAO- 04-45 (Washington, D.C.: Oct. 30, 2003); Financial Audit: Process for Preparing the Consolidated Financial Statements of the U.S. Government Needs Further Improvement, GAO-04-866 (Washington, D.C.: Sept. 10, 2004); and Financial Audit: Process for Preparing the Consolidated Financial Statements of the U.S. Government Continues to Need Improvement, GAO-05-407 (Washington, D.C.: May 4, 2005). Recommendations that were closed in prior reports are not included in this appendix. This appendix includes the status of the recommendations according to the Department of the Treasury (Treasury) and the Office of Management and Budget (OMB) as well as our own assessments. Explanations are included in the status of recommendations per GAO when Treasury and OMB disagreed with our recommendation. Of the 154 recommendations regarding the process for preparing the CFS that are listed in this appendix, 131 remained open as of December 2, 2005, the end of GAO's fieldwork for the audit of the fiscal year 2005 CFS. Of these 131 recommendations, 76 relate to specific disclosures required under U.S. generally accepted accounting principles (GAAP). Treasury has submitted a proposal to the Federal Accounting Standards Advisory Board (FASAB) seeking to amend previously issued standards and eliminate or lessen the disclosure requirements for the consolidated financial statements so that GAAP would no longer require certain of the information Treasury has not been reporting. Comments on the exposure draft of a proposed FASAB standard, based on the Treasury proposal, are due March 1, 2006. Treasury stated that it is waiting for FASAB approval and issuance of this proposed standard to determine the disclosures that will be required in future consolidated financial statements. 1. See "Agency Comments and Our Evaluation" section. 2. We continue to believe that our recommendations relating to the Statement of Changes in Cash Balance from Unified Budget and Other Activities, Reconciliations of Net Operating Cost and Unified Budget Deficit, and the adjustment process are sound. Our recommendations are intended to allow flexibility in developing viable solutions to address the issues. We will consider any alternative action that Treasury may take to satisfactorily address the recommendations with which it has disagreed. See appendix I for the status of related recommendations. In addition to the above contact, the following individuals made key contributions to this report: Lynda Downing, Assistant Director; Keith Kronin; Katherine Schirano; and Taya Tasse. | For the past 9 years, since our first audit of the consolidated financial statements of the U.S. government (CFS), certain material weaknesses in internal control and in selected accounting and financial reporting practices have resulted in conditions that prevented GAO from expressing an opinion on the CFS. Specifically, GAO has reported that the U.S. government did not have adequate systems, controls, and procedures to properly prepare the CFS. Included with GAO's December 2005 disclaimer of opinion on the fiscal year 2005 CFS was its discussion of continuing weaknesses relating to the Department of the Treasury's (Treasury) preparation of the CFS. The purpose of this report is to (1) provide details of those additional weaknesses, (2) recommend improvements, and (3) describe the status of corrective actions on GAO's previous 154 recommendations. GAO identified weaknesses during its tests of Treasury's process for preparing the fiscal year 2005 CFS. Such weaknesses in the CFS preparation process impair the U.S. government's ability to ensure that the CFS is consistent with the underlying audited agency financial statements, properly balanced, and in conformity with U.S. generally accepted accounting principles. The weaknesses GAO identified during the fiscal year 2005 CFS audit involved the following areas: (1) directly linking audited federal agency financial statements to the CFS, (2) comparability of financial statements, (3) audit assurance over certain federal agencies' closing packages, (4) internal control monitoring, (5) consolidated reporting guidance to federal agencies, (6) reconciling of intragovernmental activity and balances, and (7) various other internal control weaknesses that were identified in previous years' audits but remained in fiscal year 2005. Of the 154 recommendations GAO reported in May 2005 regarding the process for preparing the CFS, 131 remained open as of December 2, 2005, when GAO completed its fieldwork for the audit of the fiscal year 2005 CFS. However, 76 of these 131 recommendations relate to specific disclosures required under U.S. generally accepted accounting principles. Treasury has submitted a proposal to the Federal Accounting Standards Advisory Board (FASAB) seeking to amend previously issued standards and eliminate or lessen the disclosure requirements for the consolidated financial statements so that U.S. generally accepted accounting principles would no longer require certain of the information Treasury has not been reporting. Comments on the exposure draft of a proposed FASAB standard, based on the Treasury proposal, were due March 1, 2006. GAO will continue to monitor the status of corrective actions to address open recommendations during its fiscal year 2006 audit of the CFS. | 4,457 | 552 |
Dual-eligible beneficiaries are a particularly vulnerable population. These individuals are typically poorer, tend to have far more extensive health care needs, have higher rates of cognitive impairments, and are more likely to be disabled than other Medicare beneficiaries. About three out of four dual-eligible beneficiaries live in the community and typically obtain drugs through retail pharmacies. Other dual-eligible beneficiaries reside in long-term care facilities and obtain drugs through pharmacies that specifically serve these facilities. In general, individuals become dual-eligible beneficiaries in two ways. One way is when Medicare-eligible individuals subsequently become Medicaid eligible. This typically occurs when income and resources of beneficiaries fall below certain levels and they enroll in the Supplemental Security Income (SSI) program, or they incur medical costs that reduce their income below Medicaid eligibility thresholds. If these Medicare beneficiaries did not sign up for a Part D plan on their own, they have no drug coverage until they are enrolled in a PDP by CMS. CMS data show that this group represented about two-thirds of new dual-eligible beneficiaries the agency enrolled in PDPs in 2006. According to CMS, it is not possible for it to predict which Medicare beneficiaries will become Medicaid eligible in any given month because Medicaid eligibility determinations are a state function. Another way individuals become dually eligible is when Medicaid beneficiaries subsequently become eligible for Medicare by reaching 65 years of age or by completing the 24-month disability waiting period. Once they become dual-eligible beneficiaries, they can no longer receive coverage from state Medicaid agencies for their Part D-covered prescription drugs. In 2006, this group represented approximately one- third of the new dual-eligible beneficiaries enrolled in PDPs by CMS. CMS can generally learn from states when these individuals will become dually eligible. For dual-eligible beneficiaries, Medicare provides a low-income subsidy that covers most of their out-of-pocket costs for Part D drug coverage. This subsidy covers the full amount of the monthly premium that non- subsidy-eligible beneficiaries normally pay, up to the low-income benchmark premium. The subsidy also covers most or all of a dual-eligible beneficiary's prescription copayments. In 2007, these beneficiaries are responsible for copayments that range from $1 to $5.35 per prescription, depending on their income and asset levels, with the exception of those in long-term care facilities, who pay no copayments. Given the number of entities, information systems, and administrative steps involved, it takes a minimum of 5 weeks for CMS to identify and enroll a new dual-eligible beneficiary in a PDP. As a result, two out of three new dual-eligible beneficiaries--generally those who are Medicare eligible and then become Medicaid eligible--may experience difficulties obtaining their prescription drugs under Part D during this interval. For other new dual-eligible beneficiaries--those switching from Medicaid to Medicare drug coverage--CMS instituted a prospective enrollment process in late 2006 that enrolls these individuals before their date of Medicare eligibility and offers a seamless transition to Part D coverage. Multiple parties and information systems are involved in identifying and enrolling dual-eligible beneficiaries in PDPs. As shown in figure 1, CMS, the Social Security Administration (SSA), state Medicaid agencies, and PDP sponsors play key roles in providing information needed to ensure that new dual-eligible beneficiaries are identified and enrolled properly. SSA maintains information on Medicare eligibility that is used by CMS and some states. State Medicaid agencies are responsible for forwarding to CMS lists of beneficiaries whom the state believes to be eligible for both Medicare and Medicaid. CMS is then responsible for making plan assignments and processing enrollments. PDP sponsors maintain information systems that are responsible for exchanging enrollment and billing information with CMS. The process of enrolling dual-eligible beneficiaries requires several steps. It begins when state Medicaid agencies identify new dual-eligible beneficiaries and ends when PDPs make billing information available to pharmacies and send enrollment information to dual-eligible beneficiaries. We estimate that it takes at least 5 weeks to complete the process under current procedures. During this interval, pharmacies may not have up-to- date PDP enrollment information on new dual-eligible individuals. This may result in beneficiaries having difficulty obtaining Part D-covered drugs at their pharmacies. To illustrate why this occurs, we present the hypothetical example of Mr. Smith, who as a Medicare beneficiary did not sign up for the Part D drug benefit and, therefore, upon becoming Medicaid eligible, was enrolled in a PDP by CMS. (Fig. 2 shows the steps in Mr. Smith's enrollment process.) From the time Mr. Smith applies for his state's Medicaid program on August 11, it takes about 1 month for him to receive notification from the state that he is eligible for Medicaid, thus beginning the enrollment process. From there, Mr. Smith's new status is submitted by his state to CMS in a monthly file transmittal. Once CMS receives the lists of dual- eligible beneficiaries from all of the states, it verifies eligibility for Medicare and sets each beneficiary's cost-sharing level. Then, around October 8, CMS assigns Mr. Smith to a PDP randomly, based on the premium level and the geographic area served by the PDP. CMS next notifies the PDP sponsor, which then has to enroll him in its plan and assign the necessary billing information. This billing information, such as a member identification number, is necessary for pharmacies to correctly bill the PDP for Mr. Smith's prescriptions. The PDP also has to inform Mr. Smith of his enrollment information. By the time this process is completed, it is the middle of October. CMS has developed some contingency measures to help individuals like Mr. Smith during the processing interval. However, we found that these measures have not always worked effectively. For instance, CMS designed an enrollment contingency option to ensure that dual-eligible beneficiaries who were not yet enrolled in a PDP could get their medications covered under Part D, while also providing assurance that the pharmacy would be reimbursed for those medications. However, representatives of pharmacy associations we spoke with reported problems with reimbursements after using this option, which has led some pharmacies to stop using it. To avoid a gap in coverage for beneficiaries transitioning from Medicaid to Medicare prescription drug coverage, CMS has implemented a prospective enrollment process. Because states can predict and notify CMS which Medicaid beneficiaries will become new dual-eligible beneficiaries and when, CMS begins the enrollment process for these individuals 2 months before the their anticipated dual-eligible status is attained. By conducting the processing steps early, the prospective enrollment used for this group of new dual-eligible beneficiaries should ensure a seamless transition from Medicaid drug coverage to Medicare Part D coverage. Fully implemented in November 2006, prospective enrollment applies to about one-third of the new dual-eligible beneficiaries enrolled in PDPs by CMS. For the majority of new dual-eligible beneficiaries, CMS requires PDPs to provide drug coverage retroactively, typically by several months. During 2006, Medicare paid PDPs millions of dollars to provide coverage to dual- eligible beneficiaries for drug costs that may have been incurred during the retroactive coverage period. However, we found that CMS did not fully implement or monitor the impact of this policy. CMS made the effective date of Part D drug coverage for Medicare beneficiaries who become Medicaid eligible coincide with the effective date of their Medicaid eligibility. Under this policy, Part D coverage for these beneficiaries is effective the first day of the month that Medicaid eligibility is effective, which generally occurs 3 months prior to the date an individual's Medicaid application was submitted to the state, if the individual was eligible for Medicaid during this time. Thus, the Part D coverage period can extend retroactively back several months from when the actual PDP enrollment takes place. Medicare makes payments to the PDPs for providing drug coverage retroactively. Specifically, PDPs are paid approximately $90 per month for the retroactive coverage period. PDPs, in turn, are responsible for reimbursing their members (or another payer) for Part D drug costs incurred during the retroactive months. For instance, in the case of Mr. Smith, while he applied for Medicaid in August and learned of his PDP assignment for Part D in October, his coverage was effective May 1. If Mr. Smith incurred any costs for Part D-covered prescription drugs from May--when he became eligible for Medicaid--through October, he could submit his receipts to his assigned PDP and be reimbursed by the PDP, less the copayments he would pay as a dual-eligible beneficiary. We found that CMS's implementation of this policy in 2006 was incomplete. While dual-eligible beneficiaries were entitled to reimbursement by their PDPs in 2006, neither CMS nor PDPs notified dual- eligible beneficiaries of this right. The model letters used until March 2007 to inform dual-eligible beneficiaries of their PDP enrollment did not include any language concerning reimbursement of out-of-pocket costs incurred during retroactive coverage periods. In response to a recommendation in our report, CMS modified the model letters that the agency and PDPs use to notify dual-eligible beneficiaries about their PDP enrollment. The revised letters let beneficiaries know that they may be eligible for reimbursement of some prescription costs incurred during retroactive coverage periods. Given the vulnerability of this population, it seems unlikely that many dual-eligible beneficiaries would have contacted their PDPs for reimbursement if they were not clearly informed of their right to do so and given information about how to file for reimbursement, neither would they likely have retained proof of their drug expenditures. Mr. Smith, for example, would need receipts for drug purchases made during a 5-month period preceding the date he was notified of his PDP enrollment--at a time when he could not foresee the need for doing so. Further, CMS did not monitor how many months of retroactive coverage PDPs provided, nor did it monitor PDP reimbursements to beneficiaries for costs incurred during retroactive coverage periods. Based on data provided by CMS, we estimate that Medicare paid about $100 million to PDP sponsors in 2006 for retroactive coverage. CMS does not know what portion of this $100 million PDPs paid to dual-eligible beneficiaries to reimburse them for drug costs. If Mr. Smith's PDP did not reimburse Mr. Smith for any prescription drugs purchased during the retroactive coverage period, the PDP retained Medicare's payments for that time period. Given the time it takes to complete the enrollment process, CMS has taken action to ensure ready access to Part D for some new dual-eligible beneficiaries, but difficulties remain for others. For the one-third of new dual-eligible beneficiaries whose eligibility can be predicted, CMS's decision to implement prospective enrollment should eliminate the coverage gap in transitioning from Medicaid to Medicare drug coverage. However, because of inherent processing lags, most new dual-eligible beneficiaries may continue to experience difficulties obtaining their drugs for at least 5 weeks after being notified of their dual-eligible status. In addition, CMS's incomplete implementation of its retroactive coverage policy in 2006 means that CMS paid PDPs millions of dollars for coverage during periods for which dual-eligible beneficiaries may not have sought reimbursement for their drug costs. Without routine monitoring of this policy, the agency remains unaware of what portion of these funds was subsequently reimbursed to beneficiaries and, therefore, cannot ensure the efficient use of program funds. Our report contains several recommendations. We recommend that CMS require PDPs to notify beneficiaries of their right to reimbursement and monitor implementation of its retroactive payment policy. We also recommend that CMS take other steps to improve the operational efficiency of the program. Although the agency did not agree with all of them, it has already taken steps to implement some of our recommendations. As of March 2007, CMS has modified its letters to dual- eligible beneficiaries to include language informing them of their right to reimbursement for drug costs incurred during retroactive coverage periods and required PDP sponsors to do the same. In addition, CMS officials told us that they plan to analyze data to determine the magnitude of payments made to PDPs for retroactive coverage and the amounts PDPs have paid to beneficiaries. We hope that CMS will use this information to evaluate the effectiveness of its retroactive coverage policy. If, after conducting the analysis, CMS determines that it is paying PDPs substantial amounts of money and dual-eligible beneficiaries are not requesting reimbursements, the agency may want to rethink its policy in light of pursuing the most efficient use of Medicare funds. Mr. Chairman, this concludes my prepared remarks. I would be pleased to respond to any questions that you or other members of the committee may have at this time. For further information regarding this testimony, please contact Kathleen King at (202) 512-7119 or [email protected]. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this statement. Contributors to this testimony include Rosamond Katz, Assistant Director; Lori Achman; and Samantha Poppe. This is a work of the U.S. government and is not subject to copyright protection in the United States. It may be reproduced and distributed in its entirety without further permission from GAO. However, because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. | Under the Medicare Prescription Drug, Improvement, and Modernization Act of 2003 (MMA), dual-eligible beneficiaries--individuals with both Medicare and Medicaid coverage--have their drug costs covered under Medicare Part D rather than under state Medicaid programs. The MMA requires the Centers for Medicare & Medicaid Services (CMS) to enroll these beneficiaries in a Medicare prescription drug plan (PDP) if they do not select a plan on their own. CMS enrolled about 5.5 million dual-eligible beneficiaries in late 2005 and about 634,000 beneficiaries who became dually eligible during 2006. GAO was asked to testify on (1) CMS's process for enrolling new dual-eligible beneficiaries into PDPs and its effect on access to drugs and (2) how CMS set the effective coverage date for certain dual-eligible beneficiaries and its implementation of this policy. This testimony is based on a GAO report that is being released today, Medicare Part D: Challenges in Enrolling New Dual-Eligible Beneficiaries (GAO-07-272). CMS's process for enrolling new dual-eligible beneficiaries who have not yet signed up for a PDP involves many parties, information systems and administrative steps, and takes a minimum of 5 weeks to complete. For about two-thirds of these individuals--generally Medicare beneficiaries who subsequently qualify for Medicaid--pharmacies may not have up-to-date PDP enrollment information needed to bill PDPs appropriately until the beneficiaries' data are completely processed. As a result, these beneficiaries may have difficulty obtaining their Part D-covered prescription drugs during this interval. CMS has created contingency measures to help individuals obtain their new Medicare benefit, but these measures have not always worked effectively. For the other one-third of new dual-eligible beneficiaries--Medicaid enrollees who become Medicare-eligible because of age or disability--CMS eliminated the impact of processing time by enrolling them in PDPs just prior to their attaining Medicare eligibility. This prospective enrollment, implemented in late 2006, offers these dual-eligible beneficiaries a seamless transition to Medicare Part D coverage. CMS set the effective Part D coverage date for Medicare-eligible beneficiaries who subsequently become eligible for Medicaid to coincide with the date their Medicaid coverage becomes effective. Under this policy, which was designed to provide drug coverage for dual-eligible beneficiaries as soon as they attain dual-eligible status, the start of their Part D coverage can extend retroactively for several months before the date beneficiaries are notified of their PDP enrollment. GAO found that CMS did not fully implement or monitor the impact of this policy. Although beneficiaries are entitled to reimbursement for covered drug costs incurred during this retroactive period, CMS did not begin informing them of this right until March 2007. Given their vulnerability, it is unlikely that these beneficiaries would have sought reimbursement or retained proof of their drug purchases if they were not informed of their right to do so. Also, CMS made monthly payments to PDPs for providing drug coverage during retroactive periods, but did not monitor PDPs' reimbursements to beneficiaries during that time period. GAO estimated that in 2006, Medicare paid PDPs millions of dollars for coverage during periods for which dual-eligible beneficiaries may not have sought reimbursement for their drug costs. | 2,897 | 710 |
The United States has assisted the Mexican government in its counternarcotics efforts since 1973, providing about $350 million in aid. Since the later 1980s, U.S. assistance has centered on developing and supporting Mexican law enforcement efforts to stop the flow of cocaine from Colombia, the world's largest supplier, into Mexico and onward to the United States. In January 1993, the government of Mexico initiated a new drug policy under which it declined U.S. counternarcotics assistance and assumed responsibility for funding its own counternarcotics efforts. This policy remained in effect until 1995 when, according to the State Department, economic conditions and the growing drug-trafficking threat prompted the Mexican government to again begin accepting U.S. counternarcotics assistance for law enforcement organizations. Among other things, the Foreign Assistance Act of 1961, as amended, requires the President to certify annually that major drug-producing and -transit countries are fully cooperating with the United States in their counternarcotics efforts. As part of this process, the United States has established specific objectives for evaluating the performance of these countries. In 1997, the United States set the following objectives for evaluating Mexico's counternarcotics cooperation as part of the 1998 certification process: (1) reducing the flow of drugs into the United States from Mexico, (2) disrupting and dismantling narco-trafficking organizations, (3) bringing fugitives to justice, (4) making progress in criminal justice and anticorruption reform, (5) improving money laundering and chemical diversion control, and (6) continuing improvement in cooperating with the United States. In February 1998, the President certified Mexico as fully cooperating with the United States. Since our 1996 report, Mexico has undertaken actions intended to enhance its counternarcotics efforts and improve law enforcement and other capabilities. The results of these actions are yet to be realized because (1) many of them are in the early stages of implementation and (2) some are limited in scope. According to U.S. and Mexican officials, it may take several years or more before the impact of these actions can be determined. Some of the actions include (1) increasing counternarcotics cooperation with the United States; (2) initiating efforts to extradite Mexican criminals to the United States; (3) passing an organized crime law that enhanced the government's authority against money laundering and illegal use and diversion of precursor and essential chemicals; and (4) implementing measures aimed at reducing corruption, such as increasing the role of Mexico's military forces in law enforcement activities. With respect to U.S.-Mexico counternarcotics cooperation, since we reported on these matters in 1996 additional activities have taken place. For example, the High-Level Contact Group on Drug Control, comprised of senior officials from both governments responsible for drug control, has met several times. Results of these meetings include the following: A U.S.-Mexico Binational Drug Threat Assessment was issued in May 1997, which addressed illegal drug demand and production, drug trafficking, money laundering, and other drug-related issues. A joint U.S.-Mexico Declaration was issued in May 1997 that includes pledges from both governments to work toward reducing illegal drug demand, production, and distribution; improving interdiction capacity; and controlling essential and precursor chemicals, among other issues. On February 6, 1998, a joint U.S.-Mexico binational drug strategy was issued. Mexican executive and legislative actions include instituting extradition efforts, passing various laws to address illegal drug-related activities, and passing several anticorruption measures. The United States and Mexico have had a mutual extradition treaty since 1980. Although no Mexican national has ever been surrendered to the United States on drug-related charges, since 1996 Mexico has approved the extradition of 4 of 27 Mexican nationals charged with drug-related offenses. Two are currently serving criminal sentences in Mexico, and two are appealing their convictions in Mexico. The remaining drug-related extradition requests include 5 persons currently under prosecution in Mexico and 14 persons still at large. It is not clear whether any Mexican national will be surrendered on such charges before the end of 1998. Another example of increased cooperation is the November 1997 signing of a joint United States and Mexico "temporary extradition protocol." This protocol allows suspected criminals who are charged in both countries to be temporarily surrendered for trial while evidence is current and witnesses are available. The protocol is not yet in effect because it requires legislative approval in the United States and Mexico, and it has not been submitted to either body. In November 1996, Mexico passed an organized crime law that provides authority for Mexican law enforcement organizations to employ modern techniques to combat crime. These include authority to use plea bargaining and confidential informants, establish a witness protection program, and conduct controlled deliveries and court-authorized wiretaps. The law also has provisions for asset seizures and forfeitures. U.S. embassy officials stated that the passage of the organized crime law represents a major advancement in Mexico's law enforcement capabilities. According to U.S. and Mexican officials, the impact of the organized crime law is not likely to be fully evident for some time. For example, Mexican and U.S. officials told us that the process of conducting investigations is inherently lengthy and that the capabilities of many Mexican personnel who are implementing and enforcing the law are currently inadequate. Mexican agencies are investigating a number of drug-related cases. U.S. embassy officials stated that, although some guidelines and policies have been established, additional ones still need to be developed, including the use of wiretaps and the witness protection program. While this law provides the law enforcement community with the necessary tools to fight organized crime, including drug trafficking, ONDCP reported in September 1997 that the law still lacks some important elements needed to meet the 1988 United Nations (U.N.) Vienna convention and other international agreements. For example, according to ONDCP, the law lacks provisions allowing the seizure of assets of a suspected criminal who has either died or fled Mexico. Furthermore, according to U.S. and Mexican officials, Mexico also needs to develop a cadre of competent and trustworthy judges and prosecutors that law enforcement organizations can rely on to effectively carry out the provisions of the organized crime law. Several U.S. agencies are assisting Mexico in this area. In May 1996, money laundering was made a criminal offense, with penalties of up to 22 years in prison. The law requires banks and other financial institutions to report transactions over $10,000 U.S. dollars and to obtain and retain customer account information. Under the prior law, money laundering was a tax offense, there were no reporting requirements, and violators were only subject to a fine. However, U.S. and Mexican officials are concerned that the new law does not cover so called "structuring"--intentionally making transactions just below the $10,000 reporting threshold. In addition, there is no reporting requirement on currency leaving the country. Between May and December 1997, the Mexican government initiated 27 money laundering cases. To date, one case has been prosecuted, and the remaining 26 cases are still under investigation. In the one case that was prosecuted, the charges were dismissed because a federal judge ruled that no link could be established between an illegal activity and the money. The Mexican government has appealed the judge's decision. In May 1996, trafficking in drug precursor and essential chemicals was made a criminal offense. Although some chemicals that the United Nations recommends be controlled were not included in the law, Mexico passed additional legislation in December 1997 that included all chemicals, thus bringing Mexico into full compliance with U.N. and other international agreements. In addition, Mexico has taken further action to control chemicals by limiting the legal importation of precursor and essential chemicals to eight ports of entry and by imposing regulatory controls over the machinery used to manufacture drug tablets or capsules. The impact of the new chemical control law is not yet evident. Currently, the development of an administrative infrastructure for enforcing it is under way. Various U.S. agencies including the Departments of Justice and State have provided technical assistance and training to help Mexico carry out the law. It is well established and the President of Mexico acknowledges that narcotics-related corruption is pervasive and entrenched within the criminal justice system, and he has made rooting it out a national priority. Beginning in 1995, the President of Mexico expanded the role of the Mexican military in counternarcotics activities. The Mexican military, in addition to eradicating marijuana and opium poppy, has also taken over some law enforcement functions. For example, airmobile special forces units have been used to search for drug kingpins and detain captured drug traffickers until they can be handed over to civilian law enforcement agencies. In September 1996, the President of Mexico publicly acknowledged that corruption is deeply rooted in the nation's institutions and general social conduct. He added that the creation of a new culture of respect for law must start with public officials and affirmed his administration's intent to gradually eliminate official corruption. To do so, the President began to initiate law enforcement reforms. First, the primary Mexican government agency involved in counternarcotics-related activities has been reorganized. In 1996 the Attorney General's office, commonly called the PGR, began a reorganization connected to a long-term effort to clean up and professionalize federal law enforcement agencies. As part of this action, the State Department reported that over 1,250 officials were dismissed for incompetence and/or corruption. U.S. and Mexican officials stated that about 200 of these officials have subsequently been rehired by the PGR because Mexico's labor laws prevented the PGR from removing some of these personnel. Further, in February 1997, the Mexican military arrested General Jesus Gutierrez Rebollo, the head of the National Institute for Combat Against Drugs--the Mexican equivalent of the Drug Enforcement Administration--for corruption. In April 1997, the Attorney General dissolved the Institute and dismissed a number of its employees. A new organization, known as the Special Prosecutor for Crimes Against Health, was established to replace the Institute. This organization includes two special units: The Organized Crime Unit, with an authorized strength of 300, was established under the organized crime law to conduct investigations and prosecutions aimed at criminal organizations, including drug trafficking activities. The Bilateral Task Forces, with an authorized strength of 70, are responsible for investigating and dismantling the most significant drug-trafficking organizations along the U.S.-Mexican border. Finally, in 1997, the Attorney General instituted a screening process that is supposed to cover all PGR personnel including those who work for the special units. This process consists of personal background and financial checks, medical and psychological screening, urinalysis, and regular polygraph testing. However, U.S. embassy officials stated that the screening requirements do not apply to judges, most units of the military, and other key law enforcement organizations engaged in drug control activities. U.S. agencies are supporting this initiative by providing equipment, training, and technical assistance. Moreover, U.S. embassy personnel are concerned that Mexican personnel who failed the screening process are still working in the Special Prosecutor's office and the special units. Although all of Mexico's actions are positive steps to reducing drug-related activities, there are still many issues that need to be resolved. For example, U.S. and Mexican officials indicated that personnel shortages exist in the Special Prosecutor's office and the special units; the special units face operational and support problems, including inadequate Mexican government funding for equipment, fuel, and salary supplements for personnel assigned to the units, and the lack of standard operating procedures; U.S. law enforcement agents assigned to the Bilateral Task Forces cannot carry arms in Mexico; and Mexico continues to have difficulty building competent law enforcement institutions because of low salaries and little job security. U.S.-provided assistance has enhanced the counternarcotics capabilities of Mexico's military. However, the effectiveness and usefulness of some equipment provided or sold to Mexico is limited due to inadequate planning and coordination among U.S. agencies, particularly military agencies within DOD. In October 1995, the U.S. Secretary of Defense visited Mexico in an effort to strengthen military-to-military relationships between the two countries. As a result of this visit, the Mexican military agreed to accept U.S. counternarcotics assistance. Table 1 shows DOD's counternarcotics assistance provided to the Mexican military during fiscal years 1996-97. All of the helicopters and the C-26 aircraft were delivered to the Mexican military during 1996 and 1997. According to DOD officials, Mexico has also received some logistics and training support; however, they could not provide us with the exact level of support given because this data was not readily available. DOD plans to provide about $13 million worth of counternarcotics assistance under section 1004 of the Defense Authorization Act of 1989 to Mexico's military in fiscal year 1998. Furthermore, the Mexican military used its own funds to purchase two Knox-class frigates from the U.S. Navy through the Foreign Military Sales Program. These two frigates were valued at about $7 million and were delivered to Mexico in 1997. While some of the equipment has helped improve Mexico's capabilities, some has been of limited usefulness. Additionally, inadequate logistics support to the Mexican military has hindered its efforts to reduce drug-related activities in Mexico. The following examples illustrate some of the problems. The U.S. embassy has reported that the UH-1H helicopters provided to Mexico to improve the interdiction capability of Mexican army units are of little utility above 5,000 feet, where significant drug-related activities, including opium poppy cultivation, are occurring. The average operational rates for the UH-1H helicopters have remained relatively low, averaging between 35 and 54 percent, because of inadequate logistics support such as delays in the delivery of spare parts. The four C-26 aircraft were provided to Mexico without the capability to perform the intended surveillance mission. U.S. embassy officials stated that the Mexican military has not decided how many of the aircraft will be modified to perform the surveillance mission, but modifying each aircraft selected for surveillance will cost at least $3 million. Regarding the two Knox-class frigates, when they were delivered in August 1997, the ships lacked the equipment needed to ensure the safety of the crew, thus rendering the ships inoperable. The U.S. Navy estimated that it will cost the Mexican Navy about $400,000 to procure this equipment and that it will be at least 2 years before the ships will be operational. Even though the U.S. Navy knew that the ships would not be operational when they were delivered, DOD began providing the Mexican Navy with about $1.3 million worth of training to 110 personnel related to the two Knox-class frigates. U.S. embassy officials stated that this training will be completed in March 1998. The Mexican Navy will reassign these personnel until the ships can be used. According to DOD officials, they approved the training because they were not informed by the U.S. Navy that the ships would not be operational. We believe that planning and coordination of U.S. counternarcotics assistance to Mexico could be improved. Thus, we believe that the Secretary of State, in close consultation with the Secretary of Defense and the National Security Council, should take steps to ensure that future assistance is, to the maximum extent possible, compatible with the priority requirements identified in U.S. counternarcotics programs and that adequate support resources are available to maximize the benefits of the assistance. Without measures of effectiveness, it is difficult for U.S. decisionmakers to evaluate the progress that the United States and Mexico are making to reduce the flow of illegal drugs into the United States. We have previously noted the need for ONDCP to develop drug control plans that include performance measures to allow it to assess the effectiveness of antidrug programs. In February 1997, we recommended that ONDCP complete its long-term drug control plan, including quantifiable performance measures and multiyear funding needs linked to the goals and objectives of the international drug control strategy. Subsequently, in February 1998, ONDCP issued a national drug control strategy covering a 10-year period. In March 1998, ONDCP issued general performance measures, but they do not include targets and milestones for specific countries, such as Mexico. As I noted earlier, the United States and Mexico issued a joint U.S.-Mexico binational drug strategy in February 1998. Although the binational strategy is indicative of increased U.S.-Mexico cooperation, it does not contain critical performance measures and milestones for assessing performance. State Department officials stated that the bilateral process of establishing performance measures and milestones is incremental and will be addressed during 1998. ONDCP officials told us that they plan to issue specific targets and milestones for the binational strategy by the end of this year. This concludes my prepared remarks. I would be happy to respond to any questions you may have. Drug Control: Observations on Counternarcotics Activities in Mexico (GAO/T-NSIAD-96-239, Sept. 12, 1996). Drug Control: Counternarcotics Efforts in Mexico (GAO/NSIAD-96-163, June 12, 1996). Drug Control: Observations on Counternarcotics Efforts in Mexico (GAO/T-NSIAD-96-182, June 12, 1996). Drug War: Observations on U.S. International Drug Control Efforts (GAO/T-NSIAD-95-194, Aug. 1, 1995). Drug War: Observations on the U.S. International Drug Control Strategy (GAO/T-NSIAD-95-182, June 27, 1995). Drug Control: Revised Drug Interdiction Approach Is Needed in Mexico (GAO/NSIAD-93-152, May 10, 1993). Drug Control: U.S.-Mexico Opium Poppy and Marijuana Aerial Eradication Program (GAO/NSIAD-88-73, Jan. 11, 1988). Gains Made in Controlling Illegal Drugs, Yet the Drug Trade Flourishes (GAO/GGD-80-8, Oct. 25, 1979). Opium Eradication Efforts in Mexico: Cautious Optimism Advised (GAO/GGD-77-6, Feb. 18, 1977). The first copy of each GAO report and testimony is free. Additional copies are $2 each. Orders should be sent to the following address, accompanied by a check or money order made out to the Superintendent of Documents, when necessary. VISA and MasterCard credit cards are accepted, also. Orders for 100 or more copies to be mailed to a single address are discounted 25 percent. U.S. General Accounting Office P.O. Box 37050 Washington, DC 20013 Room 1100 700 4th St. NW (corner of 4th and G Sts. NW) U.S. General Accounting Office Washington, DC Orders may also be placed by calling (202) 512-6000 or by using fax number (202) 512-6061, or TDD (202) 512-2537. Each day, GAO issues a list of newly available reports and testimony. To receive facsimile copies of the daily list or any list from the past 30 days, please call (202) 512-6000 using a touchtone phone. A recorded menu will provide information on how to obtain these lists. | Pursuant to a congressional request, GAO discussed its work on the counternarcotics efforts of the United States in Mexico, focusing on the: (1) nature of the drug threat from Mexico and results of efforts to address this threat; (2) planning and coordination of U.S. counternarcotics assistance to the Mexican military; and (3) need to establish performance measures to assess the effectiveness of U.S. and Mexican counternarcotics efforts. GAO noted that: (1) Mexico is the principle transit country for cocaine entering the United States and, despite U.S. and Mexican counternarcotics efforts, the flow of illegal drugs into the United States from Mexico has not significantly diminished; (2) no country poses a more immediate narcotics threat to the United States than Mexico, according to the Department of State; (3) the 2,000-mile U.S.-Mexican border and the daunting volume of legitimate cross-border traffic provide near-limitless opportunities for smuggling illicit drugs, weapons, and proceeds of crime, and for escape by fugitives; (4) Mexico, with U.S. assistance, has taken steps to improve its capacity to reduce the flow of illegal drugs into the United States; (5) among other things, the Mexican government has taken action that could potentially lead to the extradition of drug criminals to the United States and passed new laws on organized crime, money laundering, and chemical control; (6) it has also instituted reforms in law enforcement agencies and expanded the role of the military in counternarcotics activities to reduce corruption--the most significant impediment to successfully diminishing drug-related activities; (7) while Mexico's actions represent positive steps, it is too early to determine their impact, and challenges to their full implementation remain; (8) no Mexico national has actually been surrendered to the United States on drug charges, new laws are not fully implemented, and building competent judicial and law enforcement institutions continues to be a major challenge; (9) since fiscal year 1996, Department of Defense (DOD) has provided the Mexican military with $76 million worth of equipment, training, and spare parts; (10) the Mexican military has used this equipment to improve its counternarcotics efforts; (11) however, due, in part, to inadequate planning and coordination within DOD, the assistance provided has been of limited effectiveness and usefulness; (12) improved planning and coordination could improve Mexico's counternarcotics effectiveness; (13) although the Mexican government has agreed to a series of actions to improve its counternarcotics capacity, and the United States has begun to provide a larger level of assistance, at the present time there is no system in place to assess their effectiveness; and (14) even though the United States and Mexico have recently issued a binational drug control strategy, it does not include performance measures. | 4,230 | 608 |
To help federal agencies manage their respective travel programs and achieve travel cost savings, GSA issues and revises the FTR. According to the FTR website, GSA promulgates the FTR to: (1) interpret statutory and other policy requirements in a manner that balances the need to ensure that official travel is conducted responsibly with the need to minimize administrative costs, and (2) clearly communicate the resulting requirements to federal agencies and employees. Formal changes to the FTR are identified as amendments and published in the Federal Register in accordance with the rulemaking provision of the Administrative Procedure Act. GSA officials stated that, while the agency develops and promulgates the rules and amendments that comprise the FTR, it does not have enforcement authority for agencies' compliance with FTR requirements. In addition to FTR amendments, GSA also issues travel bulletins (nonbinding guidance) that GSA officials said can typically be issued within a shorter timeframe than final rules published in the Federal Register. According to GSA officials, the travel bulletins are generally issued to remind agencies of existing FTR requirements. Administration actions have encouraged agencies to develop mechanisms by which federal agencies could reduce travel. For example, EO 13589, "Promoting Efficient Spending" called for agencies and their components to: 1) to devise strategic alternatives to government travel, including local or technological alternatives, such as teleconferencing and video conferencing; 2) conduct business and host or sponsor conferences in a space controlled by the federal government, wherever practicable and cost effective; and 3) designate a senior official responsible for developing and implementing policies and controls to ensure efficient spending on travel- and conference-related activities. Following the issuance of EO 13589, OMB issued a supporting memorandum on Promoting Efficient Spending to Support Agency Operations. In order to support the cost saving goals of the Executive Order, the memorandum explained the role that travel plays in supporting agency missions and supporting local economies. At the same time, the memorandum required that each agency reduce its spending on travel costs, and provided that specific travel policies be established or clarified to manage travel budgets more efficiently and to reduce reliance on travel. The reductions were time limited through fiscal year 2016, but the memorandum explained that the intent was, among other things, to make the reductions in travel budgets sustainable. In 2012, GSA formed the Government-wide Travel Advisory Committee (GTAC) to review existing travel policies, processes, and procedures. GSA formed GTAC to: 1) review existing travel policies, processes, and procedures; 2) ensure that the policies and procedures are accountable and transparent; and 3) help federal agencies achieve their missions effectively and efficiently at the lowest logical travel cost. In 2015, GTAC issued a report that provided advice and recommendations to GSA to, among other things, incorporate industry best practices. In 2015, GSA also established the Senior Travel Official Council to assist in the administration's efforts to promote efficient spending. Data used by the agencies to prepare Travel Reporting Information Profile (TRIP) reports for GSA are maintained within each agency's travel system, which is part of the E-Gov Travel Service 2 (ETS2). The TRIP report requires agencies to provide aggregate travel cost data (transportation, lodging, and meals and incidentals) for five travel categories: employee emergency--travel related to an unexpected occurrence/event or injury/illness that affects the employee personally and/or directly that requires immediate action/attention; mission (operational)--travel to a particular site to perform operational or managerial activities; special agency mission--travel to carry out a special agency mission and/or perform a task outside the agency's normal course of day-to-day business activities that is unique or distinctive. These special missions are defined by the head of agency and are normally not programmed in the agency annual funding authorization; nontraining conference--travel performed in connection with a prearranged meeting, retreat, convention, seminar, or symposium for consultation or exchange of information or discussion; and, training--travel in conjunction with educational activities to become proficient or qualified in one or more areas of responsibility. GSA encourages federal agencies to use ETS2 reporting capabilities as a means to track, monitor, and report on costs related to travel spending. ETS2 is a comprehensive travel services program that brings a reporting capability to the agencies' travel programs. In addition, GSA created ETS2 with the expectation that the services available under ETS2 would 1) enable federal agencies to further consolidate travel services, platforms, and channels; 2) improve the leverage of government travel spending; 3) increase transparency for improved accountability; and 4) reduce waste. According to GSA officials, ETS2 provides the travel reporting capability that aligns with and supports OMB's Memorandum M- 12-12. ETS2 can generate reports related to an agency's travel costs and other travel-related activities. For example, through ETS2 an agency can generate operational reports to monitor day-to-day travel services, travel management, and regulatory reports that can be used to foster and encourage an agency's managers to make informed decisions. By 2013, GSA awarded the development of ETS2 to two contractors that allowed agencies to design a travel data cost system that best meets their respective needs. Officials at each of the six selected agencies stated that, while their respective agencies pursued a wide range of cost-saving efforts to address the GSA cost-saving provisions, all of them had policies in place that addressed these provisions prior to GSA issuance of either an amendment or travel bulletin. Some of the agencies' cost-saving efforts involved these agencies updating internal policy statements, issuing internal guidance, configuring their ETS2 to require justifications for making a policy exception, and providing in-person and web-based trainings. For example, Department of Defense (DOD) officials stated that for half of the travel bulletin provisions (10 of 20) DOD's related actions had been in effect since at least 1998. In one instance, DOD included a provision in the December 1998 Joint Travel Regulations (JTR) that limited reimbursement of employees buying rental car insurance. This included the collision damage waiver adjustment and theft protection. This aligned with the 2014 GSA Bulletin FTR 14-05's guidance recommending that travelers decline additional insurance when renting vehicles. Similarly, the Department of Agriculture's (USDA) travel system contained a restriction prompting travelers to provide a justification for not using a CPP fare, as well as confirmation that the alternative air fare is greater than the "least logical airfare plus 75.00 dollars." In addition, USDA's internal travel policies state that each agency and staff office is expected to use the method of travel most advantageous to the government, including lower-cost airfares. According to officials at the six selected agencies, they were already taking action to address the issues introduced by FTR amendments and GSA travel bulletins. However, because the bulletins served to reinforce existing policy, these actions resulted in these agencies' officials taking further actions--either developing new travel policy or issuing a memorandum to staff reminding them of existing agency travel policy--to highlight the policy. For example, according to officials at the Department of State (State), while the agency had codified the requirement for using contract air carriers in its own Foreign Affairs Manual (FAM) at least as early as 2005, the more recent GSA travel bulletin on contract and noncontract airfares influenced State to reinforce the policy by issuing departmental notices and cables to all diplomatic and consular posts outlining the parameters for the use of commercial airfares. Similarly, DOD officials said that the agency addressed two cost-saving provisions within the same GSA travel bulletin by taking a new action to augment an existing policy. One provision addressed the issue of reviewing internal policies to ensure that the agency's use of CPP contract and non-contract air carriers resulted in overall cost savings. The other provision addressed agencies' management of their internal procedures to assess risk associated with using non-contract fares, which officials said they addressed through the same cost-saving effort. DOD officials stated that they had already taken a related action prior to the issuance of the bulletin. However, the travel bulletin resulted in DOD updating the JTR by strengthening the language related to using restricted fares and requiring the use of a Decision Support Tool to assist in determining if a restricted fare may be advantageous to the government. In other cases, FTR amendments or GSA travel bulletins prompted the six agencies to take new actions. In these instances, these agencies took action as a direct result of the recently-issued GSA provisions for which they had not previously pursued a related cost-saving effort. This usually happened through the development of a new agency-specific travel policy or the issuance of a memorandum to staff reminding them of existing agency travel policy. For example, one FTR amendment addressed the use of rental cars that resulted in the Department of Justice (DOJ) issuing a memorandum advising travelers that pre-paid fueling options for rental vehicles are not authorized according to the amendment. This was a cost- saving effort that the agency had not pursued prior to the amendment. Officials at five of the six selected agencies described a few cases where their respective agencies took no specific policy action, but either: 1) advised employees to follow the FTR; 2) asked individual components to create unique policies that ensured FTR compliance; or 3) provided approving officials with the discretion to oversee employees' compliance with the FTR as appropriate, and determine whether or how to adopt promising practices from the GSA travel bulletins. For example, according to the Department of Homeland Security (DHS) officials, employees were reminded to follow the FTR. However, if any of the agency's components needed to clarify or address parts of the FTR not specifically covered by the agency's Financial Management Policy Manual (FMPM), DHS empowered them to do so. For example, FMPM does not have a specific policy prohibiting reimbursement for purchasing pre-paid fueling options for rental cars, a GSA requirement. However, DHS officials stated that employees were asked to "exercise the same care in incurring expenses that a prudent person would exercise if using his or her personal funds while on personal business." According to officials, components and travelers were still responsible for adjusting their travel actions to remain compliant with the FTR even when DHS policy could not be promptly updated. According to DHS officials, "as questions are raised by the travelers, the policy staff at each component provides the guidance necessary to maintain FTR compliance." In another example, officials at State refrained from taking a policy action to address two provisions in GSA Bulletin FTR 13-03 because the department wanted to provide approving officials with some discretion over travel decisions on individual trips and vouchers, such as determining if a rental car is a better option than public transportation for travelers, and whether travelers should be required to share rental cars and taxis while on official travel in groups and when public transportation is not a better option. The cost of transportation is among many factors approving officials weigh when deciding. Given a wide diversity of factors across global locations in safe, secure, and available local transportation options, State officials told us that at that time they did not dictate a centralized policy on employee sharing of rental cars and taxis in order to leave this discretion to the traveler's approving official. Use of public transportation over rental car usage, remains in the traveler's approving official's discretion. State subsequently updated their FAM on Dec 7th, 2015 with specific provisions for rental cars on official travel. They said this allows approving officials the discretion to apply the prudent traveler rule, which in application "should clearly require employees to share rental cars while on official travel in groups." We also found that the selected agencies initiated travel cost-saving efforts that were in addition to provisions recommended by GSA. For example, a September 2015 Defense Travel Management Office (DTMO) report--DOD Travel Reform--said that DTMO tracked numerous cost- saving travel reform initiatives for policy simplification that it pursued outside of the actions taken related to GSA's bulletins and FTR amendments. These initiatives include: 1) standardization of reimbursement rates for privately-owned vehicles into a single rate; 2) creation of a standard travel rate to ensure that per diem is very limited for trips in which it takes a day to travel to a temporary duty (TDY) travel location; and 3) expansion of the definition of incidental expenses to include miscellaneous expenses. In another example, the Department of Veterans Affairs (VA) officials stated that in addition to the web-based training material providing for reduced per diem for long-term TDY, VA had additional requirements that were beyond the scope of GSA's bulletins and FTR amendments and that targeted additional cost savings. These requirements stated that travelers must stay in weekly or monthly rentals during extended assignments whenever possible and reduce their meals, incidentals, and expenses when the traveler is able to obtain lodging or meals at lower costs. The Senior Travel Official Council (STOC) brings travel officials from all federal agencies together to share information and best practices to further cost-saving efforts. GSA established the STOC in 2015 to identify consistencies and best practices in the areas of travel policy, programs, and procurement. According to GSA officials, STOC is designed to help agencies make better use of their travel cost data to make informed decisions about internal policymaking and replicate actions taken by other agencies that implemented successful cost-saving policies and practices. According to STOC meeting minutes, the council has taken some steps toward information-sharing efforts that could benefit all federal agencies. For example, in 2015, the council initiated a pilot program involving eight agencies to share promising practices in five areas: online booking, airfare savings, hotel reservations, car rentals, and SmartPay usage. Based on the pilot, STOC will come up with policies and processes that other agencies can choose and implement. In a December 2015 STOC meeting, officials from DOJ and the National Science Foundation provided information on actions they took prior to the pilot, and how those actions resulted in cost savings at their agencies. DOJ noted that their cost savings programs had top-down agency support and agency officials implemented a policy change that required online booking and the lowest logical airfare. By implementing this policy, DOJ officials claimed a savings of more than $9.2 million when using lowest logical airfare (non- refundable tickets) and an online booking rate of 68 percent in fiscal year 2015. According to GSA officials, the STOC plans to encourage agencies to pursue these policies beyond the pilot program. However, according to GSA officials and STOC meeting minutes, STOC members have yet to take full advantage of the STOC to network and learn about other agency-initiated policies that could lead to potential cost savings. According to GSA officials, the STOC members had not yet formally shared much information about other promising practices for tracking and monitoring savings that could be replicated to benefit other federal agencies. However, these officials said that the STOC is still in the early stages, and opportunities for agencies to share information have been limited. Such practices could help agencies to develop and implement cost-saving efforts, and quantify those efforts when possible. Without using the STOC meetings to engage in information sharing about these practices, agencies have a limited ability to learn from and apply other agencies' methods for using E-Gov Travel Service 2 (ETS2) data reports to track and monitor the impact and effectiveness of their policies on travel spending reductions. Only four cost-saving efforts at two of the selected agencies--DOJ and DOD--could be quantified. DOJ officials were able to quantify their estimated cost savings for implementing three cost-saving measures: 1) a savings of more than $15 million due to the use of non-contract, non- refundable fares from fiscal year 2015 through the first quarter of fiscal year 2016, 2) an increase in the use of video conferencing that saved an estimated $16.3 million during fiscal year 2010, and 3) a requirement for the use of online booking for travel reservations whenever possible that saved nearly $3.4 million in transaction fee savings for fiscal year 2015. DOD officials also reported quantifiable cost-savings related to a GSA provision--which promoted implementing a per diem reduction for travel over 30 days that DOD adopted by mandating a per diem reduction to 75 percent of the locality rate for TDY at a single location that extends between 31 and 180 days and 55 percent for TDY of 181 or more days-- that resulted in savings of over $56 million between November 2014 and December 2015. Although most cost-saving efforts at the selected agencies were not quantifiable, agency officials described how cost savings were likely achieved and what data limitations existed. Officials said that attempting to quantify many of the cost savings efforts would be a very labor intensive effort, and would require documenting the cost of decisions which are not made at the individual travel voucher level. For example, at VA, a travel policy stipulated that in selecting a particular local transportation method, the agency should consider, among other things, the accessibility and availability of public transportation at the TDY location. Therefore, VA officials said they did not approve car rentals in areas where public transportation is accessible and available, such as Washington, D.C. While VA officials stated that this approach likely resulted in cost savings by having travelers avoid expensive hotels and city parking fees, they could not quantify these cost savings. In other instances, the selected agencies could not provide evidence of quantifiable cost savings at the aggregate level because travel data were not available from certain components within their agencies. For example, many sub-agencies within DHS kept track of and reported some cost savings through the elimination of non-mission critical travel, and by maximizing the use of conference calls and web-based training and meetings. However, DHS did not track such savings at the department level and could not quantify them for us. According to DHS officials, they were unable to determine or quantify cost savings at an aggregate level for most measures because DHS components had a wide variety of ways to avoid costs that were most appropriate for their unique missions, and the agency did not have a tracking mechanism to identify costs avoided. In addition, State officials said that there may be some limitations in their ability to collect and report data since travel at State is decentralized with each office authorizing, conducting, and collecting its own travel data. Generally, it was not possible for most of the selected agencies to associate any one description of cost savings with a specific GSA provision. Officials at the agencies we interviewed told us that a wide range of factors influenced cost-saving efforts, including, but not limited to, GSA provisions contained within FTR amendments or GSA travel bulletins. For example, in addition to responding to GSA provisions at DHS, officials credited the Secretary's initiative to cut costs and improve overall operational efficiency as a reason for taking action on some of the same issues later recommended by GSA. Officials also cited other government-wide factors including requirements under Executive Order 13589 and OMB Memorandum M-12-12 that influenced cost saving efforts at agencies at the same time officials responded to GSA's cost- saving provisions. According to officials at five of the six selected agencies, a number of limitations in the travel data system designed by GSA and maintained by the agencies affected their ability to identify cost savings related to implementation of cost-saving provisions in FTR amendments and GSA travel bulletins. GSA officials stated that while most agencies have the capability via ETS2 to track, monitor, and report on cost savings, this reporting capability is not being leveraged consistently across the federal agencies to manage their travel costs. However, because agencies can customize ETS2 to fit their particular needs, these advances may still not provide for common reporting of travel information across federal agencies. Further, officials at GSA confirmed that while ETS2 offers better tracking and monitoring of travel costs compared to its predecessor, it still cannot provide for a central means of collecting and reporting data that would be reliable for the purposes of comparison across agencies. Access to quality travel data is essential to performing key travel management tasks. The 2015 Government-wide Travel Advisory Committee (GTAC) final report stated that access to quality data requires maintaining data from multiple sources in an integrated framework whether a system, database, or data management tool that would allow federal agencies to compile and maintain enterprise-level travel data sufficient to support business decisions, respond to government-wide data calls, leverage sourcing strategies, and comply with the Government Performance and Results Act of 1993. However, GTAC found that the federal government still had not started maintaining data from multiple sources in a single, centralized, and integrated framework, whether a system, database, or data management tool. The report's findings supported the GSA officials' statements. According to GSA officials, there is a need for more standardization in travel management reports to give GSA the ability to compare travel cost data between agencies that use either of the two ETS2 vendors, and report government-wide trends. Agencies' abilities to customize their reporting options without also meeting standard reporting requirements hinder GSA's ability to establish a common metric for tracking and monitoring federal travel spending. Agencies are also unable to fully assess the travel costs incurred by their staff. Thus, they are unable to fully identify areas for potential cost savings. ETS2 allows agencies the option to select between two vendors who can assist them with the tracking and monitoring of travel cost data. Agencies can customize and configure their travel data systems to meet their travel needs in alignment with their policies and business needs. According to GSA officials, because agencies can customize and configure their travel data systems, and because of the lack of government-wide standards on cost savings metrics, it is difficult for GSA to facilitate peer-to-peer comparisons across the federal agencies. For example, officials said that while both ETS2 vendors have a standard report to help agencies determine their hotel attachment rates within the system, one vendor's report collects information on hotels booked within the data travel system, and the other vendor's report collects information on hotels booked outside of the data travel system. As a result, it is difficult for agencies and GSA to determine if travelers at all agencies are taking advantage of the reduced rates available when booking their lodging along with transportation. Additionally, one of the goals of ETS2 is to help agencies achieve greater data transparency. The required TRIP reports provide travel spending information that was previously not tracked in ETS. This includes breaking out the travel and lodging costs into the five different categories of federal travel. Because of a standardized template used to generate the TRIP reports and the increased capacity of ETS2, agencies are now able to report travel spending in more consistent formats. While this indicates the potential for a greater level of tracking and monitoring of travel spending by agencies, GSA officials believe that this may not be achieved unless agencies adopt common reporting practices. According to GSA officials, in late 2015, they began to plan a shared services model to help agencies better manage their travel programs. This model would allow agencies to share a wide range of travel services with each other to reduce both administrative costs and burden to the government, and enable data-driven decision making. According to GSA officials, based on the draft business plan the shared services model would be able offer government-wide data collection, benchmarking, and reporting standards that agency managers can access and use to inform decisions. According to officials, under the shared services model, while the agencies would still be the decision makers regarding agency-specific travel policies, GSA would be able to advise them on which system configuration would allow the agencies to obtain the lowest travel costs relative to mission objectives. The six selected agencies that accounted for more than three-quarters of federal travel dollars (Agriculture, Defense, Homeland Security, Justice, State, and Veterans Affairs) did pursue a variety of efforts aimed at reducing travel costs that generally aligned with GSA's amendments to the FTR and travel bulletins. However, these agencies generally lacked data to track these efforts. GSA's efforts to track and monitor travel costs across federal agencies is similarly limited by a lack of standardized data as reported by individual agencies. More standardized data reporting could help GSA advise agencies on how to limit travel costs while still achieving their agencies' missions. GSA created the STOC to identify efficiencies and discuss practices for achieving travel cost savings. However, STOC members have not yet taken full advantage of this opportunity. As the STOC moves forward, opportunities exist to help agencies share information. Additional attention to these issues by STOC can help agencies develop, implement, and share their travel cost-saving efforts. Such additional attention to these issues could in turn help STOC promote the more efficient use of travel funds across the federal government without imposing additional requirements on agencies. The Administrator of General Services, in consultation with the STOC, should develop a travel data management approach, including common reporting formats that would provide GSA with more consistent travel cost data allowing GSA to compare travel costs across federal agencies. GSA could also include in this data management approach the planned implementation of the shared services model that would allow agencies to share a wide range of travel services with each other. This process could reduce both administrative costs and burden to the government and enable data-driven decision making. The Administrator of GSA, as chair of the STOC, should work with the STOC to identify promising opportunities and implement leading practices to help agencies leverage their travel resources and implement travel cost-saving efforts. We provided a draft of this report to the Administrator of the GSA and the Secretaries of Agriculture, Defense, Homeland Security, Justice, State, and Veteran Affairs for review and comment. GSA agreed with both recommendations as discussed below. The Departments of Defense, Justice, State, and Veterans Affairs provided technical comments which we incorporated as appropriate. The Departments of Agriculture and Homeland Security had no comments to our report. In written comments received on June 30, 2016, GSA staff agreed with the two recommendations in this report and agreed to take the following actions to address them. In response to the first recommendation, GSA officials stated that they will conduct a test program to determine the opportunities and barriers of creating a reliable standardized data repository containing government-wide travel spending data. To address the second recommendation, GSA officials agreed to establish an STOC working group to implement a process of documenting promising travel management and cost-saving practices, which could be used by Senior Travel Officials at their respective agencies. We are sending copies of this report to the appropriate congressional committees, and the aforementioned agencies. In addition, the report is available at no charge on the GAO website at http://www.gao.gov. If you or your staff have any questions about this report, please contact me at (202) 512-6806 or [email protected]. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this report. GAO staff who made key contributions to this report are listed in appendix III. 76 Fed. Reg. 63844, (Oct. 14, 2011) (codified at 41 C.F.R. SS 301-11) Agency must not reimburse real estate expenses or the lodging portion of per diem for the purchase or sale of a personal residence or recreational vehicle at the temporary duty (TDY) travel location. Agency must not reimburse the lodging portion of per diem to travelers who lodge at their personal residences while on TDY. Removal of Conference Lodging Allowance Provisions 78 Fed. Reg. 65210, (Oct. 31, 2013) (codified at 41 C.F.R. SSSS 301-11 and 301-70) Agency must no longer use the conference lodging allowance reimbursement option for employees on TDY. This refers to the allowance for travelers to exceed the lodging rate by up to 25 percent. If per diem lodging rates are unavailable at conference location, travelers should construct a cost comparison to decide whether to find lodging within per diem that is away from the conference location or reimburse actual expenses for lodging at the conference location that does not have a per diem rate. 80 Fed. Reg. 27259, (May 13, 2015) (codified at 41 C.F.R. SSSS 300-3, 301-10 and 301-70) Travelers must use the least expensive compact car available unless an exception is approved. Travelers will not be reimbursed for purchasing pre-paid fuel for rental cars. Travelers should refuel prior to returning the vehicle to the rental car company. Travelers will not be reimbursed for fees associated with rental car loyalty points or transfer of points charged by car companies. FTR 13-03, Dec. 21, 2012 Agency should justify that employee travel is necessary to accomplish the mission. Agency should consider technological alternatives to travel. Agency should consider all viable lowest-cost transportation options, such as selecting a non- contract airfare. Agency should have controls in place to collect all refunds for unused or partially used airline tickets. Agency should encourage employees to use public transportation as the first option for local transportation when on TDY. Agency should increase employee sharing of rental cars and taxis. Agency should encourage employees to evaluate all lodging options that are within per diem. Agency should evaluate reduced per diem for TDY assignments that last more than 30 days and a Temporary Change of Station for TDY assignments that last more than 180 days. FTR 13-07, June 4, 2013. Agency should review internal policies to ensure that use of City Pair Program contract and non- contract air carriers results in overall cost savings. Agency may authorize use of non-contract airfares in three scenarios, including when a non- contract fare, if used, would result in a lower total trip cost to the government. Agency must consider (1) all direct costs, including per diem and actual transportation cost, and (2) indirect costs, including overtime and lost work time, when authorizing a method of transportation. Agency must assess risk associated with using non-contract airfares, which includes ensuring that the traveler reasonably anticipates using the ticket. Agency should configure the E-Gov Travel Service (ETS) to reflect an airfare selection policy that is designed to achieve the lowest total trip cost. Agency must book all travel airfare through appropriate ETS booking channels regardless of fare type. FTR 14-05, Jan. 16, 2014. Agency should review agency policy to verify that it is abundantly clear that all rental cars must be authorized only when in the best interest of the government. Agency should ensure travelers book rental car reservations through ETS where available or arrange car rentals through agency's Travel Management Center (TMC). No other methods may be used. Agency should ensure travelers are familiar with the Defense Travel Management Office (DTMO), U.S. Government Rental Car Agreement and encourage them to rent cars from participating vendors. Agency should educate travelers to decline additional insurance, such as collision damage waiver or theft insurance, which travelers generally may not be reimbursed for. FTR 14-08, May 13, 2014. Agencies are strongly encouraged to notify GSA's Office of Government-wide Policy of the name and contact information of the employee selected to be responsible on an agency-wide basis (i.e. the "Senior Travel Official") for ensuring efficient travel spending. Agencies should consider including certain major responsibilities as part of the Senior Travel Official position, such as researching best practices and recommending actions to improve efficiency and effectiveness of travel programs and directing and managing agency travel programs to obtain economy and efficiency. In addition to the contact name above, Tara Carter (Assistant Director) and Joseph Santiago (analyst-in-charge) supervised the development of this report. Jehan Chase, Kelvin Dawson, Keith Logan, Michael O'Neill, Laurel Plume, Silvia Porres-Hernandez, Steven Putansu, Wesley Sholtes, and Stewart Small made key contributions to this report. | Federal agencies rely on travel to achieve a broad range of missions. GSA helps agencies develop travel policy by providing guidance to agencies, including issuing and revising the FTR. The administration and GSA have encouraged agencies to take steps to adopt cost-savings efforts and promote efficient travel spending. House Report 112-136 included a provision for GAO to report on whether FTR revisions resulted in measurable reductions in travel costs. This report: 1) describes selected agencies' actions taken to address FTR revisions; 2) determines the extent to which FTR revisions led to cost savings; and 3) determines any cost savings achieved during fiscal years 2012 to 2015. GAO reviewed information from six selected federal agencies with the largest amount of travel spending in fiscal year 2015. GAO also reviewed how these agencies responded to GSA's FTR amendments and travel bulletins to achieve cost savings. The Departments of Agriculture, Defense, Homeland Security, Justice, State, and Veterans Affairs, the six federal agencies with the largest travel spending in fiscal year 2015, pursued a variety of cost-saving efforts that generally aligned with regulations and guidance issued in either Federal Travel Regulation (FTR) amendments or General Services Administration (GSA) travel bulletins from fiscal year 2011 to fiscal year 2015. GSA administers and revises the FTR--which interprets statutory and other policy requirements to ensure that official travel is conducted responsibly--and minimizes administrative costs. Although GSA does not have the authority to enforce the FTR, it issues FTR amendments and travel bulletins to help federal agencies manage their respective travel programs and achieve travel cost savings through the provisions contained in the amendments and travel bulletins. GSA FTR amendments and travel bulletins issued between fiscal years 2011 and 2015 contained a total of 27 cost-saving provisions. Agency officials at each of the six selected agencies stated that their respective agencies either had policies in place that already addressed the cost-saving provisions; developed new travel policies or issued guidance that reinforced the provisions or updated existing policies related to the provisions; or advised employees to follow the FTR without implementing an agency-specific policy. The six agencies reported that GSA's review of the FTR to revise obsolete and outdated policies influenced their actions and resulted in cost savings. However, most of these savings could not be quantified. Only four cost-saving efforts at two agencies--the Departments of Defense and Justice--could be quantified. These agencies reported that a wide range of factors influenced their cost-saving efforts. In addition to FTR-compliance efforts, these agencies reported that administration actions on reducing travel costs, cutting waste, and promoting efficient spending influenced their approaches to managing travel costs. Agency officials also reported that broader efforts to improve operational efficiency, and efforts to responsibly use resources, also influenced their agency-specific policies and practices to promote efficient travel spending. According to GSA and officials from the six selected agencies, data limitations existed both within the selected agencies in terms of their ability to quantify travel-related cost savings, and government-wide in terms of comparing and aggregating travel data across agencies. Without standardized reporting practices, the federal government lacks common metrics for identifying, comparing and evaluating travel spending across federal agencies. The Senior Travel Official Council (STOC) was formed in 2015 to identify efficiencies and discuss best practices related to travel cost savings. According to its charter, the STOC allows agencies to work toward more consistent reporting of travel data and share information on cost-saving efforts. Although the STOC has taken some initial action to bring agencies together, additional efforts to facilitate agencies' information sharing and identification of promising practices could further enhance these efforts to encourage and achieve travel cost-saving across the federal government. GAO recommends that the Administrator of GSA should work with the STOC to: 1) develop a travel data management approach that would provide GSA with more consistent travel cost data; and 2) as chair of the STOC, identify and implement promising practices to help agencies leverage travel resources and achieve cost savings. | 6,951 | 868 |
To respond to the Gulf Coast devastation, the federal government has committed an historically high level of resources--over $110 billion-- through an array of grants, loan subsidies, and tax relief and incentives. The bulk of this assistance was provided between September 2005 and June 2006 through four emergency supplemental appropriations. A substantial portion of this assistance was directed to emergency assistance and meeting short-term needs arising from these hurricanes, such as relocation assistance, emergency housing, immediate levee repair, and debris removal efforts. Consequently, a relatively small portion of federal assistance is available for longer-term rebuilding activities such as the restoration of the region's housing and infrastructure. Later in this statement, I will discuss in greater detail the two programs that the federal government has used so far to provide assistance to the Gulf Coast for longer-term rebuilding. It is useful to view the federal assistance provided to the Gulf Coast within the context of the overall costs of the damages incurred by the region and the resources necessary to rebuild. Although there are no definitive or authoritative estimates of these costs, the various estimates of aspects of these costs offer a sense of their magnitude. For example, early damage estimates from the Congressional Budget Office (CBO) put capital losses from Hurricanes Katrina and Rita at a range of $70 billion to $130 billion while another estimate put losses solely from Hurricane Katrina-- including capital losses--at over $150 billion. Further, the state of Louisiana has estimated that the economic impact on its state alone could reach $200 billion. While the exact costs of damages and rebuilding the Gulf Coast may never be known, they will likely surpass those from the three other costliest disasters in recent history--Hurricane Andrew, the September 2001 terrorist attacks, and the 1994 Northridge earthquake. These estimates raise important questions regarding additional assistance that will be needed to help the Gulf Coast rebuild in the future--including how the assistance will be provided and by whom. The federal government has so far used two key programs--FEMA's Public Assistance and the Department of Housing and Urban Development's (HUD) CDBG programs--to provide long-term rebuilding assistance to the Gulf Coast states. These two programs follow different funding models. Public Assistance provides funding on a project-by- project basis--involving an assessment of specific proposals to determine eligibility, while CDBG--a block grant--affords broad discretion and flexibility to states and localities. FEMA's Disaster Relief Fund (DRF) supports a range of grant programs in providing federal assistance to state and local governments, nongovernment organizations, and individuals when a disaster occurs. One of its largest programs--Public Assistance--provides assistance primarily to state and local governments to repair and rebuild damaged public infrastructure and includes activities such as removing debris, repairing roads, and reconstructing government buildings, and utilities. Pursuant to the Robert T. Stafford Disaster Relief and Emergency Assistance Act (Stafford Act), this assistance is limited to either a fixed-dollar amount or a percentage of costs for restoring damaged facilities. Specifically, applicants submit requests for work which is considered for eligibility and subsequent funding. FEMA obligates funds for approved projects, providing specific amounts to complete discrete work segments on projects, while state and local governments pay the remainder based on the state's cost share agreement with FEMA. As of March 16, 2007, FEMA has obligated about $4.6 billion to Louisiana and about $2 billion to Mississippi through its Public Assistance program. HUD's Community Development Block Grant program--so far, the largest federal provider of long-term rebuilding assistance--received $16.7 billion in supplemental appropriations to help the Gulf Coast states rebuild damaged housing and other infrastructure. As shown in figure 1, Louisiana and Mississippi were allocated the largest shares of the CDBG appropriations, with $10.4 billion allocated to Louisiana, and another $5.5 billion allocated to Mississippi. Florida, Alabama, and Texas received the remaining share of CDBG funds. These formula-based grants afford states and local governments a great deal of discretion in designing directed neighborhood revitalization, housing rehabilitation, and economic development activities. In some instances, Congress has provided even greater flexibility when allocating additional CDBG funds to affected communities and states to help them recover from presidentially declared disasters, such as the Gulf Coast hurricanes. The Federal Coordinator for Gulf Coast Rebuilding has said that the CDBG program allows state leaders "who are closest to the issues" to make decisions regarding how the money should be spent. To receive CDBG funds, HUD required that each state submit an action plan describing how the funds would be used, including how the funds would address long-term "recovery and restoration of infrastructure." This process afforded the states broad discretion in deciding how to allocate their funding and for what purposes. To coordinate and oversee the state's rebuilding efforts, Louisiana created the Louisiana Recovery Authority (LRA) within the state's executive branch. As part of its responsibility, the LRA was also charged with establishing spending priorities and plans for the state's share of CDBG funds, subject to the approval of Louisiana's state legislature. Mississippi developed its spending plans through the Mississippi Development Authority (MDA)--the state's lead economic and community development agency within its executive branch--and the Governor's Office of Recovery and Renewal. In contrast to Louisiana, Mississippi's state legislature was not involved in the approval process for these state funding decisions. Consistent with HUD requirements, both Louisiana and Mississippi published their action plans to solicit public input within their state regarding the planned use of CDBG funds. As shown in figure 2, each state allocated the majority of its share of CDBG funding to housing priorities. The remaining funds were allocated primarily to economic development and infrastructure priorities. With the vast number of homes that sustained damage in Louisiana and Mississippi, each state had opted to direct the vast majority of their housing allocations to homeowners, although each state tailored its program to address the particular conditions in its state. A portion of these allocations also was directed to other housing programs such as rental housing and public housing, as well as to projects that will alleviate costs associated with housing, such as utility and insurance costs. Louisiana and Mississippi homeowner assistance programs are similar in that each is designed to compensate homeowners whose homes were damaged or destroyed by the storms. In each program, the amount of compensation that homeowners receive depends on the value of their homes before the storms and the amount of damage that was not covered by insurance or other forms of assistance. However, these programs differ in their premise and eligibility requirements. Louisiana witnessed a significant population loss in the wake of the Gulf Coast hurricanes, with many residents living in other states and debating whether to return to Louisiana. The LRA, in consultation with state and federal agencies, developed a program to restore the housing infrastructure in Louisiana, using CDBG funds from supplemental appropriations, as described earlier. Referred to as the Road Home, this program is designed to encourage homeowners to return to Louisiana and begin rebuilding. Under the program, homeowners who decide to stay in the state and rebuild in Louisiana are eligible for the full amount of grant assistance--up to $150,000--while those leaving the state will receive a lesser share. Accordingly, aside from the elderly, residents who choose to sell their homes and leave the state will have their grant awards reduced by 40 percent. Residents who do not have insurance will have their grant awards reduced by 30 percent. Further, to receive compensation, homeowners must comply with applicable code and zoning requirements and FEMA advisory base flood elevations when rebuilding and agree to use their home as a primary residence at some point during a 3-year period after closing. As of March 28, 2007, the Road Home program had received 119,945 applications, of which 60,675 had been verified and an award amount had been calculated. Applicants were then asked to decide how they wanted to proceed (for example, whether to rebuild or sell). As of that date, 25,597 applicants notified the program of their decision. Of those, the program awarded payments to 4,808 homeowners with an average award amount of $74,250. In Mississippi, Katrina's storm surge destroyed tens of thousands of homes, many of which were located outside FEMA's designated flood plain and not covered by flood insurance. Mississippi developed a two- phase program to target homeowners who suffered losses due to the storm surge. Accordingly, Phase I of the program is designed to compensate homeowners whose properties were located outside the floodplain and were otherwise fully insured. Eligible for up to $150,000 in compensation, these homeowners are not subject to a requirement to rebuild. Phase II of the program, on the other hand, is designed to award grants to uninsured and underinsured homeowners with incomes at or below 120 percent of the Area Median Income (AMI). Eligible for up to $100,000 in grant awards, these homeowners must demonstrate that they meet current building codes and standards as a condition to receiving their grants. While they are required to rebuild in south Mississippi, they are not required to stay in their homes once they have been rebuilt. In addition, homeowners who do not have insurance will have their grant reduced by 30 percent, although this penalty does not apply to the "special needs" populations as defined by the state (i.e., elderly, disabled, and low income). As of March 28, 2007, Mississippi had received 18,465 applications for Phase I of its program, of which 14,974 were determined eligible for consideration. Of those, Mississippi awarded payments to 11,894 homeowners with an average award amount of $69,669. Mississippi has yet to complete processing applications for any of the more than 10,000 uninsured and underinsured homeowners in Phase II of the program. It is clear that Louisiana's and Mississippi's homeowner assistance programs are proceeding at different paces. While we did not assess the causes for these differences, we have begun work as requested by the Senate Homeland Security and Governmental Affairs Committee to examine particular aspects of the CDBG program that may provide important insights into these issues. Restoring the region's housing and infrastructure is taking place in the context of broader planning and coordination activities; in Louisiana and Mississippi, state and local governments are engaged in both short- and long-term planning efforts. The federal government--specifically, the Coordinator of Federal Support for the Recovery and Rebuilding of the Gulf Coast Region--is responsible for coordinating the activities of the numerous federal departments and agencies involved in rebuilding as well as supporting rebuilding efforts at the state and local level. Based on our preliminary work, I would like to describe some of these activities being undertaken in Louisiana and Mississippi as well as the activities of the federal government. What will be rebuilt in many areas of Louisiana remains uncertain, as a number of planning efforts at the state and local levels are still evolving. At the state level, the LRA has coordinated a statewide rebuilding planning effort that included retaining professional planners and moving towards a comprehensive rebuilding plan. To facilitate this effort, the LRA endorsed Louisiana Speaks--a multifaceted process for helping the LRA develop a comprehensive rebuilding plan for Southern Louisiana and for providing rebuilding planning resources to homeowners, businesses, communities, and parishes. For example, Louisiana Speaks developed and distributed a pattern book for homeowners, architects, and permitting officials about how to redesign and rebuild commercial and residential buildings. Through this process, local design workshops--called charrettes--have been developed to guide neighborhood planning efforts in the impacted areas, while teams of professional planners, FEMA officials, and LRA officials and representatives work with affected local parishes to develop long-term parish recovery plans. Through extensive public input, Louisiana Speaks also seeks to develop a regional plan for Southern Louisiana, focusing on a number of critical challenges for the state's redevelopment. The regional plan will evaluate economic, environmental, and social issues that affect Southern Louisiana and explore alternative ways that growth and development can be accommodated in the context of varying environmental, economic, and cultural changes. The state of Louisiana will then use the regional plan to help direct rebuilding policy and Louisiana's long-term spending over the next 30 years. Given the central importance of the city to Louisiana's overall economy, I would like to highlight planning efforts in New Orleans. After several attempts to develop a rebuilding plan for New Orleans--including the Bring New Orleans Back Commission, efforts initiated by the city council, Urban Land Institute, and others--in August 2006, New Orleans embarked on a comprehensive rebuilding planning process, which continues to date. Referred to as the Unified New Orleans Plan (UNOP), this effort was designed as a grassroots approach to planning to incorporate the vision of neighborhoods and districts into multiple district-level plans and one citywide plan that establishes goals and priorities for rebuilding the city. In particular, the citywide plan will include priority programs and projects for repairing and rebuilding the city over a 5- to 10-year period and will help to inform critical funding and resource allocation decisions by state and federal agencies. The citywide plan is currently under review by the New Orleans Planning Commission. Mississippi created an overall plan to serve as a framework for subsequent planning efforts in affected areas of the state. More specifically, in September 2005--within days of the hurricanes' landfall--Governor Barbour created the Governor's Commission on Recovery, Rebuilding and Renewal to identify rebuilding and redevelopment options for the state. Comprised of over 20 committees, the Commission held numerous public forums across multiple counties in an effort to solicit input and public participation from residents throughout the state. In December 2005, the commission's work culminated in a final report containing 238 policy recommendations aimed at addressing a range of rebuilding issues and concerns across the state, from infrastructure and economic development to human services and finance. The report also addressed potential financing mechanisms identifying state, local, private, and federal sources. Further, the recommendations identified parties responsible for implementing the recommendations, including the creation of new state and regional entities to oversee selected recommendations. In addition, Governor Barbour created the Office of Recovery and Renewal to oversee and coordinate implementation of these recommendations. Also charged with identifying funding for rebuilding projects, the office continues to work with public and private entities as well as state and local governments. Local governments in south Mississippi are also engaged in rebuilding planning activities. For example, modeled after the Governor's Commission on Renewal and Recovery, the city of Biloxi established a volunteer steering committee to develop a rebuilding plan for the city. Biloxi's final rebuilding plan resulted in 162 recommendations to address core issues affecting the city, such as infrastructure, economic development, human services, and finance. In addition, the steering committee commissioned a separate rebuilding plan for East Biloxi--a low-lying area that had been heavily damaged by Hurricane Katrina--that included 27 recommendations for addressing this area of the city. A number of other impacted communities in south Mississippi have undertaken planning initiatives as well. In light of the magnitude of the Gulf Coast hurricanes, the administration recognized the need to provide a mechanism to coordinate with--and support rebuilding activities at--the federal, state, and local levels. More specifically, in November 2005, the President issued executive orders establishing two new entities to help provide a governmentwide response to federal rebuilding efforts. The first of these orders created the position of Coordinator of Federal Support for the Recovery and Rebuilding of the Gulf Coast Region within the Department of Homeland Security. Accordingly, the Federal Coordinator is responsible for developing principles and goals, leading the development of federal recovery activities, and monitoring the implementation of designated federal support. The Coordinator also serves as the administration's focal point for managing information flow, requests for actions, and discussions with Congress, state, and local governments, the private sector, and community leaders. Our discussions with state and local officials in Louisiana revealed a largely positive disposition towards the Federal Coordinator and his role in support of the Gulf Coast. During our field work, for example, Louisiana state and local officials said the Coordinator had played an integral role in helping to identify and negotiate an appropriate level of CDBG funding for the state. The second executive order established a Gulf Coast Recovery and Rebuilding Council within the Executive Office of the President for a period of 3 years. Chaired by the Assistant to the President for Economic Policy, the council includes most members of the Cabinet and is charged with examining issues related to the furtherance of the President's policy on recovery and rebuilding of the Gulf Coast. Rebuilding efforts in the Gulf Coast are at a critical turning point--a time when decisions now being made in community rooms, city halls, and state houses will have a significant impact on the complexion and future of the Gulf Coast. As states and localities begin to assume responsibility for developing plans for rebuilding, there are difficult policy decisions Congress will need to make about the federal government's contribution to the rebuilding effort and the role it might play over the long-term in an era of competing priorities. Based on the preliminary work I have discussed today, the Subcommittee way wish to consider the following questions as it continues to carry out its critical oversight function in reviewing Gulf Coast rebuilding efforts: How much will it cost to rebuild the Gulf Coast and how much of this cost should the federal government bear? How effective are current funding delivery mechanisms--such as Public Assistance and CDBG--and should they be modified or supplemented by other mechanisms? How can the federal government further partner with state and local governments and the nonprofit and private sectors to leverage the public investment in rebuilding? Madam Chair and Members of the Subcommittee, this concludes my statement. I would be happy to respond to any questions you or other members of the Subcommittee may have at this time. For information about this testimony, please contact Stanley J. Czerwinski, Director, Strategic Issues, at (202) 512-6806 or [email protected]. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this statement. Individuals making key contributions to this testimony include Charlesetta Bailey, Dean Campbell, Roshni Dave, Peter Del Toro, Laura Kunz, Brenda Rabinowitz, Michael Springer, and Diana Zinkl. Hurricane Katrina: Allocation and Use of $2 Billion for Medicaid and Other Health Care Needs. GAO-07-67. February 28, 2007. Disaster Assistance: Better Planning Needed for Housing Victims of Catastrophic Disasters. GAO-07-88. February 28, 2007. Small Business Administration: Additional Steps Needed to Enhance Agency Preparedness for Future Disasters. GAO-07-114. February 14, 2007. Small Business Administration: Response to the Gulf Coast Hurricanes Highlights Need for Enhanced Disaster Preparedness. GAO-07-484T. February 14, 2007. Hurricanes Katrina and Rita: Federal Actions Could Enhance Preparedness of Certain State-Administered Federal Support Programs. GAO-07-219. February 7, 2007. Hurricanes Katrina and Rita Disaster Relief: Prevention is the Key to Minimizing Fraud, Waste, and Abuse in Recovery Effort. GAO-07-418T. January 29, 2007. Hurricane Katrina: Status of Hospital Inpatient and Emergency Departments in the Greater New Orleans Area. GAO-06-1003. September 29, 2006. Catastrophic Disasters: Enhanced Leadership, Capabilities, and Accountability Controls Will Improve the Effectiveness of the Nation's Preparedness, Response, and Recovery System. GAO-06-618. September 6, 2006. Disaster Relief: Governmentwide Framework Needed to Collect and Consolidate Information to Report on Billions in Federal Funding for the 2005 Gulf Coast Hurricanes. GAO-06-834. September 6, 2006. Coast Guard: Observations on the Preparation, Response, and Recovery Missions Related to Hurricane Katrina. GAO-06-903. July 31, 2006. Hurricane Katrina: Improving Federal Contracting Practices in Disaster Recovery Operations. GAO-06-714T. May 4, 2006. Hurricane Katrina: Planning for and Management of Federal Disaster Recovery Contracts. GAO-06-622T. April 10, 2006. Hurricane Katrina: Status of the Health Care System in New Orleans and Difficult Decisions Related to Efforts to Rebuild It Approximately 6 Months after Hurricane Katrina. GAO-06-576R. March 28, 2006. Hurricane Katrina: GAO's Preliminary Observations Regarding Preparedness, Response, and Recovery. GAO-06-442T. March 8, 2006. Hurricanes Katrina and Rita: Preliminary Observations on Contracting for Response and Recovery Efforts. GAO-06-246T. November 8, 2005. Hurricanes Katrina and Rita: Contracting for Response and Recovery Efforts. GAO-06-235T. November 2, 2005. Hurricane Katrina: Providing Oversight of the Nation's Preparedness, Response, and Recovery Activities. GAO-05-1053T. September 28, 2005. Biscuit Fire Recovery Project: Analysis of Project Development, Salvage Sales, and Other Activities. GAO-06-967. September 18, 2006. September 11: Overview of Federal Disaster Assistance to the New York City Area. GAO-04-72. October 31, 2003. Disaster Assistance: Information on FEMA's Post 9/11 Public Assistance to the New York City Area. GAO-03-926. August 29, 2003. Small Business Administration: Response to September 11 Victims and Performance Measures for Disaster Lending. GAO-03-385. January 29, 2003. September 11: Small Business Assistance Provided in Lower Manhattan in Response to the Terrorist Attacks. GAO-03-88. November 1, 2002. Los Angeles Earthquake: Opinions of Officials on Federal Impediments to Rebuilding. GAO/RCED-94-193. June 17, 1994. Hurricane Iniki Expenditures. GAO/RCED-94-132R. April 18, 1994. Time-Critical Aid: Disaster Reconstruction Assistance--A Better Delivery System Is Needed. GAO/NSIAD-87-1. October 16, 1986. Guidelines for Rescuing Large Failing Firms and Municipalities. GAO/GGD-84-34. March 29, 1984. Rebuilding Iraq: More Comprehensive National Strategy Needed to Help Achieve U.S. Goals. GAO-06-788. July 11, 2006. Foreign Assistance: USAID Completed Many Caribbean Disaster Recovery Activities, but Several Challenges Hampered Efforts. GAO-06- 645. May 26, 2006. Foreign Assistance: USAID Has Begun Tsunami Reconstruction in Indonesia and Sri Lanka, but Key Projects May Exceed Initial Cost and Schedule Estimates. GAO-06-488. April 14, 2006. Foreign Assistance: USAID's Earthquake Recovery Program in El Salvador Has Made Progress, but Key Activities Are Behind Schedule. GAO-03-656. May 15, 2003. Foreign Assistance: Disaster Recovery Program Addressed Intended Purposes, but USAID Needs Greater Flexibility to Improve Its Response Capability. GAO-02-787. July 24, 2002. Foreign Assistance: Implementing Disaster Recovery Assistance in Latin America. GAO-01-541T. March 21, 2001. This is a work of the U.S. government and is not subject to copyright protection in the United States. It may be reproduced and distributed in its entirety without further permission from GAO. However, because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. | The size and scope of the devastation caused by the 2005 Gulf Coast hurricanes presents unprecedented rebuilding challenges. Today, more than a year and a half since the hurricanes made landfall, rebuilding efforts are at a critical turning point. The Gulf Coast must face the daunting challenge of rebuilding its communities and neighborhoods--some from the ground up. This testimony (1) places the federal assistance provided to date in the context of the resources likely needed to rebuild the Gulf Coast, (2) discusses key federal programs currently being used to provide rebuilding assistance, with an emphasis on the Department of Housing and Urban Development's (HUD) Community Development Block Grant (CDBG) program, (3) describes Louisiana's and Mississippi's approach to using CDBG funds, and (4) provides observations on planning activities in Louisiana and Mississippi and the federal government's role in coordinating rebuilding efforts. GAO visited the Gulf Coast region, reviewed state and local documents, and interviewed federal, state, and local officials. While the federal government has provided billions of dollars in assistance to the Gulf Coast, a substantial portion was directed to short-term needs, leaving a smaller portion for longer-term rebuilding. It may be useful to view this assistance in the context of the costs of damages incurred by the region and the resources necessary to rebuild. Some damage estimates have put capital losses at a range of $70 billion to over $150 billion, while the State of Louisiana estimated that the economic impact on its state alone could reach $200 billion. Such estimates raise important questions regarding additional assistance that will be needed to help the Gulf Coast rebuild in the future. To date, the federal government has provided long-term rebuilding assistance to the Gulf Coast through 2 key programs, which follow different funding models. The Federal Emergency Management Agency's public assistance program provides public infrastructure funding for specific projects that meet program eligibility requirements. HUD's CDBG program, on the other hand, provides funding for neighborhood revitalization and housing rehabilitation activities, affording states broad discretion and flexibility. To date, the affected states have received $16.7 billion in CDBG funding from supplemental appropriations--so far, the largest share of funding targeted to rebuilding. With the vast number of homes that sustained damage in Louisiana and Mississippi, each state allocated the bulk of its CDBG funds to homeowner assistance. Louisiana developed an assistance program to encourage homeowners to return to Louisiana and begin rebuilding while Mississippi developed a program to target homeowners who suffered losses due to Katrina's storm surge that were not covered by insurance. As of March 28, 2007, Louisiana has awarded 4,808 grants to homeowners with an average award amount of $74,250. Mississippi has awarded 11,894 grants with an average award amount of $69,669. Restoring the region's housing and infrastructure is taking place in the context of broader planning and coordination activities. In Louisiana and Mississippi, state and local governments are engaged in both short-and long-term planning efforts. Further, the President established a position within the Department of Homeland Security to coordinate and support rebuilding activities at the federal, state, and local levels. As states and localities begin to develop plans for rebuilding, there are difficult policy decisions Congress will need to make about the federal government's contribution to the rebuilding effort and the role it might play over the long-term in an era of competing priorities. Based on our work, we raise a number of questions the Subcommittee may wish to consider in its oversight of Gulf Coast rebuilding. Such questions relate to the costs for rebuilding the Gulf Coast--including the federal government's share, the effectiveness of current funding delivery mechanisms, and the federal government's efforts to leverage the public investment in rebuilding. | 5,160 | 783 |
Under Presidential Decision Directive (PDD) 39 (U.S. Policy on Counterterrorism, June 1995), the National Security Council (NSC) is to coordinate interagency terrorism policy issues and review ongoing crisis operations and activities concerning foreign terrorism and domestic terrorism with significant foreign involvement. An NSC-chaired coordinating group is to ensure the PDD is implemented but does not have authority to direct agencies' activities. Among its general mission responsibilities, the Office of Management and Budget (OMB) is to evaluate the effectiveness of agency programs, policies, and procedures; assess competing funding demands among agencies; set funding priorities; and develop better performance measures and coordinating mechanisms. Further, according to PDD 39, OMB is to analyze the adequacy of funding for terrorism-related programs and ensure the adequacy of funding for research, development, and acquisition of counterterrorism-related technology and systems on an ongoing basis. Under PDD 39, the State Department and the Department of Justice, through the Federal Bureau of Investigation (FBI), have lead federal agency responsibility for dealing with terrorist incidents overseas and domestically, respectively. Numerous federal departments, agencies, bureaus, and offices also have terrorism-related programs and activities that are funded through annual and supplemental appropriations. (See app. I for a list of federal entities with terrorism-related programs and activities.) Terrorism-related funding requests include nearly $290 million provided under the 1995 Emergency Supplemental Appropriations Act (P.L. 104-19) in the aftermath of the domestic terrorist attack in Oklahoma City and $1.1 billion proposed for counterterrorism programs within a number of agencies in fiscal year 1996 supplemental appropriations and fiscal year 1997 budget amendments. The Government Performance and Results Act (Results Act) of 1993 is intended to improve the management and accountability of federal agencies. The Results Act seeks to shift the focus of federal management and decision-making from activities that are undertaken to the results of activities as reflected in citizens' lives. Specifically, it requires federal agencies to prepare multiyear strategic plans and annual performance plans, establish program performance measures and goals, and provide annual performance reports to the Congress. Agencies submitted the first strategic plans to OMB and the Congress by September 30, 1997; the first annual performance plans, covering fiscal year 1999, are to be submitted to the Congress after the President's budget submission in 1998. In recent years, several efforts have been undertaken to coordinate federal programs that cut across agencies to help ensure that national needs are being effectively targeted. These efforts have shown that coordinating crosscutting programs takes time and sustained attention and, because of the statutory bases of crosscutting programs, may require congressional involvement to integrate the federal response to national needs. With the large number of government entities involved, the federal effort to combat terrorism is one example of a crosscutting program to which Results Act principles and measures might be applied. Federal agencies are not required to account separately for their terrorism-related programs and activities. Because most federal agencies do not isolate or account specifically for terrorism-related funding, it is difficult to determine how much the government budgets and spends to combat terrorism. Key agencies provided us their estimates of terrorism-related spending, using their own definitions. These estimates totaled nearly $7 billion for unclassified programs and activities for fiscal year 1997, and should be considered a minimum estimate of federal spending for unclassified terrorism-related programs and activities. The amounts for governmentwide terrorism-related funding and spending are uncertain because (1) definitions of antiterrorism and counterterrorism vary from agency to agency; (2) in most cases agencies do not have separate budget line items for terrorism-related activities; (3) some agency functions serve more than one purpose, and it is difficult to allocate costs applicable to terrorism alone (e.g., U.S. embassy security measures protect not only against terrorism but also against theft, compromise of classified documents, and violent demonstrations); (4) some agencies, such as the Departments of Energy and Transportation, have decentralized budgeting and accounting functions and do not aggregate terrorism-related funding agencywide; (5) programs and activities may receive funding from more than one appropriation within a given agency, which makes it difficult to track collective totals; and (6) appropriations legislation often is not clear regarding which amounts are designated to combat terrorism. At our request, the primary agencies leading or supporting operational crisis response and management activities under PDD 39 provided spending data for fiscal years 1994 to 1996 (not all agencies were able to provide historical data prior to fiscal year 1996) and estimates for fiscal year 1997 (see table 1). Figure 1 indicates that DOD spent the largest share of estimated terrorism-related funds for fiscal year 1997, followed by the Department of Energy. While DOD and the Department of Energy estimated spending accounted for 76 percent of the unclassified fiscal year 1997 terrorism-related funds, other agencies' resources dedicated to combating terrorism have significantly increased in recent years. For example, FAA resources tripled (in current dollars) during fiscal years 1994-97, and FBI resources increased five-fold. FAA increased equipment purchases and aviation security operations, and the FBI nearly tripled the authorized staffing level dedicated to combating terrorism, with the largest staff increase occurring in fiscal year 1997. There is no interagency mechanism to centrally manage funding requirements and requests to ensure an efficient, focused governmentwide application of federal funds to numerous agencies' programs designed to combat terrorism. Given the high national priority and magnitude of this nearly $7-billion federal effort, sound management principles dictate that (1) governmentwide requirements be prioritized to meet the objectives of national policy and strategy and (2) spending and program data be collected from the federal agencies involved to conduct annual, crosscutting evaluations of their funding requests based on the threat and risk of terrorist attack and to avoid duplicated efforts or serious funding gaps. Neither NSC nor OMB currently performs these functions for the governmentwide program to combat terrorism. Rather, each agency is responsible for identifying and seeking funding for its priorities within its own budget allocation, and OMB reviews the budget requests on an agency-by-agency basis. Because individual agencies continue to propose new programs, activities, and capabilities to combat terrorism, annual crosscutting evaluations of agency budget requests for such programs would be prudent to help avoid duplicated efforts. Under PDD 39, NSC is to ensure the federal policy and strategy for combating terrorism is implemented. Although PDD 39 establishes interagency coordinating and working groups under the auspices of NSC to handle policy and operational issues related to combating terrorism, these groups operate on a consensus basis, do not have decision-making authority, and do not establish governmentwide resource priorities for combating terrorism. Moreover, PDD 39 does not assign responsibility to NSC to ensure that terrorism-related requirements and related funding proposals (1) are analyzed and reviewed to ensure they are based on a validated assessment of the terrorism threat and risks of terrorist attack, (2) provide a measured and appropriate level of effort across the federal government, (3) avoid duplicative efforts and capabilities, and (4) are prioritized governmentwide in a comprehensive strategy to combat the terrorist threat. PDD 39 requires OMB to analyze the adequacy of funding for terrorism-related programs, technology, and systems. Further, OMB's general mission responsibilities include evaluating the effectiveness of federal programs and policies, assessing competing funding demands, and setting funding priorities. However, PDD 39 does not specifically require OMB to prioritize terrorism-related requirements governmentwide or to gather funding data across agencies and perform the crosscutting analyses of agencies' funding proposals necessary to ensure the efficient use of federal resources. OMB examiners who review individual agencies' terrorism-related funding requests explained that although they do not review activities and programs to combat terrorism on a crosscutting basis as such, they often discuss funding issues with each other during their reviews. Further, they bring issues they identify during their reviews to the attention of senior OMB officials. For example, OMB said it reviewed the FBI's funding requests for a hazardous materials laboratory capability and for increased staffing to combat terrorism. However, because OMB did not provide evidence of its reviews, we could not verify the extent to which OMB considered the capabilities of other federal laboratories or analyzed the FBI's request for increased staffing based on workload data and on the threat and risk of terrorism. Further, because terrorism-related funding requirements and proposals have not been prioritized across agencies, OMB could not have fully considered tradeoffs among competing demands. For this reason, it is unclear, for example, whether OMB's denial of an FBI request for an aircraft that the FBI said was required for counterterrorism and other operations was based on an assessment of terrorism-related priorities across the government or of only the FBI's funding requests. OMB stated that in addition to its examination of agencies' funding requests, it has met its responsibilities under PDD 39 by reviewing DOD's counterterrorism program baseline funding and program submission, participating in interagency meetings designed to better identify terrorism-related budget functions that are imbedded in broader funding accounts, and reviewing specific technology proposals (such as FAA proposals for explosives detection technology). Also, consistent with its role, OMB prepared the President's $1.1-billion request for terrorism-related programs and activities. We submitted a letter of inquiry to OMB to obtain information about OMB's role in reviewing federal agencies' budget requests and spending to combat terrorism. Our questions and OMB's written response appear in appendixes II and III, respectively. While OMB said that it analyzes individual agencies' funding requests--and some examiners say they share information during their examinations--OMB does not regularly perform crosscutting analyses of requirements, priorities, and funding for the overall federal effort to combat terrorism. Consequently, OMB cannot provide reasonable assurance that specific federal activities and programs to combat terrorism (1) are required based on a full assessment of the threat and risk involved, (2) avoid unnecessary duplication of effort or capability with other agencies, and (3) meet governmentwide priorities for effectively and efficiently implementing the national strategy on combating terrorism. Section 1501 of the recently enacted National Defense Authorization Act for Fiscal Year 1998 requires OMB to establish a reporting system for executive agencies on the budgeting and expenditure of funds for counterterrorism and antiterrorism programs and activities. The section also requires OMB, using the reporting system, to collect agency budget and expenditure information on these programs and activities. Further, the President is required to submit an annual report to the Congress containing agency budget and expenditure information on counterterrorism and antiterrorism programs and activities. The report is also to identify any priorities and any duplication of efforts with respect to such programs and activities. The Results Act requires each executive branch agency to define its mission and desired outcomes, measure performance, and use performance information to ensure that programs meet intended goals. However, the national policy, strategy, programs, and activities to combat terrorism cut across agency lines. The act's emphasis on results implies that federal programs contributing to the same or similar outcomes should be closely coordinated to ensure that goals are consistent and that program efforts are mutually reinforcing. Effective implementation of the act governmentwide should eventually help prevent uncoordinated crosscutting program efforts that can waste funds and limit the overall effectiveness of the federal effort. The principles underlying the Results Act provide guidance that the many federal agencies responsible for combating terrorism can use to develop coordinated goals, objectives, and performance measures and to improve the management of individual agency and overall federal efforts to combat terrorism. For example, the act focuses on clarifying missions, setting program goals, and measuring performance toward achieving those goals. In our work examining implementation of the Results Act, we identified several critical issues that need to be addressed if the act is to succeed in improving management of crosscutting program efforts by ensuring that those programs are appropriately and substantively coordinated. As their implementation of the Results Act continues to evolve, agencies with terrorism-related responsibilities may become more aware of the potential for and desirability of coordinating performance plans, goals, and measures for their crosscutting activities and programs. The next phase of implementation of the Results Act requires agencies to develop annual performance plans that are linked to their strategic plans. These plans are to contain annual performance goals, performance measures to gauge progress toward achieving the goals, and the resources agencies will need to meet their goals. The development of annual plans may provide the many federal agencies responsible for combating terrorism the next opportunity to develop coordinated goals, objectives, and performance measures for programs and activities that combat terrorism and to articulate how they plan to manage this crosscutting program area. The Economy Act of 1932 (31 U.S.C. 1535, as amended) generally requires federal agencies to reimburse other federal agencies that provide them with support. However, PDD 39 states that federal agencies providing support to lead agencies' counterterrorist operations or activities must bear the cost unless otherwise directed by the President. Because the Economy Act and PDD 39 differ in their treatment of reimbursement, DOD and the FBI have disagreed on whether the FBI must reimburse DOD for its support of counterterrorist operations. Primary examples of DOD support involve air transportation to return terrorists from overseas locations or other deployments of FBI personnel and equipment for special events or for the investigation of terrorist incidents. DOD officials stated that PDD 39 does not have the force of statutory authority regarding whether or not DOD's support to another agency is reimbursable. These officials believe the Economy Act requires DOD to provide the requested support on a reimbursable basis unless another statute allows for nonreimbursable support. Every request for DOD support requires a legal determination of which statutes are applicable and whether the Economy Act applies. DOD believes that PDD 39 does not control the legal determination of reimbursement. The issue of reimbursement has caused two concerns within the FBI: (1) the potential impairment of its operations under PDD 39 or other authorities and (2) the availability of funding for operations under PDD 39 if DOD does not provide nonreimbursable support. According to the FBI, DOD ultimately provides nonreimbursable support in most cases, but delays and uncertainties involved in DOD's decision process on reimbursement frequently threaten timely FBI deployments. DOD officials cited an example of the process it follows when the FBI, through the Attorney General, requests support under PDD 39. In response to an Attorney General request that DOD provide air transportation for FBI personnel and equipment to prepare for the June 1997 Summit of the Eight in Denver, Colorado, DOD identified a statute that allowed nonreimbursable support regarding the provision of security to foreign dignitaries. Otherwise, the Economy Act would have required the FBI to reimburse DOD for the transportation costs. In an attempt to alleviate concern and confusion over reimbursement of support activities, NSC tasked a special working group on interagency operations to explore solutions. According to NSC, possible solutions include legislation to provide DOD with special authority to provide nonreimbursable support or to set aside contingency funds for domestic emergency support team activities. The Department of Justice commented that DOD-provided transportation services and assistance provided in response to terrorist activities involving a weapon of mass destruction should be exempt from the requirements of the Economy Act. DOD commented that it is also considering various legislative options to permit nonreimbursable support for counterterrorism operations. At the time of our review, the issue remained unresolved. Billions of dollars are being spent by numerous agencies with roles or potential roles in combating terrorism, but because no federal entity has been tasked to collect such information across the government, the specific amount is unknown. Further, no governmentwide spending priorities for the various aspects of combating terrorism have been set, and no federal entity manages the crosscutting program to channel resources where they are most needed in consideration of the threat and the risk of terrorist attack and to prevent wasteful spending that might occur from unnecessary duplication of effort. Recent legislation requires that OMB establish a reporting system for executive agencies on the budgeting and expenditure of funds for counterterrorism and antiterrorism programs and activities and that the President report this information annually to the Congress, along with program priorities and any duplication of effort. We recommend that consistent with the responsibility for coordinating efforts to combat terrorism, the Assistant to the President for National Security Affairs, NSC, in consultation with the Director, OMB, and the heads of other executive branch agencies, take steps to ensure that (1) governmentwide priorities to implement the national counterterrorism policy and strategy are established; (2) agencies' programs, projects, activities, and requirements for combating terrorism are analyzed in relation to established governmentwide priorities; and (3) resources are allocated based on the established priorities and assessments of the threat and risk of terrorist attack. To ensure that federal expenditures for terrorism-related activities are well-coordinated and focused on efficiently meeting the goals of U.S. policy under PDD 39, we recommend that the Director, OMB, use data on funds budgeted and spent by executive departments and agencies to evaluate and coordinate projects and recommend resource allocation annually on a crosscutting basis to ensure that governmentwide priorities for combating terrorism are met and programs are based on analytically sound threat and risk assessments and avoid unnecessary duplication. In a draft of this report we also recommended that the Director, OMB, establish a governmentwide mechanism for reporting expenditures to combat terrorism. We deleted that recommendation in view of the requirements of the recently enacted legislation. Our remaining recommendations are consistent with and complement this legislation. In written comments on a draft of this report, the Department of Defense concurred with our findings. DOD noted that we identified a significant issue involving reimbursement for and providing DOD support to other federal agencies under PDD 39. DOD commented that although PDD 39 states that support provided by a federal agency to the lead federal agency in support of counterterrorist operations is borne by the providing agency, PDD 39 is not a statute, and does not provide authority to waive reimbursement that is required by the Economy Act. DOD also discussed in its comments specific legislative options it is considering to resolve the issue. (DOD's comments and our response are in app. IV.) In its written comments, the State Department pointed out that, although interagency funding requirements for combating terrorism are not managed by any single mechanism, overall counterterrorism and antiterrorism spending is discussed by NSC's Coordinating Sub-Group and interagency coordination occurs in other contexts. We agree that interagency coordination occurs at various forums in the counterterrorism community but such coordination mechanisms do not perform the functions we are recommending to NSC and OMB. State also highlighted the difficulties of determining the amount of funds spent to combat terrorism with a certain level of precision. We agree that it would be difficult and possibly not cost-effective to account for programs and activities that combat terrorism with a high degree of precision. Nevertheless, at the time of our review, information on federal spending to combat terrorism had not been gathered in any form or at any level of specificity, and we believe that a reasonable methodology could be devised to allow OMB to capture this data governmentwide. State also noted that efforts to coordinate programs and activities and prevent duplication are further complicated by the authorization and appropriations process in the Congress, because various committees have jurisdiction over the federal agencies involved in combating terrorism. State finally noted that it is important to have good working relations with other countries to effectively counter international terrorism. (State's comments and our response are in app. V.) OMB noted in its written comments that although our recommendations are consistent with policies and responsibilities established by statute and the President, the budget process would not be improved by mandating annual, formal crosscutting reviews of budget requests and spending for federal programs that combat terrorism. OMB also stated that, because of the significant investment in combating terrorism over the past few years, it will include a crosscutting review of these programs in the formulation of the fiscal year 1999 budget. We are encouraged by OMB's crosscutting evaluation of programs to combat terrorism for the fiscal year 1999 budget submission. Because of the high national priority, the significant federal resources allocated, and the numerous federal agencies, bureaus, and programs involved, we continue to believe that annual crosscutting reviews would provide a mechanism for OMB to better assure that federal resources are aligned with governmentwide program priorities and that funds are not allocated to duplicative activities and functions to combat terrorism. Annual reviews would be particularly important because federal agencies continue to propose funding of new programs, activities, and capabilities to combat terrorism. OMB expressed concern that our report suggests that there currently is no effective process to review spending for combating terrorism. We acknowledge OMB's reviews of individual agencies' funding requests, but as noted in our report, OMB did not provide evidence of its reviews, in particular of the $1.1-billion fiscal year 1997 amended budget request for combating terrorism. OMB also commented that it carefully considers funding levels for activities to combat terrorism. During the course of our review, OMB could not provide data on funding levels across the federal government for combating terrorism. During the agency comment period on a draft of this report, officials from the Treasury and Justice Departments noted that OMB recently issued a budget data request to gather budgetary and expenditure data from executive agencies for fiscal years 1996-99, which in part satisfies our recommendation to OMB. OMB would not provide a copy of the budget data request because we are not part of the executive branch and it was in the process of being implemented. As a result, we could not verify that the request was issued or determine its content. (OMB's written comments are in app. VI.) The Departments of Treasury; Justice, including the FBI; and Transportation provided technical comments, which we have reflected in our report, as appropriate. NSC and the Departments of Energy and Health and Human Services did not comment on the draft report. We reviewed PDD 39 to determine agencies' roles and responsibilities in managing and coordinating resources for combating terrorism. Because data on agencies' spending for U.S. efforts to combat terrorism are not available from a central source, we obtained from the Departments of Defense; Energy; Justice, including the FBI; State; Transportation (FAA); Treasury; and Health and Human Services data on spending that the agencies categorized as related to their unclassified efforts to combat terrorism. We did not verify the data for accuracy, completeness, or consistency. We discussed with NSC and OMB their respective roles in managing the crosscutting federal effort to combat terrorism, and we also submitted questions to the Director, OMB, on OMB's role under PDD 39. We discussed reimbursement issues with the FBI and DOD. We conducted our work from November 1996 to October 1997 in accordance with generally accepted government auditing standards. As agreed with your offices, unless you publicly announce the contents of this report earlier, we plan no further distribution of this report until 7 days after its issue date. At that time, we will send copies to the appropriate congressional committees; the Director, Office of Management and Budget; other federal agencies discussed in the report; and other interested parties. If you have any questions about this report, please contact me at (202) 512-3504. Major contributors to this report were Davi M. D'Agostino, Richard A. McGeary, H. Lee Purdy, and Raymond J. Wyrsch. The following is GAO's comment on DOD's letter dated November 7, 1997. 1. We did not evaluate DOD's options for proposed legislative changes that would permit nonreimbursable support to law enforcement agencies. The following are GAO's comments on the Department of State's letter dated November 3, 1997. 1. While we acknowledge the existence of various interagency coordinating mechanisms within the NSC structure, these mechanisms do not perform the functions we are recommending to NSC and OMB. For example, the interagency Technical Support Working Group coordinates only certain terrorism-related research and development projects, and it does not function to eliminate duplicative or redundant terrorism-related research and development across government agencies. 2. We modified the text to reflect the Department's point that embassy guards help protect against a variety of threats. 3. We agree that it would be difficult and possibly not cost-effective to account for spending to combat terrorism with a high degree of precision. Our report discusses this matter on p. 14. 4. The Department's concern about reimbursement for the cost of facilities security in U.S. missions abroad was not brought to our attention during our review of funding issues for combating terrorism. As a result, we are not in a position to comment on this matter. 5. The report discusses the State Department position on p. 14. The following are GAO's comments on OMB's letter dated November 18, 1997. 1. The report acknowledges that OMB reviews agencies' individual budget requests, and suggests that this process would be enhanced if federal funding proposals were reviewed on a crosscutting, governmentwide basis. The report also points out that additional steps could be taken to prioritize federal programs and activities to combat terrorism at a strategic level to better ensure priority programs are funded and avoid duplicative and overlapping activities. 2. As discussed on p. 14 of the final report, we are encouraged by OMB's crosscutting review of programs to combat terrorism as part of the fiscal year 1999 budget process. 3. As discussed on pp. 14-15, in view of the national importance and priority, the significant federal resources allocated, and the numerous federal agencies, bureaus, and programs involved, we continue to believe that governmentwide priorities should be set and annual crosscutting reviews be performed on programs to combat terrorism. As agencies continue to propose new programs, activities, and capabilities, priorities and annual crosscutting reviews are particularly important to better assure that funds are not allocated to duplicative activities and functions to combat terrorism. Combating Terrorism: Federal Agencies' Efforts to Implement National Policy and Strategy (GAO/NSIAD-97-254, Sept. 26, 1997). Combating Terrorism: Status of DOD Efforts to Protect Its Forces Overseas (GAO/NSIAD-97-207, July 21, 1997). Chemical Weapons Stockpile: Changes Needed in the Management Structure of Emergency Preparedness Program (GAO/NSIAD-97-91, June 11, 1997). State Department: Efforts to Reduce Visa Fraud (GAO/T-NSIAD-97-167, May 20, 1997). Aviation Security: FAA's Procurement of Explosives Detection Devices (GAO/RCED-97-111R, May 1, 1997). Aviation Security: Commercially Available Advanced Explosives Detection Devices (GAO/RCED-97-119R, Apr. 24, 1997). Terrorism and Drug Trafficking: Responsibilities for Developing Explosives and Narcotics Detection Technologies (GAO/NSIAD-97-95, Apr. 15, 1997). Federal Law Enforcement: Investigative Authority and Personnel at 13 Agencies (GAO/GGD-96-154, Sept. 30, 1996). Aviation Security: Urgent Issues Need to Be Addressed (GAO/T-RCED/NSIAD-96-151, Sept. 11, 1996). Terrorism and Drug Trafficking: Technologies for Detecting Explosives and Narcotics (GAO/NSIAD/RCED-96-252, Sept. 4, 1996). Aviation Security: Immediate Action Needed to Improve Security (GAO/T-RCED/NSIAD-96-237, Aug. 1, 1996). Passports and Visas: Status of Efforts to Reduce Fraud (GAO/NSIAD-96-99, May 9, 1996). Terrorism and Drug Trafficking: Threats and Roles of Explosives and Narcotics Detection Technology (GAO/NSIAD/RCED-96-76BR, Mar. 27, 1996). Nuclear Nonproliferation: Status of U.S. Efforts to Improve Nuclear Material Controls in Newly Independent States (GAO/NSIAD/RCED-96-89, Mar. 8, 1996). Aviation Security: Additional Actions Needed to Meet Domestic and International Challenges (GAO/RCED-94-38, Jan. 27, 1994). Nuclear Security: Improving Correction of Security Deficiencies at DOE's Weapons Facilities (GAO/RCED-93-10, Nov. 16, 1992). Nuclear Security: Weak Internal Controls Hamper Oversight of DOE's Security Program (GAO/RCED-92-146, June 29, 1992). Electricity Supply: Efforts Underway to Improve Federal Electrical Disruption Preparedness (GAO/RCED-92-125, Apr. 20, 1992). Economic Sanctions: Effectiveness as Tools of Foreign Policy (GAO/NSIAD-92-106, Feb. 19, 1992). State Department: Management Weaknesses in the Security Construction Program (GAO/NSIAD-92-2, Nov. 29, 1991). Chemical Weapons: Physical Security for the U.S. Chemical Stockpile (GAO/NSIAD-91-200, May 15, 1991). State Department: Status of the Diplomatic Security Construction Program (GAO/NSIAD-91-143BR, Feb. 20, 1991). International Terrorism: FBI Investigates Domestic Activities to Identify Terrorists (GAO/GGD-90-112, Sept. 9, l990). International Terrorism: Status of GAO's Review of the FBI's International Terrorism Program (GAO/T-GGD-89-31, June 22, 1989). Embassy Security: Background Investigations of Foreign Employees (GAO/NSIAD-89-76, Jan. 5, 1989). Aviation Security: FAA's Assessments of Foreign Airports (GAO/RCED-89-45, Dec. 7, 1988). Domestic Terrorism: Prevention Efforts in Selected Federal Courts and Mass Transit Systems (GAO/PEMD-88-22, June 23, 1988). The first copy of each GAO report and testimony is free. Additional copies are $2 each. Orders should be sent to the following address, accompanied by a check or money order made out to the Superintendent of Documents, when necessary. VISA and MasterCard credit cards are accepted, also. Orders for 100 or more copies to be mailed to a single address are discounted 25 percent. U.S. General Accounting Office P.O. Box 37050 Washington, DC 20013 Room 1100 700 4th St. NW (corner of 4th and G Sts. NW) U.S. General Accounting Office Washington, DC Orders may also be placed by calling (202) 512-6000 or by using fax number (202) 512-6061, or TDD (202) 512-2537. Each day, GAO issues a list of newly available reports and testimony. To receive facsimile copies of the daily list or any list from the past 30 days, please call (202) 512-6000 using a touchtone phone. A recorded menu will provide information on how to obtain these lists. | Pursuant to a congressional request, GAO reviewed interagency processes intended to ensure the efficient allocation of funding and resources for the federal government's efforts to combat terrorism, focusing on: (1) federal funding for unclassified programs and activities to combat terrorism; (2) whether any agency or entity has been designated to coordinate budget proposals, establish priorities, manage funding requirements, and help ensure the efficient allocation of federal resources for combating terrorism across federal agencies; (3) opportunities for agencies to expand coordination of terrorism-related programs and activities under the Government Performance and Results Act (GPRA) principles and framework; and (4) issues concerning the reimbursement of support provided to agencies with lead counterterrorism responsibilities. GAO noted that: (1) the amount of federal funds being spent on combating terrorism is unknown and difficult to determine; (2) identifying and tracking terrorism-related governmentwide spending with precision is difficult for several reasons; (3) information from key agencies involved in combating terrorism shows that nearly $7 billion was spent for unclassified terrorism-related programs and activities during fiscal year (FY) 1997; (4) the Department of Defense budgeted about $3.7 billion in FY 1997, or about 55 percent of the estimated spending; (5) although the National Security Council (NSC) is to coordinate counterterrorism policy issues and the Office of Management and Budget (OMB) is to assess competing funding demands, neither agency is required to regularly collect, aggregate, and review funding and spending data relative to combating terrorism on a crosscutting, governmentwide basis; (6) neither agency establishes funding priorities for terrorism-related programs across agencies' budgets or ensures that individual agencies' stated requirements have been validated against threat and risk criteria before budget requests are submitted to Congress; (7) because governmentwide priorities for combating terrorism have not been established and funding requirements have not necessarily been validated based on an analytically sound assessment of the threat and risk of a terrorist attack, there is no basis to have reasonable assurance that: (a) agencies' requests are funded through a coordinated and focused approach to implement national policy and strategy; (b) the highest priority requirements are being met; (c) terrorism-related activities and capabilities are not unnecessarily duplicative or redundant; and (d) funding gaps or misallocations have not occurred; (8) GPRA principles and framework can provide guidance and opportunities for the many federal agencies involved in the crosscutting program to combat terrorism to develop coordinated goals, objectives and performance measures, and to enhance the management of individual agency and overall federal efforts; (9) Presidential Decision Directive (PDD) 39 directs that agencies will provide support for terrorism-related activities at their own expense unless the President directs otherwise; (10) the Economy Act generally requires reimbursement for goods and services provided to another agency; and (11) the difference between PDD 39 and the Economy Act concerning reimbursement has caused disagreements between agencies in some cases. | 6,840 | 626 |
According to the Financial Literacy Act, the purpose of the Financial Literacy and Education Commission is to improve financial literacy and education through the development of a national strategy to promote them. The act defines the composition of the Commission--the Secretary of the Treasury and the heads of 19 other federal departments and agencies--and allows the President to appoint up to five additional members. The Commission must hold one public meeting at least every 4 months. It held its first meeting in January 2004 and nine subsequent meetings, most recently in January 2007. The act requires the Commission to undertake certain activities, including (1) developing a national strategy to promote financial literacy and education for all Americans; (2) establishing a financial education Web site to provide information about federal financial literacy education programs and grants; (3) establishing a toll-free hotline; (4) identifying areas of overlap and duplication among federal activities and coordinating federal efforts to implement the national strategy; (5) assessing the availability, utilization, and impact of federal financial literacy and education materials; and (6) promoting partnerships among federal, state, and local governments, nonprofit organizations, and private enterprises. The act requires that the national strategy be reviewed and modified as deemed necessary at least once a year. It also requires the Secretary of the Treasury to develop, implement, and conduct a pilot national public service multimedia campaign to enhance the state of financial literacy and education in the United States. The Treasury Department's Office of Financial Education provides primary support to the Commission and coordinates its efforts. As of April 2007, the office had assigned the equivalent of about 3 full-time professional staff to handle work related to the Commission and in the past also has received assistance from staff detailed from other federal agencies. The Commission has no independent budget. The act authorized appropriations to the Commission of amounts necessary to carry out its work, and for fiscal year 2005 Congress specified that $1 million should be used for the development and implementation of the national strategy. To develop the National Strategy for Financial Literacy, the Commission formed a national strategy working group of 13 member agencies, issued a call for public comment in the Federal Register, and held six public meetings--five organized around the commercial, government, nonprofit, education, and banking sectors and one for individual consumers. Although the Financial Literacy Act required the Commission to adopt the strategy within 18 months of enactment, or June 2005, the strategy was not publicly released until April 2006. The Commission sought unanimous consent on the national strategy, and Commission members told us that the Treasury Department faced a significant challenge in trying to get 20 federal agencies--each with its own mission and point of view--to unanimously agree to a strategy. A particular source of disagreement involved whether nonfederal entities should be cited by name as illustrative examples in the strategy. The Commission ultimately agreed that it would not name these organizations in the national strategy, but cite them in a separate document issued by Treasury, called the Quick Reference Guide to the strategy. The content of the National Strategy for Financial Literacy largely consists of a comprehensive overview of issues related to financial literacy and examples of ongoing initiatives. It describes many major problems and challenges that relate to financial literacy in the United States, identifies key subject matter areas and target populations, and describes what it believes to be illustrations of potentially effective practices in financial education across a broad spectrum of subjects and sectors. As such, the strategy represents a useful first step in laying out key issues and highlighting the need for improved financial literacy. At the same time, as some representatives of the Commission told us, the strategy is fundamentally descriptive rather than strategic. It provides information on disparate issues and initiatives but is limited in presenting a long-term plan of action for achieving its goal. Most notably, the strategy's recommendations are presented as "calls to action," defined as concrete steps that should be taken for improving financial literacy and education. Sixteen of these 26 calls to action are addressed to federal entities, 5 to private or nonprofit organizations, and 5 to the public. However, many of these calls to action are very general and do not discuss an implementation strategy, and others describe initiatives that already exist. For example, one call to action states, "Investors should take advantage of the wealth of high quality, neutral, and unbiased information offered free of charge," but does not lay out a plan for helping ensure that investors will do so. We have previously identified a set of desirable characteristics for any effective national strategy. While national strategies are not required to contain a single, consistent set of attributes, we found six characteristics that can offer policymakers and implementing agencies a management tool to help ensure accountability and more effective results. As shown in the table below, we found that the National Strategy for Financial Literacy generally addresses the first of these characteristics and partially addresses the other five. The six characteristics we considered follow: Clear Purpose, Scope, and Methodology. An effective strategy describes why the strategy was produced, the scope of its coverage, and how it was developed. The National Strategy for Financial Literacy generally addresses this characteristic. For example, it cites the legislative mandate that required the strategy, the overall purpose, and subsidiary goals such as making it easier for consumers to access financial education materials. At the time of our review, the strategy did not specifically define "financial literacy" or "financial education" and we noted that doing so could provide additional benefit in helping define the scope of the Commission's work. In its April 2007 report to Congress, the Commission provided definitions of these terms that it said would guide its work. Detailed Discussion of Problems and Risks. A strategy with this characteristic provides a detailed discussion or definition of the problems the strategy intends to address, their causes, and the risks of not addressing them. Based on our review, the National Strategy for Financial Literacy partially addresses this characteristic. It identifies specific problems that indicate a need for improved financial literacy and often discusses the causes of these problems. However, it might benefit further from a fuller discussion of the long-term risks--to the well-being of individuals, families, and the broader national economy--that may be associated with poor financial literacy. As we have reported in the past, a clear understanding of our nation's overall financial condition and fiscal outlook is an indispensable part of true financial literacy. Due to current demographic trends, rising health care costs, and other factors, the nation faces the possibility of decades of mounting debt, which left unchecked will threaten our economic security and adversely affect the quality of life available to future generations. One element of financial literacy is ensuring that Americans are aware of these potential developments in planning for their own financial futures since, for example, we can no longer assume that current federal entitlement programs will continue indefinitely in their present form. Desired Goals, Objectives, Activities and Performance Measures. The National Strategy for Financial Literacy partially addresses this characteristic, which deals not only with developing goals and strategies to achieve them, but also the milestones and outcome measures needed to gauge results. The strategy does identify key strategic areas and includes 26 calls to action that, although often lacking detail, provide a picture of the types of activities the strategy recommends. However, in general, the strategy neither sets clear and specific goals and objectives, nor does it set priorities or performance measures for assessing progress. Several stakeholders in the financial literacy community that we spoke with noted that the strategy would have been more useful if it had set specific performance measures. The Commission might also have set measurable goals for changing consumer behavior, such as seeking to reduce the number of Americans without bank accounts or increase the number saving for their retirement to a specified figure in the next 5 or 10 years. Without performance measures or other evaluation mechanisms, the strategy lacks the means to measure progress and hold relevant players accountable. Description of Future Costs and Resources Needed. Effective national strategies should include discussions of cost, the sources and types of resources needed, and where those resources should be targeted. The National Strategy for Financial Literacy discusses, in general terms, the resources that are available from different sectors and its Quick Reference Guide provides a list of specific organizations. However, the strategy does not address fundamental questions about the level and type of resources that are needed to implement the national strategy. The strategy does little to acknowledge or discuss how funding limitations could be a challenge to improving financial literacy and offers little detail on how existing resources could best be leveraged. Neither does it provide cost estimates nor does it discuss specifically where resources should be targeted. For example, it does not identify the sectors or populations most in need of additional resources. The strategy also might have included more discussion of how various "tools of government" such as regulation, standards, and tax incentives might be used to stimulate nonfederal organizations to use their unique resources to implement the strategy. Without a clear description of resource needs, policymakers lack information helpful in allocating resources and directing the strategy's implementation. Organizational Roles, Responsibilities, and Coordination. Effective national strategies delineate which organizations will implement the strategy and describe their roles and responsibilities, as well as mechanisms for coordinating their efforts. The National Strategy for Financial Literacy partially addresses these issues. For example, it discusses the involvement of various governmental and nongovernmental sectors in financial education and identifies in its calls to action which agencies will or should undertake certain tasks or initiatives. However, the strategy is not specific about roles and responsibilities and does not recommend changes in the roles of individual federal agencies. Addressing these issues more fully is important given our prior work that discussed the appropriate federal role in financial literacy in relation to other entities and the potential need to streamline federal efforts in this area. In addition, the strategy is limited in identifying or promoting specific processes for coordination and collaboration between sectors and organizations. Description of Integration with Other Entities. This characteristic addresses how a national strategy relates to other federal strategies' goals, objectives, and activities. The National Strategy for Financial Literacy does identify and describe a few plans and initiatives of entities in the federal and private sectors, and it includes a chapter describing approaches within other nations and international efforts to improve financial education. However, the strategy is limited in identifying linkages with these initiatives, and it does not address how it might integrate with the overarching plans and strategies of these state, local, and private-sector entities. Because the National Strategy for Financial Literacy is more of a description of the current state of affairs than an action plan for the future, its effect on public and private entities that conduct financial education may be limited. We asked several major financial literacy organizations how the national strategy would affect their own plans and activities, and the majority said it would have no impact at all. Similarly, few federal agencies with which we spoke could identify ways in which the national strategy was guiding their work on financial literacy. Most characterized the strategy as a description of their existing efforts. Our report recommended that the Secretary of the Treasury, in concert with other agency representatives of the Financial Literacy and Education Commission, incorporate into the national strategy (1) a concrete definition for financial literacy and education to help define the scope of the Commission's work; (2) clear and specific goals and performance measures that would serve as indicators of the nation's progress in improving financial literacy and benchmarks for the Commission; (3) actions needed to accomplish these goals, so that the strategy serves as a true implementation plan; (4) a description of the resources required to help policymakers allocate resources and direct implementation of the strategy; and (5) a discussion of appropriate roles and responsibilities for federal agencies and others, to help promote a coordinated and efficient effort. In commenting on our report, Treasury, in its capacity as chair of the Commission, noted that the National Strategy for Financial Literacy was the nation's first such effort and, as such, was designed to be a blueprint that provides general direction while allowing diverse entities the flexibility to participate in enhancing financial education. The department said that the strategy's calls to action are appropriately substantive and concrete--setting out specific issues for discussion, conferences to be convened, key constituencies, and which Commission members should be responsible for each task. As noted earlier, in its April 2007 report to Congress, the Commission provided definitions for "financial literacy" and "financial education" to help guide its work. We acknowledge that the national strategy represents the nation's first such effort, but continue to believe that future iterations of the strategy would benefit from inclusion of the characteristics cited in our report. The Financial Literacy Act required the Commission to establish and maintain a Web site to serve as a clearinghouse and provide a coordinated point of entry for information about federal financial literacy and education programs, grants, and materials. With minor exceptions, the Commission did not create original content for its Web site, which it called My Money. Instead, the site serves as a portal that consists largely of links to financial literacy and education Web sites maintained by Commission member agencies. According to Treasury representatives, the English- language version of the My Money site had more than 290 links as of April 2007, organized around 12 topics. A section on federal financial education grants was added to the site in October 2006, which includes links to four grant programs. Many representatives of private and nonprofit financial literacy initiatives and organizations with whom we spoke were generally satisfied with the Web site, saying that it provided a clear and useful portal for consumers to federal financial education materials. From its inception in October 2004 through March 2007, the My Money Web site received approximately 1,454,000 visits. The site received an average of 35,000 visits per month during the first 6 months after its introduction in October 2004. Use of the site has increased since that time and reached 78,000 visits in April 2006, when the Commission and the Web site received publicity associated with the release of the national strategy. From October 2006 through March 2007, the site averaged about 69,000 visits per month. The number of visits to the My Money Web site has been roughly comparable to some recently launched private Web sites that provide financial education. Some representatives of financial literacy organizations with whom we spoke said the Commission should do more to promote public awareness of the Web site. Commission representatives, however, noted to us several steps that have been taken to promote the site, including, for example, a promotional effort in April 2006 that printed the My Money Web address on envelopes containing federal benefits and tax refunds. However, the Commission has not yet conducted usability tests or measured customer satisfaction for the My Money Web site. The federal government's Web Managers Advisory Council provides guidance to help federal Web managers implement recommendations and best practices for their federal sites. The council recommends testing usability and measuring customer satisfaction to help identify improvements and ensure that consumers can navigate the sites efficiently and effectively. Representatives of the General Services Administration (GSA), which operates the site, acknowledged that these steps are standard best practices that would be useful in improving the site. They said they had not yet done so due to competing priorities and a lack of funding. Without usability testing or measures of customer satisfaction, the Commission does not know whether the Web site's content is organized in a manner that makes sense to the public, or whether the site's visitors can readily find the information for which they are looking. Our report recommended that the Commission (1) conduct usability testing to measure the quality of visitors' experience with the site; and (2) measure customer satisfaction with the site, using whatever tools deemed appropriate, such as online surveys, focus groups, or e-mail feedback. In its April 2007 report to Congress, the Commission said it would conduct usability testing of, and measure customer satisfaction with, its Web site by the second quarter of 2009. In addition to a Web site, the Financial Literacy Act also required that the Commission establish a toll-free telephone number for members of the public seeking information related to financial literacy. The Commission launched the telephone hotline, 1-888-My Money, simultaneously with the My Money Web site in October 2004. The hotline supports both English- and Spanish-speaking callers. A private contractor operates the hotline's call center and GSA's Federal Citizen Information Center oversees the operation and covers its cost. According to GSA, the cost of providing telephone service for the hotline was about $28,000 in fiscal year 2006. The hotline serves as an order line for obtaining a free financial literacy "tool kit"--pamphlets and booklets from various federal agencies on topics such as saving and investing, deposit insurance, and Social Security. The tool kit is available in English and Spanish versions, and consumers can also order it via the My Money Web site. The volume of calls to the My Money telephone hotline has been limited--526 calls in March 2007 and an average of about 200 calls per month between February 2005 and February 2006. As part of the national strategy, the Financial Literacy Act required the Secretary of the Treasury to develop, implement, and conduct a pilot national public service multimedia campaign to enhance the state of financial literacy in the United States. The department chose to focus the multimedia campaign on credit literacy among young adults. It contracted with the Advertising Council to develop and implement the multimedia campaign, which is expected to be advertised--using donated air time and print space--on television and radio, in print, and online. According to the Commission's April 2007 report to Congress, the launch of the campaign is scheduled for the third quarter of 2007. The Financial Literacy Act required that the Commission develop a plan to improve coordination of federal financial literacy and education activities and identify areas of overlap and duplication in these activities. The Commission created a single focal point for federal agencies to come together on the issue of financial literacy and education. Some Commission members told us that its meetings--including formal public, working group, and subcommittee meetings--have helped foster interagency communication and information sharing that had previously been lacking. In addition, the Commission's Web site, hotline, and tool kit have helped centralize federal financial education resources for consumers. Further, the national strategy includes a chapter on federal interagency coordination and several of the strategy's calls to action involve interagency efforts, including joint conferences and other initiatives. However, the Commission has faced several challenges in coordinating the efforts of the 20 federal agencies that form the Commission. Each of the Commission's participating federal agencies has different missions and responsibilities and thus different perspectives and points of view on issues of financial literacy. The agencies also differ in their levels of responsibility for and expertise on financial literacy and education. Further, because agencies tend to be protective of their resources, it might be very difficult to recommend eliminating individual agencies' programs. Moreover, the Commission's ability to coordinate such major structural change, if it chose to do so, would be constrained by its limited resources in terms of staff and funding. In addition, the Commission has no legal authority to compel an agency to take any action, but instead must work through collaboration and consensus. Given these various constraints, a Treasury official told us that the Commission saw its role as improving interagency communication and coordination rather than consolidating federal financial education programs or fundamentally changing the existing federal structure. To meet a requirement of the Financial Literacy Act that the Commission identify and propose means of eliminating areas of overlap and duplication, the Commission asked federal agencies to provide information about their financial literacy activities. After reviewing these resources, the Commission said it found minimal overlap and duplication among federal financial literacy programs and did not propose the elimination of any federal activities. Similarly, to meet a requirement of the act that it assess the availability, utilization, and impact of federal financial literacy materials, the Commission asked each agency to evaluate the effectiveness of its own materials and programs--and reported that each agency deemed its programs and resources to be effective and worthy of continuance. In both cases, we believe that the process lacked the benefit of independent assessment by a disinterested party. Our report recommended that the Secretary of the Treasury, in conjunction with the Commission, provide for an independent third party to carry out the review of duplication and overlap among federal financial literacy activities as well as the review of the availability, utilization, and impact of federal financial literacy materials. In response to these recommendations, the Commission reported in its April 2007 report to Congress that it would identify an independent party to conduct assessments on both of these matters, with the first series of independent assessments to be completed in 2009. The Financial Literacy Act also charged the Commission with promoting partnerships between federal agencies and state and local governments, nonprofit organizations, and private enterprises. Partnerships between federal agencies and private sector organizations are widely seen as essential to making the most efficient use of scarce resources, facilitating the sharing of best practices among different organizations, and helping the federal government reach targeted populations via community-based organizations. Treasury officials have cited several steps the Commission has taken to promote such partnerships. These have included calls to action in the Commission's national strategy that encouraged partnerships; community outreach and events coordinated by Treasury and other agencies; and public meetings designed to gather input on the national strategy from various stakeholders. In general, the private and nonprofit financial literacy organizations with which we spoke said that these steps had been useful, but that their relationships with federal agencies and other entities have changed little overall as a result of the Commission. Several private and nonprofit national organizations have extensive networks that they have developed at the community level across the country, and some of these organizations suggested the Commission could do more to mobilize these resources as part of a national effort. Some stakeholders told us they also felt the Commission could do more to involve state and local governments. Greater collaboration by the Commission with state and local governments may be particularly important given the critical role that school districts can play in improving financial literacy. The Commission might consider how the federal government can influence or incentivize states or school districts to include financial education in school curriculums, which many experts believe is key to improving the nation's financial literacy. Given the wide array of state, local, nonprofit, and private organizations providing financial literacy programs, the involvement of the nonfederal sectors is important in supporting and expanding Commission efforts to increase financial literacy. Thus far, the Commission has taken some helpful steps to promote partnerships, consisting mainly of outreach and publicity. As the Commission continues to implement its strategy, we believe it could benefit from further developing mutually beneficial and lasting partnerships with nonprofit and private entities that will be sustainable over the long term. Our report recommended that the Commission consider ways to expand upon current efforts to cultivate sustainable partnerships with nonprofit and private entities. As part of these efforts, we recommended that the Commission consider additional ways that federal agencies could coordinate their efforts with those of private organizations that have wide networks of resources at the community level, as well as explore additional ways that the federal government might encourage and facilitate the efforts of state and local governments to improve financial literacy. In commenting on our report, Treasury noted that it had a long history of partnerships with nonfederal entities and would consult with the Commission about how to work more closely with the types of organizations described in our report. On April 17, 2007, the Commission held the inaugural meeting of the National Financial Education Network, which it said was intended to create an open dialogue and advance financial education at the state and local level. In conclusion, in the relatively short period since its creation, the Commission has played a helpful role by serving as a focal point for federal efforts and making financial literacy a more prominent issue among the media, policymakers, and consumers. We recognize the significant challenges confronting the Commission--most notably, the inherent difficulty of coordinating the efforts of 20 federal agencies. Given the small number of staff devoted to operating the Commission and the limited funding it was provided to conduct any new initiatives, we believe early efforts undertaken by the Commission represent some positive first steps. At the same time, more progress is needed if we expect the Commission to have a meaningful impact on improving the nation's financial literacy. Mr. Chairman, this concludes my prepared statement. I would be happy to answer any questions at this time. For further information on this testimony, please contact Yvonne D. Jones at (202) 512-8678, or [email protected]. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this statement. Individuals making key contributions to this testimony include Jason Bromberg, Assistant Director; Nima Patel Edwards; Eric E. Petersen; William R. Chatlos; Emily R. Chalmers; and Linda Rego. This is a work of the U.S. government and is not subject to copyright protection in the United States. It may be reproduced and distributed in its entirety without further permission from GAO. However, because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. | The Financial Literacy and Education Improvement Act created, in December 2003, the Financial Literacy and Education Commission. This statement is based on a report issued in December 2006, which responded to the act's mandate that GAO assess the Commission's progress in (1) developing a national strategy; (2) developing a Web site and hotline; and (3) coordinating federal efforts and promoting partnerships among the federal, state, local, nonprofit, and private sectors. To address these objectives, GAO analyzed Commission documents, interviewed its member agencies and private financial literacy organizations, and benchmarked the national strategy against GAO's criteria for such strategies. The National Strategy for Financial Literacy serves as a useful first step in focusing attention on financial literacy, but it is largely descriptive rather than strategic and lacks certain key characteristics that are desirable in a national strategy. The strategy provides a clear purpose, scope, and methodology and comprehensively identifies issues and challenges. However, it does not serve as a plan of action designed to achieve specific goals, and its recommendations are presented as "calls to action" that generally describe existing initiatives and do not include plans for implementation. The strategy also does not fully address some of the desirable characteristics of an effective national strategy that GAO has previously identified. For example, it does not set clear and specific goals and performance measures or milestones, address the resources needed to accomplish these goals, or fully discuss appropriate roles and responsibilities. As a result of these factors, most organizations that GAO spoke with said the strategy was unlikely to have a significant impact on their financial literacy efforts. The Commission has developed a Web site and telephone hotline that offer financial education information provided by numerous federal agencies. The Web site generally serves as an effective portal to existing federal financial literacy sites. Use of the site has grown, and it averaged about 69,000 visits per month from October 2006 through March 2007. The volume of calls to the hotline--which serves as an order line for a free tool kit of federal publications--has been limited. The Commission has not tested the Web site for usability or measured customer satisfaction with it; these are recommended best practices for federal public Web sites. As a result, the Commission does not know if visitors are able to find the information they are looking for efficiently and effectively. The Commission has taken steps to coordinate the financial literacy efforts of federal agencies and has served as a useful focal point for federal activities. However, coordinating federal efforts has been challenging, in part because the Commission must achieve consensus among 20 federal agencies, each with its own viewpoints, programs, and constituencies, and because of the Commission's limited resources. A survey of overlap and duplication and a review of the effectiveness of federal activities relied largely on agencies' self-assessments rather than the independent review of a disinterested party. The Commission has taken steps to promote partnerships with the nonprofit and private sectors through various public meetings, outreach events, and other activities. The involvement of state, local, nonprofit, and private organizations is important in supporting and expanding Commission efforts to increase financial literacy, and our report found that the Commission could benefit from further developing mutually beneficial and lasting partnerships with these entities that will be sustainable over the long term. | 5,263 | 686 |
The Spallation Neutron Source Project is, according to DOE and its scientific advisers, vitally important to the nation's scientific community. DOE estimates that as many as 2,000 scientists from universities, industries, and federal laboratories will use this facility, which is scheduled to be completed in December 2005. The five DOE national laboratories collaborating on the project are the Lawrence Berkeley National Laboratory in California, Los Alamos National Laboratory in New Mexico, Brookhaven National Laboratory in New York, Argonne National Laboratory in Illinois, and Oak Ridge National Laboratory in Tennessee. Each of the five participating laboratories is responsible for designing, building, and assembling separate components of the project. Oak Ridge National Laboratory's current operating contractor is Lockheed Martin Energy Research Corporation, which serves as the project's overall manager. Several advisory committees provide scientific advice, and a DOE review process gives technical and managerial advice. According to current estimates, the facility will take 7- 1/4 years to complete and will cost $1.36 billion. DOE approved the conceptual design for the project in June 1997 and has spent about $39 million on the project through fiscal year 1998. The Congress approved the start of the construction phase in fiscal year 1999 and provided $130 million for this purpose. DOE expects actual construction to begin in mid-2000. We reviewed the project in the context of our past experiences in examining large DOE construction projects. As this Subcommittee is well aware, DOE has not always managed large projects successfully. Our 1996 report on DOE's management of major system acquisitions (defined as projects costing about $100 million and more) found that many of DOE's large projects have cost more and taken longer to complete than planned. In the past, many were terminated before they were completed, and others never performed as expected. One reason for the cost and schedule problems associated with these projects was the lack of sufficient DOE personnel with the appropriate skills to oversee contractors' operations. Most recently, we examined DOE's efforts to clean up large concentrations of radioactive waste at the Department's Hanford Site in southeast Washington State. Although DOE is making changes to improve its management of this project, we found early indications that DOE may be having difficulty ensuring that the proper expertise is in place. In a 1997 review, DOE reported that the success of the project depends on a having a project director skilled in accelerator science and in the management of large construction projects. "It is critical that the permanent leadership for the be named as soon as possible," the review said. "It will also be a mark of ability to execute this project that key scientific, technical, and management leadership, committed to making the succeed, can be successfully recruited to before the project is funded by Congress." Despite this recognized need and the Congress's approval of the project's construction phase 5 months earlier (the Congress provided funding for design activities beginning in fiscal year 1996), Oak Ridge National Laboratory has just announced the hiring of an experienced project director. In the interim, the laboratory's associate director has been serving as the project director. This announcement came shortly after DOE's internal review committee and an independent review team strongly recommended that a project director with the right skills be recruited as quickly as possible. Other key positions remain unfilled. The project is still without a technical director, and DOE's review committee recently concluded that there was still "an inadequate level of technical management at the laboratory." This committee also noted that a full-time operations manager should be appointed and that a manager is needed to oversee the construction of the facilities that will house the equipment and instruments being built by the individual laboratories. In addition, the committee reported that the slow progress in the facilities portion of the project is due in large part to the relative inexperience of the project facilities staff. DOE also found that the designs of each of the collaborating laboratories' component parts have not effectively been integrated into the total project, primarily because Oak Ridge National Laboratory's project office lacks the appropriate technical expertise to integrate the designs and to plan for commissioning and operating the facility. Several other key project officials were hired later than originally planned. For example, a manager for environment, safety, and health was hired in December 1998, and the architect-engineering/construction management contractor was hired in November 1998. DOE had hoped to fill these important positions before the construction phase began in October 1998. Because of these delays in hiring staff, the project is underspending its appropriation. Obligations and costs are currently running at about 60 percent of the planned budget (through 4 months of the project's 87-month schedule). A major reason for the slow pace of spending is that Los Alamos National Laboratory only recently (Nov. 1998) hired a permanent team leader and consequently is behind the other laboratories in completing several project tasks. In addition, the architect-engineering/construction management contract was finalized later than originally planned. DOE officials told us they are confident, however, that the current spending pace will not affect the project's overall schedule and that the current spending patterns represent the prudent use of funds. The project's cost and schedule estimates are not fully developed and thus do not yet represent a reliable estimate (baseline). According to a senior DOE official, the current project team does not have the expertise to develop a detailed cost estimate, preferring instead to accept laboratories' cost estimates that lack supporting detail. This shortfall in expertise has delayed the development of an accurate estimate of the project's total cost. DOE's independent reviewer expressed a similar concern, noting that the cost estimate in the project is based on its design and that "higher quality estimates are needed for a credible baseline." Of particular concern are the inadequate allowances for contingencies (unforeseen costs and delays) built into the project's current cost and schedule estimates. The project's cost estimate allows 20 percent for contingencies, well below the 25-30-percent allowance that DOE and contractor officials believe is necessary for a project of this scope and complexity. Concerned about the low contingency allowance, DOE's independent review team reported that the project will not be completed at the current cost estimate. The project's contingency allowance for delays is also too low, according to current project officials. The project allows about 6 months for delays, well below the 9 to 12 months desired by project managers. DOE and laboratory project managers told us they are confident that they can increase these contingency allowances without jeopardizing the project's overall cost and schedule. The complex management approach that DOE has devised for the project creates a need for the strongest possible leadership. In particular, integrating the efforts of five national laboratories on a project of this scope requires an unprecedented level of collaboration. While staff from multiple laboratories collaborate on other scientific programs, DOE has never attempted to manage a multilaboratory effort as large and complex as this one. According to DOE, a multilaboratory structure was chosen to take advantage of the skills offered by the individual laboratories. Although Oak Ridge National Laboratory serves as the project's overall manager, staff at each of the participating laboratories do not report to Lockheed Martin Energy Research Corporation, the current Oak Ridge contractor that is managing the project. Instead, the collaborating laboratory staff report to their respective laboratory contractors--the educational institutions or private enterprises that operate the laboratories. In addition, the five laboratories participating in the project are overseen by four separate DOE operations offices. Further complicating this reporting structure, four of the five laboratories receive most of their program funding from DOE's Office of Science, under whose leadership the project is funded and managed. Los Alamos, however, is primarily funded by DOE's Defense Programs, a different component within DOE's complex organizational structure. Achieving a high level of collaboration among the diverse cultures, systems, and processes that characterize the participating laboratories, operations offices and headquarter program offices is widely recognized as the project's biggest management challenge. To facilitate collaboration among the laboratories, DOE has developed memorandums of agreement between and among the laboratories and with the four DOE operations offices that oversee the laboratories. These agreements articulate each cooperating laboratory's role and expectation for its component of the project. However, these agreements are not binding and represent the laboratory director's promise to support the project and cooperate with Oak Ridge in ensuring that required tasks at each laboratory are completed on time and within cost. DOE told us that only two of the laboratories--Los Alamos National Laboratory and Argonne National Laboratory--have the project as a performance element in their contracts with DOE. A construction project of this scale and complexity needs a single, experienced individual in charge of all aspects of the project. This individual must have the responsibility and the full authority needed to direct all aspects of the project. Because of the multi-laboratory collaborative nature of the project, the project leader must be able to directly access the management of the collaborating laboratories at the highest level." DOE's management approach for this project raises several risks. The new project director will remain an employee of Argonne National Laboratory (operated by the University of Chicago), but will work directly with Lockheed Martin Energy Research Corporation. The project director will not have direct authority over other laboratories' staff and will, in our opinion, be handicapped by having to work through many other officials to achieve results on a day-to-day basis. Senior DOE officials responded to our concerns by noting that the project director approves all work packages authorizing funding to the participating laboratories, and thereby exercises direct control over the project. DOE officials told us that the participating laboratory directors are highly committed to the project and that senior DOE managers will not hesitate to intervene to resolve disputes. Finally, DOE officials observed that the DOE review committee and the independent reviewer have praised the level of collaboration already achieved on the project. We agree that the laboratories appear to be collaborating on the project at this very early stage, but we remain concerned about DOE's reliance on memorandums of agreements in the absence of direct control. In commenting on the collaboration achieved to date, the independent reviewer also noted that "the laboratories have traditionally operated in an independent and decentralized manner which contributes to the Team's concern in this area." The independent reviewers also said that there is not a clear chain of command in the project's current organizational structure. Contributing to our concerns is well-documented evidence of problems in the laboratories' chain of command. We, along with many other reviewers, have reported that the Department lacks an effective organizational structure for managing the laboratories as a system. We noted that the absence of a senior official in the Department with program and administrative authority over the operations of all the laboratories prevents effective management of the laboratories on a continuing basis. DOE officials told us that the Under Secretary is paying close attention to the project and will intervene as necessary to resolve disputes. DOE officials have also told us that the many advisory committees created to provide technical and managerial assistance serve to enhance the laboratories' collaboration. DOE and laboratory officials have cited several instances in which the laboratories have worked together in a highly effective manner, citing, for example, the recent completion of the Advanced Photon Source at Argonne National Laboratory. These achievements, however, are not representative of the current challenges facing DOE and its laboratories and do not resolve management problems inherent in the project's current organizational structure and reporting relationships. Mr. Chairman, this concludes our statement. We would be happy to respond to any questions from you or Members of the Subcommittee. The first copy of each GAO report and testimony is free. Additional copies are $2 each. Orders should be sent to the following address, accompanied by a check or money order made out to the Superintendent of Documents, when necessary. VISA and MasterCard credit cards are accepted, also. Orders for 100 or more copies to be mailed to a single address are discounted 25 percent. U.S. General Accounting Office P.O. Box 37050 Washington, DC 20013 Room 1100 700 4th St. NW (corner of 4th and G Sts. NW) U.S. General Accounting Office Washington, DC Orders may also be placed by calling (202) 512-6000 or by using fax number (202) 512-6061, or TDD (202) 512-2537. Each day, GAO issues a list of newly available reports and testimony. To receive facsimile copies of the daily list or any list from the past 30 days, please call (202) 512-6000 using a touchtone phone. A recorded menu will provide information on how to obtain these lists. | Pursuant to a congressional request, GAO discussed the Department of Energy's (DOE) management of the Spallation Neutron Source Project, focusing on the: (1) project's cost and schedule; and (2) effectiveness of the collaborating laboratories' coordination. GAO noted that: (1) the project is not currently in trouble, but warning signs in three key areas raise concerns about whether it will be completed on time and within budget; (2) DOE has not assembled a complete team with the technical skills and experience needed to properly manage the project; (3) a permanent project director was just hired last week, 5 months after Congress approved the start of construction and over a year after the project's design was approved; (4) other important positions remain unfilled, including those of a technical director and an operations manager; (5) cost and schedule estimates for the project have not been fully developed; (6) furthermore, the project's contingency allowances for unforeseen costs and delays are too low for a project of this size and scope, according to project managers and DOE; (7) DOE's approach to managing the project requires an unprecedented level of collaboration among five different laboratories, managed through DOE's complex organizational structure; and (8) coupled with DOE's history of not successfully completing large projects on time and within budget, these warning signs make the Spallation Neutron Source project a significant management challenge for DOE and suggest a need for continued close oversight. | 2,683 | 308 |
On January 1, 2000, computer systems worldwide could malfunction or produce inaccurate information simply because the century has changed. Unless corrected, such failures could have a costly, widespread impact. The problem is rooted in how dates are recorded and computed. For the past several decades, systems have typically used two digits to represent the year--such as "97" for 1997--to save electronic storage space and reduce operating costs. In such a format, however, 2000 is indistinguishable from 1900. Software and systems experts nationwide are concerned that this ambiguity could cause systems to malfunction in unforeseen ways, or to fail completely. As we reported to you and testified to Congress earlier this year, the public faces a risk that critical services could be severely disrupted by the Year 2000 computing crisis. Financial transactions could be delayed, airline flights grounded, and national defense affected. Also, the many interdependencies that exist among governments and within key economic sectors could cause the failure of a single system to have adverse repercussions across the nation and internationally. While managers in the government and the private sector are taking many actions to mitigate these risks, a significant amount of work remains, and time frames are unrelenting. One key concern in addressing the Year 2000 problem is the availability of trained information technology personnel. We reported in April 1998, that while various agencies stated that they or their contractors had problems in obtaining or retaining information technology personnel, no governmentwide strategy existed to address recruiting and retaining the personnel with the appropriate skills for Year 2000-related work. We recommended that the Chairman of the President's Council on Year 2000 Conversion develop a personnel strategy which includes (1) determining the need for various information specialists, (2) identifying any administrative or statutory changes that would be required to waive reemployment penalties for former federal employees, and (3) identifying ways to retain key Year 2000 staff in agencies through the turn of the century. We reemphasized the need for such a strategy in a June 1998 testimony. Within the executive branch, several executive councils and agencies are responsible for the human resources aspect of the Year 2000 effort. These councils and agencies, and their respective Year 2000 responsibilities are described below: The President's Council on Year 2000 Conversion is chaired by an Assistant to the President and is comprised of one representative from each of the executive departments and from other federal agencies as determined by the chair. The chair of the Conversion Council is tasked with the following Year 2000 roles: (1) overseeing federal agencies' Year 2000 activities, (2) acting as chief spokesperson in national and international forums, (3) providing policy coordination of executive branch activities with state, local, and tribal governments, and (4) promoting appropriate federal roles with respect to private sector activities. To date, the Conversion Council has established several working groups to address Year 2000 concerns. One of these is the Workforce Issues working group, which began meeting in May 1998. The working group is chaired by the Deputy Secretary of the Department of Labor, and includes officials from the Departments of Labor, Education, Housing and Urban Development, Commerce, Defense, as well as OPM and the Small Business Administration. The group's main objective is to determine what the federal government can do to help meet the country's needs for technically skilled personnel to address the Year 2000 problem, with special attention to small businesses, local governments, and organizations in rural areas. The CIO Council is comprised of CIOs and Deputy CIOs from 30 federal departments and agencies, representatives from OMB, and the chairs of the Government Information Technology Services Board and Information Technology Resources Board. It is the principal interagency forum for improving the design, modernization, use, sharing, and performance of information technology resources. The Council's role includes: (1) developing recommendations for information technology management policy, procedures, and standards, (2) identifying opportunities to share information resources, and (3) assessing and addressing the needs of the federal government for an information technology workforce. One of the committees reporting to the CIO Council, the Education and Training Committee, is charged with addressing issues in hiring, training, and maintaining an effective federal information technology workforce. OMB is responsible for overseeing federal agencies' responses to the Year 2000 problem. In early 1997, OMB issued a broad Year 2000 strategy for the federal government and required that 24 major agencies submit quarterly reports on their Year 2000 progress. On January 20, 1998, OMB added new quarterly reporting requirements, specifically asking agencies to provide a narrative description of progress, including a description of any problems affecting progress and, in particular, any problems in acquiring or retaining skilled personnel. In March and April 1998, OMB requested that an additional 31 small agencies report their progress to OMB by April 30, 1998, and another 10 small agencies and other entities, such as the Tennessee Valley Authority and the United States Postal Service, report by May 15, 1998. Most recently, in July 1998, OMB revised its earlier reporting requirements and asked that nine small and independent agencies begin providing quarterly reports on progress in addressing difficulties relating to the Year 2000 problem. OPM, the federal government's human resources agency, provides federal agencies with personnel services and policy leadership including staffing tools, guidance on labor-management relations, preparation of government's future leaders, compensation policy development, and programs to improve workforce performance. OPM is responsible for helping agencies to equip themselves with the systems they need to manage their human resources effectively, and in light of the Year 2000 problem, is providing tools that agencies may use to help recruit and retain information technology professionals. Of the 24 large agencies reporting to OMB, 13 are expressing concerns about the availability of information technology personnel. Also, 10 of the 41 small agencies and independent entities expressed these concerns.These organizations' concerns generally fall into the categories of difficulty in recruiting and retaining internal staff and in obtaining contractor support. Appendix II identifies the organizations reporting workforce issues and summarizes them. It also identifies the organizations with no reported concerns. Both large and small agencies and entities reported that retaining key information technology staff and recruiting new staff were among their greatest concerns in addressing the Year 2000 problem. Agencies indicated that they had lost skilled information technology employees through retirements and through increased recruitment by the private sector. For example, in May 1998, the Department of Agriculture reported that several of its agencies expressed particular concern that the loss of staff would affect their ability to meet Year 2000 deadlines. Specifically, the Farm Services Agency reported that it lost 28 (7 percent) of its 403 information technology staff in the first 6 months of fiscal year 1998. Agencies have also reported that recruiting replacements for information technology personnel is difficult. The Department of Veterans Affairs noted that recruiting is very competitive for Year 2000 professionals in some geographic areas, and stated the concern that with lucrative finders fees being advertised, government employees may leave for the private sector. Also, the National Credit Union Administration reported that it had experienced difficulties in hiring senior Year 2000 program officials. Among the various types of information technology workers needed, computer programmers are reported to be in the shortest supply. For example, in February 1998, the Justice Department reported that it had difficulty hiring and retaining skilled COBOL programmers. In August, the Department reported that it continues to encounter these problems. As another example, in its February, May, and August reports, the Environmental Protection Agency stated that it experienced problems finding PL/I programmers. Recruiting and retaining qualified contract personnel is another issue frequently mentioned by agencies reporting staffing problems. Once again, the specific type of information technology worker most often mentioned as being in short supply is programmers. For example, the Department of Justice reported in February, May, and again in August that it is continuing to encounter problems in obtaining contractor support with the necessary programming skills. Agencies' concerns also include the high turnover rate of contractor staff and the time it takes to recruit contractor staff. For example, the State Department indicated in May 1998 that the recruitment cycle for replacing contract systems programmers took more time than in past years despite the use of professional recruitment services by the contract vendors. Also, several agencies noted problems with increasing contract labor wage rates. These agencies reported that they are having to increase the hourly rate they pay for contractor staff because contractors are increasing their own staff salaries. For example, the Department of Agriculture indicated that its agencies are experiencing contracting delays as vendors find it increasingly difficult to bring on more contract employees without substantial increases in contract dollars. In addition, the Federal Deposit Insurance Corporation reported that some of its contractors are losing personnel to higher salaries with other contractors. When replacing these personnel, the contractors are increasing their hourly rates. Four agencies reported delays on six system development efforts because of problems they had encountered in obtaining contractors to address the Year 2000 problem. For example, the State Department reported that the loss of key contractor personnel had delayed the completion of one of its mission critical systems, the Management, Policy and Planning Information System, by 3 months. The Department of Commerce also reported that its Patent and Trademark Office experienced a 3-month delay on one of its mission-critical system development efforts, the Classified Search and Image Retrieval system, when the contractor was unable to place qualified staff on the task. The task order was terminated with that contractor, a new task order was awarded to another contractor, and work is now proceeding. Although a significant number of agencies are reporting concerns with the availability of qualified Year 2000 staff, it is not possible to determine the full extent or severity of personnel shortages from these concerns because they are often anecdotal. For example, one department notes that one of its agencies "has had difficulty" hiring a particular type of programmer, while another reports that it is encountering "some problems" hiring personnel. Also, only six mission critical systems were reported to have "experienced delays" in reaching Year 2000 compliance because of personnel issues. Without more detailed information on the nature and extent of personnel issues, it is difficult to determine how to best address it. OPM, the Conversion Council, and the CIO Council have various initiatives underway to address Year 2000 personnel issues: OPM has provided tools to assist agencies in dealing with Year 2000 workforce issues; the Conversion Council is identifying solutions to personnel shortages in both the government and the private sector; and the CIO Council has initiated a broad study of information technology workforce issues in the government and private sector. Agencies currently have a number of aids they can use to help recruit and retain needed personnel. Some have been available for years. Others are new. These aids are summarized below: Recruitment and relocation bonuses: Federal agencies have the authority to make a lump-sum payment of up to 25 percent of basic pay to a newly appointed employee, or to an employee who must relocate in cases in which the agency determines that the position would otherwise be difficult to fill. In return for this lump-sum payment, the employee must sign a service agreement with the agency to complete a specified period of employment. Superior qualifications appointments: Agencies have the authority to set pay for new appointments or reappointments of individuals to General Schedule (GS) positions above step 1 of the grade on the basis of the candidate's superior qualifications or the agency's special need. Pay at highest previous rate: Upon reemployment, transfer, reassignment, promotion, or change in type of appointment, agencies can set an employee's basic pay by taking into account the employee's previous pay rate while employed in another civilian federal position (with certain exceptions). Temporary and term appointments: Agencies can use temporary appointments in the competitive service for positions not expected to last longer than 1 year, but which can be extended for 1 additional year. Agencies can use term appointments when positions are expected to last longer than 1 year but not more than 4 years. Retention allowances for individual employees: Agencies have the authority to make continuing payments of up to 25 percent of basic pay if the agency determines that (1) the unusually high or unique qualifications of the employee or the agency's special need for the employee's services makes it essential to retain the employee and (2) the employee would be likely to leave federal service in the absence of a retention allowance. Retention allowances must be paid in accordance with the agency's previously established retention allowance plan and must be reviewed and certified annually. Performance and incentive awards: Agencies can provide employees a lump-sum cash award on the basis of a fully successful or better rating of record or in recognition of accomplishments that contribute to the improvement of government operations. Awards based on the rating of record can be up to 10 percent of salary, or up to 20 percent for exceptional performance, provided the award does not exceed $10,000 per employee. With OPM review and approval, agencies can grant awards over $10,000, up to $25,000. Any award that would grant over $25,000 to an individual employee must be reviewed by OPM for submission to the President for approval. Quality step increases: Agencies have the authority to increase an employee's pay by providing an additional step increase to an employee who has received the highest rating of record available in the agency's performance appraisal program. Training and education costs reimbursement: Agencies can pay or reimburse an employee for all or part of the necessary expenses for training and education, including the costs for college tuition. Agencies may require service agreements for training of long duration or of high cost. Advance payments for new appointees: Agencies may advance a new hire up to two paychecks so that a new employee can meet living and other expenses. Special salary rates: Agencies may request a higher salary rate for an occupation or group of occupations nationwide or in a local area based on a finding that the government's recruitment or retention efforts are, or would likely become, significantly handicapped without those higher rates. The minimum of a special rate range may exceed the maximum of the corresponding grade by as much as 30 percent. However, no special rate may exceed the rate for Executive Level V (currently $110,700). A special rate request must be submitted to OPM by department headquarters and must be coordinated with other federal agencies with employees in the same occupational group and geographic area. Dual compensation waivers for retirees: On March 30, 1998, OPM issued a memorandum announcing that agencies could reemploy federal retirees (civilian and military) to work specifically on the Year 2000 conversion without the usually required reduction in the retiree's salary or military annuity. With OPM's determination that the Year 2000 computer conversion problem is an "unusual circumstance," agencies can request delegated authority from OPM to rehire former federal personnel (up to a maximum number of individual exceptions approved by OPM) on a temporary basis through March 31, 2000. Premium pay for employees performing emergency work: Agencies have authority under the law and OPM regulations to make exceptions to the biweekly limitation on premium pay (including overtime, night, and holiday pay) when the head of an agency or his or her designee determines that an emergency involving a direct threat to life or property exists. In its March memorandum, OPM also encouraged agency heads to exercise this authority in the case of any employee who performs emergency work to resolve a direct threat to property (including monetary errors or cost) in connection with updating computer systems to prevent malfunction, erroneous computations, or other problems associated with the Year 2000. By exercising this authority, agencies will be able to compensate employees who perform significant amounts of overtime work related to the Year 2000 problem, as long as the total of their basic pay and premium pay does not exceed a certain rate. Exclusions from early retirement programs: On June 15, 1998, OPM issued interim regulations allowing agencies, with OPM approval, to limit the scope of voluntary early retirement offers when separating or downgrading employees in a major reorganization, a major reduction in force, or a major transfer of functions. Agencies can limit their retirement offers on the basis of (1) occupational series or level, (2) organizational unit, (3) geographic area, (4) specific window periods, (5) other similar nonpersonal factors, or (6) any combination of these factors that the agency determines appropriate. Using this tool, agencies can exclude critical Year 2000 positions from any voluntary early retirement program it offers. Retention allowances for groups or categories of employees: On June 23, 1998, OPM issued interim regulations allowing agencies to authorize a retention allowance of up to 10 percent of an employee's rate of basic pay (or up to 25 percent with OPM approval) for a group or category of employees such as computer programmers and system engineers. Retention allowances authorized for a category of employees must be based on a written determination that (1) the category of employees has unusually high or unique qualifications, or (2) the agency has a special need for the employees' services that makes it essential to retain the employees in that category, and (3) it is reasonable to presume that there is a high risk that a significant number of employees in the targeted category are likely to leave federal service in the absence of the allowance. The Conversion Council's Year 2000 Workforce Issues working group began meeting in May 1998 to address some of the Year 2000 workforce issues in both the government and private sector, focusing on three areas: (1) raising awareness of the Year 2000 problem, (2) helping managers assess their particular situations, and (3) connecting managers with solution-providers, including programmers, project managers, and those familiar with embedded systems. In July 1998, the working group released a draft sector action plan which lists key activities that the group is undertaking. Specifically, the group established an Internet site to link information technology workers with the companies that need them to solve the Year 2000 problem;is attempting to determine the effect of workforce issues on local communities by surveying community colleges; and is exploring outreach activities, such as having community colleges raise awareness of the Year 2000 problem within their communities and assist in solving the problem. Although some of these initiatives may benefit the government, the working group is clearly adopting a nationwide focus and is not solely targeting the federal workforce issue. In March 1998, the CIO Council tasked its Education and Training committee with crafting recommendations and legislation by May 1998 to help agencies recruit and retain information technology personnel. However, the committee found that in order to develop recommendations, it first needed more information about the problem. In May, the committee formed five focus teams to study the federal information technology workforce challenge. The focus teams will address the following areas: (1) national workforce strategies, (2) federal workforce planning, (3) recruitment, (4) retention, and (5) executive development. Each team will present its findings at a forum of CIO Council Members, Human Resource Council members, and OPM personnel currently planned for November 1998. The committee expects to prepare a final conference report to the CIO Council after the November 1998 forum. According to the Co-Chair of the Committee, part of the study will be focused specifically on the Year 2000 personnel issue and determining the extent and scope of the personnel problems that exist for the Year 2000 problem. It is unclear, however, that this committee will produce timely recommendations because the final report is scheduled to be issued after November 1998, which may be too late to address the Year 2000 workforce issues. While the executive councils and OPM have initiatives underway to help resolve Year 2000 workforce issues, it is not yet known if these efforts will effectively address federal agencies' concerns. OPM has developed new human resources management flexibilities, the Conversion Council working group is identifying solutions that are applicable to private industry, and the CIO Council is studying the problem. No organization, however, is working with individual agencies to determine how significant their personnel concerns are, and if they can be adequately resolved through existing human resources management tools. Given that the potential consequences of having an inadequate workforce to tackle critical Year 2000 conversions are severe, such an endeavor seems worthwhile. A significant number of agencies have reported concerns about the availability of information technology personnel needed to address their Year 2000 problems. Also, the executive councils and OPM have a number of initiatives underway to address perceived personnel shortages. However, it is not yet known whether these efforts will ensure an adequate supply of qualified personnel to solve the government's Year 2000 problem. Various organizations have responsibilities in this arena. While individual agencies are in the best position to determine if current tools adequately resolve their own Year 2000 personnel issues, OMB is responsible for overseeing federal agencies' responses to the Year 2000 problem, and OPM has both the knowledge of existing personnel management options and, in some cases, the authority to waive existing rules or develop new approaches. The workforce issue could quickly become more complicated. As awareness of the criticality of the Year 2000 problem grows throughout government and industry, there is a chance that competition for limited skilled personnel will increase. If this more vigorous competition occurs, the government may find it increasingly difficult to obtain and retain the skilled personnel needed to correct its mission critical systems in time. Given the adverse consequences of any staffing shortages, it is critical that agencies be able to quickly determine if mechanisms currently exist to resolve personnel issues or if additional solutions are needed. Given the likelihood that critical government operations will cease if key systems are not made Year 2000 compliant, we recommend that the Director of the Office of Management and Budget, as part of the agency's monitoring responsibilities for the government's Year 2000 program, determine if recent OPM initiatives have satisfactorily addressed agencies' reported personnel problems. If these problems have not been addressed by existing OPM tools, the Director of the Office of Management and Budget should designate an OMB official who, together with OPM and the CIO Council, would proactively and quickly help individual agencies resolve their Year 2000 workforce concerns. We also recommend that the Director of the Office of Management and Budget work with the CIO Council to expedite the portions of its ongoing study that are relevant to the Year 2000 problem, with a goal of issuing its Year 2000-related recommendations as soon as possible. The Chairman of the President's Council on Year 2000 Conversion, as well as officials representing the CIO Council, OMB, and OPM provided oral comments on a draft of this report. These officials concurred with the report and our recommendations. They also offered several technical suggestions which we have incorporated as appropriate. We are sending copies of this report to the Chairmen and Ranking Minority Members of the Senate and House Committee on Appropriations and the House Committee on Government Reform and Oversight; the Ranking Minority Member of the House Committee on Banking and Financial Services; the Co-Chairs of the House Year 2000 Task Force; the Chairman of the President's Council on Year 2000 Conversion; the Director of the Office of Management and Budget; and the Director of the Office of Personnel Management. Copies will also be made available to others upon request. If you have any questions about this report, please contact us at (202) 512-6253 and (202) 512-8676, respectively. We can also be reached by e-mail at [email protected] and [email protected]. Major contributors to this report are listed in appendix III. To determine the nature and extent of the Year 2000 personnel issues being reported by federal agencies, we reviewed and analyzed the Year 2000 progress reports submitted to OMB by 24 large agencies in February, May, and August 1998, by 40 of the 41 small agencies and entities in April and May 1998, and by 9 of those same small agencies and entities that were requested to report in August 1998. In addition, when personnel issues were not specifically addressed by these agencies in their progress reports or when the progress reports were not submitted to OMB, we conducted telephone interviews with agency officials to determine if the agencies were experiencing personnel problems related to the Year 2000 problem. Further, after reviewing the August 1998 reports, we conducted telephone interviews with agency officials when an agency was newly reporting it had no personnel problems. We did this to determine if prior concerns had been resolved. We did not independently assess the reliability of the information provided by the agencies. To identify what is being done to address personnel shortages related to the Year 2000 problem, we evaluated the Year 2000 personnel efforts of OPM, the Human Resources Technology Council, and the CIO Council's Education and Training and Year 2000 Committees. We also reviewed the efforts of the Workforce Issues Working Group of the President's Council on Year 2000 Conversion. In addition, we interviewed officials from each of these groups. We conducted our review from May 1998 through August 1998 in accordance with generally accepted government auditing standards. We provided a draft of this report to the Chair of the President's Council on Year 2000 Conversion, the Chair of the CIO Council, and OPM and OMB management and incorporated their comments as appropriate. Table II.1 summarizes the concerns identified by the various agencies and other entities in their reports to OMB. In cases where the agencies and entities did not specifically report on personnel issues, we interviewed agency officials to determine if the agencies were experiencing personnel problems related to the Year 2000 problem. Also, in cases where agencies newly reported that they had no personnel problems in August 1998, we interviewed agency officials to determine if prior concerns had been resolved. We did not independently assess the reliability of the information provided by the agencies. Table II.2 lists the agencies and entities that identified no concerns with the availability of personnel to address the Year 2000 problem. Large departments and agencies (13) The agency is experiencing increased attrition, and reported concern with the short supply of human resources and the upward pressure on salaries of key personnel. A contractor hired to perform legacy system maintenance and Year 2000 compliance services did not supply the key officials as provided in the contract for 5 weeks. The department is encountering high turnover and is finding it difficult to compete with higher salaries being offered by private industry. Vendors are finding it increasingly difficult to bring on contract employees without substantial increases in contract dollars and have lost contract employees who have left for better paying positions. The department reported that it continues to experience difficulties in finding and hiring qualified information technology personnel. A key contractor was unable to provide qualified staff, causing contract delays. The department reported that one facility has experienced difficulty in retaining and recruiting programming resources for the Year 2000 effort. Efforts to contract for mainframe systems programmers at one facility have not been successful. The agency experienced problems in finding programmers needed to fix key payroll systems. While the agency has located contractor resources, the costs are higher than the agency has historically paid. (continued) Description of in-house personnel issue The department reported concern that unforeseen retirements could affect its Year 2000 efforts. The department is experiencing difficulty acquiring and retaining skilled personnel, particularly COBOL programmers. The department is encountering problems in obtaining contractor support with necessary programming skills. The department is experiencing problems in acquiring and retaining skilled programmers. The department reported concerns with the availability of trained contractor staff and that the turnover rate tends to be high due to current market conditions. The Ames Research Center, located in the Silicon Valley area, is finding it a challenge to hire qualified contract programmers. The department experienced high turnover in systems support personnel, and is now facing severe staffing shortages. The department reported that recruiting to replace contract systems programmers has taken more time than in past years and has resulted in a labor rate increase. The department reported that one agency's contractors have experienced problems in finding qualified programmers. The department encountered an increased rate of attrition of its information systems workforce. It reported that skilled programmers, especially those with skills in legacy platforms, are in strong demand with the private sector, which can pay significantly higher salaries than the government. (continued) Description of in-house personnel issue The department reported that recruiting is very intensive for Year 2000 professionals in some geographic areas and expressed the concern that government employees may leave for the private sector because of the lucrative "finders fees" being advertised. Contractors are having difficulty finding and retaining personnel. Small agencies and other entities (10) The agency reported that retention and recruitment could become key issues if key computer programmers and/or network personnel decide to leave the agency. The agency expressed concern about retaining sufficient qualified contractors to carry out needed work as the demand for Year 2000 programmers increases. The agency has encountered problems with contractors who are losing personnel for higher salaries at other contractors. When replacing the contractor staff, they are increasing the hourly rate. The agency reported that it has insufficient personnel resources to accomplish all Year 2000 renovation, testing, and implementation work in-house. The agency experienced delays in scheduling the conversion of its systems due to competing workloads. Also, it has experienced difficulties in hiring senior Year 2000 program officials. (continued) The agency has encountered increased competition for skilled Year 2000 personnel, and has had to settle for Year 2000 contractors with fewer skills than needed because the contractors that possessed all the desired skills generally cost too much. Also, the agency reported that its consultants are paying their staffs more to retain them, and these costs are being passed on to the agency. The office reported that the Year 2000 program is severely straining the workload of existing information systems and technology personnel, and that any diversion of personnel to the Year 2000 program creates a potential support problem for ongoing or day-to-day operations. The office reported that finding contractor personnel with the appropriate skill level to analyze legacy systems and to recommend alternatives continues to be a problem. The corps reported its concerns with retaining information resources management staff and filling vacancies. The agency has encountered problems matching information technology skill sets with specific Year 2000 needs and has found that there is a strong employment market for information technology skills. It also reported that salaries have increased for all information technology skills, not just for Year 2000 staff. The service reported that retaining skilled resources needed for remediation and testing continues to be a challenge which is exacerbated by a limited labor pool. The service reported that retaining skilled contractor staff continues to be a challenge. Large Departments and Agencies (11) Small Agencies and Other Entities (31) Armed Forces Retirement Home Board Corporation for National and Community Services Defense Nuclear Facilities Safety Board Export-Import Bank of the United States Federal Home Loan Mortgage Corporation Federal Retirement Thrift Investment Board John F. Kennedy Center for the Performing Arts (continued) Small Agencies and Other Entities, cont. U.S. Arms Control and Disarmament Agency U.S. Trade Representative, Executive Office of the President Four large and three small agencies did not state concerns in their most recent reports to OMB, but told us of them in subsequent discussions. These agencies are not included in this list. Glenda C. Wright, Senior Information Systems Analyst The first copy of each GAO report and testimony is free. Additional copies are $2 each. Orders should be sent to the following address, accompanied by a check or money order made out to the Superintendent of Documents, when necessary. VISA and MasterCard credit cards are accepted, also. Orders for 100 or more copies to be mailed to a single address are discounted 25 percent. U.S. General Accounting Office P.O. Box 37050 Washington, DC 20013 Room 1100 700 4th St. NW (corner of 4th and G Sts. NW) U.S. General Accounting Office Washington, DC Orders may also be placed by calling (202) 512-6000 or by using fax number (202) 512-6061, or TDD (202) 512-2537. Each day, GAO issues a list of newly available reports and testimony. To receive facsimile copies of the daily list or any list from the past 30 days, please call (202) 512-6000 using a touchtone phone. A recorded menu will provide information on how to obtain these lists. | Pursuant to a congressional request, GAO reviewed workforce issues associated with the year 2000 computing crisis, focusing on: (1) the nature and extent of year 2000 personnel issues being reported by federal agencies; and (2) what is being done by the government to address reported federal personnel shortages related to the year 2000 problem. GAO noted that: (1) about half of the 24 large agencies and a quarter of the 41 small agencies and independent entities reporting to the Office of Management and Budget (OMB) expressed concerns that the personnel needed to resolve the year 2000 problem would not be available; (2) generally, these concerns fall into the categories of difficulty in finding and keeping qualified government personnel, and difficulty in obtaining contractors; (3) while a significant number of agencies are raising these concerns, their comments are largely anecdotal and a comprehensive analytical assessment of the issue has not yet been made; (4) as a result, the full extent and severity of the year 2000 workforce issue across the government is not known; (5) the President's Council on Year 2000 Conversion, the Chief Information Officers (CIO) Council, and the Office of Personnel Management (OPM) have various initiatives under way to address reported year 2000 personnel issues; (6) for example, OPM has recently developed additional human resources management aids to assist agencies in dealing with year 2000 workforce issues; (7) while such initiatives have provided agencies with important options to help address reported year 2000 personnel problems, it is not yet clear that recent actions have enabled agencies to successfully resolve all perceived personnel issues; (8) accordingly, it is essential that OMB, as part of its monitoring responsibilities for the government's year 2000 program, continue to solicit from agencies whether they have any remaining year 2000 personnel problems and to help provide specific assistance to individual agencies; and (9) moreover, OMB should work with the CIO Council to expedite evaluations of the full extent and scope of information technology personnel issues to help formulate effective solutions. | 6,756 | 414 |
As you know, Mr. Chairman, for over two decades, we have reported on problems with DOD's personnel security clearance program as well as the financial costs and risks to national security resulting from these problems (see Related GAO Reports at the end of this statement). For example, at the turn of the century, we documented problems such as incomplete investigations, inconsistency in determining eligibility for clearances, and a backlog of overdue clearance reinvestigations that exceeded 500,000 cases. More recently in 2004, we identified continuing and new impediments hampering DOD's clearance program and made recommendations for increasing the effectiveness and efficiency of the program. Also in 2004, we testified before this committee on clearance- related problems faced by industry personnel. A critical step in the federal government's efforts to protect national security is to determine whether an individual is eligible for a personnel security clearance. Specifically, an individual whose job requires access to classified information must undergo a background investigation and adjudication (determination of eligibility) in order to obtain a clearance. As with federal government workers, the demand for personnel security clearances for industry personnel has increased during recent years. Additional awareness of threats to our national security since September 11, 2001, and efforts to privatize federal jobs during the last decade are but two of the reasons for the greater number of industry personnel needing clearances today. As of September 30, 2003, industry personnel held about one-third of the approximately 2 million DOD-issued clearances. DOD's Office of the Under Secretary of Defense for Intelligence has overall responsibility for DOD clearances, and its responsibilities also extend beyond DOD. Specifically, that office's responsibilities include obtaining background investigations and adjudicating clearance eligibility for industry personnel in more than 20 other federal agencies, as well as the clearances of staff in the federal government's legislative branch. Problems in the clearance program can negatively affect national security. For example, delays reviewing security clearances for personnel who are already doing classified work can lead to a heightened risk of disclosure of classified information. In contrast, delays in providing initial security clearances for previously noncleared personnel can result in other negative consequences, such as additional costs and delays in completing national security-related contracts, lost-opportunity costs, and problems retaining the best qualified personnel. Longstanding delays in completing hundreds of thousands of clearance requests for servicemembers, federal employees, and industry personnel as well as numerous impediments that hinder DOD's ability to accurately estimate and eliminate its clearance backlog led us to declare the program a high-risk area in January 2005. The 25 areas on our high-risk list at that time received their designation because they are major programs and operations that need urgent attention and transformation in order to ensure that our national government functions in the most economical, efficient, and effective manner possible. Shortly after we placed DOD's clearance program on our high-risk list, a major change in DOD's program occurred. In February 2005, DOD transferred its personnel security investigations functions and about 1,800 investigative positions to OPM. Now DOD obtains nearly all of its clearance investigations from OPM, which is currently responsible for 90 percent of the personnel security clearance investigations in the federal government. DOD retained responsibility for adjudication of military personnel, DOD civilians, and industry personnel. Other recent significant events affecting DOD's clearance program have been the passage of the Intelligence Reform and Terrorism Prevention Act of 2004 and the issuance of the June 2005 Executive Order No. 13381, Strengthening Processes Relating to Determining Eligibility for Access to Classified National Security Information. The act included milestones for reducing the time to complete clearances, general specifications for a database on security clearances, and requirements for greater reciprocity of clearances (the acceptance of a clearance and access granted by another department, agency, or military service). Among other things, the executive order resulted in the Office of Management and Budget (OMB) taking a lead role in preparing a strategic plan to improve personnel security clearance processes governmentwide. Using this context for understanding the interplay between DOD and OPM in DOD's personnel security clearance processes, my statement addresses two objectives in this statement: (1) key points of a billing dispute between DOD and OPM and (2) some of the major impediments affecting clearances for industry personnel. As requested by this committee, we have an ongoing examination of the timeliness and completeness of the processes used to determine the eligibility of industry personnel to receive top secret clearances. We expect to present the results of this work in the fall. My statement today, however, is based primarily on our completed work and our institutional knowledge from our prior reviews of the steps in the clearance processes used by DOD and, to a lesser extent, other agencies. In addition, we used information from the Intelligence Reform and Terrorism Prevention Act of 2004; executive orders; and other documents, such as a memorandum of agreement between DOD and OPM. We conducted our work in accordance with generally accepted government auditing standards in May 2006. DOD stopped processing applications for clearance investigations for industry personnel on April 28, 2006, despite an already sizeable backlog. DOD attributed its actions to an overwhelming volume of requests for industry personnel security investigations and funding constraints. We will address the issue of workload projections later when we discuss impediments that affect industry personnel as well as servicemembers and federal employees, but first we would like to talk about the issue of funding. An important consideration in understanding the funding constraints that contributed to the stoppage is a DOD-OPM billing dispute, which has resulted in the Under Secretary of Defense for Intelligence requesting OMB mediation. The dispute stems from the February 2005 transfer of DOD's personnel security investigations function to OPM. The memorandum of agreement signed by the OPM Director and the DOD Deputy Secretary prior to the transfer lists many types of costs that DOD may incur for up to 3 years after the transfer of the investigations function to OPM. One cost, an adjustment to the rates charged to agencies for clearance investigations, provides that "OPM may charge DOD for investigations at DOD's current rates plus annual price adjustments plus a 25 percent premium to offset potential operating losses. OPM will be able to adjust, at any point of time during the first three year period after the start of transfer, the premium as necessary to cover estimated future costs or operating losses, if any, or offset gains, if any." The Under Secretary's memorandum says that OPM has collected approximately $50 million in premiums in addition to approximately $144 million for other costs associated with the transfer. The OPM Associate Director subsequently listed costs that OPM has incurred. To help resolve this billing matter, DOD requested mediation from OMB, in accordance with the memorandum of agreement between DOD and OPM. Information from the two agencies indicates that in response to DOD's request, OMB has directed them to continue to work together to resolve the matter. The DOD and OPM offices of inspector general are currently investigating all of the issues raised in the Under Secretary's and Associate Director's correspondences and have indicated that they intend to issue reports on their reviews this summer. Some impediments, if not effectively addressed, could hinder the timely determination of clearance eligibility for servicemembers, civilian government employees, and industry personnel; whereas other impediments would mainly affect industry personnel. The inability to accurately estimate the number of future clearance requests and the expiration of the previously mentioned executive order that resulted in high-level involvement by OMB could adversely affect the timeliness of eligibility determinations for all types of employee groups. In contrast, an increased demand for top secret clearances for industry personnel and the lack of reciprocity would primarily affect industry personnel. A major impediment to providing timely clearances is the inaccurate projections of the number of requests for security clearances DOD-wide and for industry personnel specifically. As we noted in our May 2004 testimony before this committee, DOD's longstanding inability to accurately project its security clearance workload makes it difficult to determine clearance-related budgets and staffing requirements. In fiscal year 2001, DOD received 18 percent (about 150,000) fewer requests than it expected, and in fiscal years 2002 and 2003, it received 19 and 13 percent (about 135,000 and 90,000) more requests than projected, respectively. In 2005, DOD was again uncertain about the number and level of clearances that it required, but the department reported plans and efforts to identify clearance requirements for servicemembers, civilian employees, and contractors. For example, in response to our May 2004 recommendation to improve the projection of clearance requests for industry personnel, DOD indicated that it was developing a plan and computer software that would enable the government's contracting officers to (1) authorize a certain number of industry personnel clearance investigations for any given contract, depending on the number of clearances required to perform the classified work on that contract, and (2) link the clearance investigations to the contract number. Another potential impediment that could slow improvements in personnel security clearance processes in DOD--as well as governmentwide--is the July 1, 2006, expiration of Executive Order No. 13381. Among other things, this executive order delegated responsibility for improving the clearance process to the Director of OMB for about 1 year. We have been encouraged by the high level of commitment that OMB has demonstrated in the development of a governmentwide plan to address clearance-related problems. Also, the OMB Deputy Director met with GAO officials to discuss OMB's general strategy for addressing the problems that led to our high-risk designation for DOD's clearance program. Demonstrating strong management commitment and top leadership support to address a known risk is one of the requirements for removing DOD's clearance program from GAO's high-risk list. Because there has been no indication that the executive order will be extended, we are concerned about whether such progress will continue without OMB's high-level management involvement. While OPM has provided some leadership in assisting OMB with the development of the governmentwide plan, OPM may not be in a position to assume additional high-level commitment for a variety of reasons. These reasons include (1) the governmentwide plan lists many management challenges facing OPM and the Associate Director of its investigations unit, such as establishing a presence to conduct overseas investigations and adjusting its investigative workforce to the increasing demand for clearances; (2) adjudication of personnel security clearances and determination of which organizational positions require such clearances are outside the current emphases for OPM; and (3) agencies' disputes with OPM--such as the current one regarding billing--may require a high-level third party to mediate a resolution that is perceived to be impartial. As we have previously identified, an increase in the demand for top secret clearances could have workload and budgetary implications for DOD and OPM if such requests continue to occur. In our 2004 report, we noted that the proportion of requests for top secret clearances for industry personnel increased from 17 to 27 percent from fiscal years 1995 through 2003. This increase has workload implications because top secret clearances (1) must be renewed every 5 years, compared to every 10 years for secret clearances, and (2) require more information about the applicant than secret clearances do. Our 2004 analyses further showed that the 10-year cost to the government was 13 times higher for a person with a top secret clearance ($4,231) relative to a person with a secret clearance ($328). Thus, if clearance requirements for organizational positions are set higher than needed, the government's capacity to decrease the clearance backlog is reduced while the cost of the clearance program is increased. When the reciprocity of clearances or access is not fully utilized, industry personnel are prevented from working. In addition to having a negative effect on the employee and the employer, the lack of reciprocity has adverse effects for the government, including an increased workload for the already overburdened staff who investigate and adjudicate security clearances. Problems with reciprocity of clearances or access, particularly for industry personnel, have continued to occur despite the establishment in 1997 of governmentwide investigative standards and adjudicative guidelines. The Reciprocity Working Group, which helped to prepare information for the governmentwide plan to improve the security clearance process, noted that "a lack of reciprocity often arises due to reluctance of the gaining activity to inherit accountability for what may be an unacceptable risk due to poor quality investigations and/or adjudications." Congress enacted reciprocity requirements in the Intelligence Reform and Terrorism Prevention Act of December 2004, and OMB promulgated criteria in December 2005 for federal agencies to follow in determining whether to accept security clearances from other government agencies. Because of how recently these changes were made, their impact is unknown. We will continue to assess and monitor DOD's personnel security clearance program at your request. We are conducting work on the timeliness and completeness of investigations and adjudications for top secret clearances for industry personnel and we will report that information to this committee this fall. Also, our standard steps of monitoring programs on our high-risk list require that we evaluate the progress that agencies make toward being removed from the list. Lastly, we monitor our recommendations to agencies to determine whether steps are being taken to overcome program deficiencies. For further information regarding this testimony, please contact me at (202)512-5559 or [email protected]. Individuals making key contributions to this testimony include Jack E. Edwards, Assistant Director; Jerome Brown; Kurt A. Burgeson; Susan C. Ditto; David Epstein; Sara Hackley; James Klein; and Kenneth E. Patton. Managing Sensitive Information: Departments of Energy and Defense Policies and Oversight Could Be Improved. GAO-06-369. Washington, D.C.: March 7, 2006. Managing Sensitive Information: DOE and DOD Could Improve Their Policies and Oversight. GAO-06-531T. Washington, D.C.: March 14, 2006. GAO's High-Risk Program. GAO-06-497T. Washington, D.C.: March 15, 2006. Questions for the Record Related to DOD's Personnel Security Clearance Program and the Government Plan for Improving the Clearance Process. GAO-06-323R. Washington, D.C.: January 17, 2006. DOD Personnel Clearances: Government Plan Addresses Some Long- standing Problems with DOD's Program, But Concerns Remain. GAO-06- 233T. Washington, D.C.: November 9, 2005. Defense Management: Better Review Needed of Program Protection Issues Associated with Manufacturing Presidential Helicopters. GAO-06- 71SU. Washington, D.C.: November 4, 2005. DOD's High-Risk Areas: High-Level Commitment and Oversight Needed for DOD Supply Chain Plan to Succeed. GAO-06-113T. Washington, D.C.: October 6, 2005. Questions for the Record Related to DOD's Personnel Security Clearance Program. GAO-05-988R. Washington, D.C.: August 19, 2005. Industrial Security: DOD Cannot Ensure Its Oversight of Contractors under Foreign Influence Is Sufficient. GAO-05-681. Washington, D.C.: July 15, 2005. DOD Personnel Clearances: Some Progress Has Been Made but Hurdles Remain to Overcome the Challenges That Led to GAO's High-Risk Designation. GAO-05-842T. Washington, D.C.: June 28, 2005. Defense Management: Key Elements Needed to Successfully Transform DOD Business Operations. GAO-05-629T. Washington, D.C.: April 28, 2005. Maritime Security: New Structures Have Improved Information Sharing, but Security Clearance Processing Requires Further Attention. GAO-05-394. Washington, D.C.: April 15, 2005. DOD's High-Risk Areas: Successful Business Transformation Requires Sound Strategic Planning and Sustained Leadership. GAO-05-520T. Washington, D.C.: April 13, 2005. GAO's 2005 High-Risk Update. GAO-05-350T. Washington, D.C.: February 17, 2005. High-Risk Series: An Update. GAO-05-207. Washington, D.C.: January 2005. Intelligence Reform: Human Capital Considerations Critical to 9/11 Commission's Proposed Reforms. GAO-04-1084T. Washington, D.C.: September 14, 2004. DOD Personnel Clearances: Additional Steps Can Be Taken to Reduce Backlogs and Delays in Determining Security Clearance Eligibility for Industry Personnel. GAO-04-632. Washington, D.C.: May 26, 2004. DOD Personnel Clearances: Preliminary Observations Related to Backlogs and Delays in Determining Security Clearance Eligibility for Industry Personnel. GAO-04-202T. Washington, D.C.: May 6, 2004. Security Clearances: FBI Has Enhanced Its Process for State and Local Law Enforcement Officials. GAO-04-596. Washington, D.C.: April 30, 2004. Industrial Security: DOD Cannot Provide Adequate Assurances That Its Oversight Ensures the Protection of Classified Information. GAO-04-332. Washington, D.C.: March 3, 2004. DOD Personnel Clearances: DOD Needs to Overcome Impediments to Eliminating Backlog and Determining Its Size. GAO-04-344. Washington, D.C.: February 9, 2004. Aviation Security: Federal Air Marshal Service Is Addressing Challenges of Its Expanded Mission and Workforce, but Additional Actions Needed. GAO-04-242. Washington, D.C.: November 19, 2003. Results-Oriented Cultures: Creating a Clear Linkage between Individual Performance and Organizational Success. GAO-03-488. Washington, D.C.: March 14, 2003. Defense Acquisitions: Steps Needed to Ensure Interoperability of Systems That Process Intelligence Data. GAO-03-329. Washington D.C.: March 31, 2003. Managing for Results: Agency Progress in Linking Performance Plans With Budgets and Financial Statements. GAO-02-236. Washington D.C.: January 4, 2002. Central Intelligence Agency: Observations on GAO Access to Information on CIA Programs and Activities. GAO-01-975T. Washington, D.C.: July 18, 2001. Determining Performance and Accountability Challenges and High Risks. GAO-01-159SP. Washington, D.C.: November 2000. DOD Personnel: More Consistency Needed in Determining Eligibility for Top Secret Clearances. GAO-01-465. Washington, D.C.: April 18, 2001. DOD Personnel: More Accurate Estimate of Overdue Security Clearance Reinvestigations Is Needed. GAO/T-NSIAD-00-246. Washington, D.C.: September 20, 2000. DOD Personnel: More Actions Needed to Address Backlog of Security Clearance Reinvestigations. GAO/NSIAD-00-215. Washington, D.C.: August 24, 2000. Security Protection: Standardization Issues Regarding Protection of Executive Branch Officials. GAO/T-GGD/OSI-00-177. Washington, D.C.: July 27, 2000. Security Protection: Standardization Issues Regarding Protection of Executive Branch Officials. GAO/GGD/OSI-00-139. Washington, D.C.: July 11, 2000. Computer Security: FAA Is Addressing Personnel Weaknesses, But Further Action Is Required. GAO/AIMD-00-169. Washington, D.C.: May 31, 2000. DOD Personnel: Weaknesses in Security Investigation Program Are Being Addressed. GAO/T-NSIAD-00-148. Washington, D.C.: April 6, 2000. DOD Personnel: Inadequate Personnel Security Investigations Pose National Security Risks. GAO/T-NSIAD-00-65. Washington, D.C.: February 16, 2000. DOD Personnel: Inadequate Personnel Security Investigations Pose National Security Risks. GAO/NSIAD-00-12. Washington, D.C.: October 27, 1999. Background Investigations: Program Deficiencies May Lead DEA to Relinquish Its Authority to OPM. GAO/GGD-99-173. Washington, D.C.: September 7, 1999. Department of Energy: Key Factors Underlying Security Problems at DOE Facilities. GAO/T-RCED-99-159. Washington, D.C.: April 20, 1999. Performance Budgeting: Initial Experiences Under the Results Act in Linking Plans With Budgets. GAO/AIMD/GGD-99-67. Washington, D.C.: April 12, 1999. Military Recruiting: New Initiatives Could Improve Criminal History Screening. GAO/NSIAD-99-53. Washington, D.C.: February 23, 1999. Executive Office of the President: Procedures for Acquiring Access to and Safeguarding Intelligence Information. GAO/NSIAD-98-245. Washington, D.C.: September 30, 1998. Inspectors General: Joint Investigation of Personnel Actions Regarding a Former Defense Employee. GAO/AIMD/OSI-97-81R. Washington, D.C.: July 10, 1997. Privatization of OPM's Investigations Service. GAO/GGD-96-97R. Washington, D.C.: August 22, 1996. Cost Analysis: Privatizing OPM Investigations. GAO/GGD-96-121R. Washington, D.C.: July 5, 1996. Personnel Security: Pass and Security Clearance Data for the Executive Office of the President. GAO/NSIAD-96-20. Washington, D.C.: October 19, 1995. Privatizing OPM Investigations: Implementation Issues. GAO/T-GGD-95- 186. Washington, D.C.: June 15, 1995. Privatizing OPM Investigations: Perspectives on OPM's Role in Background Investigations. GAO/T-GGD-95-185. Washington, D.C.: June 14, 1995. Security Clearances: Consideration of Sexual Orientation in the Clearance Process. GAO/NSIAD-95-21. Washington, D.C.: March 24, 1995. Background Investigations: Impediments to Consolidating Investigations and Adjudicative Functions. GAO/NSIAD-95-101. Washington, D.C.: March 24, 1995. Managing DOE: Further Review Needed of Suspensions of Security Clearances for Minority Employees. GAO/RCED-95-15. Washington, D.C.: December 8, 1994. Personnel Security Investigations. GAO/NSIAD-94-135R. Washington, D.C.: March 4, 1994. Classified Information: Costs of Protection Are Integrated With Other Security Costs. GAO/NSIAD-94-55. Washington, D.C.: October 20, 1993. Nuclear Security: DOE's Progress on Reducing Its Security Clearance Work Load. GAO/RCED-93-183. Washington, D.C.: August 12, 1993. Personnel Security: Efforts by DOD and DOE to Eliminate Duplicative Background Investigations. GAO/RCED-93-23. Washington, D.C.: May 10, 1993. Administrative Due Process: Denials and Revocations of Security Clearances and Access to Special Programs. GAO/T-NSIAD-93-14. Washington, D.C.: May 5, 1993. DOD Special Access Programs: Administrative Due Process Not Provided When Access Is Denied or Revoked. GAO/NSIAD-93-162. Washington, D.C.: May 5, 1993. Security Clearances: Due Process for Denials and Revocations by Defense, Energy, and State. GAO/NSIAD-92-99. Washington, D.C.: May 6, 1992. Due Process: Procedures for Unfavorable Suitability and Security Clearance Actions. GAO/NSIAD-90-97FS. Washington, D.C.: April 23, 1990. Weaknesses in NRC's Security Clearance Program. GAO/T-RCED-89-14. Washington, D.C.: March 15, 1989. Nuclear Regulation: NRC's Security Clearance Program Can Be Strengthened. GAO/RCED-89-41. Washington, D.C.: December 20, 1988. Nuclear Security: DOE Actions to Improve the Personnel Clearance Program. GAO/RCED-89-34. Washington, D.C.: November 9, 1988. Nuclear Security: DOE Needs a More Accurate and Efficient Security Clearance Program. GAO/RCED-88-28. Washington, D.C.: December 29, 1987. National Security: DOD Clearance Reduction and Related Issues. GAO/NSIAD-87-170BR. Washington, D.C.: September 18, 1987. Oil Reserves: Proposed DOE Legislation for Firearm and Arrest Authority Has Merit. GAO/RCED-87-178. Washington, D.C.: August 11, 1987. Embassy Blueprints: Controlling Blueprints and Selecting Contractors for Construction Abroad. GAO/NSIAD-87-83. Washington, D.C.: April 14, 1987. Security Clearance Reinvestigations of Employees Has Not Been Timely at the Department of Energy. GAO/T-RCED-87-14. Washington, D.C.: April 9, 1987. Improvements Needed in the Government's Personnel Security Clearance Program. Washington, D.C.: April 16, 1985. Need for Central Adjudication Facility for Security Clearances for Navy Personnel. GAO/GGD-83-66. Washington, D.C.: May 18, 1983. Effect of National Security Decision Directive 84, Safeguarding National Security Information. GAO/NSIAD-84-26. Washington, D.C.: October 18, 1983. Faster Processing of DOD Personnel Security Clearances Could Avoid Millions in Losses. GAO/GGD-81-105. Washington, D.C.: September 15, 1981. This is a work of the U.S. government and is not subject to copyright protection in the United States. It may be reproduced and distributed in its entirety without further permission from GAO. However, because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. | The Department of Defense (DOD) is responsible for about 2 million active personnel security clearances. About one-third of the clearances are for industry personnel working on contracts for DOD and more than 20 other executive agencies. Delays in determining eligibility for a clearance can heighten the risk that classified information will be disclosed to unauthorized sources and increase contract costs and problems attracting and retaining qualified personnel. On April 28, 2006, DOD announced it had stopped processing security clearance applications for industry personnel because of an overwhelming volume of requests and funding constraints. GAO has reported problems with DOD's security clearance processes since 1981. In January 2005, GAO designated DOD's program a high-risk area because of longstanding delays in completing clearance requests and an inability to accurately estimate and eliminate its clearance backlog. For this statement GAO addresses: (1) key points in the billing dispute between DOD and OPM and (2) some of the major impediments affecting clearances for industry personnel. The costs underlying a billing dispute between DOD and OPM are contributing to further delays in the processing of new security clearance requests for industry personnel. The dispute stems from the February 2005 transfer of DOD's personnel security investigations function to OPM and associated costs for which DOD agreed to reimburse OPM. Among other things, the two agencies' memorandum of agreement for the transfer allows OPM to charge DOD annual price adjustments plus a 25 percent premium, in addition to the rates OPM charges to other federal government agencies. A January 20, 2006, memorandum from the Under Secretary of Defense for Intelligence to the Office of Management and Budget (OMB) questioned the continued need for the premiums and requested mediation from OMB. According to DOD and OPM, OMB has directed the two agencies to continue to work together to resolve the matter. The inspectors general for both DOD and OPM are expected to report on the results of their investigations into the dispute this summer. Other impediments, if not effectively addressed, could negatively affect the timeliness of clearance-eligibility determinations for one or more of the following employee groups: industry personnel, servicemembers, and civilian government employees. All three groups are affected by DOD's longstanding inability to accurately estimate the size of its security clearance workload. Inaccurate estimates of the volume of clearances needed make it difficult to determine clearance-related budgets and staffing requirements. Similarly, the July 1, 2006, expiration of Executive Order 13381, which delegated responsibility for improving the clearance process to OMB, could potentially slow improvements in personnel security clearance processes DOD-wide as well as governmentwide. GAO has been encouraged by OMB's high level of commitment to activities such as the development of a government plan to improve personnel security clearance processes governmentwide but is concerned about whether such progress will continue after the executive order expires. In contrast, demand for top secret clearances for industry personnel and the lack of reciprocity (the acceptance of a clearance and access granted by another department, agency, or military service) are impediments that mainly affect industry personnel. A previously identified increase in the demand for top secret clearances for industry personnel has workload and budgetary implications for DOD and OPM if such requests continue to occur. Finally, the lack of reciprocity has a negative effect on employees and employers, and increases the workload for already overburdened investigative and adjudicative staff. Reciprocity problems have occurred despite the issuance of governmentwide investigative standards and adjudicative guidelines in 1997. | 5,991 | 772 |
Since 1955, federal agencies have been encouraged to obtain commercially available goods and services from the private sector if doing so is cost-effective. In 1966, OMB issued Circular A-76, which established federal policy for the government's performance of commercial activities and set forth the procedures for studying them for potential contracting. In 1979, OMB issued a supplemental handbook to the circular that included cost comparison procedures for determining whether commercial activities should be performed in-house, by another federal agency through an interservice support agreement, or by the private sector. OMB updated this handbook in 1983 and again in March 1996. The March 1996 Revised Supplemental Handbook clarified numerous areas, including the application of the A-76 cost comparison requirements. The handbook's introduction describes a wide range of options government officials must consider as they contemplate reinventing government operations. They include "the consolidation, restructuring or reengineering of activities, privatization options, make or buy decisions, the adoption of better business management practices, the development of joint ventures with the private sector, asset sales, the possible devolution of activities to state and local governments and the termination of obsolete services or programs." The introduction also explains that "in the context of this larger reinvention effort, the scope of the Supplemental Handbook is limited to conversion of recurring commercial activities to or from in-house, contract or interservice support agreement performance." Where A-76 cost comparison procedures apply, the initial step is to develop a performance work statement describing what is needed to perform the activity. That statement is used as the technical performance section of a solicitation for private-sector offers. The government also develops a management plan that describes the most efficient organization for in-house performance of the activity described in the performance work statement. The cost of performance by the government in accordance with the most efficient organization is compared to the cost proposed by the private-sector source selected pursuant to the solicitation. The activity will be converted to performance by the private sector if the private sector's offer represents a reduction of at least 10 percent of direct personnel costs or $10 million over the performance period. Further information about the A-76 process is included in appendix I. In addition to A-76, the Department of Defense (DOD) must consider the effect of 10 U.S.C. 2461 when it plans changes to an industrial or commercial type function performed by its civilian employees. Section 2461, as amended by the Strom Thurmond National Defense Authorization Act for Fiscal Year 1999, Public Law 105-261, requires an analysis of the activity, including a comparison of the cost of performance by DOD civilian employees and by a contractor, to determine whether contractor performance could result in a savings to the government. It also requires DOD to notify Congress of the analysis and to provide other information prior to instituting a change in performance. The 38th EIW is an active component Air Force unit with a wartime support mission that has been greatly diminished since the end of the Cold War. Deactivation of the 38th EIW will involve multiple actions to realign the wartime mission and reassign other peacetime roles. The 38th EIW provides engineering and installation (E&I) services in support of the Air Force's communications needs. It supports flight facilities, intrusion detection, ground radio, wideband/satellite systems, local area networks, cable/fiber optic distribution systems, switching systems, and other communications systems. The 38th EIW is an Air Force Materiel Command unit headquartered at Tinker AFB, Oklahoma, with squadrons at Keesler AFB, Mississippi; Kelly AFB, Texas; and McClellan AFB, California. In addition, an active duty military advisor is stationed at each of the 19 ANG units--units that also provide engineering and installation services. Currently, the 38th EIW consists of 2,343 personnel (1,358 military and 985 civilian) at these bases and various ANG locations. Table 1 shows the active component military and civilian personnel authorized for the 38th EIW at each location. The squadrons at Keesler, Kelly, and McClellan AFBs are composed primarily of military personnel. About a third of the total EIW authorized personnel (726 military and 40 civilian) perform installation services. The remainder of the military and civilian personnel perform engineering; logistics; and other support functions. The 19 ANG units noted above have 2,314 authorized guard personnel: they perform peacetime installation services as part of their training. Further, the Air Force relies on the private sector to provide E&I services using approximately 40 different indefinite delivery/indefinite quantity contracts. The 38th EIW's structure was premised on its cold war mission of reconstituting damaged fixed communications systems (radars, phone lines, cables, etc.) at overseas bases. However, under the new Air Expeditionary Force concept, existing military forces will go into bare bases and use tactical, or mobile, communications gear. Consequently, the need to repair these fixed communications is reduced and there is greater reliance on tactical communications. Based on the reassessment of its wartime mission requirements and the Quadrennial Defense Review process (which recommended DOD improve the efficiency and performance of support activities by reengineering), the Air Force decided that the wartime E&I mission could be transferred to the ANG. At the same time, the Air Force would retain a minimal active-duty capability, provided by a new rapid response squadron at Keesler AFB. Since there will no longer be a need for the 38th EIW to supply the Air Force's peacetime E&I needs in order to maintain wartime skills, the Air Force no longer has a requirement to maintain the large E&I infrastructure of the 38th EIW. As currently proposed, the deactivation of the 38th EIW would eliminate 1,200 of its 1,358 military positions and 552 of its 985 civilian positions. After the wing is deactivated, the remaining 158 military personnel and 433 civilian personnel will be reassigned to existing or new organizations, located principally at Tinker and Keesler AFBs. With the deactivation of the 38th EIW and transfer of the wartime mission to the ANG, other actions will also occur: The Kelly and McClellan squadrons will be disestablished concurrent with the realignment and closure actions being implemented as part of the 1995 base realignment and closure decision. All 19 active-duty authorizations at the ANG units will be eliminated. The squadron at Keesler AFB will become a rapid response squadron whose mission would be wartime deployment, and also provide a quick reaction E&I capability for emergency needs, and provide specialized engineering. A portion of the positions formerly with the 38th EIW will be reassigned to a new organizational unit at Tinker AFB that will become a base communication and information infrastructure planning and program management office. Fifty civilian authorizations which are being eliminated at Tinker will be transferred to the Air Force Communications Agency at Scott AFB, Illinois, to more closely align their telecommunications sustainment workload with the Air Force unit responsible for telecommunications policy. The wartime E&I mission will be substantially transferred to the existing ANG E&I units without an increase in authorized positions. Figure 1 portrays the planned actions. Viewed another way, of 1,358 authorized military positions, over 88 percent would be eliminated and out of the 985 authorized civilian positions, 56 percent would be eliminated, while the remainder would be shifted to other organizations. Table 2 shows the number of 38th EIW military and civilian positions that would be reduced at affected bases and the numbers reassigned to other organizations. As a result of the deactivation and restructuring, 1,752, or 75 percent, of the unit's 2,343 positions would be eliminated, while 591 would be reassigned elsewhere. Following the deactivation of the 38th EIW, the responsibility for obtaining peacetime E&I services will be transferred to the individual major commands. These commands may acquire such services from (1) contracts, (2) the ANG E&I units, or (3) the rapid response squadron at Keesler, based on availability. The military units will need to perform some of this peacetime work to maintain their wartime skills. OMB Circular A-76 and the cost comparison requirements of its accompanying handbook apply to the conversion of the performance of a commercial activity from government civilian employees to the private sector. According to the Air Force, A-76 does not apply to its plan because the deactivation of the 38th EIW does not constitute a conversion of the performance of an activity by civilian DOD employees as envisioned under the circular. The Air Force's changed wartime requirements have caused it to propose a realignment of the responsibilities and missions of the 38th EIW. Consequently, the original function of the 38th EIW has been fundamentally altered and the need for civilian employee support is significantly reduced. We find the Air Force's conclusion that A-76 does not apply to be reasonable. The Air Force made a reasonable judgment in deciding that its deactivation of the 38th EIW and the restructuring of the delivery of E&I services is not subject to the requirements of 10 U.S.C. 2461 since the plan does not constitute a change from performance of a particular workload by DOD civilian employees to private sector performance. The handbook does not provide detailed guidance as to what constitutes a conversion of a commercial activity for purposes of A-76. Between 1979 and 1994, DOD conducted over 2,000 competitions using the A-76 process. Most of these involved activities, such as groundskeeping, laundry, and food service, where the conversions proposed were straightforward exchanges of a government employee workforce for a contractor workforce to perform a particular service. An agency must base its judgment about whether A-76 applies on the individual facts of each initiative. As each case usually involves a unique situation, an agency has the discretion to determine the applicability of A-76 to its particular initiative as long as the agency has exercised its judgment reasonably. The handbook introduction explains that a commercial activity is a process resulting in a product or service that may be obtained from the private sector and that some management initiatives, such as "reengineering," "privatization," or "restructuring," involving such activities are beyond conversions and are not subject to the cost-comparison requirements of A-76. Therefore, it is reasonable to interpret the guidance to mean that A-76 conversions are not intended to encompass every initiative that results in the loss of civilian government jobs. Further, the handbook provides that it is not to apply to the conversion of activities performed by uniformed military personnel. The Air Force plan to deactivate the 38th EIW and transfer its E&I activities to other organizations within the Air Force or to the ANG is a comprehensive change to the missions and responsibilities of the 38th EIW. The Air Force has decided that the 38th EIW's wartime mission should be transferred to the ANG. As a result, it appears that the Air Force no longer has a requirement to maintain a large, centralized E&I infrastructure to train personnel to meet this mission. The peacetime E&I work was performed by the 38th EIW, in large part, to maintain its skills and capabilities to perform its wartime mission. This included a large civilian workforce performing peacetime E&I work to support the wartime mission of the uniformed military personnel. Now that this wartime mission has been transferred to the ANG, which does not need this civilian support, there is no longer a requirement to maintain the infrastructure. The type of E&I work being impacted by the Air Force plan would generally fit within the definition of commercial activity for A-76 purposes. However, the Air Force plan is not simply a changeover of this commercial activity from performance by civilian employees to private sector workers. In fact, the majority of positions affected are uniformed military personnel, which are not subject to A-76. Of 1,358 military personnel assigned to the 38th EIW, only 158 will remain. The civilians in the 38th EIW were primarily performing commercial E&I activities to provide continuity during contingencies and support for military personnel to enhance their wartime skills. Absent the military requirement, the E&I services could have been supplied by contract with the private sector. Under the restructuring, civilian positions will be lost and the different Air Force units could meet some of their new responsibilities by obtaining E&I services through contractors. However, this is an incidental result of a plan that primarily involves the reassignment of uniformed military personnel and the transfer of their responsibilities to other organizations. The civilian performance of the commercial E&I activity was essentially an adjunct of the military mission. The civilians who remain will be reassigned to different organizations and locations. Thus, we find reasonable the Air Force decision that its plan to change the wartime mission of the 38th EIW is not the type of management initiative that is subject to A-76. We believe that the Air Force made a reasonable judgment in deciding that its deactivation of the 38th EIW and the restructuring of the delivery of E&I services which that necessitates is not subject to the requirements of section 2461 since the plan does not constitute a change from performance of a particular workload by DOD civilian employees to private-sector performance. Section 2461 requires that before any commercial or industrial type function is changed from performance by DOD civilian employees to private-sector performance, DOD must report to Congress and perform an analysis showing that private sector performance will result in a savings to the government over the life of the contract. As under A-76, the cost of performance of the function by the government employees is to be based on an estimate of their most cost-effective manner for performance of the function. Section 2461 applies to initiatives that result in functions performed by DOD civilian employees being changed to performance by private-sector employees. As discussed earlier, the Air Force proposal is more than just a change of the 38th EIW function from DOD civilian employees to contractors. Rather, it is a transfer of the E&I wartime mission to the ANG units which primarily affects uniformed military personnel who are not subject to 10 U.S.C. 2461. Once this occurs, there will no longer be a need for the 38th EIW, which was designed to support the military personnel and their wartime mission. The action being taken in this case is not a change of the kind contemplated by section 2461. While neither an A-76 cost comparison nor a section 2461 cost study was required, the Air Force nevertheless did complete a business case analysis to estimate the cost-effectiveness of restructuring the 38th EIW. That analysis showed an estimated annual recurring savings of approximately $28 million, based on reported fiscal year 1997 costs and projected costs (including contract costs) of the restructured organizations. The estimated contract costs were based on existing negotiated contract rates for an equivalent level of effort. The analysis showed that most of the recurring savings would result from engineer and installer manpower cuts and unit operations and maintenance reductions. Also, the business case analysis found that the Air Force will realize an estimated one-time savings of $33 million, of which $28 million is due to the cancellation of planned Base Realignment and Closure construction projects associated with the future realignment of Kelly AFB and the closure of McClellan AFB. (These construction projects were planned at other bases in order to accommodate the E&I workload being transferred from the squadrons at McClellan and Kelly AFBs as the result of 1995 base realignment and closure decisions.) The study also found that another $5 million will be saved due to the cancellation of building construction projects at Tinker AFB. The Air Force Audit Agency performed a management advisory review of the 38th EIW business case analysis. The Audit Agency sampled two of the five wing functions, representing 78 percent of the wing's total functions. It concluded that the methodology the Air Force had used for its analysis was sound and that the analysis was materially correct and well documented. It also concluded that the estimate of expected savings was conservative because the Air Force used the most conservative rates in place. We also found the analysis to be reasonable based on the cost factors and type of methodology the Air Force used. The Air Force's proposal concerning the 38th EIW is a comprehensive change to the missions and responsibilities performed by the 38th EIW and does not constitute a conversion of civilian to contractor personnel as envisioned under A-76. Thus, the Air Force was reasonable in concluding that it did not have to undergo the A-76 process in this instance. Similarly, the planned action is not a change of the kind contemplated by 10 U.S.C. 2461. Accordingly, the Air Force was not required to perform the cost study and provide congressional notification under that provision. At the same time, the Air Force's business case analysis supports the cost-effectiveness of the proposed action, with the reduction of a significant number of personnel. We requested comments on a draft of this report from the Secretary of Defense or his designee. On February 11, 1999, DOD officials concurred with the report findings. They also provided technical comments which have been incorporated as approriate. To determine whether the planned action was subject to the requirements of OMB Circular A-76, we reviewed the Air Force's programming and implementation plans and reviewed and analyzed Circular A-76. Also, we interviewed senior officials at Air Force Headquarters, Washington, D.C.; the 38th EIW, Tinker AFB, Oklahoma; and the Office of Management and Budget, Washington, D.C. We also reviewed our prior work reviewing A-76 issues. To determine whether the Air Force action was subject to the requirements of 10 U.S.C. 2461, we identified and reviewed relevant legislation and discussed the applicability of section 2461 with senior officials of the Office of Management and Budget and the Air Force's Office of General Counsel. To determine whether the Air Force analyzed the cost-effectiveness of the proposed action, we reviewed its business case analysis and discussed it with the Air Force Audit Agency. We also reviewed the Air Force's rates and cost methodology. We conducted our review from May 1998 to January 1999 in accordance with generally accepted government auditing standards. We are sending copies of this report to the Ranking Minority Member of the Subcommittee on Readiness and Management Support, Senate Armed Services Committee; Chairmen and Ranking Minority Members of the Senate and House Committees on Appropriations; the Secretaries of Defense and the Air Force; and the Director of OMB. We will make copies available to others upon request. Please contact me at 202-512-8412 if you or your staff have any questions concerning this report. Major contributors to this report are listed in appendix II. In general, the A-76 process consists of six key activities. They are: (1) developing a performance work statement and quality assurance surveillance plan; (2) conducting a management study to determine the government's most efficient organization (MEO); (3) developing an in-house government cost estimate for the MEO; (4) issuing a Request for Proposals (RFP) or Invitation for Bid (IFB); (5) evaluating the proposals or bids and comparing the in-house estimate with a private sector offer or interservice support agreement and selecting the winner of the cost comparison; and (6) addressing any appeals submitted under the administrative appeals process, which is designed to ensure that all costs are fair, accurate, and calculated in the manner prescribed by the A-76 handbook. Figure I.1 shows an overview of the process. The solid lines indicate the process used when the government issues an IFB, requesting firm bids on the cost of performing a commercial activity. This process is normally used for more routine commercial activities, such as grass-cutting or cafeteria operations, where the work process and requirements are well defined. The dotted lines indicate the additional steps that take place when the government wants to pursue a negotiated, "best value" procurement. While it may not be appropriate for use in all cases, this process is often used when the commercial activity involves high levels of complexity, expertise, and risk. Most Efficient Organization (MEO) activities Additional steps required for request for proposals (RFP) The circular requires the government to develop a performance work statement. This statement, which is incorporated into either the IFB or RFP, serves as the basis for both government estimates and private sector offers. If the IFB process is used, each private sector company develops and submits a bid, giving its firm price for performing the commercial activity. While this process is taking place, the government activity performs a management study to determine the most efficient and effective way of performing the activity with in-house staff. Based on this "most efficient organization," the government develops a cost estimate and submits it to the selecting authority. The selecting authority concurrently opens the government's estimate along with the bids of all private sector firms. According to OMB's A-76 guidance, the government's in-house estimate wins the competition unless the private sector's offer meets a threshold of savings that is at least 10 percent of direct personnel costs or $10 million over the performance period. This minimum cost differential was established by OMB to ensure that the government would not contract out for marginal estimated savings. If the RFP--best value process--is used, the Federal Procurement Regulations and the A-76 Supplemental Handbook require several additional steps. The private sector offerors submit proposals that often include a technical performance proposal, and a price. The government prepares an in-house management plan and cost estimate based strictly on the performance work statement. On the other hand, private sector proposals can offer a higher level of performance or service. The government's selection authority reviews the private sector proposals to determine which one represents the best overall value to the government based on such considerations as (1) higher performance levels, (2) lower proposal risk, (3) better past performance, and (4) cost to do the work. After the completion of this analysis, the selection authority prepares a written justification supporting its decision. This includes the basis for selecting a contractor other than the one that offered the lowest price to the government. Next, the authority evaluates the government's offer and determines whether it can achieve the same level of performance and quality as the selected private sector proposal. If not, the government must then make changes to meet the performance standards accepted by the authority. This ensures that the in-house cost estimate is based upon the same scope of work and performance levels as the best value private sector offer. After determining that the offers are based on the same level of performance, the cost estimates are compared. As with the IFB process, the work will remain in-house unless the private offer is (1) 10 percent less in direct personnel costs or (2) $10 million less over the performance period. Participants in the process--for either the IFB or RFP process--may appeal the selection authority's decision if they believe the costs submitted by one or more of the participants were not fair, accurate, or calculated in the manner prescribed by the A-76 handbook. Appeals must be submitted in writing and within 20 days after the date that all supporting documentation is made publicly available. The appeal period may be extended to 30 days if the cost comparison is particularly complex. Appeals are supposed to be adjudicated within 30 days after they are received. The A-76 Supplemental Handbook provides that, under certain circumstances, agencies may authorize cost comparison waivers and direct conversions to or from in-house, contract or interservice support agreements. A waiver may be granted where: The conversion will result in a significant financial or service quality improvement and a finding that the conversion will not serve to reduce significantly the level or quality of competition in the future award or performance of work; or The waiver will establish why in-house or contract offers have no reasonable expectation of winning a competition conducted under the cost comparison procedures of the Handbook. Additionally, the supplemental handbook provides that under certain circumstances, such as situations involving 65 or less full time equivalent personnel, streamlined cost comparisons may be permitted. Kimberly Seay, Site Senior Bonnie Carter, Senior Evaluator The first copy of each GAO report and testimony is free. Additional copies are $2 each. Orders should be sent to the following address, accompanied by a check or money order made out to the Superintendent of Documents, when necessary. VISA and MasterCard credit cards are accepted, also. Orders for 100 or more copies to be mailed to a single address are discounted 25 percent. U.S. General Accounting Office P.O. Box 37050 Washington, DC 20013 Room 1100 700 4th St. NW (corner of 4th and G Sts. NW) U.S. General Accounting Office Washington, DC Orders may also be placed by calling (202) 512-6000 or by using fax number (202) 512-6061, or TDD (202) 512-2537. Each day, GAO issues a list of newly available reports and testimony. To receive facsimile copies of the daily list or any list from the past 30 days, please call (202) 512-6000 using a touchtone phone. A recorded menu will provide information on how to obtain these lists. | Pursuant to a legislative requirement, GAO provided information on whether the Air Force complied with relevant policy and congressional notification requirements in reaching a decision to deactivate the 38th Engineering Installation Wing (EIW) at Tinker Air Force Base (AFB), Oklahoma, focusing on: (1) the scope of the Air Force's planned action; (2) whether it is subject to the requirements of Office and Management Budget (OMB) Circular A-76 and 10 U.S.C. 2461; and (3) whether an analysis was completed to examine the cost-effectiveness of the planned action. GAO noted that: (1) the Air Force plans to deactivate the 38th EIW, headquartered at Tinker AFB, Oklahoma, and transfer its wartime mission to the Air National Guard without increasing the Guard's authorized end-strength; (2) about 75 percent, or 1,752, of the unit's authorized positions will be eliminated, while 591 positions will be reassigned to existing or new organizations to assume responsibilities previously assigned to the 38th EIW; (3) these changes are expected to result in one-time savings of $33 million and annual recurring savings of $28 million; (4) the proposed action is a comprehensive restructuring of an active component unit, largely transferring its wartime mission to the National Guard; (5) it is not the type of action historically associated with OMB Circular A-76 and is not a conversion as envisioned under the circular; (6) accordingly, a cost comparison under that circular is not required; (7) likewise, the planned action is not a change in the performance from civilian personnel to contractor employees of the kind subject to the requirements of 10 U.S.C. 2461; (8) accordingly, the Air Force was not required to perform the cost study and provide congressional notification under that provision; and (9) at the same time, the Air Force's business case analysis supports the cost-effectiveness of the proposed action, with the reduction of a significant number of personnel. | 5,612 | 438 |
DOD codified its DOD Executive Agent program in 2002 and issued a directive, DOD Directive 5101.1, that defines a DOD Executive Agent and establishes the roles and responsibilities governing the DOD Executive Agent assignments and arrangements. DOD officials told us that the department issued a directive for its DOD Executive Agent program in part because the term DOD Executive Agent had been used to describe a variety of management arrangements, and DOD Directive 5101.1 was intended to clarify the term. For example, in 1998, DOD identified approximately 401 Executive Agents within the military departments. However, after the directive was issued in 2002, ODCMO officials stated they worked with identified Executive Agents to determine which were to remain DOD Executive Agents under the directive. As a result, the number of activities and programs with the title of DOD Executive Agent was significantly reduced. For example, the Joint Interagency Task Force West was referred to as U.S. Pacific Command's Executive Agent to support law enforcement for counterdrug efforts in the Asia-Pacific region. However, according to ODCMO officials, this task force was not considered to be an official DOD Executive Agent per DOD Directive 5101.1, and ODCMO officials removed its DOD Executive Agent designation. For issuances published before March 25, 2012, DOD policy is that directives are to be updated or cancelled after 10 years. ODCMO officials told us that they are in the process of updating DOD Directive 5101.1, certified current in 2003, but did not have a firm deadline for when the directive will be updated. DOD Executive Agent designations are conferred when 1) the efforts of more than one DOD component needs to be coordinated and no existing means to accomplish DOD objectives exists, 2) DOD resources need to be focused on a specific area or areas of responsibility in order to minimize duplication or redundancy, or 3) such designation is required by law, executive order, or government-wide regulation. Further, within the scope of its assigned responsibilities and functions, the authority of the DOD Executive Agent takes precedence over the authority of other DOD component officials performing related or collateral joint or multicomponent support responsibilities and functions. A DOD Executive Agent is the head of a DOD component. The DOD Executive Agent may delegate the authority to act to a subordinate designee within that official's component. For example, the Secretary of the Army is the designated DOD Executive Agent for DOD Biometrics, and has delegated that responsibility to the Army's Provost Marshall. DOD Directive 5101.1 assigns ODCMO the overall program management of the DOD Executive Agent program. Specifically, ODCMO oversees the implementation of the DOD Executive Agent directive, develops policy on DOD Executive Agent designations, and issues guidelines as appropriate to further define responsibilities contained in DOD Directive 5101.1. An OSD Principal Staff Assistant oversees the activities of DOD Executive Agents in their functional areas of responsibility. In addition, DOD Directive 5101.1 states that the OSD Principal Staff Assistant should assess the DOD Executive Agents in their functional areas periodically, but not less than once every 3 years, to determine the DOD Executive Agent's continued need, currency, and effectiveness and efficiency in satisfying end-user requirements. According to ODCMO officials, these OSD Principal Staff Assistants are the Under Secretaries of Defense, the Deputy Chief Management Officer, the General Counsel of DOD, the Inspector General of DOD, and those Assistant Secretaries of Defense, Assistants to the Secretary of Defense, and OSD Directors, and equivalents, who report directly to the Secretary or Deputy Secretary of Defense. Typically, the OSD Principal Staff Assistants assess DOD Executive Agents within their functional areas. For example, the Under Secretary of Defense for Acquisition, Technology and Logistics would assess DOD Executive Agents involved in acquisition and logistics related areas, such as the DOD Executive Agents for Medical Material, Subsistence, Construction and Barrier Material, and Bulk Petroleum that are tasked with managing the logistics of supplying these products across the department. Only the Secretary of Defense or the Deputy Secretary of Defense may designate a DOD Executive Agent, and the designation remains in effect until the Secretary of Defense or the Deputy Secretary of Defense revokes or supersedes it. According to ODCMO officials, the Secretary or Deputy Secretary of Defense designates a DOD Executive Agent after an evaluation of existing organizational and management arrangements and a determination that a DOD Executive Agent would most effectively, economically, or efficiently carry out a function or task. However, according to ODCMO officials, the head of a DOD component may volunteer as a DOD Executive Agent and may formally request the Secretary or Deputy Secretary of Defense make the assignment, or an OSD Principal Staff Assistant may propose that the Secretary or Deputy Secretary of Defense assign a DOD component as a DOD Executive Agent. ODCMO officials stated this typically happens when a military department, defense agency, or a combatant command has substantial responsibility or expertise to execute a task on behalf of DOD, or the function is particularly sensitive or complex as differentiated from its overall organic mission. ODCMO officials also stated that DOD Executive Agent designations are typically formalized in a Secretary or Deputy Secretary of Defense memorandum, with direction to establish a DOD issuance to codify the specifics of the DOD Executive Agent arrangement at a later date. ODCMO officials stated that the issuance is important, as the designation of the title of DOD Executive Agent by itself confers no specific responsibilities. The nature and scope of the authority delegated must be stated in the memorandum or DOD issuance designating the DOD Executive Agent. According to ODCMO officials, funding of specific DOD Executive Agent activities is not determined at the time of assignment. Rather, the designated DOD Executive Agent seeks resources through DOD's planning and budgeting process. Further, according to ODCMO officials, the DOD Executive Agent often bears the major share of the cost to execute the assigned responsibilities. However, ODCMO officials explained that, as necessary, funding determinations between the DOD Executive Agent and other DOD stakeholders are negotiated through memorandums of agreement or understanding and DOD's annual program and budget review process. We determined that DOD had 81 DOD Executive Agents focused on a variety of topics and designated to 12 different DOD components, as of May 2017. Almost half (38 of 81) or 47 percent of the DOD Executive Agents are designated to the Secretary of the Army and 68 of 81, or 84 percent, were designated to the Secretaries of the Army, Air Force, or Navy or the Commandant of the Marine Corps. In contrast, six DOD components had one DOD Executive Agent designation each. Additionally, 11 different OSD Principal Staff Assistants oversee 81 DOD Executive Agents. This information is based on our analysis of ODCMO's list of DOD Executive Agents. Figure 1 shows the DOD Executive Agent designations by DOD component and by OSD Principal Staff Assistant. According to ODCMO officials, a DOD Executive Agent designation is typically assigned to the DOD component that is already involved in the work related to the DOD Executive Agent. Below are several types of activities DOD Executive Agents perform and an example of a DOD Executive Agent that performs the activity: Administrative Support--The Secretary of the Army, as the designated DOD Executive Agent for the U.S. Military Entrance Processing Command, is responsible for programming, budgeting, and funding all Military Entrance Processing Command operations. Developing Standards--The Director of the Defense Information Systems Agency, as the DOD Executive Agent for Information Technology Standards, is responsible for developing and maintaining information-technology standards. Developing Training Programs--The Secretary of the Air Force, as the DOD Executive Agent for Military Working Dogs, is responsible for developing required training programs and curricula for military working-dog instructors, kennel masters, and handlers. Technology Management--The Secretary of the Navy, as the DOD Executive Agent for Printed Circuit Board and Interconnect Technology, is responsible for developing and maintaining a technology roadmap to ensure that DOD has access to manufacturing capabilities and technical expertise necessary to meet future military requirements regarding this technology. Acquisition Support--The Commandant of the Marine Corps, as the DOD Executive Agent for Non-Lethal Weapons, is responsible for coordinating nonlethal weapon requirements across doctrine, organization, training, materiel, leadership and education, personnel, and facilities. Department-wide Visibility--The Secretary of the Army, as the DOD Executive Agent for the Unexploded Ordnance Center of Excellence, chairs the center and executes management oversight and funding responsibilities for the center. As part of our questionnaire for DOD Executive Agents, we asked about the reasons why DOD conferred the designation. In response, 51 percent (36 of 70) of DOD Executive Agents responding to our questionnaire reported that their designation was conferred to minimize the duplication or redundancy of DOD resources. Thirty-six percent (25 of 70) of DOD Executive Agents responding to our questionnaire reported that their designation was conferred because no other means existed for the department to accomplish its objective. Finally, 26 percent (18 of 70) of DOD Executive Agents responding to our questionnaire reported that their designation was conferred because it was required by law, executive order, or government-wide regulation. A majority of the DOD Executive Agents have OSD Principal Staff Assistants from one of three Under Secretaries of Defense. About half (35 of 81), or 43 percent, of the OSD Principal Staff Assistants for DOD Executive Agents are assigned to the Under Secretary of Defense for Acquisition, Technology and Logistics, while another 40 percent (32 of 81) of OSD Principal Staff Assistants are assigned to the Under Secretary of Defense for Personnel and Readiness or the Under Secretary of Defense for Policy. According to DOD Directive 5101.1, an OSD Principal Staff Assistant is to oversee the activities of DOD Executive Agents in their functional areas of responsibility. In addition, the OSD Principal Staff Assistant is also assigned to assess each DOD Executive Agent to determine the DOD Executive Agent's continued need, currency, and effectiveness and efficiency in satisfying end-user requirements. Typically, the OSD Principal Staff Assistant is to assess DOD Executive Agents within their functional areas. For example: The Under Secretary of Defense for Acquisition, Technology and Logistics oversees 35 DOD Executive Agents and typically assesses those involved in acquisition and logistics-related areas, such as the Director of the Defense Logistics Agency, serving as the DOD Executive Agent for Medical Materiel, Subsistence, Construction and Barrier Materiel, and Bulk Petroleum and is tasked with managing the logistics of supplying these products across the department. In addition, the Under Secretary of Defense for Acquisition, Technology and Logistics also oversees two designations related to chemical and biological weapons and two designations related to the safety and security of biological toxins and hazards. The Under Secretary of Defense for Personnel and Readiness's portfolio includes readiness; health affairs; training; and personnel requirements and management, including equal opportunity, morale, welfare, recreation, and quality-of-life matters. The Under Secretary of Defense for Personnel and Readiness oversees 20 DOD Executive Agents, including three designations related to language training or foreign language contracts; two designations related to recruitment and entrance processing; and the Armed Services Entertainment program. The Under Secretary of Defense for Policy's portfolio includes all matters pertaining to the formulation of national security and defense policy. The office oversees 12 DOD Executive Agents, including two designations related to security cooperation activities and two designations related to multinational organizations. We found that DOD has weaknesses in its approach to tracking its DOD Executive Agents, resulting in ODCMO not having an accurate accounting of the number of DOD Executive Agents. According to DOD Directive 5101.1, ODCMO is responsible for developing, maintaining, monitoring, revising, and making available the list of DOD Executive Agent designations. However, we found that ODCMO did not maintain a list of DOD Executive Agents that was current or complete. For example, we found 10 designations on DOD's list of DOD Executive Agents that were not accurate, including the following: Disestablished DOD Executive Agents: Three DOD Executive Agent designations that were on ODCMO's list had been disestablished; however, they had not been removed from the list. For example, in October 2015, a Deputy Secretary of Defense memorandum disestablished the DOD Executive Agent for Space by redesignating it as the Principal DOD Space Advisor. ODCMO officials stated that they were aware that it had been disestablished, but had not removed it from the list until a directive, issued in June 2017, cancelled the designation for the DOD Executive Agent for Space. In another example, the DOD Executive Agent for Global Command and Control Systems should have been removed from ODCMO's list in 2013. Inactive DOD Executive Agents: Two DOD Executive Agent designations were no longer considered active, meaning that while the designations have not been cancelled, the DOD Executive Agents are no longer performing the responsibilities of the DOD Executive Agents. DOD Directive 5101.1 states that the designations are to remain in effect until the Secretary of Defense or the Deputy Secretary of Defense revokes or supersedes them. However, the Secretary of Defense or Deputy Secretary of Defense has not issued any documentation to disestablish the DOD Executive Agents. Specifically, some Army officials from the Chemical Demilitarization Program stated that the responsibilities of the DOD Executive Agent had been completed in 2012 and thus the designation was no longer active. In the other example, officials from the DOD Executive Agent for DOD Civilian Police Officers and Security Guards Physical Fitness Standards Program stated that the directive for this program was updated in 2012 and reference to the DOD Executive Agent designation had been removed because the designation was no longer necessary. Officials stated that they intended to pursue the cancellation of the designation at a later date. Unclear DOD Executive Agent designations: Three DOD Executive Agent designations were unclear, such that they were not considered actual DOD Executive Agents, or officials in the relevant component had no knowledge of the designation. For example, Navy officials stated that they could not find any organization currently carrying out any responsibility related to the DOD Executive Agent for High School News Service or for the Force Protection of Military Sealift Assets. ODCMO officials told us that these may have been considered DOD Executive Agents at one time, but the arrangements were never documented. In the other example, the status of the DOD Executive Agent for the Global Positioning System is unclear since Air Force officials at the program stated that they do not use the term DOD Executive Agent to refer to the program and were unaware that the program was considered to be a DOD Executive Agent. ODCMO officials stated that a determination was likely made at some point to consider this organization a DOD Executive Agent, and therefore the organization was included on ODCMO's list, but no official documentation was issued. Air Force Officials who track the Air Force's DOD Executive Agents stated that the Global Positioning System program may have been considered a DOD Executive Agent at one time. Missing DOD Executive Agent Designation: One DOD Executive Agent designation was missing from ODCMO's list. ODCMO's list included an DOD Executive Agent for Weapons of Mass Destruction and Delivery Vehicle Elimination Operations in Libya, and the Defense Threat Reduction Agency was the designated DOD Executive Agent. However, Defense Threat Reduction Agency officials stated that there are actually two separate designations, one for such operations in Libya and one in Iraq. Both ODCMO and the Defense Threat Reduction Agency lost track of the designation for Iraq and it was not included in ODCMO's list of DOD Executive Agents. Not an DOD Executive Agent: One DOD Executive Agent designation was on ODCMO's list that ODCMO and Army officials agree should not have been considered as a DOD Executive Agent. According to ODCMO officials, the DOD Executive Agent for the Joint Center for International Security Force Assistance was inappropriately applied to the organization. Officials explained that the center is actually a Chairman's Controlled Activity, which is another type of management arrangement the department uses. Per DOD policy, only the Secretary of Defense or Deputy Secretary of Defense may cancel a designation. Thus, Army officials stated that until official action is taken to document that the center is not an DOD Executive Agent, it will remain on ODCMO's list and the Army will consider it a valid DOD Executive Agent. We also identified seven other designations that ODCMO may need to revisit. ODCMO officials stated that our review highlighted several designations that may no longer be considered active and require resolution. Specifically: Army officials with whom we spoke told us that 5 of the Army's 38 designations may no longer be necessary and could be disestablished. Officials from the DOD Executive Agent for Weapons of Mass Destruction Elimination Operations and Delivery Vehicle Elimination Operations in Libya stated in their response to our questionnaire that the DOD Executive Agent's 2004 designation is no longer needed, as considerable time has passed and the nature of U.S. government engagement and policies toward Libya have changed significantly. Officials from both the DOD Executive Agent and the OSD Principal Staff Assistant for the DOD Executive Agent for the Regional Centers for Security Studies stated that the designation may no longer be necessary, as the functions and responsibilities of this DOD Executive Agent are operating in a routine manner. According to ODCMO officials, a number of different circumstances may prompt the cancellation of a DOD Executive Agent designation, to include circumstances when the responsibilities of a DOD Executive Agent have become institutionalized as part of an office or agency. ODCMO controls its updates to the DOD Executive Agent list to ensure any changes are vetted through the appropriate offices. However, according to ODCMO officials, to maintain the list they rely on representatives from DOD Executive Agents to self-report any modifications to the DOD Executive Agent or contact information for relevant officials, which has resulted in some of the discrepancies described above. Aside from DOD Executive Agents self-reporting any changes, ODCMO officials stated that there is no process to ensure that all information on the list is current or complete. Furthermore, ODCMO officials stated that they have not issued guidance instructing DOD Executive Agent officials under what circumstances they should self- report changes. Moreover, we found that ODCMO does not have a process for being notified when a new DOD Executive Agent is established or when one is cancelled. ODCMO officials stated that they provide consultation upon request to other DOD components that are considering establishing a new DOD Executive Agent. However, officials stated they are not always consulted and may not become aware of the new DOD Executive Agent designation until after its establishment. For example, ODCMO officials stated that were not involved in the issuance of the January 2017 Deputy Secretary of Defense memorandum that announced the designation of the Secretary of the Army as the DOD Executive Agent for the DOD Biological Select Agent and Toxin Biosecurity Program. ODCMO officials told us they have, on at least one occasion, learned about interest in establishing a DOD Executive Agent for a function that another DOD Executive Agent was already addressing, and advised against its establishment. Furthermore, ODCMO officials said that a DOD Executive Agent designation can be removed from the list of DOD Executive Agents by cancelling or updating the DOD issuance that established the DOD Executive Agent. Even though ODCMO coordinates all issuances for the department, ODCMO officials stated that they are not informed of all changes in issuances related to DOD Executive Agent designations, such as when a designation is updated or cancelled. For example, as noted earlier, officials from the DOD Executive Agent for DOD Civilian Police Officers and Security Guards Physical Fitness Standards Program stated that reference to the DOD Executive Agent designation was removed as part of the 2012 update to the DOD directive for the DOD Executive Agent. However, ODCMO officials were not aware that the updated directive no longer included a reference to the DOD Executive Agent designation, and therefore ODCMO still had this DOD Executive Agent on its list. DOD Executive Agent officials stated that they intended to pursue the cancellation of the designation at a later date. When consulted on DOD issuances related to the establishment, disestablishment, or modification of a DOD Executive Agent-related issuance, ODCMO officials stated they advise the OSD Principal Staff Assistants, among others, to discretely identify the actions related to the DOD Executive Agent designation to facilitate their tracking. According to DOD Directive 5101.1, ODCMO is to issue guidelines, as appropriate, to define further the policies, responsibilities and functions, and authorities contained in the directive. This could include the process for notifying ODCMO when a change is made to a DOD Executive Agent, such as when one is established, removed, or modified. Standards for Internal Control in the Federal Government states that management should use high-quality information to achieve the entity's objectives. Specifically, management obtains relevant data from reliable internal and external sources in a timely manner based on the identified information requirements. ODCMO officials agreed that they need to improve their tracking of DOD Executive Agents; however, they have not developed an approach for this. Without taking steps to ensure that it is accurately tracking its Executive Agents, ODCMO will not be able to effectively oversee the DOD Executive Agent program. An accurate list is an important tool to help ODCMO manage its DOD Executive Agent program, including ensuring that there is no overlap in efforts across the DOD Executive Agent designations. As a result, DOD's list of Executive Agents will continue to be out dated and incomplete. According to the 70 DOD Executive Agents responding to our questionnaire, OSD Principal Staff Assistants responsible for assessing DOD Executive Agents have not conducted assessments of about half (37 of 70) of the DOD Executive Agents in the past 3 years, as required by DOD guidance. Of the remaining 33 DOD Executive Agents, 28 responded that their OSD Principal Staff Assistant assessed them. Moreover, of those 28 DOD Executive Agents, almost half (13 of 28) said their assessment was not documented or that they did not know whether documentation existed. Finally, 3 DOD Executive Agents responded that they did not know whether OSD Principal Staff Assistants had assessed them. (See fig. 2.) Among the DOD Executive Agents that indicated they were assessed and provided documentation of the assessment, we found that many did not meet all of the requirements for assessments as prescribed in DOD Directive 5101.1. Specifically, the OSD Principal Staff Assistants did not conduct the assessment at all or did not conduct it within the past 3 years. Of the 15 respondents who indicated that the assessment was documented, 12 either provided documentation, the text of the document in their response but not the document itself, or a citation to a DOD issuance related to the DOD Executive Agent that we were able to find independently. The documentation provided included, for example, minutes of annual meetings reviewing DOD Executive Agent programs, assessments the DOD Executive Agent directed independent consultants to conduct, or delegations of authority from the head of the component designated to be the DOD Executive Agent to other officials. Our review of these documents found that for half (6 of 12) of the DOD Executive Agents that provided documentation, the OSD Principal Staff Assistant did not conduct the assessment, and 3 of the 6 did not conduct it within the past 3 years, as shown in table 1 below. For example, the OSD Principal Staff Assistant did not conduct the assessments of the four DOD Executive Agents assigned to the Defense Logistics Agency (Subsistence, Bulk Petroleum, Construction/Barrier Material, and Medical Material). According to an official from the OSD Principal Staff Assistant's office, the OSD Principal Staff Assistant delegated the responsibility to conduct the assessment directly to the DOD Executive Agent in one case, and in the other three cases the OSD Principal Staff Assistant approved the DOD Executive Agent's decision to direct an independent consultant to conduct the assessments. In addition, according to Army officials, 2 of the 12 documented assessments should not be considered assessments. Specifically, Army officials from the Office of the Administrative Assistant to the Secretary of the Army, the office that manages the Army's DOD Executive Agents, did not agree that the documentation submitted by two Army DOD Executive Agents should be considered an assessment. Specifically, the DOD Executive Agents for Chemical and Biological Defense Program and the DOD Executive Agent for the Contract Linguist Program submitted Army memorandums stating that the Secretary of the Army was delegating the responsibilities of the DOD Executive Agent to other offices within the Army. According to Army officials who prepared the memorandums, the Army did not conduct any review or assessment of the DOD Executive Agent while generating these memorandums. DOD Directive 5101.1 states that the OSD Principal Staff Assistant shall assess DOD Executive Agent assignments and arrangements associated with such assignments under their cognizance, as noted previously. The directive further states the assessments shall occur periodically, but not less than once every 3 years, to determine the DOD Executive Agent's continued need, currency, and effectiveness and efficiency in satisfying end-user requirements. In addition, Standards for Internal Control in the Federal Government state that documentation is a necessary part of an effective internal control system, and is required for the effective design, implementation, and operating effectiveness of an entity's internal control system. The directive also assigns ODCMO the responsibility for overseeing the implementation of the directive. ODCMO officials told us that they did not know whether the assessments were occurring, and neither requested nor received assessments. The officials stated that they have not ensured the completion of DOD Executive Agents assessments because they narrowly interpreted their responsibility to oversee the implementation of DOD Directive 5101.1. Specifically, ODCMO officials stated that their responsibilities were limited to providing advice to other DOD components that expressed interest in using the DOD Executive Agent designation and maintaining a list of DOD Executive Agent designations. Further, we found that when the assessments were completed, according to the officials, the assessments were not always documented. While DOD Directive 5101.1 does not require the assessments to be documented, in the absence of such documentation, the OSD Principal Staff Assistant cannot demonstrate it has conducted an assessment in the past 3 years or that the assessment reviewed the DOD Executive Agent's continued need, currency, effectiveness, and efficiency in satisfying end-user requirements. According to DOD Directive 5101.1, ODCMO shall issue implementing guidance, which may include clarifying the responsibility of OSD Principal Staff Assistants in conducting assessments of DOD Executive Agents. ODCMO officials told us that they have not issued implementing guidance because they do not want to be prescriptive in how OSD Principal Staff Assistants should assess DOD Executive Agents, as each DOD Executive Agent designation is unique. Therefore, ODCMO wants to provide flexibility in how those OSD Principal Staff Assistants conduct the assessments, including how they define the terms continued need, currency, and effectiveness and efficiency in satisfying end-user requirements. However, ODCMO could issue implementing guidance that ensures that the assessments are completed and documented. Several OSD Principal Staff Assistants with whom we spoke also told us that additional ODCMO guidance could help clarify the assessment requirement. Without verifying that the OSD Principal Staff Assistants for all DOD Executive Agents have completed required assessments and providing implementing guidance requiring the documentation of the assessments, the department does not have reasonable assurance that OSD Principal Staff Assistants are assessing DOD Executive Agents or that DOD Executive Agents--as a management arrangement--are accomplishing department objectives. According to DOD officials, conducting these periodic assessments would assist the department in reviewing DOD Executive Agent designations to ensure that the department is managing its resources efficiently and effectively. DOD Executive Agents can help the department further achieve its range of objectives more efficiently and effectively when additional coordination is needed to focus DOD resources and minimize duplication or redundancy of activities, among other things. However, ODCMO faces challenges in overseeing DOD Executive Agents. For example, ODCMO has weaknesses in its approach to tracking its DOD Executive Agents, making it difficult to determine how effectively the office is carrying out its responsibilities. Further, ODCMO does not ensure that OSD Principal Staff Assistants are conducting required assessments or that these assessments are documented in a manner that supports that DOD Executive Agents were assessed for continued need, currency, and effectiveness and efficiency in meeting end-user needs. Given its oversight responsibility for the DOD Executive Agent program, ODCMO should to take action to ensure that requirements in DOD Directive 5101.1 are being met and that the program is being effectively implemented. Without this action, DOD does not know whether its Executive Agents are effective in meeting their intended purpose and may be missing opportunities to better manage its resources and activities department-wide. We recommend that DOD's Deputy Chief Management Officer take the following three actions: strengthen its approach to track DOD Executive Agents to ensure that its list and contact information are current and complete; verify that the OSD Principal Staff Assistants for all DOD Executive Agents have completed their required assessments every 3 years; and issue implementing guidance that OSD Principal Staff Assistants should document the assessments of DOD Executive Agents, including documenting how the assessments address the DOD Executive Agents' continued need, currency, and effectiveness and efficiency in meeting end-user needs. We provided a draft of this report to DOD for review and comment. In written comments, which are summarized below and reprinted in appendix II, DOD concurred with our recommendations. In addition, DOD provided technical comments, which we have incorporated into the report as appropriate. In its written comments, DOD stated that it plans to take several actions to implement the recommendations by the end of the first quarter of fiscal year 2018. Specifically, DOD stated that it will task the OSD Principal Staff Assistants to review the DOD Executive Agents under their cognizance, validate existing information, identify inaccuracies, and provide updated points of contact. In addition, DOD plans to issue guidance to the OSD Principal Staff Assistants to provide documentation of assessments completed in the last three years, and direct the OSD Principal Staff Assistants to initiate an assessment if one has not been completed in the last three years. Furthermore, this guidance will task OSD Principal Staff Assistants to conduct, document, and provide copies of these assessments for each DOD Executive Agent. Finally, DOD stated that the Deputy Chief Management Officer, once informed by the completed assessments of DOD Executive Agents, will take the necessary actions to enhance DOD Executive Agent oversight. We are sending copies of this report to the appropriate congressional committees, the Secretary of Defense, and the Deputy Chief Management Officer. In addition, the report is available at no charge on the GAO website at http://www.gao.gov. If you or your staff have any questions about this report, please contact me at (213) 830-1011 or [email protected]. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this report. GAO staff who made key contributions to this report are listed in appendix III. The Office of the Deputy Chief Management Officer (ODCMO) maintains a list of DOD Executive Agents. The list includes information about each DOD Executive Agent, such as the title of the DOD Executive Agent assignment, the office assigned as the Office of the Secretary of Defense (OSD) Principal Staff Assistant, the department official who designated the DOD Executive Agent, and the date of the DOD Executive Agent assignment. To describe the number of DOD Executive Agents, we analyzed DOD's list and DOD issuances designating the DOD Executive Agent assignment, and contacted department officials of each DOD Executive Agent. Below are four tables listing the DOD Executive Agent responsibilities assigned to the Secretary of the Army (see table 2), the Secretary of the Air Force (see table 3), the Secretary of the Navy, including the Marine Corps (see table 4), and the heads of other DOD components (see table 5). In addition to the individual named above, key contributors to this report were Tina Won Sherman (Assistant Director), Angeline Bickner, Sarolynn Savanuagh, Tim DiNapoli, Mae Frances Jones, Lori Kmetz, Kirsten Lauber, Shari Nikoo, Daniel Ramsey, Michael Silver, and Matthew Ullengren. | DOD maintains military forces with unparalleled capabilities. However, the department continues to confront weaknesses in the management of its business functions that support these forces. DOD uses Executive Agents, which are intended to facilitate collaboration, to achieve critical department objectives. Senate Report 114-255, accompanying a bill for the National Defense Authorization Act for Fiscal Year 2017, included a provision that GAO review DOD Executive Agents. This report (1) describes the number and focus of DOD Executive Agents; and evaluates the extent to which DOD (2) tracks its Executive Agents and (3) conducts periodic assessments of its Executive Agents. GAO reviewed relevant DOD directives and the list of Executive Agents; developed and implemented a questionnaire to DOD's Executive Agents; and interviewed relevant DOD officials. Based on GAO's analysis, the Department of Defense (DOD) has 81 Executive Agents--management arrangements where the head of a DOD component is designated specific roles and responsibilities to accomplish objectives when more than one component is involved. These Executive Agents are assigned to 12 DOD components and support a range of activities, including managing technology and developing training programs. The Secretary of the Army is designated as the Executive Agent for almost half of them (38 of 81). DOD's Executive Agent directive requires that the Office of the Deputy Chief Management Officer (ODCMO) maintain a list of Executive Agent designations and oversee their assessments, among other things. Office of the Secretary of Defense (OSD) Principal Staff Assistants are required to assess their respective Executive Agents every 3 years to determine their continued need, currency, efficiency, and effectiveness. GAO found weaknesses in DOD's approach to tracking its Executive Agents, resulting in inaccuracies regarding 10 Executive Agents. For example, DOD's list of Executive Agents included several that are not currently active. While ODCMO is required to maintain a list of Executive Agents, ODCMO officials rely on self-reported information from DOD Executive Agents and OSD Principal Staff Assistants. Without taking steps to accurately track DOD Executive Agents, DOD's list will continue to be out dated and ODCMO cannot effectively oversee DOD Executive Agents. Principal Staff Assistants had not periodically assessed more than half (37 of 70) of DOD Executive Agents that responded to GAO's questionnaire (see figure). ODCMO is responsible for overseeing the implementation of DOD's Executive Agents directive, which requires that Principal Staff Assistants conduct assessments; however, ODCMO officials told GAO they do not ensure that Principal Staff Assistants have conducted these assessments. GAO also found that Principal Staff Assistants are not required to document these assessments. Without verifying the completion of these assessments and issuing guidance requiring their documentation, DOD does not have reasonable assurance that DOD Executive Agents are accomplishing department objectives. GAO recommends that ODCMO strengthen its approach to track DOD Executive Agents; verify assessments are conducted; and issue implementing guidance for documenting assessments. DOD concurred with the recommendations. | 7,173 | 662 |
DOD's combat casualty care researchers focus their efforts on the major causes of injury and death on the battlefield, and on improving medical care in specific battlefield conditions. For example, DOD estimates that approximately 84 percent of potentially survivable battlefield deaths are caused by bleeding. Therefore, DOD focuses a significant amount of its research on ways to control bleeding on the battlefield. Other areas on which DOD researchers focus include extremity trauma, diagnosis and treatment of traumatic brain injury, and ways to improve the care provided to casualties prior to and during evacuation to a hospital. In order to improve medical care in these areas, DOD researchers use various means to apply findings from combat casualty care research to develop drugs or medical devices. For example, DOD researchers convene multidisciplinary teams to decide whether a research project is ready and feasible to support development of a drug or medical device, according to DOD officials. These teams consist of researchers and other DOD personnel who are involved in acquiring and maintaining drugs and medical devices. At multiple meetings, the teams make decisions on whether to allow the project to proceed. In addition, DOD researchers work with the FDA to understand and share general information about regulatory requirements for drugs and medical devices that DOD develops. DOD officials also told us that in some cases DOD researchers also share the results of DOD research with medical corporations, which develop these products. In addition to developing drugs or medical devices, DOD researchers apply findings from combat casualty care research by disseminating information on medical practices. For example, the Army Institute for Surgical Research publishes clinical-practice guidelines that clinical subject-matter experts develop in response to needs identified while providing care to combat casualties. These guidelines are based on the best existing clinical evidence and experience, approved by senior DOD medical officials, and are available to all military medical practitioners. In addition, DOD researchers share new medical knowledge and best- practice information by publishing research results in medical journals and making presentations at conferences. In May 2008, then-Secretary of Defense Robert Gates publicly expressed his commitment to improving medical care and support for wounded servicemembers. In that same month, DOD completed a program assessment of its medical research and development investments, which became the basis for DOD's June 2008 Guidance for the Development of the Force report. Among other matters, this assessment identified gaps in DOD's capabilities to protect the health of servicemembers, including health care provided to servicemembers who are wounded on the battlefield. For example, the 2008 report identified a gap in DOD's capability to diagnose, resuscitate, and stabilize casualties with survivable wounds. DOD used the capability gaps identified in the 2008 report as the justification for funding requests that DOD subsequently made for medical research and development, including for research to address gaps in DOD's capability to provide combat casualty care. This assessment also concluded that a consolidated medical research and development budget structure with a centralized planning, programming, and budget authority and with centralized management would provide the most efficient and effective process and governance for DOD's medical research and development investment. To address the gaps in its capability to provide combat casualty care, DOD has increased this research funding overall, as shown in figure 1. In fiscal year 2010, DOD's funding for combat casualty care research increased to $537 million, and 2 years later it fell to $321 million. Health Affairs and the Army, with 82 percent of the funding in fiscal year 2012, were responsible for the majority of this research (see fig. 2). The Navy, the Air Force, and DARPA were responsible for the remainder. Multiple officials and organizations oversee DOD's combat casualty care research and development. The Assistant Secretary of Defense for Research and Engineering--who reports to the Under Secretary of Defense for Acquisition, Technology and Logistics--is responsible for promoting coordination of all research and engineering within DOD, including health-related research such as combat casualty care research. In addition, the Assistant Secretary of Defense for Health Affairs serves as the principal advisor to the Under Secretary of Defense for Personnel and Readiness on a variety of health issues, including medical research, which includes research to improve combat casualty care. The Assistant Secretary of Defense for Research and Engineering and the Assistant Secretary of Defense for Health Affairs cochair the Armed Services Biomedical Research and Evaluation Management committee. This committee's charter states that it was established to facilitate coordination and prevent unnecessary duplication of effort within DOD's biomedical research and development program. Joint Technology Coordinating support the committee in specific research areas, including Groupscombat casualty care. Joint Technology Coordinating Groups are responsible for coordinating plans for research in their areas and for submitting recommendations on the distribution of responsibility for program execution and resources. (See fig. 3 for organizations that oversee combat casualty care research and development.) With regard to planning, there are multiple DOD organizations specifically devoted to biomedical research, and these organizations plan research and development designed to improve the medical care provided to injured servicemembers. They include the Army MRMC, the Office of Naval Research, the Naval Medical Research Center, the Air Force Office of Scientific Research, the Air Force Medical Support Agency, and DARPA. In March 2011, Health Affairs signed an interagency support agreement with the Army MRMC to take advantage of existing Army MRMC staff and infrastructure. Under the agreement, the Army MRMC manages certain Health Affairs funds for medical research and development. To help manage these funds, the Army MRMC established Joint Program Committees for the major areas of medical research that DOD conducts, including combat casualty care, which is managed by the Joint Program Committee for Combat Casualty Care (JPC-6). The JPC-6 includes representatives from the DOD biomedical research organizations within each military department, including the Marine Corps, as well as from DARPA, NIH, VA, and other DOD organizations that use the results of combat casualty care research--such as DOD's Special Operations Command. These organizations coordinate to prioritize how to spend the Health Affairs funding for combat casualty care research. Other DOD research organizations also conduct research that is at times related to combat casualty care. Typically these research organizations do not plan or conduct biomedical research, but sometimes they identify ways that applications of their research could improve combat casualty care. These organizations include the Army Research Laboratory and the Naval Postgraduate School. DOD's biomedical research organizations use a coordinated approach to plan for combat casualty care research and development in a manner that is consistent with key collaboration practices. Further, DOD research organizations do not always share information early in the research process. DOD has also taken steps to coordinate with other federal agencies that are involved in combat casualty care research. DOD's biomedical research organizations coordinate combat casualty care research and development planning in a manner that is consistent with key collaboration practices identified in prior GAO work to enhance and sustain coordination. These key practices include agreeing on roles and responsibilities and establishing a means to operate across organizational boundaries. DOD's biomedical research organizations responsible for combat casualty care research and development have agreed on their roles and responsibilities, including establishing a key leadership position responsible for combat casualty care research. As we have previously reported, agreement on roles and responsibilities among coordinating organizations is important because it enables each organization to stay informed about the others' individual and joint efforts, and it facilitates decision making. DOD's biomedical research organizations have agreed on the roles and responsibilities for the organizations involved in planning, overseeing, and executing this type of research. First, Health Affairs and the Army MRMC--the two organizations that fund most combat casualty care research and development--have outlined their roles in an Interagency Support Agreement, which designates the Army MRMC as the organization responsible for managing the day-to-day use of Health Affairs funding for medical research, including research to improve combat casualty care. Second, the JPC-6 developed a draft charter in 2010 that explains the roles and responsibilities for all of the JPC-6 member organizations, including the non-DOD organizations, such as VA and NIH. The draft charter was finalized in early January 2013, while we were conducting our review.told us that the JPC-6 began using the charter in 2010, but that they delayed finalizing it in part because they wanted to have the opportunity to incorporate lessons learned during the operation of the committee during its first 2 years. The charter states that JPC-6 members represent the interests of their member organizations as well as provide subject- matter expertise and advice to the JPC-6 chair on requirements, program Health Affairs and Army MRMC officials management, transition planning, and planning and programming for future investments. In addition to establishing a JPC-6 charter, Health Affairs and Army MRMC have established a key leadership position responsible for combat casualty care research by having one official serve simultaneously in three complementary roles: JPC-6 chair, Director of the Army Combat Casualty Care Research Program, and chair of the Joint Technology Coordinating Group for Combat Casualty Care. As noted in the JPC-6 charter, the group's chair is responsible for making recommendations to Health Affairs for planning, programming, budgeting, and executing research and development to improve medical care provided to combat casualties, and the chair is to make these recommendations with the advice and support of the JPC-6 members. Because the DOD official serving as JPC-6 chair also serves as Director of the Army Combat Casualty Care Research Program and chair of the Joint Technology Coordinating Group for Combat Casualty Care, this official oversees the majority of this research in DOD. From fiscal years 2008 through 2011, this official oversaw approximately 600 research projects, constituting over 80 percent of DOD's funding for combat casualty care research. Health Affairs and Army MRMC officials told us they expect that one official will lead all three organizations in the future. DOD's biomedical research organizations responsible for combat casualty care research and development have established mechanisms to facilitate working across organizational boundaries--a step that, as we have previously reported, helps to enhance and sustain coordination. For example, DOD located nearly all of the DOD biomedical research organizations that conduct combat casualty care research at the Joint Center of Excellence for Battlefield Health and Trauma Research at Fort Sam Houston, Texas. The center includes the U.S. Army Institute for Surgical Research and other principal DOD biomedical research organizations that conduct combat casualty care research, such as the combat casualty care research functions from the Naval Medical Research Center and from Walter Reed Army Institute of Research. DOD officials told us that being located in the same place is useful in enabling them to know what other DOD organizations are doing with their related research and development. Another example of a mechanism to facilitate working across organizational boundaries is the Military Health System Research Symposium, an annual conference that provides DOD researchers the opportunity to discuss and address multiple medical research topics, including combat casualty care, with researchers from other federal agencies, academia, and private industry. DOD officials told us that these annual conferences have led to interagency collaboration on research and development for combat casualty care. DOD organizations that typically do not conduct biomedical research are generally not involved in DOD's efforts to coordinate combat casualty care research. When these nonmedical research organizations conduct research relevant to combat casualty care, they do not always share relevant information with appropriate officials early in the research process. We have previously reported that organizations involved in similar missions should coordinate and share relevant information early to avoid unnecessary duplication of work. The JPC-6 chair, who is the lead official responsible for coordinating combat casualty care research, told us that he periodically has identified cases in which researchers began conducting research relevant to combat casualty care, but did not coordinate with him early in the process. He stated that in these cases, the research typically had been underway for a period of 1 to 5 years before he learned about it. He stated that he coordinates with nonmedical research organizations when he becomes aware of research relevant to combat casualty care. However, he stated that he has not always been aware of relevant research, and that there may be similar ongoing research projects about which he is currently unaware. For example, the Army Research Laboratory, which typically conducts research in the physical, engineering, and environmental sciences, started developing a product in 2006 that had the potential to control the bleeding of wounded soldiers-- the leading cause of preventable deaths on the battlefield--but did not inform the JPC-6 chair of this research until 2 years later. In addition, multiple DOD officials--including the JPC-6 chair and other officials responsible for health research--stated that other DOD research organizations, such as the Naval Postgraduate School, the Defense Threat Reduction Agency, and the Joint Improvised Explosive Device Defeat Organization, have conducted research related to combat casualty care in the past and have not always coordinated or shared information early in the research process. The JPC-6 chair also stated that some DOD researchers do not share information with him early in the research process because they are not aware of the need to coordinate early and may not fully understand medical research requirements, such as those that are necessary to support FDA processes for approval of new drugs and medical devices. He also stated that a lack of awareness and understanding can result in researchers duplicating each other's work. As discussed above, Army Research Laboratory researchers did not inform the JPC-6 chair of their work for 2 years, and as a result they learned that some of their initial testing did not fully adhere to medical testing protocols associated with wounds and wound severity. Subsequently, the researchers had to redo some steps in their research. An Army Research Laboratory official responsible for the project told us that they could have avoided the inefficiency of duplicating these steps if they had shared information with the JPC-6 chair at an earlier point. The JPC-6 chair stated that, since this occurrence, the Army Research Laboratory and Army MRMC now coordinate with one another regularly to identify Army Research Laboratory projects with potential implications for combat casualty care. DOD coordinates medical research information with other federal agencies, including FDA, NIH, and VA. DOD coordinates with FDA with regard to drugs and medical devices it develops because FDA is responsible for overseeing the safety and effectiveness of these products--including those that are developed through DOD's combat casualty care research--and DOD must obtain FDA's regulatory review and approval or clearance to field medical products. FDA officials stated that they regularly meet with the commanding general of the Army MRMC to review DOD's medical research priorities and to share general information about regulatory requirements. FDA officials also provide product-specific advice to DOD regarding regulatory requirements by meeting with DOD researchers throughout the development process. This coordination is consistent with FDA's efforts, noted in previous GAO reports, to address concerns from industry and advocacy groups, including those related to the timeliness of the review process and the need to improve communication between FDA and stakeholders throughout the development process. DOD officials told us that FDA regulators were very responsive to their regulatory questions and concerns, and they reported that sometimes this communication helped to expedite the development process. Likewise, it is important for DOD, NIH, and VA to coordinate with each other because all of these agencies conduct research that is directly related to combat casualty care research. DOD, NIH, and VA conduct joint program reviews, prepare joint strategic documents, complete joint research projects, and attend joint symposiums and conferences to share their research. Our prior work identified some issues concerning the ability of DOD, NIH, and VA to readily access comprehensive medical research information funded by the other agencies.three agencies could improve their ability to efficiently identify potential duplication if they improved access to each others' comprehensive electronic information on funded health research. DOD officials recently stated that DOD and the other two agencies are working together to address these concerns. Specifically, NIH has provided a DOD official with access to an NIH database that contains information about funded health research projects, and it has also provided training and support so that the DOD official can search the database for potential duplicated research. If this effort is successful, DOD plans to identify additional medical research officials who will be granted access to NIH's health research database. Because VA's medical research information resides We found that the in this database, DOD will also be able to identify VA research that is directly related to DOD's combat casualty care research. Health Affairs and Army MRMC monitor and assess the progress of combat casualty care research and development projects, but they have not assessed the extent to which this research fills gaps in DOD's capability to provide combat casualty care or achieves other goals for this research, including those related to improving DOD's ability to control bleeding, which is the primary cause of death on the battlefield. Internal control standards for the federal government state that agencies should monitor and assess their performance over time to help ensure that they meet the agency's missions, goals, and objectives. Using performance information such as performance metrics can aid agencies with monitoring results, developing approaches to improve results, and helping determine progress in meeting the goals of programs or operations. Health Affairs and Army MRMC monitor and assess the progress of combat casualty care research and development projects. For example, Health Affairs and Army MRMC monitor and assess cost, schedule, and performance metrics for individual research projects to determine whether to continue funding, make necessary corrections to, or terminate these projects. Senior leadership in these organizations reviews projects annually to determine whether they are meeting established cost, schedule, and performance baselines. In addition, these leaders assess technology readiness levels--which are measurements of maturity level--to determine whether findings from a research project are sufficiently mature to move to the next phase of development. Health Affairs and Army MRMC also monitor and assess some aspects of the progress of the overall combat casualty care research portfolio, such as the number of projects completed, ongoing, or canceled, as well as the number of products available to users in the field. These organizations have applied findings from combat casualty care research to field five such products between fiscal years 2008 and 2011. For example, Health Affairs and Army MRMC officials told us that DOD fielded a combat gauze product that was the result of combat casualty care research. This gauze includes a mineral to help form blood clots and is designed to stop severe bleeding in less than 4 minutes. Following the annual combat casualty care research portfolio review in September 2012, Health Affairs and Army MRMC reported that they plan to identify new performance metrics, such as data related to peer-reviewed publications and FDA approved drugs and medical devices that will provide additional information on the overall portfolio's progress. However, Health Affairs and Army MRMC have not assessed the extent to which the results of combat casualty care research fill gaps in DOD's capability to provide care to combat casualties. As we discussed earlier, DOD identified a number of gaps in its capability to provide combat casualty care in the 2008 Guidance for the Development of the Force analysis and report. Since 2008, Health Affairs and Army MRMC told us that they have completed about 44 combat casualty care research projects that are each designed to address one or more of these capability gaps. Health Affairs and Army MRMC officials told us that in 2010 they attempted to measure the extent to which the 2008 capability gaps had been filled on the basis of the research results. However, they abandoned that effort because, according to officials, in 2010 researchers had not completed a sufficient amount of research designed to fill the 2008 capability gaps. In addition, these officials indicated that the capability gaps were not specific, were not organized to correspond with DOD's research areas, and did not reflect the state of medical knowledge at the time. Health Affairs officials told us that they are currently revising these capability gaps and they expect to complete the revision in 2013. Following the Health Affairs revision, the Joint Staff--a group of senior military leaders in DOD--will then validate the capability gaps. Health Affairs and Army MRMC officials told us that they plan to assess whether the results of future research fill the revised capability gaps once the Joint Staff validates them. In addition, Health Affairs and Army MRMC have not developed an assessment of the extent to which the results of combat casualty care research have achieved other goals for this research. Both Health Affairs and Army MRMC have established goals for the combat casualty care research portfolio including several related to improving DOD's ability to control bleeding, which is the primary cause of death on the battlefield. For example, Health Affairs set a goal for DOD to improve its ability to control bleeding in areas of the human body where it is not feasible to apply a tourniquet, such as on internal organs or the groin. Health Affairs and Army MRMC officials told us that they periodically review and discuss progress toward these research goals for certain research topics. However, these officials have not developed an assessment that comprehensively identifies each of the goals for the portfolio and includes information about the extent to which each goal has been met. They acknowledged that more work is needed to do this. Following a review and analysis of the combat casualty care research portfolio in September 2012, Health Affairs and Army MRMC officials reported to us that they intended to complete an overarching strategic roadmap for the portfolio by March 2013. They told us that they expect the roadmap could include specific project timelines and goals, among other things. However, on the basis of the information provided by DOD officials, we were unable to determine if the plan will clearly delineate how Health Affairs and Army MRMC will assess the extent to which results from combat casualty care projects fill capability gaps and achieve other goals. Until Health Affairs and Army MRMC assess the results of DOD's research against revised capability gaps and other goals, DOD will not have reasonable assurance that the research it is conducting meets its needs. Coordination among the various organizations that plan and conduct combat casualty care research and development is important to effectively produce medical solutions to save or improve the lives of injured servicemembers. DOD has taken important steps to agree on roles and responsibilities and to establish the means for coordination and collaboration across organizational boundaries. However, DOD's research organizations can only coordinate with each other when they become aware of relevant research. Without communicating to nonmedical research organizations about the importance of coordinating with the JPC-6 chair early in the research process, DOD research organizations may have to redo some steps of their research to address medical research requirements that they may not fully understand. Moreover, while DOD assesses the progress of combat casualty care research projects, it is also important that DOD monitor and assess the extent to which the results of its combat casualty care research fill the gaps in DOD's capability to provide combat casualty care and achieve other goals that it established for the research. However, without a plan for monitoring and assessment, DOD runs the risk that it may not be producing results that most effectively improve combat casualty care to save lives on the battlefield. 1. To ensure that nonmedical DOD research organizations coordinate with the Assistant Secretary of Defense for Health Affairs early in the research process to understand medical research requirements and avoid inefficiencies that may lead to duplicative work, we recommend that the Secretary of Defense direct the Under Secretary of Defense for Acquisition, Technology and Logistics to communicate to DOD's nonmedical research organizations the importance of coordination with the JPC-6 chair on combat casualty care issues, and require this coordination early in the research process when these organizations conduct research with implications for combat casualty care. 2. To improve DOD's ability to assess the overall performance of its combat casualty care research portfolio, we recommend that the Secretary of Defense direct the Under Secretary of Defense for Personnel and Readiness to direct the Assistant Secretary of Defense for Health Affairs to develop and implement a plan to assess the extent to which combat casualty care research and development fills gaps in DOD's capability to provide combat casualty care and achieves DOD's other goals for this portfolio of research. We provided a draft of this report to DOD, VA, and the Department of Health and Human Services (HHS), which includes FDA and NIH. In response, we received written comments from DOD and HHS, which are reprinted in appendixes I and II, respectively. VA did not comment on this report. DOD and HHS also provided technical comments that we have incorporated as appropriate. In its written comments, DOD concurred with the recommendations we made to the department and also described steps it had taken or planned to take in response to our recommendations. Specifically, DOD concurred with our first recommendation to communicate to nonmedical research organizations the importance of coordination with the JPC-6 chair and require this coordination early in the research process. DOD also concurred with our second recommendation to develop and implement a plan to assess the extent to which combat casualty care research addresses DOD's capability gaps and achieves its other goals. In its comments on our second recommendation, DOD stated that it planned to revise its process to better assess the extent to which each combat casualty care research project closes capability gaps. Moreover, when we sent our draft report to DOD for comment in December 2012, Health Affairs and Army MRMC had not yet finalized the JPC-6 charter. Therefore, we included a recommendation in our draft report that DOD issue the final charter. In early January 2013, after we sent the draft report to DOD, the commanding general of Army MRMC signed and issued the final JPC-6 charter. As a result, we did not include the recommendation to finalize the charter in our final report. In its written comments, HHS responded to a statement in the draft report that DOD, NIH, and VA could improve their ability to efficiently identify potentially duplicative research with improved access to each agency's electronic health research information, as noted in a 2012 GAO report. HHS stated that DOD has access, to varying degrees, to NIH and VA medical research information. Consistent with our 2012 report, HHS stated that NIH and VA need access to DOD medical research information to reduce the risk of potentially duplicative research. HHS also stated that the agencies continue to evaluate the best approach to providing NIH and VA with access to DOD's medical research information. We are sending copies of this report to the appropriate congressional committees; the Secretary of Defense, the Deputy Under Secretary of Defense for Personnel and Readiness; the Deputy Under Secretary of Defense for Acquisitions, Technology and Logistics; the Assistant Secretary of Defense for Health Affairs; the Secretaries of the Army, Navy, and Air Force and the Commandant of the Marine Corps; the Secretary of Health and Human Services; the Secretary of Veterans Affairs; and other interested parties. In addition, the report is available at no charge on the GAO website at http://www.gao.gov. If you or your staff have any questions about this report, please contact Linda Kohn at (202) 512-7114 or [email protected] or Brenda Farrell at (202) 512-3604 or [email protected]. Contact Points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this report. GAO staff who made key contributions to this report are listed in appendix III. In addition to the contacts named above, Will Simerl, Assistant Director; Steve Boyles; La Sherri Bush; James P. Klein; Monica Perez-Nelson; Michael Pose; Mike Silver; Sarah Veale; and Cheryl Weissman made key contributions to this report. | DOD estimates that about 24 percent of servicemembers who die in combat could have survived if improved and more timely medical care could be made available. Because multiple DOD organizations conduct research to develop medical products and processes to improve combat casualty care, it is critical that these organizations coordinate their work. It is also important that agencies monitor and assess their performance to help achieve organizational goals, which for DOD include addressing gaps in its capability to provide combat casualty care. The National Defense Authorization Act for Fiscal Year 2012 directed GAO to review DOD's combat casualty care research and development programs. This report assesses whether DOD (1) uses a coordinated approach to plan this research; and (2) monitors and assesses this research to determine the extent to which it fills capability gaps and achieves other goals. GAO reviewed DOD's policies and documentation; interviewed officials from DOD and other federal agencies; and analyzed metrics DOD used to gauge the progress of its research. The biomedical research organizations of the Department of Defense (DOD) use a coordinated approach to plan combat casualty care research and development, but not all of DOD's nonmedical research organizations share information early in the research process. GAO has previously reported that federal agencies can enhance and sustain collaboration of efforts by using key practices, such as agreeing on roles and responsibilities and establishing the means to operate across organizational boundaries. In 2010, DOD established a planning committee to coordinate the efforts of organizations conducting combat casualty care research. The committee developed a draft charter in 2010 identifying members respective roles and responsibilities. DOD issued the final charter in early January 2013, while GAO was conducting its review. DOD also facilitated operation across organizational boundaries by colocating most of the organizations conducting combat casualty care research. However, DOD organizations that typically do not conduct biomedical research, such as the Army Research Laboratory, are not involved in DOD's efforts to coordinate this research. When these organizations conduct research relevant to combat casualty care they do not always share information with appropriate officials early in the research process, as they are not aware of the need to coordinate early and may not fully understand medical research requirements. As a result, some researchers have had to repeat some work to adhere to these requirements. DOD has also taken steps to coordinate with other federal agencies that are involved in this research. The Office of the Assistant Secretary of Defense for Health Affairs (Health Affairs) and the Army Medical Research and Materiel Command (MRMC) assess the progress of combat casualty care research and development projects, but they have not assessed the extent to which this research fills gaps in DOD's capability to provide this care or achieves other DOD goals. Federal internal control standards state that agencies should assess their performance to ensure they meet the agency's objectives. Health Affairs and Army MRMC--the two organizations that fund most combat casualty care research and development--monitor research projects to determine whether to continue funding, make necessary corrections, or terminate these projects. However, in 2008 DOD identified gaps in its capability to provide combat casualty care, and although Health Affairs and Army MRMC have completed 44 research projects since then designed to address these gaps, they have not assessed whether the results of this research fill the gaps identified in 2008. In addition, Health Affairs and Army MRMC established other goals for this research portfolio to improve combat casualty care. For example, in 2010, Health Affairs set goals to improve DOD's ability to control bleeding. However, neither organization has developed an assessment that comprehensively identifies each of the goals for the portfolio and includes information about the extent to which each goal has been met. Health Affairs and Army MRMC officials stated that they intend to complete a strategic roadmap for the portfolio, but GAO was unable to determine if the roadmap will include a plan for a comprehensive assessment of this portfolio. Without such a plan for a comprehensive assessment, these organizations cannot be sure the research they are conducting is producing results that most effectively improve combat casualty care to save lives on the battlefield. GAO recommends that DOD (1) communicate the importance of early coordination among DOD's nonmedical organizations and (2) develop and implement a plan to determine the extent to which research fills gaps and achieves other goals. DOD concurred with these recommendations. | 6,037 | 928 |
The Army established the MWO program to enhance the capabilities of its fielded weapon systems and other equipment and correct any identified operational and safety problems. Modifications vary in size and complexity. For example, for a modification to the Bradley Fighting Vehicle, the Army is adding the driver's thermal viewer to improve visibility during night-time and all-weather conditions, the battlefield combat identification system to reduce the potential for friendly fire casualties, and the global positioning receiver and digital compass system to improve navigation. In contrast to this major modification, the Army is adding updated seat belts to its fleet of High Mobility Multipurpose Wheeled Vehicles to improve safety. The Army is making a sizable investment to modify its fielded equipment. For fiscal years 1995-97, the Army received $5.1 billion for all of its modification programs, and the President has requested $6.7 billion for 208 modifications to the Army's equipment for fiscal years 1998-2003. About 80 percent of that amount is for modifications to helicopters and other aviation items and to weapons and tracked combat vehicles. According to Army headquarters officials, as the Army's budget has declined, less funding has been available for new systems. As a result, the Army will have to rely more heavily on the modification of its assets to correct deficiencies and enhance equipment capabilities. For example, to correct identified problems and add technological advances, the Army has approved 95 MWOs for its Apache helicopter since fielding this system in 1986. Management of the MWO program is shared by several Army headquarters organizations. Each organization has a wide range of decision-making responsibilities in developing and supporting weapon systems, which includes modifying weapon systems and equipment through the MWO program. The Army defined the roles and responsibilities of its headquarters organizations and MWO sponsors in its September 6, 1990, Interim Operating Instructions for Materiel Change Management, which superseded Army Regulation 750-10. One of the objectives cited in the instruction was to decentralize the management of each MWO and yet retain overall responsibility and oversight at the headquarters level. The instructions list numerous responsibilities for Army organizations; however, Army headquarters officials emphasized the following key duties for the organizations with primary responsibilities: The Deputy Chief of Staff for Operations has responsibility for prioritizing the required modifications for technical and safety issues, justifying and monitoring the overall budget, and allocating the approved funding. The Deputy Chief of Staff for Logistics has responsibility for overall supply and maintenance support and for knowing the status of MWOs. The Acquisition Executive has responsibility over modifications to correct or enhance the operations of weapon systems still being acquired. The Army Materiel Command has responsibility over modifications to correct or enhance the operations of weapon systems that are no longer being acquired and for other equipment items. In addition, the Army Materiel Command is executive agent for the headquarters and, as such, is responsible for knowing the status of MWOs and for ensuring that each MWO is complete and conforms with Army policy and procedures before the modification is done. Program sponsors for individual weapon systems and other equipment items are responsible for executing each MWO--acquiring the various components needed to modify the weapon systems and equipment, putting together the applicable MWO kit, ensuring logistical support items are addressed, and managing the modification process on a day-to-day basis. The MWO program sponsors for systems still being acquired are managed under the Program Executive Office of the Army Acquisition Executive, and the program sponsors for systems no longer being acquired are managed under the commodity commands of the Army Materiel Command. In January 1997, the Army formed a process action team, including representatives from the organizations with program management responsibility, to study how the program could be improved. The Army also hired a contractor to assist in evaluating how automated information might be used to support program management. We coordinated with the process action team and have provided the team with information as our evaluation progressed. The process action team expects to provide its recommendations to the Army by October 1997. The Army does not currently maintain centralized information to track the status of equipment modifications. Instead, it relies on the individual program sponsors to capture the information they need to track the separate modifications for which they are responsible. As a result, Army headquarters and Army Materiel Command officials do not have the information they need to effectively oversee this highly decentralized modification program. Moreover, the information that Army headquarters officials and maintenance personnel have for tracking modifications may not be entirely accurate. Finally, field and depot maintenance personnel do not have ready access to the information they need to determine current equipment configurations, nor do they have ready access to the technical information they need to maintain the equipment once it is modified. Individual program sponsors decide how they will track the modifications for which they are responsible. Our review showed a variety of ways that system modifications are tracked. As a general rule, for high-cost systems such as M1 tanks, Bradley Fighting Vehicles, and helicopters, the command or program sponsors established databases showing systems that were modified and systems that were not. However, for high-density, widely dispersed systems such as M113 armored personnel carriers, trucks, and radios, program sponsors make very little or no attempt to track which systems were modified. To carry out its management functions, the Army Materiel Command had previously developed an integrated database to track the status of MWO installation and funding. However, the Command quit using the system because the Army (1) discontinued funding to maintain the portion of the system used to track MWO installation and (2) canceled the remaining portion of the system because it was not chosen as a Department of Defense (DOD) standard system to track funding. As noted, a contractor is currently studying the automated data needs of the MWO program. The potential problems created by the lack of centralized information readily available to Army officials to track modifications were highlighted in a 1994 Army Audit Agency report. The report pointed out that the Army Materiel Command needed up-to-date equipment configuration information to satisfy requirements that pertain to readiness, safety, and compliance with laws. The report also noted that without a centralized information system, the Command's current and future ability to plan for the sustainment of weapon systems was weakened. Furthermore, this could affect the Army's current and future readiness position and adversely affect troop survivability. Army headquarters and Army Materiel Command officials responsible for formulating the MWO program budget and for ensuring that upgraded and enhanced equipment is available to satisfy the Army's force structure have limited information about what MWO funds have been spent, what equipment has been modified, and what equipment still needs to be modified. Due to the decentralized nature of the program, the Army budgets for MWOs through each program sponsor, who has discretion in spending and transferring funds. While the data available from program sponsors provide some information, Army headquarters officials told us they do not have ready access to this information and that it is insufficient to enable them to track budget expenditures. As previously stated, not all program sponsors track the status of their MWOs. While the information for tracked systems provides some degree of control over the configuration, such information is not available for all weapon systems and equipment. Moreover, headquarters officials maintain that these individual tracking systems do not have all the information they need to make informed decisions and are not readily accessible. The lack of timely information on equipment configuration could have potential adverse effects. For example, if the Army deployed a mechanized infantry division, it would need to know the latest configuration of the division's tanks, Bradley Fighting Vehicles, helicopters, and trucks for mission considerations as well as to ensure that the appropriate parts needed for maintenance were on hand. To determine the latest configuration of this equipment, Army officials would have to contact the respective systems' program sponsors to determine how many tanks, Bradleys, and helicopters of each configuration there were in the division--a time-consuming process. In addition, civilian aviation and Army ground maintenance personnel at Fort Hood, Texas, and Fort Carson, Colorado, told us that the accuracy of the databases may be suspect. For example, they said that in some instances modified parts had been removed from aircraft such as the Huey utility helicopter and nonmodified parts had been reinstalled. This occurred because either the unit did so intentionally or no modified parts were in stock when the new parts broke. As a result, the configuration of these aircraft and ground equipment are not always accurately portrayed in the database used by the maintenance personnel, and Army headquarters officials would not know the current configuration for these aircraft or ground equipment. Without the latest and most accurate configuration information, it is difficult to ensure that deploying units have the latest, most enhanced, and most survivable equipment. Logistics support is also complicated because planners do not know which type of and how many spare parts are needed to support the unit. Depot maintenance personnel at the Anniston Army Depot, Alabama, told us they need current and accurate configuration data to overhaul equipment but that they do not have such data. To overhaul equipment, they need to know whether any modifications or components are missing. Lack of good configuration data makes it difficult to accurately estimate the costs of overhauls and to have the proper kits and repair parts on hand. Officials said that, as a result, they expend additional labor for physical inspections and make allowances in their cost estimates to cover unanticipated problems. For example, depot personnel had to visually inspect 32 National Guard trucks in the depot for overhaul because they had no way of knowing whether two authorized modifications had been made when the vehicles arrived. When this happens, the overhaul program is delayed while depot personnel order the parts or kits. However, if MWO kits are not installed at the time the modification is made to the fleet, the kits are often no longer available. Field and support organization personnel also told us they have trouble identifying what the configuration of weapon systems and equipment should be and whether modifications have been made. They told us they need to know whether the configuration of weapon systems and equipment is up-to-date and what is required on the item in order to maintain it. They said that this problem is especially acute for items that are transferred from other units. These officials said they had sometimes spent many hours inspecting equipment to determine its current configuration because determining whether modifications had been done was not easy. For example, during our visit to Fort Carson, Colorado, a maintenance chief said that all authorized modifications on two helicopters he had received from another geographic area were supposed to have been made, but in preparing them for deployment, a visual inspection showed some modifications had not been made. According to the chief, a contractor team had to make the necessary modifications before the aircraft could be deployed. No tracking information and no central list of modification changes that should have been made are available for equipment with lower dollar values, like trucks. According to field personnel, the only way to determine the configuration of weapon systems or equipment is to do a physical inventory and compare the results to similar items that are already assigned to the unit. Maintenance personnel at several locations said that an information system that tracks both the completion of MWOs and any removal or transfer of major components would be useful. However, they would rather have this capability added to their existing maintenance information system than have an entirely new information system to maintain and use. We were told this tracking information will become especially critical in the future as more modifications involve software revisions. Without tracking all of the MWO changes, removal or transfer of major components, and software revisions, the configuration data recorded in the information system will be inaccurate. Field and support organization personnel told us that they also need up-to-date technical information to maintain equipment. The Army's interim guidance requires technical publications to be updated and distributed to field locations before modifications are made. However, maintenance personnel from Fort Hood, Texas, and Fort Campbell, Kentucky, told us that technical manual updates are published only on a yearly basis and that they do not receive updated technical publications in a timely manner. If the modification and resulting configuration change occur between updates, the unit may have to wait months before receiving the updated technical information. This delay not only prevents maintenance personnel from using the latest techniques to troubleshoot equipment but it may also result in wasted effort and impede supply personnel from ordering the correct repair parts. A division aviation maintenance officer at Fort Campbell cited several instances in which the lack of up-to-date technical manuals caused wasted effort or delayed the installation of the modification. For example, in July 1996, when division maintenance personnel modified the fuel subsystem on the Apache attack helicopter, they did not receive revisions to the supply parts manual. Subsequently, the aircraft was grounded and the maintenance team wasted many hours troubleshooting because the old manual did not identify the new fuel transfer valve. This new part would have been identified in the revised manual. In another instance, they had to delay the installation of the embedded global positioning system on the Apache by 2 weeks because the Apache program office did not provide changes to the maintenance test flight and operator manuals. The Army sometimes loses portions of its enhanced equipment capabilities achieved through equipment modifications because Army units cannot always obtain spare parts for its modified weapon systems and equipment. This occurs because program sponsors do not always order initial spare parts for the supply system when they procure MWO kits. Furthermore, they do not always modify the spare parts that are at the depot and unit level to the configuration of the new component. Army officials reviewing the MWO program believe that these problems occurred because Army regulations are not clear about whether program sponsors are supposed to provide initial spare parts when they acquire the MWO kits. As a result, Army units increase their efforts to keep equipment operational and ready. In addition, program sponsors and supply system personnel do not always follow policies and procedures to ensure that supply system records are updated to show the addition of new items and the deletion of replaced items. When the supply system records are inaccurate, the Army's budget may not reflect accurate requirements for new spare parts to repair and maintain modified weapon systems and equipment. Some program sponsors have not used their limited funds to order initial spare parts for the supply system, according to Army officials responsible for the management of the MWO program. Ideally, initial spare parts would be provided to bridge the gap between the modification of equipment and the entrance of the replenishment spare parts into the Army's supply system. Providing initial spare parts at the time of modification is needed because the supply system can take 18 to 24 months or more to provide replenishment spare parts, according to aviation supply representatives. According to Army civilian aviation maintenance personnel at Fort Hood and Army aviation and ground maintenance personnel at Fort Carson and Fort Campbell, program sponsors did not always modify spare parts at unit and depot locations when equipment was modified. For example, we were told that the Apache attack helicopters were being modified with an improved fuel subsystem, but at least four major components were not available in the depot supply system. As a result, aviation maintenance personnel had to take parts from five MWO kits intended for other aircraft. This MWO had been ongoing for 15 months. Aviation personnel said this occurred because at least some portion of the components stored at the depot had not been modified to the new configuration. One program sponsor told us his office was not required to buy initial spare parts or modify parts located at depots when they modified equipment in the field. However, the Army's interim operating instructions require program sponsors to ensure all necessary integrated logistical support parts items are addressed. Furthermore, according to Army Regulation 700-18, ordering initial spare parts is part of the total integrated logistical support package for systems and end items. This regulation, which does not specifically refer to modifications, requires program sponsors to coordinate logistical support requirements with all agencies and activities concerned with initial materiel support for weapon systems and equipment. According to Army headquarters officials, both the interim guidance and the regulation require program sponsors to provide initial spare parts and to modify spare parts, but neither may be clear enough to ensure that all program sponsors do it for modifications. In addition, Army headquarters officials told us that when the Army Materiel Command used configuration control boards, comprised of technical and administrative representatives, to ensure the MWOs were complete and conformed with Army policies and procedures, the need to buy spare parts was part of the approval process. The Army Materiel Command lost this quality control when the reviews were decentralized to the program sponsors. Army personnel at the four locations we visited told us that they had to take additional measures to support their equipment because they had experienced problems obtaining spare parts. They stated that if spare parts were not available, they took components from MWO kits. For example, the only way to obtain spare parts for the new fuel control panels--part of the Apache attack helicopter fuel crossover modification--was to take them from kits that were needed to modify other Apache helicopters. In addition, they had obtained parts outside the normal supply system by fabricating parts locally and by buying parts directly from contractors with local funds. These activities have led to higher costs and reduced efficiencies at units we visited. In reviewing 73 MWO cases, we attempted to determine whether the Army had properly phased out old spare parts and added new items to its supply system to support newly modified equipment. Because the Army does not have an automated list of major components in MWOs, we encountered difficulties in trying to make this analysis and could not identify a significantly large number of the major components. We compared information on those major components that we could identify with the Army's budget justification report and inventory records and found many irregularities. For example, national stock numbers had not been assigned for some components; some items with national stock numbers could not be tracked into the supply system; and relationship codes, which show whether old items are to be phased out of the supply system, were not always assigned. We were unable to measure the impact of these irregularities from our relatively small sample of MWOs; however, we believe that they indicate long-standing weaknesses in the Army's management of spare parts. For example, using a larger universe, we reported on similar errors in the Army's budget justification report in December 1995. In that report, we noted that the Army's budget justification report for spare parts contained numerous errors, including errors in the relationship codes and inaccurate records for items being repaired at maintenance facilities. We reported that as a result of the errors, the Army lacks assurance that its budget requests represent its actual funding needs for spare parts. Field maintenance personnel cited numerous problems in modifying their weapon systems and equipment. For example, they stated that (1) the completion of multiple MWOs on the same piece of equipment is not always coordinated, or not all equipment is modified at the same time; (2) they do not always receive adequate notice of MWOs; and (3) modified equipment does not always work together with other equipment once the modification takes place. As a result, they believe some units are losing equipment capability or experiencing reduced reportable mission time, the cost to install MWOs is increasing, and the training of unit personnel may be adversely affected. Army headquarters and Army Materiel Command officials believe these problems are also occurring because of their loss of oversight and control over the program and the inconsistent implementation of policies and procedures by program sponsors, especially in negotiating fielding plans with the affected organizations. Maintenance personnel told us that the completion of multiple MWOs on the same equipment is not always coordinated. For example, the National Guard is testing a program to place some of its equipment in long-term preservation storage. Equipment in long-term storage testing at the Camp Shelby, Mississippi, mobilization and equipment training site has been taken out of storage several times so modifications can be made. As a result, the program was disrupted, and additional labor hours were expended, according to a National Guard official. The lack of coordination in the future could have even greater cost implications because the Guard is planning to place 25 percent of its equipment in preserved storage and if it implements recommendations we are making in another report, the Guard would put an even larger percentage in storage. In another example, an aviation maintenance chief told us that two labor-intensive modifications were planned for consecutive years on each of 33 Blackhawk utility aircraft belonging to two units at Fort Carson. He said that making both modifications concurrently made more sense. Since a modification causes an aircraft to be grounded, the additional downtime to install each modification consecutively would adversely affect the reportable mission time for each unit. Maintenance personnel also noted that inefficiencies had resulted when not all modifications were done at the same time. For example, when the Army upgraded the armament fire control system on the M1 tank at the Camp Shelby mobilization and training site, a contractor team installed new software cards in the fire control system and 2 months later, a team from the Anniston Army Depot made needed mechanical adjustments to the same tanks. According to Army officials, both functions could have been done at the same time, thereby reducing the time the unit was without its equipment. The direct support maintenance chiefs and general support maintenance personnel at Fort Hood and Fort Carson told us they did not always receive adequate notice of modifications. This situation disrupted their ability to meet training schedules that were set up 12 months in advance and interfered with their ability to maintain their equipment. After some modifications are done, some equipment does not always work together properly, according to aviation maintenance personnel at Fort Hood. For example, although civilian aviation personnel at Fort Hood modified the Blackhawk utility helicopters to work with night vision goggles, they could not get replacement radios from a different program sponsor that were compatible with the night vision goggle system, and night operational capability was lost. Army headquarters and Army Materiel Command officials believed these problems had occurred because of their loss of oversight and control over the program and the inconsistent implementation of policies and procedures by program sponsors. The Army's Interim Operating Instructions for Materiel Change Management requires individual program sponsors to prepare a fielding plan for each modification. The fielding plan calls for coordination and adequate notice when a modification is to be done. The highly decentralized nature of the MWO program underscores the need for Army headquarters officials to have ready access to program data and information and adequate management controls to ensure that program implementation complies with policies and procedures. Even though the database they used was discontinued in part because it was not accepted as a standard DOD system, Army headquarters officials told us that the unavailability of information on the status of MWOs, the status of funding, and the configuration of weapon systems and equipment has made it difficult for managers at all levels to effectively carry out their respective responsibilities and make informed decisions on such things as funding, deployment, and logistical support of weapon systems and equipment. The program sponsors have been inconsistent in providing initial spare parts, ensuring that spare parts are added to the supply system, and keeping technical information updated for the field maintainers. Furthermore, program sponsors have not always adequately coordinated the completion of MWOs with other sponsors and with the field maintainers. The Army guidance on these processes is not clear, and the headquarters' ability to ensure that existing policies and procedures were complied with was diminished when the responsibilities of configuration control boards were transferred to program sponsors. As a result, field maintainers have experienced difficulty in obtaining spare parts and current technical information and have experienced inefficiencies in getting their weapon systems and equipment modified. Program sponsors have varying amounts of information on their MWOs, ranging from none to fairly complete, and do not have ready access to information needed to coordinate with other program sponsors. Those program sponsors without a database are limited in managing their own programs. Field maintainers do not have easy access to information on MWOs that should have been installed or scheduled for future installation. At the unit level, the lack of information has manifested itself in various inefficiencies related to the coordination and scheduling of the installation of MWOs and has sometimes prevented units from knowing the configuration of their equipment. It is important that these modifications be done as efficiently as possible to minimize the reportable mission time the equipment is unavailable to units. The Army's creation of a process action team to develop revised policies and procedures and its hiring of a contractor to examine automated information needs are steps toward correcting the weaknesses noted in this report. Improved management of this program would provide more assurance that improved capabilities are effectively and efficiently integrated into the Army's equipment in the most expeditious manner. In considering the upcoming results of the MWO process action team, we recommend that the Secretary of the Army direct actions necessary to provide managers at all levels ready access to the information they need to oversee, manage, and implement the MWO program and to ensure compliance with Army policies and procedures; clarify regulations to ensure that program sponsors and supply system personnel provide proper logistical support for modified equipment, including (1) ordering appropriate initial spare parts when MWO kits are ordered, (2) updating technical information and providing it to units when MWO kits are installed, and (3) properly phasing out old spare parts and adding new items to its supply system; and establish an effective mechanism for program sponsors to coordinate and schedule their MWOs, among themselves and their customers, to reduce the amount of manpower and to minimize the reportable mission time required to complete the MWOs. In written comments on a draft of this report, DOD concurred with our findings and our recommendations (see app. I), acknowledging that improvements to the weapon system and equipment modification program were needed. Regarding our first recommendation, DOD agreed that managers at all levels need ready access to information to oversee, manage, and ensure compliance with Army policies and procedures. It noted that the process action team is developing a recommendation for an MWO integrated management information system that would obtain information from already established databases. DOD believes that such a system would provide a cost-efficient, nonlabor-intensive management tool to assist managers in tracking all facets of MWOs. Approval of a proposal for a new study effort to design and develop this system is pending. DOD also agreed with our recommendation that the Secretary of the Army clarify regulations to ensure that program sponsors and supply system personnel provide proper logistical support for modified equipment. DOD stated that Army Regulation 750-10 is being totally revised to clearly define roles and responsibilities, thereby making it a joint acquisition and logistics regulation that can be used by both communities. The revised regulation will adopt a modified materiel release process that would address the logistical support issues raised in our recommendation as well as other areas of concern identified by the process action team. Finally, DOD agreed with our recommendation that the Secretary of the Army establish an effective mechanism for program sponsors to coordinate and schedule their MWOs, among themselves and their customers. DOD stated that the revised Army Regulation 750-10 will address the issue of coordination between program sponsors and ensure that MWOs are completed at all units at one location at the same time where possible. We believe that these actions, if properly implemented, will help to further improve the effectiveness and efficiency of this program. We interviewed officials and reviewed program records at the Army Materiel Command, Alexandria, Virginia; the Army Aviation and Troop Command, St. Louis, Missouri; and the Army Tank-Automotive and Armament Command, Warren, Michigan, to identify how the MWO program works and to identify any problems. We also interviewed officials and reviewed records at the U.S Army Materiel Command; the Assistant Secretary of the Army for Research, Development and Acquisition; the Deputy Chief of Staff for Logistics; and the Deputy Chief of Staff for Operations at Army headquarters to determine their role in the modification program and what information they need to manage funding, resource allocations, deployment decisions, and supportability. We also interviewed Directorate of Logistics personnel and general and direct support personnel, reviewed records, and made on-site observations at Fort Hood, Texas; Fort Campbell, Kentucky; and Fort Carson, Colorado, to determine whether they were having any difficulties with the completion, scheduling, or supply support obtained for MWOs. In addition, we interviewed civilian and contractor personnel that provided regional aviation maintenance support at Fort Hood and Fort Campbell and reviewed records to determine whether they were experiencing similar problems. Furthermore, we interviewed officials at Anniston Army Depot, Alabama, and Camp Shelby, Mississippi, to determine how the MWO programs affect maintenance and overhaul programs. To evaluate how well the Army integrates its MWO program with the supply support system, we judgmentally selected 73 recent MWOs for aviation systems; weapons and tracked combat vehicle systems; and small arms. The Army does not have a complete list of MWOs, MWO kits, or the major components in the kits. It has automated data only on MWOs for high-dollar weapon systems. For the MWOs selected, we attempted to manually identify the major components in the kits, enter them into a database, and compare them to the Army's automated inventory (April-June 1997 master data record) and budget justification (Sept. 1996 budget stratification report) records. We were not able to quantify the problems with the supply system identified in this report because (1) we could not identify a significantly large universe of new replacement items and match them with the related item being phased out of the system and (2) for the items identified, we could not consistently trace them into the automated inventory and budget justification records. Furthermore, we could not determine the extent of some of the problems identified through our field visits because some of the newer MWOs in our sample have not been operational long enough for their parts to fail. We have used the automated budget justification records and automated inventory databases in prior evaluations and reported that they contain significant errors regarding the relationship codes between secondary inventory items being added to the system and the replaced items. These databases are, however, the only available information on inventory and budget justifications for Army secondary items. We performed our review between January 1996 and August 1997 in accordance with generally accepted government auditing standards. We are sending copies of this report to the Secretaries of Defense and the Army; the Director, Office of Management and Budget; and other interested parties. Please contact me on (202) 512-5140 if you have any questions concerning this report. Major contributors to this report are listed in appendix II. Gary Billen Mark Amo Leonard Hill Robert Sommer Robert Spence The first copy of each GAO report and testimony is free. Additional copies are $2 each. Orders should be sent to the following address, accompanied by a check or money order made out to the Superintendent of Documents, when necessary. VISA and MasterCard credit cards are accepted, also. Orders for 100 or more copies to be mailed to a single address are discounted 25 percent. U.S. General Accounting Office P.O. Box 37050 Washington, DC 20013 Room 1100 700 4th St. NW (corner of 4th and G Sts. NW) U.S. General Accounting Office Washington, DC Orders may also be placed by calling (202) 512-6000 or by using fax number (202) 512-6061, or TDD (202) 512-2537. Each day, GAO issues a list of newly available reports and testimony. To receive facsimile copies of the daily list or any list from the past 30 days, please call (202) 512-6000 using a touchtone phone. A recorded menu will provide information on how to obtain these lists. | Pursuant to a congressional request, GAO provided information on the Army's management of its modification work order (MWO) program, focusing on: (1) the availability of information needed by Army headquarters and field personnel to effectively oversee and manage the MWO program; (2) the availability of spare parts needed by personnel in the field to maintain modified equipment; and (3) field personnel's experiences in implementing the MWO program. GAO noted that: (1) Army headquarters officials and Army Materiel Command officials no longer have the information they need to effectively oversee and manage the MWO program; (2) this occurred because the centralized database to track installation and funding was discontinued; control over modification installation funding was transferred from the headquarters level to individual program sponsors; and the authority over configuration control boards, which ensured the completeness and compliance of MWOs with policy, was transferred to individual program sponsors; (3) as a result, Army officials do not have an adequate overview of the status of equipment modifications across the force, funding requirements, logistical support requirements, and information needed for deployment decisions; (4) the lack of information is also a problem at field units; (5) maintenance personnel have not always known which modifications should have been made to equipment or which modifications have actually been made; (6) in addition, maintainers of equipment have not always received the technical information they need in a timely manner to properly maintain modified equipment; (7) maintenance personnel in the field have had difficulty obtaining spare parts to maintain modified equipment because program sponsors frequently had not ordered initial spare parts when they acquired modification kits; (8) Army officials believe these problems occurred because they lost oversight and control of the program and policies and procedures were not being consistently applied by the individual program sponsors; (9) because spare parts have often not been available, maintenance personnel have made additional efforts to maintain modified equipment; (10) supply system personnel have not always followed policies and procedures to ensure that supply system records were updated to show the addition of new spare parts and the deletion of replaced spare parts; (11) as a result, the Army's budget for spare parts may not reflect accurate requirements for new components to repair and maintain modified weapon systems and equipment; (12) maintenance personnel in the field have also experienced a variety of problems in implementing MWOs; (13) maintainers have not always received adequate notice of pending modifications, and training schedules and equipment maintenance have been adversely affected; (14) GAO was told that various items of equipment did not always work together once some modifications were made; and (15) according to Army officials, these problems also occurred because of their loss of oversight and control. | 6,599 | 541 |
Multimission stations, formerly referred to as small boat stations, are involved in all Coast Guard missions, including search and rescue, recreational and commercial fishing vessel safety, marine environmental response, and law enforcement activities such as drug and migrant interdiction. Search and rescue has traditionally been the stations' top priority. However, after the terrorist attacks of September 11, 2001, the Coast Guard elevated the maritime homeland security mission to a level commensurate with the search and rescue mission. Congress's actions to provide the Coast Guard with an additional $15.7 million for these stations in fiscal year 2003 was part of a longer-standing effort to address readiness concerns. In 2001, Congress directed the Department of Transportation's Office of Inspector General (OIG) to conduct a thorough review of the operational readiness capability of stations, following a series of accidents involving search and rescue efforts initiated at these stations. The OIG reported that readiness levels at stations had been deteriorating for more than 20 years and were continuing to decline. In response, Congress provided an earmarked appropriation in fiscal year 2002 and directed the Inspector General to review the use of the earmarked funds. The OIG found that the Coast Guard generally complied with the intent of the earmark but also concluded that improving operational readiness at stations would require a substantial and sustained investment. The OIG also recommended that to improve congressional oversight of expenditures, the Coast Guard should make improvements to its accounting system to allow for the tracking of certain station expenditures. Since the additional funding efforts began, in fiscal year 2002, Coast Guard officials told us they have, among other actions, added approximately 1,100 personnel to stations, increased levels of personal protection equipment for station personnel, and started to replace old and nonstandard boats with new standard boats. In December 2002 the Coast Guard also developed, in response to a recommendation from the OIG in its 2001 report and at the direction of the Senate Appropriations Committee, a draft strategic plan to guide the recruiting and hiring of personnel. In its 2002 report, the OIG criticized the plan for being too general in nature, specifically regarding how and when the Coast Guard will increase staffing, training, equipment, and experience levels at stations. Because the Coast Guard's automated databases are not set up in such a way that they can fully identify expenditure data at the station level, we were unable to fully determine expenditures for all four categories. However, through a combination of data runs and unit surveys performed at our request, the Coast Guard was able to estimate staffing and personnel retention expenditures, and develop actual expenditure data for personal protection equipment (PPE). Within these three categories, the Coast Guard estimates it spent at least $291 million in fiscal year 2003. The information available by category was as follows: Staffing: The Coast Guard incurred estimated costs of $277.6 million for 5,474 active duty personnel assigned to stations during fiscal year 2003. This figure does not include costs for the 1,657 reserve personnel assigned to stations, or an unknown number of auxiliary personnel. PPE: Reported expenditures for this category totaled $7.5 million. Personnel retention: Expenditure data for all aspects of this category are not available. However, in one specific category--reenlistment bonuses--the Coast Guard expended $5.9 million for bonuses to boatswain's mates and machinists assigned to stations. Training: Coast Guard officials attempted to identify estimated costs of training station personnel at national training centers during fiscal year 2003 but could not provide reliable data for this category. Officials told us the Coast Guard has separate databases that track costs incurred by the national training centers, but do not have a database that can identify training costs expended on personnel after they have been assigned to stations. Further, expenditures incurred by stations in providing on-the-job training (a significant component of total training provided to station personnel) were not available because the Coast Guard, like many agencies, does not track time spent on this type of training. Using fiscal year 2002 data derived through similar analyses, we determined that estimated station expenditures for fiscal year 2003 exceeded fiscal year 2002 levels by at least $20.5 million--or $4.8 million more than the $15.7 million earmarked appropriation. Table 1 shows the differences in estimated expenditures (levels of effort) by fiscal year for the three categories that had available data. Only partial data were available on personnel retention, and no data were available on training expenditures. Although expenditure data for all personnel retention efforts were not available, the Coast Guard was able to provide annual expenditure data for reenlistment bonuses offered to selected multimission station personnel. Other information we gathered in discussions with Coast Guard personnel indicates that the Coast Guard's levels of effort in station training also increased during fiscal year 2003. In fiscal year 2003, the Coast Guard increased the number of instructors and classrooms at two national training centers, which provide training to station and other personnel, in order to increase the number of total students graduated. Appendix I describes our methodology for developing these estimates, and appendix II contains a more detailed description of the data in each category. Because complete comparative data could not be identified for all four categories, we cannot say with certainty that Coast Guard expenditures for multimission stations in fiscal year 2003 were at least $15.7 million above fiscal year 2002 levels. However, we believe this is a reasonable conclusion based on the following: Although the staffing data provided to us are based on budget cost formulas, we determined that the data are sufficiently reliable for the purpose of demonstrating increases in staffing levels between the two years. Discussions with station officials indicate that station personnel have sufficient levels of PPE. In its fiscal year 2002 audit, the OIG reported that the Coast Guard did not provide PPE for 69 percent of the personnel added to stations during fiscal year 2002. Our visits to a limited number of stations--8 out of 188 stations--and discussions with station personnel, indicated that all active and reserve personnel assigned to these stations--even newly assigned personnel--had received what they considered to be an appropriate level of PPE (basic and cold weather). Although available quantitative data were limited for this category, over the past few years the Coast Guard has implemented a variety of financial incentives aimed at improving personnel retention. Training officers at the 8 stations we visited indicated that training for station personnel did not decrease in fiscal year 2003 compared with the prior year. In addition, in fiscal year 2003 the Coast Guard increased training resources in two areas--the boatswain's mate training school increased its training output by over a third, and unit training provided by headquarters to station personnel also increased. The Coast Guard did not have adequate processes in place to sufficiently account for the expenditure of the entire $15.7 million earmarked fiscal year 2003 appropriation or to provide assurance that these earmarked funds were used appropriately, as set forth by federal management and internal control guidelines. The purpose of an earmark is to direct an agency to spend a certain amount of its appropriated funds for a specific purpose. Federal guidelines and government internal control standards indicate that agencies should account for the obligation and expenditure of earmarked appropriations both as a sound accounting practice and to demonstrate compliance in the event of an audit. The expectation that agencies will be able to effectively demonstrate compliance in their use of earmarked funds stems from the following: Office of Management and Budget Circulars: These circulars hold that agencies' management controls should reasonably ensure that laws and regulations are followed. The Federal Managers' Financial Integrity Act: This act establishes specific requirements regarding management controls and directs agency heads to establish controls to reasonably ensure that obligations and costs comply with applicable laws. Standards for Internal Control in the Federal Government: These standards specify that internal controls should provide reasonable assurance that an agency is in compliance with applicable laws and regulations. They also direct that internal controls and transactions should be clearly documented and the documentation should be readily available for examination. Further, the Department of Homeland Security (DHS), the parent agency for the Coast Guard, recently issued budget execution guidance that encourages component agencies to identify the obligation and expenditure of earmarked funds separately from other appropriated funds. (This guidance was issued in fiscal year 2004 after the Coast Guard had obligated the fiscal year 2003 earmark.) In response to a recommendation made in our recent report on the reprogramming of Federal Air Marshal Service funds, DHS has agreed to make this a requirement. The Coast Guard told us at the onset of our review that it did not have adequate processes in place to collect data with respect to earmarked expenditures. Although officials had taken steps to account for PPE expenditures (because purchase receipts could be easily tracked), they did not have adequate processes in place to account for earmarked funds spent on staffing and training needs at the station level. Consequently, the Coast Guard could not demonstrate conclusively that it was complying with the earmark. Basically, the Coast Guard's databases were not designed for this purpose and would have to be modified to provide actual expenditure data for stations, according to Coast Guard officials. On the basis of lessons learned from the OIG's audit in fiscal year 2002, which faulted the Coast Guard for not having cost accounting systems in place to allow for the tracking of certain multimission station expenditures, Coast Guard officials developed a plan to show how various allocations would add up to $15.7 million if expended. The plan, although useful as an indicator of the Coast Guard's intentions, is not sufficient to show that the Coast Guard had expended the earmarked appropriation as directed. Coast Guard officials also told us that, in response to the OIG's 2002 recommendation to allow for the tracking of certain station expenditures, they are assisting DHS in developing a new enterprise-wide financial system called "electronically Managing enterprise resources for government effectiveness and efficiency" (eMerge). As part of the overall system requirements, the Coast Guard expects that eMerge and the Coast Guard was unable to provide us with system specifications prior to the issuance of this report. On the basis of available data and other information, the Coast Guard appears to have met the Congress's requirement to spend at least $15.7 million more on multimission stations in fiscal year 2003 than in fiscal year 2002. However, the Coast Guard does not have adequate processes in place to track actual expenditures related to earmarks. Rather, agency officials could provide only estimates for much of the station expenditures. Without the ability to accurately and completely account for these expenditures, the Coast Guard cannot assure that it complied with the earmark. Moreover, Congress's ability to hold the Coast Guard accountable for future earmarks is seriously diminished. In light of our recent recommendation to DHS on the need to track earmarks--and its subsequent concurrence--we believe the Coast Guard should take immediate steps to ensure that future accounting systems include the capability to track earmarks. To improve the Coast Guard's ability to respond to congressional oversight and to provide greater assurance that earmarked funds are used appropriately, we recommend that the Secretary of Homeland Security direct the Commandant of the Coast Guard to develop, in accordance with the fiscal year 2004 departmental guidelines, processes to accurately and completely account for the obligation and expenditure of earmarked appropriations. We requested comments on a draft of this report from the Secretary of Homeland Security or his designee. On May 14, 2004, Coast Guard officials, including the Chief, Office of Budget and Programs, provided us with oral comments, with which the DHS GAO Liaison concurred. Coast Guard officials generally agreed with the facts and our recommendation to better track earmarked expenditures. We did not review the Coast Guard's financial databases to determine if modifications to them would be necessary to better track earmarked expenditures (obligations). Coast Guard officials, however, expressed concern that developing better procedures to track some station expenditures (obligations), such as those for staffing or training, will prove challenging and could be costly due to the need to significantly modify their financial systems. Officials stated that accounts are centrally managed and specific expenditures would not be easily tracked at the station level. The Coast Guard officials said they plan to explore this issue more thoroughly and to examine how organizations with comparable activities have overcome similar obstacles to tracking earmarked funds. The officials also provided a number of technical clarifications, which we incorporated where appropriate. We will send copies of this report to interested congressional committees and subcommittees. We will also make copies available to others on request. In addition, the report will be available at no charge on GAO's Web site at http://www.gao.gov. If you or your staffs have any questions about this report or wish to discuss the matter further, please contact me at (415) 904-2200 or Randall B. Williamson at (206) 287-4860. Additional contacts and key contributors to this report are listed in appendix III. We used a variety of approaches in our work to determine the amount of the general appropriation the Coast Guard expended on multimission stations in fiscal year 2003 across the four areas covered by the earmark-- staffing, personal protection equipment (PPE), personnel retention and training--and whether this amount exceeded by $15.7 million the level of effort expended in fiscal year 2002. Because Congress directed that we review the amount of general appropriations expended on station readiness needs, we did not review expenditures of funds received through supplemental appropriations. We determined at the outset of our work that Coast Guard databases did not contain information that would allow us to fully report on station expenditures for the four earmark categories. To identify available information and possible limitations of the information, we worked extensively with Coast Guard headquarters officials from the Offices of Budget and Programs; Financial Analysis; Boat Forces; Resource Management; Workforce Management; Personnel Command; and Workforce Performance, Training and Development. We also obtained documentation from headquarters, stations, groups, and districts. After reviewing the reliability of available data and the feasibility of Coast Guard officials' proposals for gathering additional data, we agreed on a combination of expenditure and allocation data, which would be collected through special data runs, analyses, and unit surveys. Coast Guard officials provided data for three of the four categories. Although officials attempted to develop information on training costs, they were not able to produce reliable data. Some of the information we needed was obtained not at headquarters but at specific Coast Guard sites, which we judgmentally selected according to size, location, and type. The specific data and analyses used to develop estimates on each of the four categories, were as follows: Staffing: To determine the number and cost of personnel assigned to multimission stations, we requested Coast Guard personnel expenditure data for fiscal years 2002 and 2003, but we were told that expenditure data were not available at the station level. To develop estimated staffing costs, Coast Guard officials merged information from personnel and position databases to identify the number of personnel assigned to stations and then applied a personnel cost formula to arrive at total estimated costs. Developing estimates was complicated because the fiscal year 2002 data were developed from a different database than the fiscal year 2003 data, and because the Coast Guard has more personnel assigned to stations than actual authorized (or funded) positions, a variance that requires periodic adjustment of the databases. However, after discussing these factors at length with Coast Guard officials, we determined that the data developed by the Coast Guard were sufficiently reliable for the purpose of providing estimates of expenditures for fiscal years 2002 and 2003. The methodology and process for developing the data were contributed to by the following Coast Guard offices: Budget and Programs, Resource Management, Workforce Management, and Personnel Command. PPE: To obtain fiscal year 2003 expenditure data for this category, we asked the Coast Guard to survey all 188 stations and their oversight units. Each station and unit was asked to provide the total amount of fiscal year 2003 funds spent on PPE for personnel assigned to the station during the year. These totals included expenditures made for station personnel at the group and district levels as well. To verify the accuracy of these data, we reviewed original expenditure documentation for a judgmentally selected sample of 29 stations. On the basis of this documentation, we independently quantified PPE expenditures for each station. Our count of total PPE purchases at the 29 stations was 9 percent higher than the total provided by the Coast Guard (our count was 4 percent less than the Coast Guard's after removing expenditures for one outlier station). Coast Guard officials attributed the difference to errors made by station personnel when compiling the expenditure data. As a result of these differences, however, we refer to the total expenditure for fiscal year 2003 as an estimate. Because Coast Guard officials considered gathering expenditure data for fiscal year 2002 as too labor intensive for station personnel, given their current workloads, we used the Coast Guard's data on planned PPE expenditures for fiscal year 2002. After reviewing possible limitations in the PPE data provided, we determined that the data provided were sufficiently reliable for the purpose of providing estimates of expenditures. The PPE planning data were provided to us by the Offices of Boat Forces and Budget and Programs. Personnel retention: We were not able to determine total retention expenditures because the Coast Guard does not specifically track these costs, and retention efforts encompass a diverse array of direct and indirect activities. We were able to identify certain direct activities-- selective reenlistment bonus expenditures for multimission stations and various financial incentives available to Coast Guard personnel-- and some indirect incentives. After reviewing how data provided by the Personnel Services Center on selective reenlistment bonus expenditures were collected and maintained, we determined that the data were sufficiently reliable for the purposes of this report. The personnel retention expenditure data were provided to us by the Office of Budget and Programs. Training: The Coast Guard was unable to provide actual or estimated expenditure data for training multimission station personnel in fiscal years 2002 and 2003. Officials from the Office of Budget and Programs and the Office of Workforce Performance, Training, and Development told us at the outset of our review that they would not be able to identify total training costs because the Coast Guard does not track the amount of time station personnel devote to on-the-job training (which accounts for a significant amount of total training). Headquarters officials attempted to obtain data on the estimated annual costs for training station staff at the Coast Guard's national training centers by cross-referencing data from multiple databases and applying a cost formula. However, Coast Guard officials identified a number of serious anomalies in the data and concluded the data were too unreliable to be used. To determine whether the Coast Guard had adequate processes in place to account for the expenditure of the $15.7 million earmarked appropriation, we interviewed and obtained documentation from stations, groups, and districts. We also interviewed and obtained documentation from officials in the following headquarters offices: Boat Forces, Budget and Programs, and Financial Analysis. Further, we studied the Coast Guard's funding plan, which showed how the earmark was intended to be spent. We also reviewed federal management guidelines and government internal control standards to identify earmark accountability requirements that apply to agencies. The $15.7 million earmark presented to the Coast Guard in its fiscal year 2003 appropriation called for funds to be spent across four categories of multimission station needs--staffing, PPE, personnel retention, and training. In determining the amount of funds spent by the Coast Guard in 2003 on station needs and whether this amount exceeded the fiscal year 2002 level of effort by $15.7 million, we also developed cost information for three of the four categories. Coast Guard officials attempted but were unable to develop reliable data on the cost of training station personnel during fiscal years 2002 and 2003. This appendix has two main sections. The first presents additional information about estimated station expenditures in the areas of staffing, PPE, and personnel retention in fiscal year 2003, and the second contains additional information about the changes that occurred between fiscal years 2002 and 2003. Using a combination of estimated and actual expenditure data, we determined that estimated fiscal year 2003 costs for staffing, PPE, and personnel retention efforts at stations amounted to at least $291 million. The Coast Guard could not provide us with the actual amount of fiscal year 2003 appropriation funds spent on station staffing because the agency's automated databases do not fully identify personnel expenditures at the station level. However, using a combination of budget and personnel data, officials were able to estimate that in fiscal year 2003 the Coast Guard incurred costs of $277.6 million to support 5,474 active duty station personnel. This estimate does not include costs for the 1,657 reserve personnel assigned to stations in fiscal year 2003, nor does it include the costs of volunteer auxiliary personnel who assisted in station operations during the year. The Coast Guard did not calculate estimated expenditures for reservists because of the complex and labor-intensive nature of the analysis. Coast Guard officials determined that the agency spent approximately $7.5 million in fiscal year 2003 on PPE for station personnel. As shown in table 2, the cost of a total basic PPE outfit in fiscal year 2003 was $1,296. The cost of a cold weather PPE outfit, which is used by personnel working at stations where the outdoor temperature falls below 50 degrees Fahrenheit, was $1,431. (Figure 1 shows a station crew member in cold weather PPE.) A May 2002 Coast Guard Commandant directive emphasized the importance of proper supplies and use of PPE as one of the top priorities of Coast Guard management. In this directive, the Commandant cited an internal research report that attributed 20 percent of the total risk facing boat personnel to exposure to extreme weather conditions. The directive also states that the use of appropriately maintained PPE could improve Coast Guard's operational capability. The Coast Guard provided data demonstrating how it promotes personnel retention through a variety of direct and indirect incentives. Direct incentives include financial benefits that personally benefit the individual, while indirect incentives include projects, such as facility improvements, that may indirectly contribute to retention by increasing staff morale. Coast Guard officials provided expenditure data for selected direct incentives provided to station personnel in fiscal year 2003 because officials could not quantify the total amount of funds expended on direct incentives. Likewise, the total amount expended on indirect incentives cannot be readily identified because of the numerous and varied nature of the efforts. Coast Guard's direct financial incentives include selective reenlistment bonuses. During fiscal year 2003, the Coast Guard spent $5.9 million on 312 selective reenlistment bonuses for station personnel--$4.2 million of this went to boatswain's mates while the remaining $1.7 million went to machinery technicians. A variety of other financial benefit improvements were also recently implemented: Between fiscal year 2003 and fiscal year 2004 the Coast Guard increased the surfman pay premium by 33 percent. Since fiscal year 2000 the average portion of housing costs paid by personnel has decreased annually, going from 18.3 percent in fiscal year 2000 to 3.5 percent in 2004; in 2005 this expense will be reduced to zero. Since fiscal year 2002 enlisted personnel have been entitled to a basic allowance for food. Before fiscal year 2002 they received no funds for food purchased outside of a Coast Guard galley (kitchen). Since fiscal year 2002 first-term enlisted personnel have received a "dislocation allowance" that provides funds for rental deposits and other incidentals that may occur when personnel are required to move. Since fiscal year 2003 junior personnel have been able to ship greater weights of household goods when transferring stations. During fiscal year 2004 the death gratuity issued to assist survivors of deceased Coast Guard active personnel doubled. Multiple indirect Coast Guard efforts also serve as personnel retention tools by improving staff morale. At our request, Coast Guard officials asked 29 (15 percent) of the 188 multimission stations to provide data on estimated expenditures incurred for projects that indirectly contributed to staff retention. For the 24 stations that responded, infrastructure and lifestyle improvements totaled over $350,000 in fiscal year 2003. Improvements cited by multimission stations include such items as new furniture, sports equipment, televisions, satellite TV service, and entertainment systems. According to a Coast Guard official, the source of funds for these improvements can be station, group, or district operating budgets or donations by Coast Guard support groups. Table 3 shows examples of some of the projects cited by the 24 survey respondents. While we could not determine with certainty the difference in estimated expenditures (levels of effort) expended on stations between fiscal years 2002 and 2003 because of financial system limitations, the information available suggests that the difference amounted to at least $20.5 million. The following discusses estimated differences in fiscal year 2002 and 2003 staffing, PPE, and personnel retention costs for multimission stations. As shown in Table 4, the Coast Guard increased staffing at multimission stations by an estimated 466 personnel (9.3 percent) in fiscal year 2003. The estimated cost of this staffing increase was $14.4 million above the level of effort expended for staffing in fiscal year 2002. According to the Coast Guard, the agency estimates it spent approximately $5 million more for PPE than it planned to spend during fiscal year 2002. We used fiscal year 2002 planned allocation data for this expenditure comparison because Coast Guard officials considered a survey of stations to collect fiscal year 2002 expenditure data--similar to the survey conducted for the fiscal year 2003 expenditure data--too burdensome for station personnel, given their current workload. Coast Guard officials told us that historically the amount of funds allocated for station PPE at the beginning of a fiscal year is not enough to fund PPE for all station personnel estimated to need it during the year. The Coast Guard's method for allocating PPE funds to stations uses the number of positions authorized to stations as a primary factor in determining the amount of funds allocated to individual stations. Because Coast Guard stations have more personnel assigned to them than authorized positions, in the past personnel not assigned to an authorized position were typically not included in PPE allocation calculations. To address this shortfall, the Coast Guard initially planned to allocate $3 million of the earmarked funds in fiscal year 2003. During 2003 the Coast Guard had added another $2.6 million of the earmarked funds, bringing the total to $5.6 million. Reenlistment bonuses issued to boatswain's mates and machinery technicians assigned to stations increased by $1.1 million from fiscal year 2002 to fiscal year 2003. During fiscal year 2002, the Coast Guard issued $4.8 million in bonuses to the two classes of station personnel; the amount issued in fiscal year 2003 rose to $5.9 million. Expenditures for other, more indirect, forms of retention activities, such as station infrastructure improvements, are not tracked annually and therefore are not available for comparative purposes. The Coast Guard was not able to identify training costs for multimission station personnel for fiscal year 2002 or fiscal year 2003 despite extensive efforts. Officials told us the Coast Guard has separate databases in place to track training costs by national training center, but it does not have a database that identifies costs for station personnel. The Coast Guard conducted several queries from available databases but the resulting data were not accurate. The lack of available training cost data precluded us from making a comparison of annual expenditure data in this area. However, some information indicates that levels of effort expended on training station personnel increased in fiscal year 2003. For example, Coast Guard's boatswain's mate training school increased its training output by over a third in fiscal year 2003. In addition to those named above, Cathleen A. Berrick, Barbara A. Guffy, Dorian R. Dunbar, Ben Atwater, Joel Aldape, Marisela Perez, Stan G. Stenersen, Michele C. Fejfar, Casey L. Keplinger, Denise M. Fantone, and Shirley A. Jones made key contributions to this report. | The Coast Guard conducts homeland security and search and rescue operations from nearly 200 shoreside stations along the nation's coasts and waterways. After several rescue mishaps that resulted in the deaths of civilians and station personnel, Congress recognized a need to improve performance at stations and appropriated additional funds to increase stations' readiness levels. For fiscal year 2003, the Coast Guard received designated funds of $15.7 million specifically to increase spending for stations' staffing, personal protection equipment (such as life vests and cold weather protection suits), personnel retention, and training needs. Congress directed GAO to determine if the Coast Guard's fiscal year 2003 outlays for stations increased by this amount over fiscal year 2002 expenditure levels. GAO also assessed the adequacy of the processes used by the Coast Guard to account for the expenditure of designated funds. According to our analyses of available data, and anecdotal and other information, it appears that the Coast Guard spent at least $15.7 million more to improve readiness at its multimission stations in fiscal year 2003 than it did the previous year. However, this statement cannot be made with certainty, because the Coast Guard's databases do not fully identify expenditures at the station level. GAO worked with the Coast Guard to develop expenditure estimates for the stations, using budget plans and available expenditure data, and this effort produced full or partial estimates for three of the four categories--staffing, personal protection equipment, and personnel retention efforts. For these three categories, fiscal year 2003 expenditure estimates were at least $20.5 million more than the previous year, or about $4.8 million more than the $15.7 million designated appropriation. Although estimates could not be developed for training expenditures, other available information indicates that training levels increased in fiscal year 2003. Taken together, these results suggest that the Coast Guard complied with Congress' direction to increase spending for stations by $15.7 million. Federal management guidelines and internal control standards call for greater accountability for designated--earmarked--appropriations than was provided by the processes the Coast Guard had in place to track these funds. The purpose of an earmark is to ensure agencies spend a certain amount of their appropriated funds for a specific purpose. Guidelines and standards indicate that agencies should account for the obligation and expenditure of earmarked appropriations--a step the Coast Guard thoroughly implemented only for personal protection equipment. Coast Guard officials developed a plan showing how they planned to spend the earmark, but such a plan, while useful as an indication of an agency's intentions, is not sufficient to show that the earmark was expended in accordance with congressional direction. | 5,955 | 553 |
Military test and training ranges are used primarily to test weapon systems and to train military forces. Test ranges are used to evaluate warfighting systems and functions in a natural environment and under simulated operational conditions. Training ranges include air ranges for air-to-air, air-to-ground, drop zone, and electronic combat training; live-fire ranges for artillery, armor, small arms, and munitions training; ground maneuver ranges to conduct realistic force-on-force and live-fire training; and sea ranges to conduct ship or submarine maneuvers. In February 2014, DOD reported to Congress that it had 533 test and training ranges throughout the United States and overseas. These included 456 Army ranges, of which 384 were in the United States; 23 Navy ranges, of which 18 were in the United States; 40 Air Force ranges, of which 35 were in the United States; and 14 Marine Corps ranges, of which 13 were in the United States. Figure 1 shows the location of major DOD test and training ranges throughout the United States as of June 2014. Before DOD can determine whether a project or transaction poses a potential security threat to a range by providing a foreign entity a permanent platform for observing operations, it must first become aware of the proposed project or transaction. Multiple federal entities may be involved in identifying and approving potential business activities near DOD ranges. DOD, working with these federal entities, uses multiple methods to determine what activities are occurring in proximity to its ranges. None of these methods, with the exception of the Committee on Foreign Investment in the United States (CFIUS), discussed below, was designed to consider security concerns. The following entities and processes are available to DOD to become aware of and gather information on projects located near ranges. CFIUS, an interagency committee chaired by the Department of the Treasury and including DOD as a member, reviews certain covered transactions to assess the impact on national security of foreign control of U.S. companies, such as by considering the control of domestic industries and commercial activity by foreign citizens as it affects the capability and capacity of the United States to meet the requirements of national security. DOD has the opportunity to comment on these transactions, including raising any security concerns. For more information on CFIUS, see appendix II. The Bureau of Land Management within the Department of the Interior administers over 245 million acres of federal land for a variety of uses, including energy development, recreation, and timber harvesting. The Bureau issues a wide variety of permits, licenses, or leases for use of public land, including permits and leases for energy development, and administers mining claims. According to Bureau of Land Management and DOD officials, local Bureau of Land Management personnel may work with DOD installations within their jurisdictions to notify them of projects in proximity to the installation and test and training ranges. In some cases, the office may notify the installation when leases are issued or projects are proposed in proximity to test and training ranges. The Bureau of Ocean Energy Management within the Department of the Interior promotes energy independence and economic development and manages the natural resources of the Outer Continental Shelf, including oil and gas, marine minerals, and renewable energy. Under a 1983 Memorandum of Agreement, the Bureau and DOD consult to resolve conflicts between Outer Continental Shelf exploration and development and the requirements for DOD to use the Outer Continental Shelf for national defense and security. Following these consultations, DOD and the Department of the Interior agree on areas that may require deferral from leasing or that can be leased subject to lessee advisories or lease stipulations allowing for joint use. The Bureau of Safety and Environmental Enforcement within the Department of the Interior works to promote safety, protect the environment, and conserve resources offshore through regulatory oversight and enforcement. Key functions of the Bureau include oil and gas permitting, facility inspections, regulations and standards development, safety research, data collection, technology assessments, field operations, incident investigation, environmental compliance and enforcement, and oil spill prevention and readiness. The Federal Aviation Administration within the Department of Transportation works to provide a safe and efficient aerospace system and reviews proposed structures for obstruction concerns. In addition, parties proposing any project over 200 feet in height or within certain distances of an airport or runway are required by law and regulation to provide notice and certain project information to the Federal Aviation Administration. As part of its evaluation process, the Federal Aviation Administration's obstruction evaluation system automatically notifies interested agencies, including DOD and the individual military services, based on the agencies' preferences. DOD's Siting Clearinghouse, which was set up to work with renewable energy project developers to mitigate encroachment concerns at DOD installations, is automatically notified about all renewable energy projects filed with the Federal Aviation Administration. The National Environmental Policy Act of 1969 process requires environmental reviews of certain actions on federally controlled land. As part of this process, the public must be notified of impending action on federal land and is invited to comment. DOD may be included as a cooperating agency when a project is located near a DOD installation or when there is an identified DOD interest involved. Community Planning and Liaison Officers at Navy and Marine Corps installations establish relationships with nearby communities and local governments, and provide a mechanism by which the installations can become aware of and address any concerns stemming from proposed projects near ranges. These entities and processes may apply to a wide variety of activities that can occur in proximity to DOD test and training ranges, including renewable and conventional energy projects, mineral extraction (mining), and oil and natural gas exploration. For example, the Bureau of Land Management grants mining rights on federal land near DOD ranges. Moreover, as discussed above, the Bureau also administers minerals mining claims, including those in proximity to DOD's test and training ranges. Figure 2 shows the mining claims on federal land outside of the Fallon Training Range Complex in Nevada. Similar to the large number of mining claims near the Fallon Range Training Complex, there is also extensive oil and gas exploration in the Gulf of Mexico near many onshore Navy and Air Force installations, including Eglin Air Force Base, where DOD set up the Integrated Training Center for the F-35 Joint Strike Fighter. In 2006, Congress passed the Gulf of Mexico Energy Security Act of 2006of the Gulf of Mexico to new oil and gas leasing, but also placed a moratorium on oil and gas leases in portions of the gulf, in part to avoid which opened several areas interfering with DOD's training mission.gas activity as of October 2013 in the Gulf of Mexico and the moratorium area. Our prior work has shown that utilizing a risk management approach allows an agency to more effectively prioritize its resources and enhance its ability to respond to a threat. Under DOD Instruction 3020.45, DOD utilizes a risk management approach to manage its critical infrastructure program. According to DOD, risk management is the process of identifying, assessing, and controlling risks arising from operational factors and making decisions that balance risk cost with mission benefits. A key step in this approach is to conduct a risk assessment to provide a way to continuously evaluate and prioritize risks and recommend strategies for mitigation. DOD's risk assessment process has three core elements: criticality, vulnerability, and threats. Criticality identifies the consequence of the loss of a particular asset based on national security concerns or the impact to DOD's missions. A criticality assessment identifies key assets and infrastructure that support DOD missions, units, or activities and are deemed mission critical by military commanders or civilian agency managers. Vulnerability is a weakness or susceptibility of an installation, system, asset, application, or its dependencies that could cause it to suffer a degradation or loss as a result of having been subjected to a certain level of threat or hazard. A vulnerability assessment is a systematic examination of the characteristics of an installation, system, asset, or its dependencies to identify vulnerabilities. Threats refer to an adversary having the intent, capability, and opportunity to cause loss or damage. DOD has not conducted a risk assessment that includes prioritizing ranges based on mission criticality, determining their vulnerabilities to foreign encroachment, and assessing the degree to which foreign encroachment could pose a threat to the mission of the ranges. As a result, the department does not know the extent to which foreign encroachment poses a threat to its test and training ranges. Neither DOD nor the services have determined which of their ranges are the most critical to protect or assessed any vulnerabilities and threats posed by foreign encroachment. As discussed above, utilizing a risk management approach, which includes conducting a risk assessment, allows an agency to more effectively prioritize its resources and enhance its ability to respond to a threat. A DOD instruction governing its critical infrastructure program states that determining the criticality of key assets is a core element of conducting a risk assessment.provides a framework that could be used to manage critical infrastructure across the department, it does not specifically mention risk assessment in relation to foreign encroachment. Rather, it establishes policy to manage the identification, prioritization, and assessment of defense critical infrastructure as a comprehensive program. Therefore, this instruction could be used by DOD as a model for how to deal with the issue of foreign encroachment. Navy and Air Force officials said that the lack of an established methodology or criteria, as well as the unique mission capabilities of each range, make it difficult to determine the relative criticality of each range as it relates to foreign encroachment, including which ranges would be the most valuable collection points for foreign adversaries trying to gather intelligence and which ranges house the most sensitive test and training activities. In addition, the services do not have guidance on how to conduct such an assessment because the issue of foreign encroachment is new. However, DOD has resolved similar challenges in the past. For instance, in an October 2009 review of DOD's management of electrical disruptions, we found that DOD had not developed guidelines for addressing the unique challenges related to conducting some vulnerability assessments of electrical power assets. We recommended that DOD develop explicit guidelines, based on existing Defense Critical Infrastructure Program guidance, for assessing critical assets' vulnerabilities to long-term electrical power disruptions. with this recommendation and developed a tool for assessing critical assets' vulnerabilities to power disruptions. Similarly, specific guidance on foreign encroachment could assist DOD and the services in managing this issue. GAO, Defense Critical Infrastructure: Actions Needed to Improve the Identification and Management of Electrical Power Risks and Vulnerabilities to DOD Critical Assets, GAO-10-147 (Washington, D.C.: Oct. 23, 2009). guidance designed to assess the criticality of Navy ranges in terms of foreign encroachment and expect this guidance to be issued sometime during 2015; however, as of December 2014, little progress has been made in developing this guidance. These officials further stated that, once guidance is finalized, they intend to begin the assessment process. Officials told us that they expect that, once this assessment process is complete, it will be a critical component of any effort to prioritize ranges by their importance. This, in turn, could support any Navy efforts to address foreign encroachment by targeting counter-intelligence activities on the most critical ranges. According to DOD, another core element of a risk assessment is to determine vulnerabilities, or the weakness of an asset that could cause it to suffer a loss. DOD and the services have raised concerns about the level of vulnerability facing some of their test and training ranges with regard to foreign encroachment. Specifically, Navy and Air Force headquarters officials as well as officials from all four of the ranges in our review told us that they had concerns about the number of investment- related projects by foreign entities occurring near their respective ranges--projects that they stated could pose potential security threats. Those officials told us that they were particularly concerned that foreign entities may have an increased ability to observe sensitive military testing or training activities if they are able to establish a persistent presence outside the services' test or training ranges. Further, officials at all four of the ranges in our review expressed such concerns to varying degrees. For example, officials from the Fallon Range Training Complex and the Nevada Test and Training Range, both of which are used to provide realistic air-to-ground combat training, told us that they have observed a number of energy development and mining projects near both ranges that may be owned or controlled by foreign entities. Officials at Eglin Air Force Base, where the Air Force conducts land, air, and water test and training, and at White Sands Missile Range, where the services evaluate new weapon systems, also expressed concerns about the potential for foreign entities to observe testing and training activities at their respective ranges. DOD officials noted, however, that the services have not conducted formal assessments to determine the extent to which these vulnerabilities exist at their ranges. According to DOD's instruction, along with establishing the criticality and vulnerability of assets, the third core element of a risk assessment is to assess the threats and hazards. Counterintelligence officials from the services' criminal investigation agencies said that they have conducted some threat or risk assessments on specific locations or installations, as well as investigated some individual instances of commercial activity. However, they have not conducted threat assessments focused on foreign encroachment across DOD's test and training ranges. Although these counterintelligence officials have investigated some individual instances of commercial activity, they have not conducted a systematic assessment of this potential threat because in most of the cases that they have investigated, they have not seen evidence that foreign encroachment posed a threat to the range. Therefore, they were reluctant to assign additional resources to this issue. DOD officials stated that given the uncertainty surrounding this issue, a risk assessment would be beneficial. However, DOD has not taken steps to initiate such a risk assessment or established a time frame for doing so. Without guidance from DOD for the services to follow in conducting a risk assessment that establishes a time frame for completion, identifies critical ranges, and then assesses vulnerabilities and threats to these ranges, DOD may not be able to determine what, if any, negative impact foreign encroachment may be having on its test or training ranges. DOD does not have information that officials say they need, such as the ownership of companies conducting business on federally managed land near DOD's ranges, to determine if specific transactions on federally owned or managed land pose a threat to ranges. Leading practices state that to support decision-making, it is important for organizations to have complete, accurate, and consistent information. Range officials at all four installations in this review stated that they need more specific information to determine whether an individual transaction poses a threat to their range. Further, DOD officials have identified some possible sources or methods for obtaining this information but have not formally collaborated with other federal agencies on how to gather this information. Collaboration can be broadly defined as any joint activity that is intended to produce more public value than could be produced when organizations act alone. Leading practices state that agencies can enhance and sustain collaborative efforts by engaging in several practices that are necessary for a collaborative working relationship. These practices include identifying and addressing needs by leveraging resources; agreeing on roles and responsibilities; and establishing compatible policies, procedures, and other means to operate across agency boundaries. In order for DOD to determine if an entity engaging in investment activities near one of its test or training ranges poses a potential risk for foreign encroachment, DOD officials said that they would need additional identifying information from governing federal agencies responsible for issuing public land-use permits or leases. Such information could include, for example, identification of any parent companies or whether a U.S.- based entity is owned or controlled by a foreign entity. Service headquarters and range officials at all four ranges in our review said that they generally have good informal working relationships with governing federal agencies that allow them to find out some information about transactions on federal land near the ranges and that federal agency officials, including those from the Bureau of Land Management or the Bureau of Ocean Energy Management, frequently contact them to informally let them know of a proposed transaction near the range. Despite these relationships, DOD and range officials expressed concerns that these governing agencies are not able to provide DOD with the necessary information to identify potential encroachment. Officials from the Bureau of Land Management, the Bureau of Ocean Energy Management, and the Bureau of Safety and Environmental Enforcement within the Department of the Interior and the Federal Aviation Administration within the Department of Transportation told us that they face legal, regulatory, or resource challenges that may prevent them from collecting information that is unrelated to their respective missions, leading to knowledge gaps that may be acceptable for approving leases or permits on federal lands but could adversely affect DOD's ability to identify potential security threats near the ranges. For example: The Paperwork Reduction Act of 1980 requires, among other things, that agencies undertake a number of procedural steps before collecting information from the public, including justifying the need for such information collection to the Office of Management and Budget. As part of this process individual agencies are required to certify to the Office of Management and Budget, among other things, that proposed collections of information are necessary for the proper performance of the functions of the agency. As a result, Department of the Interior officials said that they generally limit the information they collect to what is directly tied to the agencies' respective missions of effective land or resource management. Officials from the Bureau of Land Management said that the information that they are permitted to collect on potential lease-holders or permit applicants is prescribed by regulation based on a longstanding interpretation of their authorizing statute. Further, Federal Aviation Administration officials told us that while they have a process for collecting information about proposed structures that are more than 200 feet in height or are within certain distances of airports or runways, this process is designed to support the agency's mission of maintaining a safe and efficient aerospace system, not to collect the information that DOD would need to help identify instances of foreign encroachment on its ranges. Therefore, officials from these agencies expressed that they believe they would need some type of change in either their authorizing statutes or regulations to be able to collect this information. Agency officials also raised resource challenges as an issue in collecting additional information. Department of the Interior officials expressed concerns that any changes to either their statutory authorities or implementing regulations in order to collect additional information may create additional costs to the Department of the Interior as its bureaus conduct their respective missions. The officials told us they recognize the challenges DOD faces in identifying potential cases of foreign encroachment, but they also said their agencies' respective missions have little to do with national security issues and agency officials questioned whether, under current law, their appropriations could properly be used to finance data collection unrelated to their mission and for DOD's exclusive use. These officials expressed concerns that changes to their authorities or additional requirements imposed upon them may be burdensome, given their limited available resources. DOD has had some success in obtaining information that could be used to identify activities that could provide opportunities for foreign encroachment, but has not discussed options for obtaining additional information with other federal agencies. However, as discussed above, DOD officials said that when they do find out that an entity proposing a project near a range is foreign-owned, they generally obtain information on an informal basis through developed interagency relationships and not through any systematic process. For example, at some DOD installations, officials work with Bureau of Land Management district and field office managers to receive notifications on or discuss projects that may have an impact on DOD activities or interests. At one location--Naval Air Station Fallon--the Navy and the Bureau of Land Management have established a military liaison position to provide further coordination on both Navy and Bureau of Land Management interests due to the large number of energy development and mining projects occurring near the Fallon Range Training Complex. This military liaison position is funded by the Navy and the duties of the liaison include coordinating with the Navy on use of public lands and providing advice on highly technical and complex programs. In addition, Bureau of Land Management and DOD installation officials said that other DOD installations have good working relationships with their local Bureau of Land Management offices to discuss issues of importance to both agencies. Through these relationships, individual ranges are often notified of potential transactions near the ranges, but due to reasons stated above, range officials at the four installations in our review stated that they still feel that they need additional information on the transactions to be able to assess whether a transaction poses a threat to the range. The Navy, through the Center for Naval Analysis, recently conducted a study on this issue and identified additional sources of information that DOD could possibly leverage, including the Bureau of Economic Analysis within the Department of Commerce. Because this is an emerging issue for DOD, DOD has not taken steps to fully identify all potential sources of information or to reach out to other federal agencies that may have this information to discuss options for obtaining it. Without engaging potential sources of information on commercial activities near its ranges, DOD is hindered in its efforts to determine if a project could present a threat to test or training range activities. DOD's concerns about various forms of encroachment have been long- standing. As potential opportunities for foreign encroachment have presented themselves, some in DOD have become increasingly concerned about the potential vulnerability and risk to its domestic air, land, and sea test and training ranges from such encroachment. However, DOD has not determined the likelihood of foreign encroachment through persistent presence on federally owned or managed lands in proximity to the test and training ranges, versus other means that may give foreign adversaries the opportunity to observe new weapon systems and operational tactics. Although the Navy has taken steps to develop guidance on assessing the risk of foreign encroachment to its ranges, as of December 2014, this guidance has not been issued. Further, the other departments have not taken any steps toward developing this type of guidance. Without guidance from DOD for the military departments to follow in conducting a risk assessment--including a time frame for completion--that identifies critical ranges, then assesses vulnerabilities and threats to these ranges, DOD may not be able to determine what, if any, negative impact foreign encroachment may be having on its test or training ranges. In addition, without a means to collect more information on the entities conducting business in proximity to its ranges, DOD cannot adequately assess individual transactions as to their potential threat to a range. Because of the degree to which DOD and other agencies must manage legal, regulatory, and resource constraints in taking action to identify and address any significant encroachment concerns, it is critical that DOD have a complete picture of where it is at greatest risk, what information is needed to fully assess any risks, and what options are available to mitigate or manage risks in a manner that is consistent with DOD and other agencies' missions and resources. To improve the ability of the Department of Defense and the military departments to manage the potential for foreign encroachment near their test and training ranges, we recommend that the Secretary of Defense, in consultation with the military departments, develop and implement guidance for assessing risks to test and training ranges from foreign encroachment in particular, to include: determining the criticality and vulnerability of DOD's ranges and the level of the threat; and a time frame for completion of risk assessments. To identify potential foreign encroachment concerns on federally-owned land near test and training ranges, we recommend that the Secretary of Defense collaborate with the secretaries of relevant federal agencies, including at a minimum the Secretaries of the Interior and Transportation, to obtain additional information needed from federal agencies managing land and transactions adjacent to DOD's test and training ranges. If appropriate, legislative relief should be sought to facilitate this collaborative effort. In a written response on a draft of this report, DOD concurred with both recommendations. In addition, the Department of the Interior and the Department of Treasury provided technical comments, which we incorporated in our report as appropriate. The Department of Justice and the Department of Transportation did not provide any comments. DOD's comments are reproduced in their entirety in appendix III. We are sending copies of this report to the appropriate congressional committees; the Secretaries of Defense, the Army, the Navy, and the Air Force; the Secretaries of Interior, Transportation, and Treasury; the Attorney General of the United States; and the Director, Office of Management and Budget. In addition, the report is available at no charge on the GAO website at http://www.gao.gov. If you or your staff have any questions about this report, please contact me at (202) 512-4523 or [email protected]. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this report. GAO staff who made key contributions to this report are listed in appendix IV. To determine the extent to which DOD has conducted a risk assessment to identify the existence and extent of any threats posed by foreign encroachment to its domestic air, land, and sea test and training ranges, we reviewed statutes, regulations, and guidance pertaining to federal agencies' oversight of transactions by private entities on air, land, and sea. We compared DOD's efforts to key elements of conducting a risk assessment that we previously developed as well as criteria for identifying and protecting critical infrastructure that DOD uses in managing its Defense Critical Infrastructure Program. We also reviewed DOD counterintelligence guidance and intelligence reporting on surveillance threats to DOD activities and facilities. To understand DOD's concerns related to the potential presence of foreign entities near its test and training ranges, we interviewed appropriate officials from the Office of the Secretary of Defense as well as the Departments of the Navy, Army, and Air Force. We interviewed officials from DOD and the services' intelligence agencies, as well as the Defense Intelligence Agency and the Federal Bureau of Investigation, to understand the extent to which any foreign encroachment concerns raised are based on information provided by the intelligence community. We also interviewed appropriate officials from the entities that govern activities on federally managed land in order to understand how and the extent to which DOD works with civilian governing agencies to identify areas of potential foreign encroachment., i.e., the Bureau of Land Management and Bureau of Ocean Energy Management (both within the Department of the Interior), which have responsibility for approving and administering permits and leases for projects on public lands, and the Federal Aviation Administration (within the Department of Transportation), which is responsible for reviewing potential obstructions to aviation safety. Finally, we interviewed officials from the Department of the Treasury, which chairs the Committee on Foreign Investment in the United States (CFIUS). To determine the extent to which DOD has obtained information on specific transactions near test and training ranges that it needs to determine if these transactions pose a threat to the range, we interviewed officials from OSD and the military service headquarters, as well as military department intelligence agencies, the Defense Intelligence Agency, and the Federal Bureau of Investigation. We also interviewed officials from selected federal agencies including the Bureau of Land Management and Bureau of Ocean Energy Management within the Department of the Interior, the Federal Aviation Administration within the Department of Transportation, and the Department of the Treasury, who all have a role overseeing transactions on federal land surrounding DOD's ranges. We compared DOD's efforts in obtaining information to leading practices on decision making and collaboration from our prior work. For both objectives, we spoke with officials from selected DOD test and training ranges: the Nevada Test and Training Range, Nevada (Air Force); the Fallon Range Training Complex, Nevada (Navy); Eglin Air Force Base, Florida (Air Force); and White Sands Missile Range, New Mexico (Army). After discussions with DOD officials, we selected locations (1) that included at least one range from each military department, (2) where security encroachment from foreign countries on federally owned land near test or training ranges has been raised as a concern, and (3) where ranges were surrounded by federally controlled land or ocean areas, thus requiring coordination with other federal agencies. At the Nevada Test and Training Range and the Fallon Range Training Complex we also interviewed officials from the Bureau of Land Management and the Federal Bureau of Investigation, as these agencies have responsibilities for the approval of public-use leases and permits or domestic counterintelligence efforts outside of both of these locations, respectively. Because federally owned land is disproportionately located in the western United States, the majority of our visits and discussions were with ranges in that area. The information from these four ranges is not generalizable to all of DOD's domestic ranges. We limited the scope of this engagement to projects in which the federal government plays a role in approving, evaluating, or permitting the project. In addressing our objectives, we contacted officials representing a wide range of organizations (see table 1). We conducted this performance audit from July 2013 to December 2014 in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. The only formal option in regard to transactions involving foreign companies or entities that accounts for national security concerns is the Committee on Foreign Investment in the United States (CFIUS) process. However, the CFIUS process is limited in two main ways. First, CFIUS only reviews transactions that meet certain criteria. Specifically, the CFIUS process reviews covered transactions, which include any merger, acquisition, or takeover that results in foreign control of any person engaged in interstate commerce in the United States. However, there are also many types of non-covered transactions that could result in a foreign entity having access to or a persistent presence near DOD ranges. These non-covered transactions include starts-ups, as well as acquisitions of assets other than an interest in a U.S. company, such as equipment or intellectual property. In addition, foreign purchases or leases of private real property--for business or non-business uses--near installations would not be covered by CFIUS. Second, CFIUS primarily relies on voluntary reporting of transactions by the involved parties to bring covered transactions to its attention, although the President or any member of CFIUS can also initiate a review of a covered transaction should they discover its occurrence. In the absence of voluntary reporting by the parties involved or independent discovery of the transaction by CFIUS, however, covered transactions will not be reviewed. For covered transactions it does review, CFIUS determines the effects of the transaction on national security, which includes consideration of a number of factors, including the potential national security-related effects on United States critical infrastructure. After a review of a covered transaction is initiated, the Committee evaluates the transaction and then either approves the transaction, approves the transaction with mitigation or makes a recommendation to the President to block the transaction. In the case of transactions that the Committee approves with mitigation, the Committee and participating companies typically execute national security agreements that impose some type of limitations or monitoring of projects, such as limitations on the citizenship of employees of the company or reporting of visitation by foreign citizens. However, according to DOD and installation officials, these agreements are often difficult to enforce. Finally, if the President finds that (1) there is credible evidence that the foreign interest exercising control might take action that will impair the national security and that (2) other laws, in the judgment of the President, do not provide adequate and appropriate authority for the President to protect national security, then the President can direct that the transaction be suspended or prohibited. This has only happened rarely, though. For example, in 1990 the President ordered a foreign- owned company to divest its acquisition of a manufacturing firm producing metal parts and assemblies for aircraft, and in 2012 the President blocked a foreign acquisition of a U.S. energy firm that was constructing a wind-turbine plant near a specialized Navy training facility. In addition to the contact named above, GAO staff who made key contributions to this report include Maria Storts, Assistant Director; Mark Wielgoszynski, Assistant Director; Leslie Bharadwaja; Simon Hirschfeld; Terry Richardson; Amie Lesser; Erik Wilkins-McKee; Michael Willems; and Richard Winsor. | For many years, DOD has reported that it faces challenges in carrying out realistic training because of the cumulative result of outside influences--such as urban growth and endangered species habitat--that DOD refers to as encroachment. In January 2014, DOD reported concerns with security encroachment by foreign entities conducting business near its test and training ranges. GAO was mandated by the House Armed Services Committee report accompanying a bill for the National Defense Authorization Act for Fiscal Year 2014 to review encroachment on DOD's test and training ranges. This report examines the extent to which DOD has (1) conducted a risk assessment to identify the existence and extent of any threats of foreign encroachment and (2) obtained information needed on specific transactions to determine if they pose a threat. GAO reviewed statutes, regulations, and guidance on federal agency oversight of transactions on federal land. GAO interviewed DOD and service officials, as well as officials from other federal agencies identified by DOD as having a role in such transactions. The Department of Defense (DOD) has not conducted a risk assessment that includes prioritizing test and training ranges based on mission criticality, determining their vulnerabilities to foreign encroachment (i.e., foreign entities acquiring assets, such as mines or energy projects, or otherwise conducting business transactions near test and training ranges), and assessing the degree to which foreign encroachment could pose a threat to the mission of the range. Some DOD officials stated that they are concerned about foreign encroachment, which may provide an opportunity for persistent surveillance of DOD test and training activities. However, DOD has not prioritized its ranges or assessed such threats because, among other things, there is no clear guidance on how to conduct assessments of the risks and threats posed by foreign encroachment. Some DOD officials told GAO they have considered conducting such assessments, but DOD has not issued guidance directing the services to conduct these assessments. Officials from the Navy and the Air Force stated that given the unique nature of each range, it would be difficult to assess their criticality. However, Navy officials stated that they had expected to issue guidance for conducting risk assessments sometime in 2015. Without clear guidance from DOD for the services to follow in conducting a risk assessment, DOD may not be able to determine what, if any, negative impact foreign encroachment may be having on its test or training ranges. DOD has not obtained sufficient information on commercial activity being conducted near test and training ranges in the level of detail officials say they need--such as if a U.S.-based entity is owned or controlled by a foreign entity--to determine if specific transactions on federally owned or managed land in proximity to ranges pose a threat to the range. Such information is generally not collected by other agencies with responsibilities for these transactions because, in some cases, legal, regulatory, or resource challenges may prevent them from collecting information that is unrelated to their agencies' missions. For example, the Federal Aviation Administration collects information about proposed structures that are more than 200 feet in height to support the agency's mission of maintaining a safe and efficient aerospace system, but does not collect information on the ownership of the companies building the structures because it is beyond the scope of its mission. DOD has identified some potential sources of information, but it has not formally collaborated with other federal agencies on how to gather this information. Leading practices state that agencies can enhance and sustain collaboration by engaging in several practices, including addressing needs and leveraging resources and agreeing on roles and responsibilities. Without engaging potential sources of information on commercial activities near its ranges, DOD is hindered in its efforts to determine if a project could present a threat to test or training range activities. GAO recommends that DOD (1) develop and implement guidance for conducting a risk assessment on foreign encroachment and (2) collaborate with other federal agencies to obtain additional information on transactions near ranges. In written comments on a draft of this report, DOD concurred with both recommendations. | 7,101 | 886 |
The National Airlift Policy, issued in June 1987, reinforced the need for and use of the Civil Reserve Air Fleet (CRAF) program, established in 1951. The policy states that military and commercial airlift resources are equally important; that DOD should determine which resources must be moved by the military and which can be moved by commercial air carriers; and that commercial carriers will be relied upon to provide airlift capability beyond the capability of the military fleet. It also states that during peacetime, DOD requirements for passengers and/or cargo airlift augmentation shall be satisfied by the procurement of airlift from commercial air carriers participating in the CRAF program. Military airlift requirements are fulfilled by a mix of both military and civilian aircraft. Currently, the military airlift fleet is comprised of 82 C-17, 110 C-5, 468 C-130, and 69 C-141 aircraft. The older C-141 aircraft are being phased out and replaced by additional C-17 aircraft. There are also 54 KC-10 aircraft, which perform both airlift and refueling missions. The CRAF program includes 927 cargo and passenger aircraft from U.S. commercial air carriers. CRAF participants are required to respond within 24 hours of activation in the event of stage I (a regional crisis in which the Air Mobility Command's (AMC) aircraft fleet cannot meet both deployment and other traffic requirements simultaneously) or stage II (a major war that does not warrant full national mobilization). Stage III--multiple theater wars or a national mobilization--requires that total CRAF airlift capability be made available to DOD within 48 hours of activation. Aircraft used in stages I and II are also available in subsequent stages. In the event of activation, AMC assumes mission control, but the carriers continue to operate and support the aircraft (support includes fuel, spare parts, and maintenance). Stage I was activated for the first and only time on August 17, 1990, during Operation Desert Shield. Stage II was activated on January 17, 1991, for Operation Desert Storm. The total number of aircraft committed to CRAF (see table 1) accounts for about 15 percent of all U.S.-owned commercial aircraft forecasted for 2003. Appendix I lists the carriers participating in the CRAF program as of October 2002 and the total number of aircraft each has committed through stage III. More aircraft are committed to the CRAF program than are needed to fulfill the wartime requirements established by the Mobility Requirements Study 2005 (MRS-05). There was a shortage of aeromedical evacuation aircraft, but this has been recently eliminated. Program participants stated that they would be capable of providing the needed levels of aircraft and crews within the necessary time frames, even with recent furloughs and with crewmembers that have National Guard or Reserve commitments. A new mobility requirements study could see an increase in the need for CRAF based on a change from the two major theater war scenario to the new strategy of planning for a range of military operations that was described in DOD's recent Quadrennial Defense Review Report, issued in September 2001. Under MRS-05's two major war scenario, the study assumed that both military and CRAF aircraft were needed and that CRAF would be required to move 20.5 million ton miles a day, or 41 percent of all military bulk cargo deliveries. CRAF would also carry 93 percent of all passengers and provide almost all aeromedical evacuation needs. In fiscal year 2002, there were only 31 of the 40 B-767s required to be available for conversion to aeromedical evacuation. However, commercial carriers increased their commitment to 46 of these aircraft for fiscal year 2003. Table 2 compares the requirements for a stage III CRAF activation with commitments by program participants. Officials from CRAF air carrier participants that we visited confirmed that they would be able to provide the agreed levels of airlift capacity within the necessary time frames and that the turmoil in the airline industry after the attacks of September 11, 2001, would not affect their ability to do so. The officials said they would also be able to provide at least four flight crews per aircraft (crewmembers must also be U.S. citizens), as they are required to do by AMC Regulation 55-8. This is in spite of the fact that some carriers have had to furlough pilots during the recent economic downturn and that employees with National Guard or Reserve commitments cannot be included in available crew lists. The same regulation requires that commercial carrier personnel with military Reserve or National Guard commitments not be considered in the cockpit crew-to-aircraft ratio. They can, however, be used in CRAF carrier work until their military units have alerted them of a recall to active duty. Officials from the carriers we visited said they monitor their crewmembers' reserve commitments carefully and usually maintain a higher crew-to-plane ratio than DOD requires. For example, one carrier we visited operates with a crew-to-plane ratio of 10 to 1, instead of the 4 to 1 DOD requires for CRAF carriers. DOD also inspects carriers annually, and the inspectors have been satisfied that the carriers could meet the crew-to-plane ratio. The MRS-05 did not consider CRAF's full capacity, and it set a ceiling of 20.5 million ton miles on daily CRAF airlift requirements. According to DOD officials, the study restricted CRAF cargo capacity to 20.5 million ton miles per day because DOD's airfields can accommodate only a certain number of aircraft at the same time. Also, they stated that using additional CRAF aircraft would reduce efficiency because of the type of cargo CRAF is modeled to carry. They said that commercial aircraft can take longer to unload than military aircraft and require special material handling equipment to be available at an off-loading base. Military aircraft, on the other hand, do not need specialized loading equipment because they are high-winged and lower to the ground. Furthermore, the MRS-05 did not consider the ability of the commercial industry to carry different cargo sizes. The MRS-05 modeled CRAF aircraft carrying only bulk cargo. According to Air Force officials, the U.S. commercial cargo fleet has limited ability to carry oversized cargo and no ability to carry outsized cargo. They stated that it is difficult, from a planning perspective, to model CRAF aircraft carrying oversized cargo because the models would need to distinguish between the types of oversized cargo and the types of aircraft. They also stated that using more CRAF capacity than the 20.5 million ton mile limit would flow more bulk cargo into a theater instead of oversized and outsized unit equipment brought in by the larger military aircraft. In reality, however, commercial aircraft do carry some oversized cargo. DOD is examining how much oversized equipment can be moved by CRAF so that this capability can be included in future mobility studies. DOD's Defense Planning Guidance, issued in August 2001, requires that mobility requirements be reevaluated by 2004, and DOD officials believe that future requirements will be higher because of the increased number of possible scenarios included in the guidance. We believe that a study that also takes into consideration excess CRAF capacity and the types of cargo that CRAF can accommodate could provide a more realistic picture of needs and capabilities. It could also mitigate some of the concerns about airfield capacity and flow of cargo into a theater if CRAF aircraft could move some of the oversized cargo. This could get the larger cargo to a unit as it was needed, instead of bulk cargo, which may not be as time-critical. One of the key stated incentives of the CRAF program--the ability to bid on peacetime government business--may be losing its effectiveness because DOD uses almost exclusively one type of aircraft, the B-747, for its peacetime cargo missions. Over 94 percent, or 892, of 946 wide-body missions flown by CRAF participants in the first 10 months of fiscal year 2002 were carried out by B-747s, which accounted for only 38 percent of wide-body cargo aircraft committed to the CRAF program. Some major CRAF participants who do not have B-747s have suggested that they might reduce or end their participation in the program if they do not receive any business in return for their commitment. This could have a serious effect on the program's ability to meet future requirements, especially if those requirements increase due to the change in focus from two major theater wars to a range of military operations outlined in the recent Quadrennial Defense Review. Only carriers that participate in the CRAF program can bid on peacetime mobility business. Carriers can bid on a percentage of peacetime business in direct proportion to their commitment to the program. Participants earn mobilization value points, which are based on the number and type of committed aircraft. In assigning mobilization value points, DOD measures each volunteered passenger or cargo aircraft against the capacity and airspeed of a B-747-100. Participants in the aeromedical evacuation segment of CRAF receive double the mobilization value points because of the significant reconfiguration their aircraft (B-767s) must undergo. The points are used to determine how much commercial business each participant can bid on out of the total, which in fiscal year 2002 more than doubled to $1.28 billion from $572 million the previous year (see app. II for annual amounts since fiscal year 1998). Participants with 62 percent of the wide-body cargo aircraft committed to CRAF are not able to bid on most peacetime cargo business because they do not have B-747s. An AMC official said that most requests for cargo aircraft require a 90-ton capacity, the same as that of a 747-type aircraft but slightly more than those of other wide-body aircraft such as the MD-11 (86 tons) or the DC-10 (75 tons). One carrier with over 100 wide-body cargo planes smaller than B-747s committed to the program (and accounting for 41 percent of all total mobilization value points awarded to cargo carriers) received only about 4 percent of peacetime cargo business in fiscal year 2002. By contrast, a carrier committing 10 B-747 type aircraft (7 percent of total cargo points) flew 37 percent of all peacetime cargo business. AMC officials claim that they must use 90-ton capacity aircraft because they need the flexibility and capacity to clear ports as quickly as possible. The B-747 can carry more and larger cargo than other wide-body aircraft because it has more capacity and larger doors. Officials also noted that the B-747 can carry standard-sized bulk cargo pallets that are the same size as those used by commercial industry, the Defense Logistics Agency, and other DOD activities and contractors. Standard pallets also fit aboard all military cargo aircraft. In order to fit aboard other wide-body aircraft such as the DC-10 or the MD-11, cargo handlers at military bases must disassemble and rebuild the standard pallets to fit the aircrafts' lower profile (see fig. 2). Some cargo carrier officials said they could not bid on the amount of peacetime business they believe they are entitled to based on their CRAF participation. Consequently, they indicated that unless this problem improves, they might reduce or end their participation at some point in the future. AMC officials acknowledged that the requirements from Operation Enduring Freedom, DOD's operation in Afghanistan, amounted to the equivalent of a stage I activation. Activation was avoided because CRAF participants volunteered the airlift capability needed in fiscal year 2002. Although commitments to the CRAF program currently exceed requirements, this situation could change if some cargo carriers continue to be left out of the peacetime business and eventually decide to reduce or terminate their participation in the program. In our opinion, DOD cannot afford to lose CRAF participants, particularly in view of a new mobility requirements study and a potential increase in requirements. Furthermore, some cargo carriers stated that the CRAF B-747s are not flying with full loads and claimed that it would be less expensive to use smaller wide-body aircraft with lower per-mile costs. We obtained mission data and found that almost half of the 892 CRAF missions flown on B-747s in the first 10 months of fiscal year 2002 did not use all available space or weight capacities. These loads might have fit on smaller wide-body aircraft, which would have cost less to fly. B-747 aircraft are more expensive than other wide-body aircraft, such as the MD-11, which have lower per-mile operation costs. See table 3 for a cost comparison by plane type for a round-trip flight from Dover Air Force Base to Ramstein Air Force Base, Germany. Over 40 percent of these recent missions flown by B-747s did not utilize all the available pallet positions and carried less than 55.7 tons. In fiscal year 2002, AMC officials used the 55.7-ton mark as a breakeven point--the point at which the per-pound cost that the customer pays to have the cargo shipped equals the B-747's per-mile cost that AMC pays the carrier to fly the mission. We were unable to determine whether a smaller, more economical aircraft could have been used for these missions because, at the time we requested the data, DOD was not obtaining data on cargo volume. However, it has since begun to accumulate this information, which will help determine whether aircraft are flying at full capacity. Military port handlers assured us that DOD's use of B-747 aircraft during peacetime would not decrease their capability to build and load different types of pallets on other types of aircraft, which AMC data show account for 62 percent of the CRAF wide-body cargo fleet, during wartime. They stated that they "frequently" build pallets and can use available templates for nonstandard shapes. When we questioned how effectively they could do this in the very first and most urgent phases of a conflict, they stated that during wartime, supplies such as ammunition and food are delivered in pallets that can be loaded directly aboard smaller wide-body planes. According to port officials, loading aircraft is easily accomplished once the pallets are built. Another incentive for passenger air carriers to participate in the CRAF program is annual government air passenger business under the General Services Administration's City Pairs program. General Services Administration officials said that passenger air carriers have expressed dissatisfaction because they believe the program is too restrictive and does not allow them to manage aircraft capacity to generate the highest profit. However, the 2003 contract includes some changes that program officials believe will resolve many of the carriers' concerns. The upcoming reevaluation of mobility requirements may increase the need for CRAF in the future. However, the last study did not consider some factors--such as the ability of commercial aircraft to carry different sized cargo--that, if included, could provide more accurate and realistic requirements. The last study also set a ceiling on the amount of cargo carried by CRAF that provided the needed flow of cargo into a theater and that DOD's infrastructure could process efficiently. This figure needs to be revalidated so that the next mobility requirements study can provide decision makers accurate and helpful information on true needs and capabilities. There are strong indications that some major program participants are dissatisfied with their share of a key CRAF incentive, the opportunity to bid on peacetime mobility business, because DOD uses almost exclusively only one type of aircraft for peacetime cargo missions. If they are unable to see some benefit from the incentive program, some participants might reduce or end their participation in the program. This could cause difficulties in meeting requirements at a time when participation in peacetime business or CRAF activation is crucial. DOD needs to study ways to expand the use of smaller wide-body aircraft to ensure an equitable distribution of the peacetime business and determine whether smaller wide-body aircraft could carry out a higher proportion of its peacetime missions as efficiently as, and possibly more economically as, the B-747 does. We recommend that the Secretary of Defense direct that the reevaluation of mobility requirements mandated by the Defense Planning Guidance include a more thorough study of CRAF capabilities, to include the types of cargo CRAF can carry and how much CRAF aircraft can land and be unloaded and serviced at military bases, and the Air Mobility Command determine whether smaller wide-body aircraft could be used as efficiently and effectively as the larger B-747-type planes to handle the peacetime cargo business that DOD uses as an incentive for CRAF participants. In written comments on a draft of this report, DOD concurred with our recommendations. However, DOD believed it would be more appropriate to ensure that ongoing study efforts be given greater emphasis and require that any resulting reports specifically address our issues. We agree that these studies could address our first recommendation concerning a more thorough study of CRAF capabilities. In a subsequent discussion, a DOD official stated that DOD intends to perform an additional study that would address the second recommendation. DOD's comments are presented in their entirety in appendix III. We used the MRS-5, DOD regulations, and discussions with officials at the U.S. Transportation and U.S. Air Mobility Commands, located at Scott Air Force Base, Illinois, to establish the aircraft and time frame requirements for the CRAF program. We obtained and reviewed data from and interviewed officials at the U.S. Transportation Command, U.S. Air Mobility Command, Office of the Secretary of Defense, and representatives of six CRAF participants, which represent about 38 percent of the total CRAF aircraft commitment, to conclude whether the participants could respond to an activation with the required number of aircraft and crews and in the required time frame. We also interviewed representatives of six CRAF participants, representing both passenger and cargo air carriers, to determine whether the incentives used to attract and retain program participants are effective. For clarification on the incentives and how they are used, we referred to DOD regulations and interviewed officials at the U.S. Transportation Command, the U.S. Air Mobility Command, and the General Services Administration. We analyzed AMC mission data to determine the capacity at which aircraft were flying. We met with officials at the 436th Aerial Port Squadron at Dover Air Force Base to discuss cargo and aircraft loading. We conducted our review between January and October 2002 in accordance with generally accepted government auditing standards. We are sending copies of this report to the Secretary of Defense, the appropriate congressional committees, and the Director, Office of Management and Budget. We will also make copies available to others upon request. In addition, the report will be available at no charge on the GAO Web site at http://www.gao.gov. If you have any questions about this report, please contact me at (757) 552-8100. See appendix III for major contributors. The Department of Defense (DOD) uses commercial carriers for two different kinds of peacetime airlift moves: The first (called fixed buy) is a set contract for "channel flights" made on a regular weekly schedule from U.S. bases to fixed points across Atlantic and Pacific routes. The second (called expansion buys) includes airlift bought after the start of the fixed buy contract to support exercises, contingencies, special airlift assignment missions, and growth in channel requirements. From fiscal years 1992 through 1997, DOD contracts for commercial passenger and cargo business averaged over $611 million a year. From fiscal years 1998 through 2001, contracts increased to an average of almost $640 million a year. In fiscal year 2002, contracts increased significantly to almost $1.3 billion, which Air Mobility Command officials attributed to missions flown in support of Operation Enduring Freedom, the operation in Afghanistan. (See table 4.) In addition to those named above, Lawrence E. Dixon, Patricia Lentini, Stefano Petrucci, and Kenneth Patton made key contributions to this report. | In the event of a national emergency, the Department of Defense (DOD) can use commercial aircraft drawn from the Civil Reserve Air Fleet to augment its own airlift capabilities. The Civil Reserve Air Fleet is a fleet of aircraft owned by U.S. commercial air carriers but committed voluntarily to DOD for use during emergencies. After the terrorist attacks of September 11, 2001, many air carriers experienced financial difficulties. This sparked concern about the fleet's ability to respond, if activated, and prompted the Subcommittee to ask GAO to determine whether the fleet could respond to an activation with the required number of aircraft and crews and in the required time frame. The Subcommittee also wanted to know whether the incentives used to attract and retain participants are effective. Civil Reserve Air Fleet participants can respond to an emergency or a war with the required number of aircraft and crews and within the required time frame. Currently, there are more aircraft committed to the fleet than are needed to fill the wartime requirements identified in the DOD Mobility Requirements Study 2005, which determined the requirements to fight and win two major theater wars. However, Civil Reserve Air Fleet requirements may increase the next time mobility requirements are studied. The last mobility requirements study was limited in that it did not consider the use of excess Civil Reserve Air Fleet capacity and the ability of some commercial aircraft to carry larger cargo than standard-sized pallets. The incentives currently in place to encourage participation in the program, especially the incentive to participate in DOD's peacetime business, might be losing effectiveness and could become disincentives in the future. Some participants are not able to bid on peacetime cargo business because their fleets do not include B- 747s, the predominant aircraft DOD uses for peacetime cargo missions. GAO found that B-747s carried out 94 percent of 946 missions flown by commercial aircraft in the first 10 months of fiscal year 2002. Furthermore, over 40 percent of recent missions did not use all available space or weight limits aboard B-747s. These missions might have been carried out less expensively with smaller wide-body aircraft. Using smaller aircraft would provide more peacetime business to a greater share of program participants, thus enhancing current incentives. However, the Air Force does not have sufficient management information to identify options for selecting the best available aircraft type for the mission. | 4,256 | 503 |
A major goal of Customs is to prevent the smuggling of drugs into the country by attempting to create an effective drug interdiction, intelligence, and investigation capability that disrupts and dismantles smuggling organizations. Although Customs inspectors have the option to conduct examinations of all persons, cargo, and conveyances entering the country, the inspectors may selectively identify for a thorough inspection those that they consider high risk for drug smuggling. This identification is generally done through the use of databases available to Customs, such as TECS. TECS is designed to be a comprehensive enforcement and communications system that enables Customs and other agencies to create or access lookout data when (1) processing persons and vehicles entering the United States; (2) communicating with other computer systems, such as the Federal Bureau of Investigation's National Crime Information Center; and (3) storing case data and other enforcement reports. In addition to Customs, TECS has users from over 20 different federal agencies, including the Immigration and Naturalization Service; the Bureau of Alcohol, Tobacco and Firearms; the Internal Revenue Service; and the Drug Enforcement Administration. The TECS network consists of thousands of computer terminals that are located at land border crossings along the Canadian and Mexican borders; sea and air ports of entry; and the field offices of Customs' Office of Investigations and the Bureau of Alcohol, Tobacco and Firearms. These terminals provide access to records and reports in the TECS database containing information from Customs and other Department of the Treasury and Department of Justice enforcement and investigative files. According to the TECS user manual, all TECS users (e.g., Customs inspectors and special agents) can create and query subject records, which consist of data on persons, vehicles, aircraft, vessels, businesses or organizations, firearms, and objects. According to TECS Data Standards, records should be created when the subject is deemed to be of law enforcement interest. This interest may be based on previous violations, such as drug smuggling or suspicion of violations, or subjects that are currently or potentially of investigative interest. One of the reasons for creating a TECS lookout record is to place a person or vehicle in the system for possible matching at Customs' screening locations, such as land border ports of entry. For example, if a vehicle's license plate that was placed on lookout for possible drug smuggling were later matched during a primary inspectionat a land border port of entry, that vehicle could be referred for additional scrutiny at a secondary inspection. Inappropriate deletions of TECS lookout records could negatively affect Customs' ability to detect drug smuggling. Although inspectors have the option to conduct a thorough examination of all persons, cargo, and conveyances entering the country, they selectively identify for a thorough inspection only those that they consider high risk for drug smuggling. This identification is generally done through the use of databases available to Customs, such as TECS. Inspectors also rely on their training and experience to detect behavior that alerts them to potential drug violators. If lookout records have been inappropriately deleted, inspectors will have less information or less accurate information on which to make their decisions. The TECS administrative control structure consists of a series of System Control Officers (SCO) at various locations, including Customs headquarters, CMCs, and ports around the country. These SCOs are responsible for authorizing and controlling TECS usage by all of the users within the network. A national SCO has designated other SCOs at Customs headquarters for each major organization (e.g., Office of Investigations, Field Operations, and Internal Affairs) who, in turn, have designated regional SCOs who have named SCOs at each CMC and Office of Investigations field office. In some instances, SCOs have been appointed at the port of entry and Office of Investigations suboffice level. Consequently, the SCO chain is a hierarchical structure with each user assigned to a local SCO who, in turn, is assigned to a regional SCO, and so on up to the national level. One of an SCO's primary duties is to establish User Profile Records on each user. User Profile Records identify the user by name, social security number, position, duty station, and telephone number. They also identify the social security number of the user's supervisor, the SCO's social security number, and the TECS applications that the user is authorized to access. SCOs at the various levels have certain system authorities they can pass on to other users. For example, the record update level is a required field in the User Profile Record that indicates the user's authority to modify or delete records. SCOs can only assign to a user the level that they have, or a lower level. According to the TECS user manual, record update levels include the following: 1. Users can only modify or delete records they own (i.e., the user created the records or received them as a transfer from the previous owner). 2. Users can modify or delete any record within their specific Customs sublocation, such as a port of entry, thereby ignoring the ownership chain;the user does not have to be the owner of the record. 3. Users can modify or delete any record owned by anyone in their ownership chain. 4. Users can modify or delete any record in the Customs Service, thereby ignoring the ownership chain. 5. Users have a combination of levels two and three. 9. Users can modify or delete any user's record in the database. According to Customs TECS officials, when a TECS user creates a record and enters it into the system, the user's supervisor is automatically notified of the entry. All records must be viewed by the supervisor. The supervisor must approve the record, and the record must be linked to supporting documentation, such as a Memorandum of Information Received. According to the TECS user manual, TECS users can modify and delete records that they own, and on the basis of the record update level in their User Profile Record, may modify and delete the records of other users as follows: If the users are supervisors or SCOs with the proper record update level (three or five), they may modify and delete the records owned by users in their supervisory or SCO chain. If the users' record update level (two, four, or five) allows, they may modify and delete the records created or owned by other users in a specific Customs sublocation, such as a port of entry. No other controls or restrictions are written in the TECS user manual or any other document that we reviewed. The Federal Managers' Financial Integrity Act of 1982 required, among other items, that we establish internal control standards that agencies are required to follow (see 31 U.S.C. 3512). The resulting Comptroller General's standards for internal controls in the federal government contain the criteria we used to assess Customs' controls over the deletion of lookout records from TECS. During our review, we identified three areas of control weakness: separation of duties, documentation of transactions, and supervision. The Comptroller General's internal control standards require that key duties and responsibilities in authorizing, processing, recording, and reviewing transactions should be separated among individuals. To reduce the risk of error, waste, or wrongful acts or to reduce the risk of their going undetected, no one individual should control all key aspects of a transaction or event. Rather, duties and responsibilities should be assigned systematically to a number of different individuals to ensure that effective checks and balances exist. Key duties include authorizing, approving, and recording transactions and reviewing or auditing transactions. Customs' current policy authorizes a wide variety of people within and outside of an individual's supervisory and SCO chain to individually delete the records that another individual owns without any checks and balances (e.g., concurrence by another person). This situation increases risk because, as one SCO that we interviewed told us, the more individuals--supervisors, SCOs, or others--with the required record update levels there are, the more vulnerable TECS is to having records inappropriately altered or deleted. According to the TECS user manual, supervisors, SCOs, and other users with the proper record update level may delete TECS records that they do not own. Moreover, we noticed a range in the number of individuals who were authorized to individually delete others' records at the three CMCs and three ports we visited. For example, the Southern California CMC had 1 official--the SCO--with the authority to delete others' records, while the Arizona CMC had 41 individuals--supervisors, SCOs, and others--with that authority. In addition, 1 of the ports we visited (Nogales) had 22 individuals with the authority to delete any record within their port without the record owner's or anyone else's permission. In these instances, many individuals, by virtue of their status as a supervisor or SCO or because they possessed the required record update level, were able to delete records with no checks and balances in evidence. The Comptroller General's standards require that internal control systems and all transactions and other significant events are to be clearly documented, and that the documentation is to be readily available for examination. Documentation of transactions or other significant events should be complete and accurate and should facilitate tracing the transaction or event and related information from before it occurs, while it is in process, to after it is completed. Neither Customs policies nor the TECS user manual contained standards or guidance to require that Customs officials document reasons for the deletion of TECS lookout records. Although TECS can produce detailed information on what happened to records in the system and when it happened, there is no requirement that the person deleting the record is to describe the circumstances that called for the deletion. Thus, examiners cannot determine from the documentation whether the deletion was appropriate. The Comptroller General's standards require that qualified and continuous supervision is to be provided to ensure that internal control objectives are achieved. This standard requires supervisors to continuously review and approve the assigned work of their staffs, including approving work at critical points to ensure that work flows as intended. A supervisor's assignment, review, and approval of a staff's work should result in the proper processing of transactions and events, including (1) following approved procedures and requirements; (2) detecting and eliminating errors, misunderstandings, and improper practices; and (3) discouraging wrongful acts from occurring or recurring. Customs had no requirement for supervisory review and approval of record deletions, although supervisory review and approval were required for creating TECS records. TECS officials told us that users could delete records that they own without supervisory approval. In addition, anyone with a higher record update level than the record owner, inside or outside of the owner's supervisory and SCO chain, could also delete any owner's record without obtaining approval. TECS lookout records can provide Customs inspectors at screening areas on the Southwest border with assistance in identifying persons and vehicles suspected of involvement in drug smuggling. Internal control weaknesses over deletions of the records may compromise the value of these tools in Customs' anti-drug smuggling mission. Most of the CMCs and ports we reviewed had many individuals who were able to delete TECS records without any checks and balances, regardless of whether they owned the records or whether they were in an authorized supervisory or SCO chain of authority. In addition, Customs' current policy authorizes a wide variety of people within and outside of an individual's chain of authority the ability to delete records that other individuals created. The more people inside or outside of the supervisory or SCO chain of authority who can delete records without proper checks and balances, the more vulnerable the records are to inappropriate deletions. Although our review was limited to Customs headquarters, three CMCs, and three ports of entry, because of the lack of systemwide (1) internal control standards concerning deletion authority and (2) specific guidance concerning the deletion of TECS records that comply with the Comptroller General's standards for internal controls, it is possible that TECS lookout records are not adequately safeguarded in other CMCs and other ports of entry as well. To better ensure that TECS lookout records are adequately safeguarded from inappropriate deletion, we recommend that the Commissioner of Customs develop and implement guidance and procedures for authorizing, recording, reviewing, and approving deletions of TECS records that conform to the Comptroller General's standards. These procedures should include requiring supervisory review and approval of record deletions and documenting the reason for record deletions. The Treasury Under Secretary for Enforcement provided written comments on a draft of this report, and the comments are reprinted in appendix I. Overall, Treasury and Customs management generally agreed with our conclusions, and the Under Secretary said that Treasury officials also provided technical comments, which have been incorporated in the report as appropriate. Customs has begun action on our recommendation. Customs recognized that there is a systemic weakness in not requiring supervisory approval for the deletion of TECS records and not requiring an explicit reason for the deletion of these records. Customs agreed to implement the necessary checks and balances to ensure the integrity of lookout data in TECS. We are providing copies of this report to the Chairmen and Ranking Minority Members of House and Senate committees with jurisdiction over the activities of the Customs Service, the Secretary of the Treasury, the Commissioner of Customs, and other interested parties. Copies also will be made available to others upon request. The major contributors to this report are listed in appendix II. If you or your staff have any questions about the information in this report, please contact me on (202) 512-8777 or Darryl Dutton, Assistant Director, on (213) 830-1000. Brian Lipman, Site Senior The first copy of each GAO report and testimony is free. Additional copies are $2 each. Orders should be sent to the following address, accompanied by a check or money order made out to the Superintendent of Documents, when necessary. VISA and MasterCard credit cards are accepted, also. Orders for 100 or more copies to be mailed to a single address are discounted 25 percent. U.S. General Accounting Office P.O. Box 37050 Washington, DC 20013 Room 1100 700 4th St. NW (corner of 4th and G Sts. NW) U.S. General Accounting Office Washington, DC Orders may also be placed by calling (202) 512-6000 or by using fax number (202) 512-6061, or TDD (202) 512-2537. Each day, GAO issues a list of newly available reports and testimony. To receive facsimile copies of the daily list or any list from the past 30 days, please call (202) 512-6000 using a touchtone phone. A recorded menu will provide information on how to obtain these lists. | Pursuant to a congressional request, GAO reviewed the internal control techniques the Customs Service has in place to safeguard certain law enforcement records in the Treasury Enforcement Communications System (TECS) from being inappropriately deleted. GAO noted that: (1) Customs did not have adequate internal controls over the deletion of TECS lookout records; (2) standards issued by the Comptroller General require that: (a) key duties and responsibilities in authorizing, processing, recording, and reviewing transactions should be separated among individuals; (b) internal control systems and all transactions and other significant events should be clearly documented; and (c) supervisors should continuously review and approve the assigned work of their staffs; (3) however, guidance on TECS does not require these safeguards and Customs officials at the three ports GAO visited had not implemented these controls; (4) as a result, Customs employees could inappropriately remove lookout records from TECS; and (5) although GAO's review was limited to Customs headquarters, three Customs Management Centers, and three ports of entry, because of the lack of adequate systemwide internal control standards over deletion authority, it is possible that TECS lookout records may not be adequately safeguarded in other ports of entry as well. | 3,291 | 269 |
Geriatric assessment, defined as the skillful gathering of information about an elderly person's health, needs, and resources, is a potentially useful component of any program for frail elderly clients needing home and community-based long-term care. Such assessment is especially relevant to multiservice programs that pay for a wide variety of services, such as the Medicaid waiver programs found in 49 states. These programs are authorized by the Social Security Act, which allows for the waiver of certain Medicaid statutory requirements to enable states to cover home and community-based services as an alternative to client institutionalization. Such waivers, however, need not be statewide and can specifically target selected groups of individuals (for example, the elderly). The home and community-based services must be furnished in accordance with a plan of care aproved by the State Medicaid Agency. The instruments used to determine the level of care, the qualifications of those using these instruments, and the processes involved in assessment are systematically reviewed and must be approved by the administrative staff of the Medicaid program. These controls on the tools, personnel, and processes involved in establishing program eligibility are likely to benefit the care planning process. However, relatively little is known about the assessments used by waiver programs to develop care plans for the elderly, how they are used, what they cover, how they are administered, and the qualifications of those who administer them. The elderly clients who apply for home and community-based care usually undergo cycles of assessment. Depending upon each client's assessment, the program determines the services that should be delivered to the client over a period of time, utilizing a clinical decision-making process that results in a plan of care. Care planning processes vary among and within the states, and there is no single agreed-upon way to translate the results of assessment into a care plan. However, without good care planning, even the best assessment may not be helpful in achieving the most appropriate services for clients. Starting from this plan, program personnel (or personnel contracted by the program) directly authorize appropriate services and, when services are not available through the waiver program, may provide information to the client on how those services might be obtained. As the client's needs for services change or a specified period of time passes, program personnel reassess the needs and adjust the care plan accordingly. Each state Medicaid waiver program for the elderly has the freedom to develop and adopt its own assessment instrument with no specific federal guidelines for content or process of administration. Most of the information gathered by these instruments falls under one of six broad domains, which are recommended by experts in geriatric assessment and found in most of the published instruments developed to assess the frail elderly. They are: (1) physical health, (2) mental health, (3) functioning (problems with daily activities), (4) social resources, (5) economic resources, and (6) physical environment. To the extent that these domains are included, the instrument can be thought of as comprehensive. The completion of the assessment instrument is often based on one or more interviews between the client and the assessor. Information from other sources, such as medical records or interviews with family members, may also be included. Regardless of its formal elements, the entire assessment process must be skillfully coordinated by the assessor or assessors involved. This is necessary to maximize the useful information obtained within the limits set by the capacities of the elderly clients being served and their understandable preference to "tell their stories" as they choose. We conducted a literature review on assessment instruments; interviewed experts in geriatric assessment and state and local officials; and visited several state Medicaid programs (California, Oregon, and Florida). From the exhaustive literature review and interviews with the nationally recognized experts identified through the literature, we learned about good practices in geriatric assessment. (See appendix I for a list of experts.) From officials and visits to state programs, we learned about the goals, procedures, and difficulties of assessment in the field and gathered information to help inform our data collection. We then conducted a survey of all 50 states and the District of Columbia about their assessment instruments for the Medicaid waiver programs that provide the elderly with multiple services (in some places referred to as elderly and disabled waiver programs). We asked the head of each waiver program (or the most appropriate staff) to complete a questionnaire and send us a copy of their assessment instruments used to develop the care plans of elderly clients. The questionnaire requested two kinds of information: (1) general information about the program and (2) detailed information about the assessment instrument or instruments used to develop the clients' care plans, the assessment and care planning processes, and training and educational requirements of the assessors. After an extensive developmental process, we pretested the questionnaire in two states and incorporated necessary changes suggested by state officials. We then mailed the questionnaire to all states and gathered information between July 1994 and January 1995. The District of Columbia and Pennsylvania indicated that they did not have Medicaid waiver programs for the elderly and, therefore, were excluded from our sample. The 49 states with Medicaid waiver programs all responded to our questionnaire. We conducted our work in accordance with generally accepted government auditing standards. All 49 states reported to us that they use an assessment instrument to determine the care plan for each client, including the identification of needed services available both through the waiver program and outside the program. In addition, 43 states use the assessment to determine an elderly person's functional eligibility for the waiver program (level of care), and 31 states use part of the instrument as a preadmission screen for possible nursing home care. The programs rely upon several types of information to develop care plans, including client's preference, clinical impression, assessment scores, caregiver's preference, budgetary caps, and medical records. Most programs use the assessor's clinical impression, based on the assessment interview, and any scores or ratings generated by the assessment process most or all of the time. (See table 1.) Forty-eight of the programs told us that they "almost always" or "most of the time" provide clients with information about providers from whom they can get services not offered by the waiver program; 45 states provide them with referrals to such services; 35 provide them with assistance in obtaining these services; and 34 of the programs follow up clients to verify that the nonwaiver services have been obtained. It should be noted that some of these nonwaiver services may also be Medicaid-funded, such as home health care provided by Medicaid. We found that although all instruments gather some information on the broad domains of physical health, mental health, and functioning, not all of them cover the other three domains of a comprehensive assessment of an elderly person (84 percent cover social resources, 69 percent cover economic resources, and 80 percent cover physical environment). Within each of the six domains, certain specific topics are covered by a number of instruments. We found that all state instruments consistently gather information on assistance with activities of daily living (for example, bathing, toileting, and dressing). Table 2 shows the relative frequency of occurrence of any coverage whatsoever for each domain and for each topic found in 10 percent or more of the instruments. This list of topics does not represent an accepted standard. Different topics within a domain may yield similar or equivalent information. There may be other topics, not listed, that can also contribute to comprehensive assessment, and for some clients, skillful probing by assessors may be needed to obtain important contextual information not listed on any assessment form. It should also be acknowledged that, in particular instances, selected topics missing from instruments do not imply that states are not informed about these topics. Such information may be available from other sources. Also, the nature of the program or characteristics of the population may make certain information less relevant. For example, the financial eligibility rules of some states may obviate the need to ask about all the topics in the economic resources domain. Such repetition of topics would make the assessment unreasonably burdensome for the clients as well as for those programs with relatively limited resources (staff, time, or money). Less comprehensive instruments should be evaluated in the context of their particular programs to determine if sufficient information is collected about the client's physical and mental health, functional status, social and economic supports, and home environment to develop an appropriate care plan. We found that although most assessments are conducted as face-to-face interviews, only 35 percent of the instruments specify the wording of any of the interview questions that assessors ask the clients. Further, when the wording is not specified, it is often unclear in what order different elements of information are to be gathered. Instruments with specified wording, however, are usually designed to gather information in a particular order. This lack of uniformity in instrument administration may lead to unnecessary variation in how different clients perceive, and therefore respond to, requests for "the same information." For example, some replies to questions about depression may differ depending on whether they are asked before or after questions about physical health. Also, questions about activities of daily living, such as bathing, may evoke different replies depending on whether the client is asked if he or she "can bathe" or "does bathe." Although there may be no universally agreed-upon "correct" wording for such items, once such a wording is decided upon, there may be benefits to employing it consistently within a given program. We found that 53 percent of the programs using a single assessor mention a years-of-experience requirement, and 57 percent of the programs using a team of two assessors mention this requirement for their lead assessor (for the second assessor, it is 50 percent). Moreover, most states require assessors to possess specific professional credentials. Thus, programs attempt in various ways, such as by the adoption of hiring (or contracting) and training standards, to ensure that assessors perform their job competently. However, no particular background or training requirements can guarantee optimal assessment for all clients. We found that only 31 percent of the programs require training the assessor in how to use the instrument, although such training may be obtained without a requirement. Assessors who are not similarly trained in the use of the instrument, regardless of their credentials or other training, may not respond uniformly to common occurrences, such as a client's fatigue or a request to clarify a question. Assessors may administer the same instrument differently, even with standardized order and wording of the questions, based on differences in clinical training or experience in other situations. In light of the observed variability in waiver program assessments--with respect to instrument content, instrument standardization, and assessor requirements--the experts we consulted and the literature in gerontology make the following suggestions for improvement: First, a number of topics, such as those listed in table 2, have proved useful in assessing the elderly. Programs that do not cover a wide variety of these can increase the comprehensiveness of their assessments by including more of these topics. Second, standardizing the wording and order of questions generally increases the comparability of the clients' replies. Finally, another important element in achieving uniformity of instrument administration is assessor training in use of the instrument. We have drawn three conclusions about the assessment instruments and their administration. First, we found that although all states use assessment instruments to develop a care plan, there is variation in their level of comprehensiveness. Second, we found that although most assessments are conducted as face-to-face interviews, many state instruments do not have standardized wording. Third, we found that although training in the administration of the instrument may be important in achieving uniformity of administration, many states do not require such training. The Health Care Financing Administrator provided written comments on a draft of this report. (See appendix II.) The agency did not disagree with our findings, but listed some circumstances that help clarify variations across states. Specifically, they noted that waiver programs are frequently administered by different state agencies, which not only bring different perspectives to the assessments, but also use them for a variety of different purposes and may use more than one instrument. Through our state survey, we also found that some states use multiple assessment instruments, and some use them for multiple purposes. In oral comments on our draft report, responsible agency officials made some technical comments. We have incorporated these into the text where appropriate. As discussed with your office, we will be sending copies of this report to the Subcommittee Chairman, to other interested congressional committees and agencies, and to the Department of Health and Human Services and the Health Care Financing Administration. We will also send copies to others who request them. If you or your staff have any questions about this report, please call me or Sushil K. Sharma, Assistant Director, at (202) 512-3092. The major contributors to this report are listed in appendix III. Kathleen C. Buckwalter, Ph.D., University of Iowa Robert Butler, M.D., Mount Sinai Medical Center, N.Y. Donald M. Keller, Project Manager Venkareddy Chennareddy, Referencer We wish to acknowledge the assistance of R.E. Canjar in collecting and organizing the data and Richard C. Weston in ensuring data quality. The first copy of each GAO report and testimony is free. Additional copies are $2 each. Orders should be sent to the following address, accompanied by a check or money order made out to the Superintendent of Documents, when necessary. VISA and MasterCard credit cards are accepted, also. Orders for 100 or more copies to be mailed to a single address are discounted 25 percent. U.S. General Accounting Office P.O. Box 6015 Gaithersburg, MD 20884-6015 Room 1100 700 4th St. NW (corner of 4th and G Sts. NW) U.S. General Accounting Office Washington, DC Orders may also be placed by calling (202) 512-6000 or by using fax number (301) 258-4066, or TDD (301) 413-0006. Each day, GAO issues a list of newly available reports and testimony. To receive facsimile copies of the daily list or any list from the past 30 days, please call (202) 512-6000 using a touchtone phone. A recorded menu will provide information on how to obtain these lists. | Pursuant to a congressional request, GAO reviewed how publicly funded programs assess the need for home and community-based long-term care services for the poor disabled elderly, focusing on the: (1) comprehensiveness of the assessment instruments; (2) uniformity of their administration; and (3) uniformity of training for staff who conduct the assessments. GAO found that: (1) all 49 states reviewed use an assessment instrument to determine the long-term care needs of the poor disabled elderly and some also use them for other eligibility determinations; (2) 48 of the programs provide information to their clients about services not covered and most give referrals and assistance to obtain those services; (3) all of the assessment instruments covered physical and mental health and functional abilities of the disabled elderly, but inclusion of their social resources, economic resources, and physical environment ranged from 69 percent to 84 percent; (4) dependence on assistance with daily living activities was the only specific topic included in all instruments; (5) most assessments use face-to-face interviews, but only a minority of them specify the wording of questions; (6) most programs have experience and professional credential requirements for their assessors, but most programs do not require standardized training; and (7) experts believe that assessment instruments could be improved by including more topics, standardizing the wording and order of questions, and training assessors in use of the instruments. | 3,047 | 287 |
Since the end of the Cold War, the Navy has emphasized a strategy of littoral warfare. As part of this strategy, the Navy and the Marine Corps have been developing operational concepts for amphibious warfare, which rely heavily on the ability to launch and support amphibious assaults from ships up to 25 nautical miles from the enemy's shore. According to the Navy and the Marine Corps, to successfully conduct amphibious operations, the Marine Corps requires all-weather fire support. If artillery and other ground-based fire support assets are not available, Marine Corps ground forces will need long-range fire support from Navy surface ships or from attack helicopters and fixed-wing aircraft. Currently, the Navy operates the 5-inch, 54-caliber gun on cruisers and destroyers, which can fire unguided projectiles a maximum range of about 13 nautical miles. According to the Navy and the Marine Corps, this short range combined with threats to surface ships from mines and antiship missiles currently preclude the Navy from adequately supporting Marine Corps amphibious operations or engaging other long-range targets. The Congress has been interested in the Navy's plans for NSFS since 1991. The National Defense Authorization Act for Fiscal Years 1992 and 1993 required (1) the Secretary of the Navy to provide a report to the Congress outlining NSFS requirements and survey alternative technologies and other options that could meet these requirements; (2) the Secretary of Defense, through the Institute for Defense Analysis, to provide a study of naval ship-to-shore fire support requirements and cost-effective alternatives; and (3) the Navy to conduct a cost and operational effectiveness analysis (COEA) based on the requirements and technologies identified in the first report. In the conference report to the National Defense Authorization Act for Fiscal Year 1995, the Congress required the Secretary of the Navy to submit a report on the Navy's NSFS plan. At the time of this review, this report has not been submitted to the Congress. In February 1993, the Center for Naval Analyses began the COEA. It evaluated the performance of 10 existing and candidate 5- and 8-inch and 155-millimeter gun systems with different propellants, flight classifications, and warhead types against target sets for three scenarios, two of which represented major regional conflicts. The third scenario represented a noncombatant evacuation operation. The Navy also evaluated seven missile concepts against these scenarios because it found that none of the gun systems could handle all of the target sets. The scenarios and target sets were developed along with the Marine Corps and validated by the COEA's oversight board. The COEA identified eight gun systems that, when combined with missiles, were capable of attacking at least 95 percent of the targets in the major regional conflict scenarios at the lowest total estimated cost. Five of these systems were 155-millimeter variants, and three were 8-inch variants with different propellants and calibers. The COEA concluded that a 155-millimeter, 60-caliber gun system with an advanced propellant and precision-guided munitions in combination with the Tomahawk missile was the most cost-effective NSFS option. According to the Navy, the only 5-inch gun candidate that was able to compete with other gun systems modeled in the COEA was a 5-inch, 70-caliber Magnum gun. This gun does not exist and would have to be developed. The COEA found that, for both major regional conflict scenarios, fewer 155-millimeter munitions and long-range missiles would be needed to hit a majority of the target sets than 5-inch, 70-caliber munitions and missiles. For example, the Navy could hit 99 percent of the targets in one scenario with 1,316 fewer 155-millimeter projectiles, and 34 fewer long-range missiles at a wartime cost of about $69 million less than with a combination of 5-inch, 70-caliber projectiles and missiles. Also, the COEA stated that, if the NSFS program became fiscally constrained, development of a 5-inch, 70-caliber gun might save money in the near term, making it an attractive option because of lower research and development costs, but (1) wartime costs would be considerably higher than with larger guns and (2) a 5-inch, 70-caliber gun would not adequately cover the targets. The Navy subsequently developed the NSFS program based on the results of the COEA. In March 1994, the Navy proposed (1) developing a new 155-millimeter, 60-caliber gun; (2) developing, along with the Army, a new 155-millimeter precision-guided munition; and (3) researching different propellants, including electro-thermal-chemical and liquid propellants. The Navy planned to field these new systems by fiscal year 2003. The Navy also proposed providing limited upgrades to existing 5-inch guns to achieve greater ranges until the 155-millimeter gun became available and planned to conduct concept demonstrations of various missiles. According to the Navy, the NSFS program had the potential for joint development of various propellants and commonality with Army 155-millimeter munitions. To fund this overall program, the Navy included $360 million for research and development in its proposed Future Years Defense Program for fiscal years 1996-2001 and expected to field the 155-millimeter gun in fiscal year 2003 on new-production DDG-51 destroyers or on a follow-on surface ship, known as SC-21. Funding shortfalls in the Navy's fiscal year 1996 program objective memorandum led to a decision by the Navy to cut its NSFS program in August 1994 to help pay for programs that the Marine Corps considered vital to its amphibious capabilities. These programs included the V-22 medium-lift aircraft and the Advanced Amphibious Assault Vehicle. According to program officials, to stay within the reduced funding level, the Navy canceled plans to develop the 155-millimeter, 60-caliber gun and the 155-millimeter precision-guided munition and scaled back efforts to develop advanced propellants for 155-millimeter munitions. The Navy said it would consider this option as a long-term NSFS solution as it develops its new surface combatant ship, the SC-21. In the interim, the Navy has decided to upgrade its existing 5-inch, 54-caliber guns and develop a 5-inch precision-guided munition. According to program officials, the Navy made this decision primarily because it believed that modifying existing guns would be the quickest way to gain better gun capability at the least cost. In December 1994, the Chief of Naval Operations approved the Navy's revised NSFS plan, and in January 1995, directed the Naval Sea Systems Command to (1) initiate upgrades to the 5-inch, 54-caliber gun to deliver precision-guided munitions; (2) develop a 5-inch precision-guided munition with an initial operational capability before fiscal year 2001; and (3) scale back liquid propellant gun technology efforts. In addition, the Chief of Naval Operations directed that no funds be used to develop the 155-millimeter gun. According to the Navy, it will need about $246 million in research and development funds between fiscal years 1996 and 2001 for the revised NSFS program. About $165 million will be required to develop the precision-guided munition, $56 million to upgrade the 5-inch gun, and $25 million will be needed for research and development on NSFS-related command and control systems. The Navy included $160.2 million in its Future Years Defense Program for fiscal years 1996-2001 for research and development of the 5-inch gun and precision-guided munition, including $12 million for fiscal year 1996. As a result, the Navy's research and development program is underfunded by about $86 million. Navy officials told us that funds would be added to the program in fiscal year 1997. In November 1994, 3 months after the Navy proposed the 5-inch, 54-caliber gun solution, the Marine Corps established a range requirement for NSFS that is less than the range requirements assumed in the COEA. Although the COEA does not specify a range requirement, the COEA assumed that a majority of the NSFS targets in the major regional conflict scenarios were located within 75 nautical miles of the fire support ship. This requirement was consistent with the findings of the July 1992 Navy NSFS requirements study and the June 1993 Institute for Defense Analysis study, which found that 75 nautical miles was the maximum required range to support the Marine Corps' operational concepts. Although range estimates for an upgraded 5-inch, 54-caliber gun vary, all estimates are less than 75 nautical miles. The June 1993 Institute for Defense Analysis study estimated that an advanced 5-inch gun projectile with rocket-assisted propulsion could achieve a range between 45 and 65 nautical miles. Navy officials told the Chief of Naval Operations that an upgraded 5-inch gun could achieve ranges between 45 and 70 nautical miles depending on the scope of the upgrade and the type of propellant used in the precision-guided munition. According to the Navy, to achieve a 70 nautical mile range, electro-thermal-chemical propellants may be needed, but these propellants have not yet been developed. In November 1994, the Marine Corps established a requirement for NSFS in terms of range, volume of fire, and lethality. Although it participated in developing the original 75 nautical mile range target assumption used in the COEA, the Marine Corps decided that the minimum range requirement for NSFS should be 41.3 nautical miles and that the maximum range should be 63.1 nautical miles. The Marine Corps based these ranges on its intent to use NSFS during the initial stages of an amphibious operation until artillery is ashore. Because its 155-millimeter towed artillery would be unavailable during the initial stages of an amphibious operation, the Marine Corps concluded that NSFS, at a minimum, must provide the same range, lethality, and accuracy as current artillery systems. The minimum 41.3 nautical mile range consists of the 25 nautical mile ship-to-shore distance plus a 16.3 nautical mile (30 kilometers) distance representing the maximum range of existing Marine Corps 155-millimeter artillery with rocket-assisted projectiles. To derive the maximum range of 63.1 nautical miles, the Marine Corps used the accepted minimum range for threat artillery articulated in the Army Field Artillery COEA of 21.8 nautical miles (40 kilometers) and added this range to the minimum range of 41.3 nautical miles. The Marine Corps' intent to use NSFS during the initial stages of amphibious landing operations was outlined in the NSFS mission needs statement, which was signed by the Navy in May 1992. According to the statement, NSFS also involves suppressing and destroying hostile antiship weapons and air defense systems, delaying and disrupting enemy movements, and reinforcing defending forces. Marine Corps and Navy requirements officials also told us that the Marine Corps revised the 75 nautical mile range requirement because it was not logical, specifically defined, or formally agreed to by the Navy or the Marine Corps. We found this surprising because Navy and Marine Corps officials were involved in developing the target sets used in the COEA's scenarios. The scenarios and target sets were also approved by officials from both services serving on the COEA's oversight board. The fact that the Navy and the Marine Corps established the new range requirement after the Navy completed work on the COEA and restructured the program raises questions about the validity of NSFS range requirements. The Marine Corps did not assess the impact of its new requirement on the target sets originally developed for the COEA or conduct any further analysis to validate these ranges. Therefore, the importance to the NSFS mission of targets located between 63 and 75 nautical miles from the ship is not clear. According to defense acquisition management policies and procedures, a COEA is intended to assist decisionmakers in choosing the best system alternative for the money invested and not to justify decisions that have already been made. The Navy did not perform a supplemental analysis to its original COEA before it decided to restructure the NSFS program. The Navy is currently conducting a supplemental analysis to evaluate near-term alternatives for NSFS. According to the Navy, this analysis will reflect the new Marine Corps' maximum range requirement of 63.1 nautical miles and be limited only to 5-inch gun options. The Navy has asked the Center for Naval Analyses to complete this analysis by May 1995. It is not clear whether a supplemental analysis that considered all gun options--5 and 8 inch and 155 millimeter--against the Marine Corps' new distance requirements would support the Navy's decision to upgrade the 5-inch gun because (1) larger guns firing advanced projectiles with more payload can attack more targets than smaller, 5-inch guns and (2) the original COEA found that the rankings of the eight most cost-effective systems were not sensitive to range. The original COEA assessed the effectiveness of the eight most cost-effective systems when the ship-to-shore distance was reduced from 25 to 5 nautical miles and found that the cost-effectiveness rankings of the systems remained basically the same. Even at shorter ranges, the 155-millimeter, 60-caliber gun and Tomahawk missile combination remained the most cost-effective NSFS option. The Congress may wish to consider not authorizing or appropriating fiscal year 1996 funds for NSFS until the Navy has (1) determined and validated NSFS requirements and (2) conducted a comprehensive supplemental analysis to the COEA that includes all available gun and missile alternatives. The Department of Defense (DOD) did not concur with either the thrust of this report or the matter for congressional consideration (see app. II). DOD took issue with three major issues in the report: the Marine Corps' range requirement, the Navy's long-term plans for the 155-millimeter gun, and our suggestion that the Navy is revising the COEA to justify decisions it had already made. DOD noted that the report incorrectly alludes to a Marine Corps initial NSFS requirement of 75 nautical miles. DOD said that the minimum 41.3 and maximum 63.1 nautical mile ranges established by the Marine Corps in November 1994 was the first explicit statement of the requirement based on a practical analysis of war-fighting scenarios. We do not agree with DOD's position. Although the COEA did not include a specific range requirement, a majority of the targets in the major regional conflict scenarios modeled by the COEA were located within 75 nautical miles of the fire support ship. The 75 nautical mile range was consistent with the findings of the July 1992 Navy NSFS requirements study and the June 1993 Institute for Defense Analysis study, which found that 75 nautical miles was the maximum required range to support the Marine Corps' operational concepts. Further, the Navy did not conduct an analysis to validate the relationship between the target set used in developing the COEA and the Marine Corps' new maximum range requirement of 63.1 nautical miles. Also, it should be noted that the original COEA found that the rankings of the eight most cost-effective systems were not sensitive to range. The original COEA assessed the effectiveness of the eight most cost-effective systems when the ship-to-shore distance was reduced from 25 to 5 nautical miles and found that the cost-effectiveness rankings of the systems remained basically the same. Even at shorter ranges, the 155-millimeter, 60-caliber gun and Tomahawk missile combination remained the most cost-effective NSFS option. DOD said that plans to develop the 155-millimeter gun and precision-guided projectile, as recommended in the COEA, have not been canceled and that this system remains a viable option for inclusion on the SC-21. This differs sharply from what Navy officials told us during the audit. Moreover, no funds have been budgeted for this program in the Future Years Defense Program for fiscal years 1996-2001. Also, in his December 1994 decision to focus on the 5-inch gun upgrade program, the Chief of Naval Operations directed that no funds be used to develop the 155-millimeter gun. DOD said that the Navy was not revising its COEA but was conducting a supplemental analysis to the original NSFS COEA. DOD noted that the purpose of the supplemental analysis was to determine the best near-term NSFS improvements to meet the range requirements established by the Marine Corps in November 1994. However, we note the Navy requested the Center for Naval Analyses to perform the supplemental analysis 2 months after its decision to proceed with the restructured program. Because the Navy has restricted the supplemental analysis to only 5-inch gun solutions, rather than all potential gun solutions, we believe that the supplemental analysis may not determine the most cost-effective, near-term NSFS program. Our recent discussions with officials from the Center for Naval Analyses who are conducting the supplemental analysis has reinforced this view. According to these officials, the 5-inch precision-guided munition development program is a high-risk endeavor that requires concurrent development of a number of new technologies. One risk associated with concurrency is that fielding of the munition may be delayed beyond the year 2001. According to the Center for Naval Analyses, another risk is that the 5-inch munition may not be able to meet the Marine Corps' maximum range requirement. DOD also disagreed with the matter for congressional consideration. DOD noted that its near-term program was consistent with the 1993 Institute for Defense Analysis study, which recommended developing advanced projectiles compatible with existing 5-inch, 54-caliber guns for the near term and that sufficient analysis has been conducted for the Navy to proceed with its program. DOD also stated that removal of fiscal year 1996 funding would slow the achievement of both near- and long-term objectives. From the outset, the Navy intended to use the COEA to determine the best program for NSFS. We continue to believe the Navy has not conducted sufficient analysis to support its near-term program. To obtain information on NSFS requirements and the Navy's plans, we interviewed officials and reviewed documents from the Office of the Deputy Chief of Naval Operations for Resources, Warfare Requirements, and Assessments and the Office of the Assistant Secretary of the Navy for Research, Development, and Acquisition, Washington, D.C. We also interviewed officials and reviewed documents at the Marine Corps Combat Development Command, Quantico, Virginia; and the Naval Sea Systems Command, Crystal City, Virginia. We reviewed the Navy and the Office of the Secretary of Defense NSFS studies mandated by the Congress in the National Defense Authorization Act for Fiscal Years 1992 and 1993 and discussed them with Navy officials and representatives of the Institute for Defense Analysis, Alexandria, Virginia. The Navy did not provide us with a copy of the COEA, but we reviewed the COEA's summary report dated March 31, 1994, which contained its major findings and conclusions. We discussed the COEA with officials of the Center for Naval Analyses, Alexandria, Virginia. We conducted our review between July 1993 and March 1995 in accordance with generally accepted government auditing standards. We are sending copies of this letter to the Secretaries of Defense and the Navy and the Commandant of the Marine Corps. We will also make copies available to others on request. Please contact me at (202) 512-3504 if you or your staff have any questions concerning this report. Major contributors to this report are Richard Price, Assistant Director; Anton Blieberger, Evaluator-in-Charge; and Robert Goldberg, Senior Evaluator. National Defense Authorization Act for Fiscal Years 1992 and 1993 mandates the Navy and the Office of the Secretary of Defense to assess naval surface fire support (NSFS) needs and the Navy to conduct a formal cost and operational effectiveness analysis (COEA). The Navy signs the NSFS mission needs statement. The Navy issues its first congressionally mandated report on NSFS requirements. The Navy begins the COEA. The Institute for Defense Analysis completes its assessment of NSFS. The Navy completes its work on the COEA and, on the basis of its results, proposes an NSFS program and funding in its Future Years Defense Program for fiscal years 1996-2001. The Navy restructures the NSFS program in light of funding shortfalls and cancels 155-millimeter, 60-caliber gun development. The Marine Corps identifies NSFS range requirements. The COEA is signed out for distribution by the Co-Chairs of COEA oversight board, but is not released to the Congress. The Navy proposes a revised NSFS program to the Chief of Naval Operations and obtains approval. The Chief of Naval Operations formally approves the NSFS range requirement and issues formal program guidance directing the Navy to pursue upgrades to 5-inch guns and development of a precision-guided munition. The Navy asks the Center for Naval Analyses to provide a supplemental analysis to its original COEA that reflects the Marine Corps' new range requirements by May 1995. The first copy of each GAO report and testimony is free. Additional copies are $2 each. Orders should be sent to the following address, accompanied by a check or money order made out to the Superintendent of Documents, when necessary. Orders for 100 or more copies to be mailed to a single address are discounted 25 percent. U.S. General Accounting Office P.O. Box 6015 Gaithersburg, MD 20884-6015 Room 1100 700 4th St. NW (corner of 4th and G Sts. NW) U.S. General Accounting Office Washington, DC Orders may also be placed by calling (202) 512-6000 or by using fax number (301) 258-4066, or TDD (301) 413-0006. Each day, GAO issues a list of newly available reports and testimony. To receive facsimile copies of the daily list or any list from the past 30 days, please call (301) 258-4097 using a touchtone phone. A recorded menu will provide information on how to obtain these lists. | Pursuant to a congressional request, GAO reviewed the Navy's upgrade of its surface ships' guns to determine whether the Navy has chosen the most cost-effective system for improving naval surface fire support (NSFS). GAO found that: (1) the Navy did not sufficiently analyze its needs before deciding on the upgrade of its 5-inch, 54-caliber guns and the development of a 5-inch precision-guided munition; (2) the Navy determined that the most cost-effective system to meet NSFS needs by fiscal year (FY) 2003 would be a 155-millimeter, 60-caliber gun with an advanced propellant and precision-guided munitions in combination with the Tomahawk Land Attack Missile; (3) although it initially proposed to develop the guns at a cost of about $360 million, the Navy has decided to limit the program to upgrading existing guns and developing precision-guided munitions to meet the reduced funding level; (4) the Navy estimates that research and development (R&D) costs for the 5-inch guns will be about $246 million; (5) the Navy R&D budget has a $86-million shortfall that will be corrected in FY 1997; (6) the Marine Corps has revised its minimum NSFS range requirement to reflect the Navy's restructured gun program; and (7) the Navy is conducting a supplemental analysis to evaluate near-term alternatives for NSFS, but it is unclear whether this analysis will support the Navy's decision to upgrade the 5-inch gun. | 4,817 | 329 |
Because of such emergencies as natural disasters, hazardous material spills, and riots, all levels of government have had some experience in preparing for different types of disasters and emergencies. Preparing for all potential hazards is commonly referred to as the "all-hazards" approach. While terrorism is a component within an all-hazards approach, terrorist attacks potentially impose a new level of fiscal, economic, and social dislocation within this nation's boundaries. Given the specialized resources that are necessary to address a chemical or biological attack, the range of governmental services that could be affected, and the vital role played by private entities in preparing for and mitigating risks, state and local resources alone will likely be insufficient to meet the terrorist threat. Some of these specific challenges can be seen in the area of bioterrorism. For example, a biological agent released covertly might not be recognized for a week or more because symptoms may only appear several days after the initial exposure and may be misdiagnosed at first. In addition, some biological agents, such as smallpox, are communicable and can spread to others who were not initially exposed. These characteristics require responses that are unique to bioterrorism, including health surveillance, epidemiologic investigation, laboratory identification of biological agents, and distribution of antibiotics or vaccines to large segments of the population to prevent the spread of an infectious disease. The resources necessary to undertake these responses are generally beyond state and local capabilities and would require assistance from and close coordination with the federal government. National preparedness is a complex mission that involves a broad range of functions performed throughout government, including national defense, law enforcement, transportation, food safety and public health, information technology, and emergency management, to mention only a few. While only the federal government is empowered to wage war and regulate interstate commerce, state and local governments have historically assumed primary responsibility for managing emergencies through police, firefighters, and emergency medical personnel. The federal government's role in responding to major disasters is generally defined in the Stafford Act, which requires a finding that the disaster is so severe as to be beyond the capacity of state and local governments to respond effectively before major disaster or emergency assistance from the federal government is warranted. Once a disaster is declared, the federal government--through the Federal Emergency Management Agency (FEMA)--may reimburse state and local governments for between 75 and 100 percent of eligible costs, including response and recovery activities. There has been an increasing emphasis over the past decade on preparedness for terrorist events. After the nerve gas attack in the Tokyo subway system on March 20, 1995, and the Oklahoma City bombing on April 19, 1995, the United States initiated a new effort to combat terrorism. In June 1995, Presidential Decision Directive 39 was issued, enumerating responsibilities for federal agencies in combating terrorism, including domestic terrorism. Recognizing the vulnerability of the United States to various forms of terrorism, the Congress passed the Defense Against Weapons of Mass Destruction Act of 1996 (also known as the Nunn-Lugar- Domenici program) to train and equip state and local emergency services personnel who would likely be the first responders to a domestic terrorist event. Other federal agencies, including those in the Department of Justice, Department of Energy, FEMA, and Environmental Protection Agency, have also developed programs to assist state and local governments in preparing for terrorist events. The attacks of September 11, 2001, as well as the subsequent attempts to contaminate Americans with anthrax, dramatically exposed the nation's vulnerabilities to domestic terrorism and prompted numerous legislative proposals to further strengthen our preparedness and response. During the first session of the 107th Congress, several bills were introduced with provisions relating to state and local preparedness. For instance, the Preparedness Against Domestic Terrorism Act of 2001, which you cosponsored, Mr. Chairman, proposes the establishment of a Council on Domestic Preparedness to enhance the capabilities of state and local emergency preparedness and response. The funding for homeland security increased substantially after the attacks. According to documents supporting the president's fiscal year 2003 budget request, about $19.5 billion in federal funding for homeland security was enacted in fiscal year 2002. The Congress added to this amount by passing an emergency supplemental appropriation of $40 billion dollars. According to the budget request documents, about one- quarter of that amount, nearly $9.8 billion, was dedicated to strengthening our defenses at home, resulting in an increase in total federal funding on homeland security of about 50 percent, to $29.3 billion. Table 1 compares fiscal year 2002 funding for homeland security by major categories with the president's proposal for fiscal year 2003. We have tracked and analyzed federal programs to combat terrorism for many years and have repeatedly called for the development of a national strategy for preparedness. We have not been alone in this message; for instance, national commissions, such as the Gilmore Commission, and other national associations, such as the National Emergency Management Association and the National Governors Association, have advocated the establishment of a national preparedness strategy. The attorney general's Five-Year Interagency Counterterrorism Crime and Technology Plan, issued in December 1998, represents one attempt to develop a national strategy on combating terrorism. This plan entailed a substantial interagency effort and could potentially serve as a basis for a national preparedness strategy. However, we found it lacking in two critical elements necessary for an effective strategy: (1) measurable outcomes and (2) identification of state and local government roles in responding to a terrorist attack. In October 2001, the president established the Office of Homeland Security as a focal point with a mission to develop and coordinate the implementation of a comprehensive national strategy to secure the United States from terrorist threats or attacks. While this action represents a potentially significant step, the role and effectiveness of the Office of Homeland Security in setting priorities, interacting with agencies on program development and implementation, and developing and enforcing overall federal policy in terrorism-related activities is in the formative stages of being fully established. The emphasis needs to be on a national rather than a purely federal strategy. We have long advocated the involvement of state, local, and private-sector stakeholders in a collaborative effort to arrive at national goals. The success of a national preparedness strategy relies on the ability of all levels of government and the private sector to communicate and cooperate effectively with one another. To develop this essential national strategy, the federal role needs to be considered in relation to other levels of government, the goals and objectives for preparedness, and the most appropriate tools to assist and enable other levels of government and the private sector to achieve these goals. Although the federal government appears monolithic to many, in the area of terrorism prevention and response, it has been anything but. More than 40 federal entities have a role in combating and responding to terrorism, and more than 20 federal entities in bioterrorism alone. One of the areas that the Office of Homeland Security will be reviewing is the coordination among federal agencies and programs. Concerns about coordination and fragmentation in federal preparedness efforts are well founded. Our past work, conducted prior to the creation of the Office of Homeland Security, has shown coordination and fragmentation problems stemming largely from a lack of accountability within the federal government for terrorism-related programs and activities. There had been no single leader in charge of the many terrorism- related functions conducted by different federal departments and agencies. In fact, several agencies had been assigned leadership and coordination functions, including the Department of Justice, the Federal Bureau of Investigation, FEMA, and the Office of Management and Budget. We previously reported that officials from a number of agencies that combat terrorism believe that the coordination roles of these various agencies are not always clear. The recent Gilmore Commission report expressed similar concerns, concluding that the current coordination structure does not provide the discipline necessary among the federal agencies involved. In the past, the absence of a central focal point resulted in two major problems. The first of these is a lack of a cohesive effort from within the federal government. For example, the Department of Agriculture, the Food and Drug Administration, and the Department of Transportation have been overlooked in bioterrorism-related policy and planning, even though these organizations would play key roles in response to terrorist acts. In this regard, the Department of Agriculture has been given key responsibilities to carry out in the event that terrorists were to target the nation's food supply, but the agency was not consulted in the development of the federal policy assigning it that role. Similarly, the Food and Drug Administration was involved with issues associated with the National Pharmaceutical Stockpile, but it was not involved in the selection of all items procured for the stockpile. Further, the Department of Transportation has responsibility for delivering supplies under the Federal Response Plan, but it was not brought into the planning process and consequently did not learn the extent of its responsibilities until its involvement in subsequent exercises. Second, the lack of leadership has resulted in the federal government's development of programs to assist state and local governments that were similar and potentially duplicative. After the terrorist attack on the federal building in Oklahoma City, the federal government created additional programs that were not well coordinated. For example, FEMA, the Department of Justice, the Centers for Disease Control and Prevention, and the Department of Health and Human Services all offer separate assistance to state and local governments in planning for emergencies. Additionally, a number of these agencies also condition receipt of funds on completion of distinct but overlapping plans. Although the many federal assistance programs vary somewhat in their target audiences, the potential redundancy of these federal efforts warrants scrutiny. In this regard, we recommended in September 2001 that the president work with the Congress to consolidate some of the activities of the Department of Justice's Office for State and Local Domestic Preparedness Support under FEMA. State and local response organizations believe that federal programs designed to improve preparedness are not well synchronized or organized. They have repeatedly asked for a one-stop "clearinghouse" for federal assistance. As state and local officials have noted, the multiplicity of programs can lead to confusion at the state and local levels and can expend precious federal resources unnecessarily or make it difficult for them to identify available federal preparedness resources. As the Gilmore Commission report notes, state and local officials have voiced frustration about their attempts to obtain federal funds and have argued that the application process is burdensome and inconsistent among federal agencies. Although the federal government can assign roles to federal agencies under a national preparedness strategy, it will also need to reach consensus with other levels of government and with the private sector about their respective roles. Clearly defining the appropriate roles of government may be difficult because, depending upon the type of incident and the phase of a given event, the specific roles of local, state, and federal governments and of the private sector may not be separate and distinct. A new warning system, the Homeland Security Advisory System, is intended to tailor notification of the appropriate level of vigilance, preparedness, and readiness in a series of graduated threat conditions. The Office of Homeland Security announced the new warning system on March 12, 2002. The new warning system includes five levels of alert for assessing the threat of possible terrorist attacks: low, guarded, elevated, high, and severe. These levels are also represented by five corresponding colors: green, blue, yellow, orange, and red. When the announcement was made, the nation stood in the yellow condition, in elevated risk. The warning can be upgraded for the entire country or for specific regions and economic sectors, such as the nuclear industry. The system is intended to address a problem with the previous blanket warning system that was used. After September 11th, the federal government issued four general warnings about possible terrorist attacks, directing federal and local law enforcement agencies to place themselves on the "highest alert." However, government and law enforcement officials, particularly at the state and local levels, complained that general warnings were too vague and a drain on resources. To obtain views on the new warning system from all levels of government, law enforcement, and the public, the United States Attorney General, who will be responsible for the system, provided a 45-day comment period from the announcement of the new system on March 12th. This provides an opportunity for state and local governments as well as the private sector to comment on the usefulness of the new warning system, and the appropriateness of the five threat conditions with associated suggested protective measures. Numerous discussions have been held about the need to enhance the nation's preparedness, but national preparedness goals and measurable performance indicators have not yet been developed. These are critical components for assessing program results. In addition, the capability of state and local governments to respond to catastrophic terrorist attacks is uncertain. At the federal level, measuring results for federal programs has been a longstanding objective of the Congress. The Congress enacted the Government Performance and Results Act of 1993 (commonly referred to as the Results Act). The legislation was designed to have agencies focus on the performance and results of their programs rather than on program resources and activities, as they had done in the past. Thus, the Results Act became the primary legislative framework through which agencies are required to set strategic and annual goals, measure performance, and report on the degree to which goals are met. The outcome-oriented principles of the Results Act include (1) establishing general goals and quantifiable, measurable, outcome-oriented performance goals and related measures, (2) developing strategies for achieving the goals, including strategies for overcoming or mitigating major impediments, (3) ensuring that goals at lower organizational levels align with and support general goals, and (4) identifying the resources that will be required to achieve the goals. A former assistant professor of public policy at the Kennedy School of Government, now the senior director for policy and plans with the Office of Homeland Security, noted in a December 2000 paper that a preparedness program lacking broad but measurable objectives is unsustainable. This is because it deprives policymakers of the information they need to make rational resource allocations, and program managers are prevented from measuring progress. He recommended that the government develop a new statistical index of preparedness,incorporating a range of different variables, such as quantitative measures for special equipment, training programs, and medicines, as well as professional subjective assessments of the quality of local response capabilities, infrastructure, plans, readiness, and performance in exercises. Therefore, he advocated that the index should go well beyond the current rudimentary milestones of program implementation, such as the amount of training and equipment provided to individual cities. The index should strive to capture indicators of how well a particular city or region could actually respond to a serious terrorist event. This type of index, according to this expert, would then allow the government to measure the preparedness of different parts of the country in a consistent and comparable way, providing a reasonable baseline against which to measure progress. In October 2001, FEMA's director recognized that assessments of state and local capabilities have to be viewed in terms of the level of preparedness being sought and what measurement should be used for preparedness. The director noted that the federal government should not provide funding without assessing what the funds will accomplish. Moreover, the president's fiscal year 2003 budget request for $3.5 billion through FEMA for first responders--local police, firefighters, and emergency medical professionals--provides that these funds be accompanied by a process for evaluating the effort to build response capabilities, in order to validate that effort and direct future resources. FEMA has developed an assessment tool that could be used in developing performance and accountability measures for a national strategy. To ensure that states are adequately prepared for a terrorist attack, FEMA was directed by the Senate Committee on Appropriations to assess states' response capabilities. In response, FEMA developed a self-assessment tool--the Capability Assessment for Readiness (CAR)--that focuses on 13 key emergency management functions, including hazard identification and risk assessment, hazard mitigation, and resource management. However, these key emergency management functions do not specifically address public health issues. In its fiscal year 2001 CAR report, FEMA concluded that states were only marginally capable of responding to a terrorist event involving a weapon of mass destruction. Moreover, the president's fiscal year 2003 budget proposal acknowledges that our capabilities for responding to a terrorist attack vary widely across the country. Many areas have little or no capability to respond to a terrorist attack that uses weapons of mass destruction. The budget proposal further adds that even the best prepared states and localities do not possess adequate resources to respond to the full range of terrorist threats we face. Proposed standards have been developed for state and local emergency management programs by a consortium of emergency managers from all levels of government and are currently being pilot tested through the Emergency Management Accreditation Program at the state and local levels. Its purpose is to establish minimum acceptable performance criteria by which emergency managers can assess and enhance current programs to mitigate, prepare for, respond to, and recover from disasters and emergencies. For example, one such standard is the requirement that (1) the program must develop the capability to direct, control, and coordinate response and recovery operations, (2) that an incident management system must be utilized, and (3) that organizational roles and responsibilities shall be identified in the emergency operational plans. Although FEMA has experience in working with others in the development of assessment tools, it has had difficulty in measuring program performance. As the president's fiscal year 2003 budget request acknowledges, FEMA generally performs well in delivering resources to stricken communities and disaster victims quickly. The agency performs less well in its oversight role of ensuring the effective use of such assistance. Further, the agency has not been effective in linking resources to performance information. FEMA's Office of Inspector General has found that FEMA did not have an ability to measure state disaster risks and performance capability, and it concluded that the agency needed to determine how to measure state and local preparedness programs. Since September 11th, many state and local governments have faced declining revenues and increased security costs. A survey of about 400 cities conducted by the National League of Cities reported that since September 11th, one in three American cities saw their local economies, municipal revenues, and public confidence decline while public-safety spending is up. Further, the National Governors Association estimates fiscal year 2002 state budget shortfalls of between $40 billion and $50 billion, making it increasingly difficult for the states to take on expensive, new homeland security initiatives without federal assistance. State and local revenue shortfalls coupled with increasing demands on resources make it more critical that federal programs be designed carefully to match the priorities and needs of all partners--federal, state, local, and private. Our previous work on federal programs suggests that the choice and design of policy tools have important consequences for performance and accountability. Governments have at their disposal a variety of policy instruments, such as grants, regulations, tax incentives, and regional coordination and partnerships, that they can use to motivate or mandate other levels of government and private-sector entities to take actions to address security concerns. The design of federal policy will play a vital role in determining success and ensuring that scarce federal dollars are used to achieve critical national goals. Key to the national effort will be determining the appropriate level of funding so that policies and tools can be designed and targeted to elicit a prompt, adequate, and sustainable response while also protecting against federal funds being used to substitute for spending that would have occurred anyway. The federal government often uses grants to state and local governments as a means of delivering federal programs. Categorical grants typically permit funds to be used only for specific, narrowly defined purposes. Block grants typically can be used by state and local governments to support a range of activities aimed at achieving a broad national purpose and to provide a great deal of discretion to state and local officials. Either type of grant can be designed to (1) target the funds to states and localities with the greatest need, (2) discourage the replacement of state and local funds with federal funds, commonly referred to as "supplantation," with a maintenance-of-effort requirement that recipients maintain their level of previous funding, and (3) strike a balance between accountability and flexibility. More specifically: Targeting: The formula for the distribution of any new grant could be based on several considerations, including the state or local government's capacity to respond to a disaster. This capacity depends on several factors, the most important of which perhaps is the underlying strength of the state's tax base and whether that base is expanding or is in decline. In an August 2001 report on disaster assistance, we recommended that the director of FEMA consider replacing the per-capita measure of state capability with a more sensitive measure, such as the amount of a state's total taxable resources, to assess the capabilities of state and local governments to respond to a disaster. Other key considerations include the level of need and the costs of preparedness. Maintenance-of-effort: In our earlier work, we found that substitution is to be expected in any grant and, on average, every additional federal grant dollar results in about 60 cents of supplantion. We found that supplantation is particularly likely for block grants supporting areas with prior state and local involvement. Our recent work on the Temporary Assistance to Needy Families block grant found that a strong maintenance- of-effort provision limits states' ability to supplant. Recipients can be penalized for not meeting a maintenance-of-effort requirement. Balance accountability and flexibility: Experience with block grants shows that such programs are sustainable if they are accompanied by sufficient information and accountability for national outcomes to enable them to compete for funding in the congressional appropriations process. Accountability can be established for measured results and outcomes that permit greater flexibility in how funds are used while at the same time ensuring some national oversight. Grants previously have been used for enhancing preparedness and recent proposals direct new funding to local governments. In recent discussions, local officials expressed their view that federal grants would be more effective if local officials were allowed more flexibility in the use of funds. They have suggested that some funding should be allocated directly to local governments. They have expressed a preference for block grants, which would distribute funds directly to local governments for a variety of security-related expenses. Recent funding proposals, such as the $3.5 billion block grant for first responders contained in the president's fiscal year 2003 budget, have included some of these provisions. This matching grant would be administered by FEMA, with 25 percent being distributed to the states based on population. The remainder would go to states for pass-through to local jurisdictions, also on a population basis, but states would be given the discretion to determine the boundaries of substate areas for such a pass-through--that is, a state could pass through the funds to a metropolitan area or to individual local governments within such an area. Although the state and local jurisdictions would have discretion to tailor the assistance to meet local needs, it is anticipated that more than one- third of the funds would be used to improve communications; an additional one-third would be used to equip state and local first responders, and the remainder would be used for training, planning, technical assistance, and administration. Federal, state, and local governments share authority for setting standards through regulations in several areas, including infrastructure and programs vital to preparedness (for example, transportation systems, water systems, public health). In designing regulations, key considerations include how to provide federal protections, guarantees, or benefits while preserving an appropriate balance between federal and state and local authorities and between the public and private sectors (for example, for chemical and nuclear facilities). In designing a regulatory approach, the challenges include determining who will set the standards and who will implement or enforce them. Five models of shared regulatory authority are: fixed federal standards that preempt all state regulatory action in the subject area covered; federal minimum standards that preempt less stringent state laws but permit states to establish standards that are more stringent than the federal; inclusion of federal regulatory provisions not established through preemption in grants or other forms of assistance that states may choose to accept; cooperative programs in which voluntary national standards are formulated by federal and state officials working together; and widespread state adoption of voluntary standards formulated by quasi- official entities. Any one of these shared regulatory approaches could be used in designing standards for preparedness. The first two of these mechanisms involve federal preemption. The other three represent alternatives to preemption. Each mechanism offers different advantages and limitations that reflect some of the key considerations in the federal-state balance. To the extent that private entities will be called upon to improve security over dangerous materials or to protect vital assets, the federal government can use tax incentives to encourage and enforce their activities. Tax incentives are the result of special exclusions, exemptions, deductions, credits, deferrals, or tax rates in the federal tax laws. Unlike grants, tax incentives do not generally permit the same degree of federal oversight and targeting, and they are generally available by formula to all potential beneficiaries who satisfy congressionally established criteria. Promoting partnerships between critical actors (including different levels of government and the private sector) facilitates the maximizing of resources and also supports coordination on a regional level. Partnerships could encompass federal, state, and local governments working together to share information, develop communications technology, and provide mutual aid. The federal government may be able to offer state and local governments assistance in certain areas, such as risk management and intelligence sharing. In turn, state and local governments have much to offer in terms of knowledge of local vulnerabilities and resources, such as local law enforcement personnel, available to respond to threats and emergencies in their communities. The importance of readily available urban search and rescue was highlighted in the Loma Prieta earthquake in October 1989 that collapsed the Cypress section of the Nimitz Freeway in Oakland and structures in San Francisco and Santa Cruz. In late 1989, the Governor's Office of Emergency Services developed a proposal to enhance urban search and rescue capabilities in California, and the cornerstone of this proposal was the development of multidiscipline urban search and rescue task forces to be deployed in the event of large-scale disasters. A parallel effort was undertaken by FEMA at that time to upgrade urban search and rescue efforts nationwide. FEMA's national urban search and rescue response teams provide a framework for structuring local emergency personnel into integrated disaster response task forces. FEMA has 28 urban search and rescue teams, with 8 of those teams positioned in California. Twenty of FEMA's 28 teams were deployed to New York in the aftermath of the tragedy, and five teams were deployed to Washington to help in search and rescue efforts at the Pentagon. Since the events of September 11th, a task force of mayors and police chiefs has called for a new protocol governing how local law enforcement agencies can assist federal agencies, particularly the FBI, given the information needed to do so. As the United States Conference of Mayors noted, a close working partnership of local and federal law enforcement agencies, which includes the sharing of intelligence, will expand and strengthen the nation's overall ability to prevent and respond to domestic terrorism. The USA Patriot Act provides for greater sharing of intelligence among federal agencies. An expansion of this act has been proposed (S.1615, H.R. 3285) that would provide for information sharing among federal, state, and local law enforcement agencies. In addition, the Intergovernmental Law Enforcement Information Sharing Act of 2001 (H.R. 3483), which you sponsored Mr. Chairman, addresses a number of information-sharing needs. For instance, this proposed legislation provides that the United States Attorney General expeditiously grant security clearances to governors who apply for them, and state and local officials who participate in federal counterterrorism working groups or regional terrorism task forces. Local officials have emphasized the importance of regional coordination. Regional resources, such as equipment and expertise, are essential because of proximity, which allows for quick deployment, and experience in working within the region. Large-scale or labor-intensive incidents quickly deplete a given locality's supply of trained responders. Some cities have spread training and equipment to neighboring municipal areas so that their mutual aid partners can help. These partnerships afford economies of scale across a region. In events that require a quick response, such as a chemical attack, regional agreements take on greater importance because many local officials do not think that federal and state resources can arrive in sufficient time to help. Mutual aid agreements provide a structure for assistance and for sharing resources among jurisdictions in response to an emergency. Because individual jurisdictions may not have all the resources they need to respond to all types of emergencies, these agreements allow for resources to be deployed quickly within a region. The terms of mutual aid agreements vary for different services and different localities. These agreements may provide for the state to share services, personnel, supplies, and equipment with counties, towns, and municipalities within the state, with neighboring states, or, in the case of states bordering Canada, with jurisdictions in another country. Some of the agreements also provide for cooperative planning, training, and exercises in preparation for emergencies. Some of these agreements involve private companies and local military bases, as well as local government entities. Such agreements were in place for the three sites that were involved on September 11th-- New York City, the Pentagon, and a rural area of Pennsylvania--and provide examples of some of the benefits of mutual aid agreements and of coordination within a region. With regard to regional planning and coordination, there may be federal programs that could provide models for funding proposals. In the 1962 Federal-Aid Highway Act, the federal government established a comprehensive cooperative process for transportation planning. This model of regional planning continues today under the Transportation Equity Act for the 21st century (TEA-21, originally ISTEA) program. This model emphasizes the role of state and local officials in developing a plan to meet regional transportation needs. Metropolitan Planning Organizations (MPOs) coordinate the regional planning process and adopt a plan, which is then approved by the state. Mr. Chairman, in conclusion, as increasing demands are placed on budgets at all levels of government, it will be necessary to make sound choices to maintain fiscal stability. All levels of government and the private sector will have to communicate and cooperate effectively with each other across a broad range of issues to develop a national strategy to better target available resources to address the urgent national preparedness needs. Involving all levels of government and the private sector in developing key aspects of a national strategy that I have discussed today--a definition and clarification of the appropriate roles and responsibilities, an establishment of goals and performance measures, and a selection of appropriate tools-- is essential to the successful formulation of the national preparedness strategy and ultimately to preparing and defending our nation from terrorist attacks. This completes my prepared statement. I would be pleased to respond to any questions you or other members of the subcommittee may have. For further information about this testimony, please contact me at (202) 512-6737, Paul Posner at (202) 512-9573, or JayEtta Hecker at (202) 512- 2834. Other key contributors to this testimony include Jack Burriesci, Matthew Ebert, Colin J. Fallon, Thomas James, Kristen Sullivan Massey, Yvonne Pufahl, Jack Schulze, and Amelia Shachoy. Homeland Security: Challenges and Strategies in Addressing Short- and Long-Term National Needs. GAO-02-160T. Washington, D.C.: November 7, 2001. Homeland Security: A Risk Management Approach Can Guide Preparedness Efforts. GAO-02-208T. Washington, D.C.: October 31, 2001. Homeland Security: Need to Consider VA's Role in Strengthening Federal Preparedness. GAO-02-145T. Washington, D.C.: October 15, 2001. Homeland Security: Key Elements of a Risk Management Approach. GAO-02-150T. Washington, D.C.: October 12, 2001. Homeland Security: A Framework for Addressing the Nation's Issues. GAO-01-1158T. Washington, D.C.: September 21, 2001. Combating Terrorism: Considerations for Investing Resources in Chemical and Biological Preparedness. GAO-01-162T. Washington, D.C.: October 17, 2001. Combating Terrorism: Selected Challenges and Related Recommendations. GAO-01-822. Washington, D.C.: September 20, 2001. Combating Terrorism: Actions Needed to Improve DOD's Antiterrorism Program Implementation and Management. GAO-01-909. Washington, D.C.: September 19, 2001. Combating Terrorism: Comments on H.R. 525 to Create a President's Council on Domestic Preparedness. GAO-01-555T. Washington, D.C.: May 9, 2001. Combating Terrorism: Observations on Options to Improve the Federal Response. GAO-01-660T. Washington, D.C.: April 24, 2001. Combating Terrorism: Comments on Counterterrorism Leadership and National Strategy. GAO-01-556T. Washington, D.C.: March 27, 2001. Combating Terrorism: FEMA Continues to Make Progress in Coordinating Preparedness and Response. GAO-01-15. Washington, D.C.: March 20, 2001. Combating Terrorism: Federal Response Teams Provide Varied Capabilities; Opportunities Remain to Improve Coordination. GAO-01- 14. Washington, D.C.: November 30, 2000. Combating Terrorism: Need to Eliminate Duplicate Federal Weapons of Mass Destruction Training. GAO/NSIAD-00-64. Washington, D.C.: March 21, 2000. Combating Terrorism: Observations on the Threat of Chemical and Biological Terrorism. GAO/T-NSIAD-00-50. Washington, D.C.: October 20, 1999. Combating Terrorism: Need for Comprehensive Threat and Risk Assessments of Chemical and Biological Attack. GAO/NSIAD-99-163. Washington, D.C.: September 7, 1999. Combating Terrorism: Observations on Growth in Federal Programs. GAO/T-NSIAD-99-181. Washington, D.C.: June 9, 1999. Combating Terrorism: Analysis of Potential Emergency Response Equipment and Sustainment Costs. GAO-NSIAD-99-151. Washington, D.C.: June 9, 1999. Combating Terrorism: Use of National Guard Response Teams Is Unclear. GAO/NSIAD-99-110. Washington, D.C.: May 21, 1999. Combating Terrorism: Observations on Federal Spending to Combat Terrorism. GAO/T-NSIAD/GGD-99-107. Washington, D.C.: March 11, 1999. Combating Terrorism: Opportunities to Improve Domestic Preparedness Program Focus and Efficiency. GAO-NSIAD-99-3. Washington, D.C.: November 12, 1998. Combating Terrorism: Observations on the Nunn-Lugar-Domenici Domestic Preparedness Program. GAO/T-NSIAD-99-16. Washington, D.C.: October 2, 1998. Combating Terrorism: Threat and Risk Assessments Can Help Prioritize and Target Program Investments. GAO/NSIAD-98-74. Washington, D.C.: April 9, 1998. Combating Terrorism: Spending on Governmentwide Programs Requires Better Management and Coordination. GAO/NSIAD-98-39. Washington, D.C.: December 1, 1997. Bioterrorism: The Centers for Disease Control and Prevention's Role in Public Health Protection. GAO-02-235T. Washington, D.C.: November 15, 2001. Bioterrorism: Review of Public Health and Medical Preparedness. GAO- 02-149T. Washington, D.C.: October 10, 2001. Bioterrorism: Public Health and Medical Preparedness. GAO-02-141T. Washington, D.C.: October 10, 2001. Bioterrorism: Coordination and Preparedness. GAO-02-129T. Washington, D.C.: October 5, 2001. Bioterrorism: Federal Research and Preparedness Activities. GAO-01- 915. Washington, D.C.: September 28, 2001. Chemical and Biological Defense: Improved Risk Assessments and Inventory Management Are Needed. GAO-01-667. Washington, D.C.: September 28, 2001. West Nile Virus Outbreak: Lessons for Public Health Preparedness. GAO/HEHS-00-180. Washington, D.C.: September 11, 2000. Need for Comprehensive Threat and Risk Assessments of Chemical and Biological Attacks. GAO/NSIAD-99-163. Washington, D.C.: September 7, 1999. Chemical and Biological Defense: Program Planning and Evaluation Should Follow Results Act Framework. GAO/NSIAD-99-159. Washington, D.C.: August 16, 1999. Combating Terrorism: Observations on Biological Terrorism and Public Health Initiatives. GAO/T-NSIAD-99-112. Washington, D.C.: March 16, 1999. | Federal, state, and local governments share responsibility for terrorist attacks. However, local government, including police and fire departments, emergency medical personnel, and public health agencies, is typically the first responder to an incident. The federal government historically has provided leadership, training, and funding assistance. In the aftermath of September 11, for instance, one-quarter of the $40 billion Emergency Response Fund was earmarked for homeland security, including enhancing state and local government preparedness. Because the national security threat is diffuse and the challenge is highly intergovernmental, national policymakers must formulate strategies with a firm understanding of the interests, capacity, and challenges facing those governments. The development of a national strategy will improve national preparedness and enhance partnerships between federal, state, and local governments. The creation of the Office of Homeland Security is an important and potentially significant first step. The Office of Homeland Security's strategic plan should (1) define and clarify the appropriate roles and responsibilities of federal, state, and local entities; (2) establish goals and performance measures to guide the nation's preparedness efforts; and (3) carefully choose the most appropriate tools of government to implement the national strategy and achieve national goals. | 8,038 | 239 |
DODIG has taken a number of actions to improve its tracking of the timeliness of military whistleblower reprisal investigations, including developing an automated tool to address statutory notification requirements. However, DODIG does not regularly report to Congress on the timeliness of military whistleblower reprisal investigations. In both 2012 and 2015, we found that DOD was not meeting its internal timeliness requirements for completing military whistleblower reprisal investigations within 180 days. Specifically, in 2012 we found that despite undertaking efforts to improve timeliness--such as changing its process for taking in complaints--DOD took a mean of 451 days to process cases, and that its efforts to improve case processing times were hindered by unreliable and incomplete data on timeliness. Further, in 2015 we found that DOD's average investigation time for cases closed in fiscal years 2013 and 2014 was 526 days, almost three times DOD's internal completion requirement of 180 days. DOD Directive 7050.06, which implements 10 U.S.C. SS 1034 and establishes DOD policy, states that DODIG shall issue a whistleblower reprisal investigation report within 180 days of the receipt of the allegation of reprisal. To improve the timeliness of military whistleblower reprisal investigations, we recommended in February 2012 that DOD (1) implement procedures to track and report data on its case processing timeliness and (2) track and analyze timeliness data to identify reforms that could aid in processing cases within 180-day time frame. DOD concurred and subsequently took several actions to implement these recommendations. For example, in December 2012 DODIG began implementing a case management system to collect key dates to track the timeliness of DODIG's investigative phases and in March 2016 issued a case management system guide that established procedures to help ensure accurate and complete recording and consistent tracking of case processing time. Further, DODIG took steps to track and analyze timeliness data that could aid in processing cases within the 180-day timeframe by compiling quarterly timeliness metrics starting in fiscal year 2014, and by updating its case management system in April 2016 to include additional investigation milestones. Because some of these actions were not taken until 2016, it is too early to determine whether timeliness has improved since we last reported on the status. In both our 2012 and 2015 reports, we found that DOD generally did not meet statutory requirements for notifying servicemembers within 180 days about delays in investigations. According to 10 U.S.C. SS 1034 if, during the course of an investigation, an IG determines that it is not possible to submit the report of investigation to the Secretary of Defense and the service Secretary within 180 days after the receipt of the allegation, the IG shall provide to the Secretary of Defense, the service Secretary concerned, and the servicemember making the allegation a notice of that determination including the reasons why the report may not be submitted within that time and an estimate of the date when the report will be submitted. In 2012, we found that neither the DODIG nor military service IGs had been making the required notifications. During that review, DODIG changed its practice and started reporting this information in October 2011 and identified steps in an action plan to help ensure that it and the military service IGs followed the statutory reporting requirements. During our 2015 review, DODIG officials stated that they had taken additional steps to help ensure they met the statutory notification requirement. For example, DODIG assigned an oversight investigator to remind the service IGs to send the required letters and developed a mechanism in DODIG's case management system to indicate which cases were older than 180 days. However, during our 2015 review, we again found that DOD had not sent the required letters to notify servicemembers about delays in their investigations in about half of reprisal investigations closed in fiscal year 2013; that the median notification time for servicemembers receiving the required letter was about 353 days after the servicemember filed the complaint; and that the letters that DOD had sent, on average, had significantly underestimated the date by which the investigation would be competed. Consequently, we recommended in our 2015 report that DOD develop an automated tool to help ensure compliance with the statutory 180-day notification requirement by providing servicemembers with accurate information regarding the status of their reprisal investigations within 180 days of receipt of an allegation of reprisal. DOD concurred with this recommendation and in April 2016, launched an automated tool within its case management system to help ensure compliance with the statutory 180-day notification requirement, instead of relying on its manual reconciliation process. Specifically, the case management system now has an alert that provides the age of the case and the date by which the notification letter must be transmitted to the required parties. This tool is to help provide assurance that servicemembers are being notified of the status of their reprisal investigations. In 2012, we found that although DODIG is required to keep Congress fully and currently informed through, among other things, its semiannual reports to Congress, DODIG was not including in these reports information on military whistleblower case processing time, including (1) statutorily required notifications of delays in the investigations or (2) those exceeding DODIG's internal 180-day completion requirement. The semiannual report to Congress is required to include information on fraud, abuses, and deficiencies related to the administration of programs and operations managed or financed by DOD, but DOD interpreted this requirement as not applying to the military whistleblower reprisal program. Because Congress is the primary oversight body for DODIG, we recommended that DOD regularly report to Congress on the timeliness of military whistleblower reprisal investigations, including those exceeding the 180-day timeframe. DOD concurred with our recommendation. On August 31, 2016, the DOD Principal Deputy Inspector General performing the duties of the DOD Inspector General stated that the office will implement this recommendation by regularly reporting timeliness information to Congress on a biannual basis. We believe that if this action is taken, it will fully implement our recommendation, provide Congress with enhanced visibility over the status of military whistleblower reprisal investigations, and thereby improve decisionmakers' ability to effectively oversee the military whistleblower reprisal program. In 2012 and 2015, we found that DODIG's oversight of military whistleblower reprisal investigations conducted by the military services was hampered by insufficient processes, including performance metrics; guidance; and plans. DOD subsequently took steps to strengthen its oversight of military whistleblower reprisal investigations conducted by the military services by establishing processes and developing guidance for overseeing these investigations--along with a plan to expand its case management system to the services. In 2012, we found that DODIG lacked reliable data on the corrective actions taken in response to substantiated whistleblower reprisal cases, thus limiting the visibility and oversight DOD and Congress have of the final portion of the military whistleblower reprisal process. DOD Directive 7050.06 directs the Secretaries of the military departments and the heads of the other DOD components to take corrective action based on IG reports of investigations of military whistleblower reprisal allegations and to notify DODIG of the actions taken within 10 working days. Further, DODIG requires that the service IGs report back to DODIG on command actions taken against the individual alleged to have reprised against a whistleblower, according to officials from these organizations. However, in 2012 we found that DODIG had not been maintaining reliable information on command actions needed to oversee this process. Specifically, for 40 percent of all substantiated cases that DODIG closed from October 1, 2005, through March 31, 2011, the database that DODIG used during that period did not contain information on the command actions taken. As a result, we recommended in our 2012 report that DOD (1) establish standardized corrective action reporting requirements, and (2) consistently track and regularly reconcile data regarding corrective actions. DOD addressed these recommendations by issuing an update to its military whistleblower directive in April 2015 that required standardized corrective action reporting requirements by the services. DODIG also issued additional guidance in its March 2016 investigations manual requiring that investigators populate data fields for corrective actions and remedies. Finally, DODIG provided us with a report in April 2016 detailing its tracking of corrective actions taken in response to substantiated reprisal cases between October 2011 and January 2016. In 2012, we also found that DODIG had not yet fully established performance metrics for ensuring the timeliness and quality of whistleblower reprisal investigations but was taking steps to establish timeliness metrics that focused on investigation processing time. Federal internal control standards state that metrics are important for identifying and setting appropriate incentives for achieving goals while complying with law, regulations, and ethical standards. Further, we found in our previous work that metrics on both timeliness and quality--such as completeness of investigative reports and the adequacy of internal controls--can enhance the ability of organizations to provide assurance that they are exercising all of the appropriate safeguards for federal programs. During our 2012 review, DODIG officials stated that they recognized the importance of both timeliness and quality metrics and that they planned to develop quality metrics as part of their effort to improve case management and outcomes. They further noted that quality metrics could include measuring whether interviews are completed and documented and whether conclusions made about the case are fully supported by evidence. To assist DOD in improving oversight of the whistleblower reprisal program, we recommended in our 2012 report that DOD develop and implement performance metrics to ensure the quality and effectiveness of the investigative process, such as ensuring that the casefiles contain evidence sufficient to support the conclusions. DOD concurred with our recommendation and in 2014 fully developed timeliness metrics, along with some performance metrics to assess the completeness of a sample of (1) DODIG-conducted whistleblower reprisal investigations and (2) DODIG oversight reviews of the military services whistleblower reprisal investigations. For example, now DODIG is to complete internal control checklists for investigations it conducts and oversight worksheets for investigations conducted by the military services to determine whether casefiles are compliant with internal policy and best practices. On a quarterly basis, DODIG is to draw a sample of the checklists and oversight worksheets for cases closed by DODIG and the military service IGs and compare these checklists to the quality metrics that it developed. According to DODIG officials, these metrics were briefed to the DOD Inspector General in fiscal year 2014. DODIG officials stated in July 2016 that they continued to conduct quality assurance reviews and collect associated metrics in fiscal year 2015, but that they have not briefed these metrics to the DOD Inspector General since fiscal year 2014 and that changes to the metrics briefings are forthcoming per direction from the DOD Inspector General and Principal Deputy Inspector General. DODIG did not provide information on the nature of these changes. While we believe that DODIG's actions should help oversee the quality of investigations, we will continue to work with the DODIG and monitor its progress in implementing and communicating these performance metrics during our ongoing review assessing whistleblower reprisal investigation processes for DOD civilian employees and contractors. Further, we also believe that until the military services follow standardized investigation stages, as discussed later in this statement, it will be difficult for the DODIG to consistently measure the quality of the services' military whistleblower reprisal investigations. Separately, in 2015, we found that DODIG and the service IGs had processes for investigators to recuse themselves from investigations, but there was no process for investigators to document whether the investigation they conducted was independent and outside the chain of command. Council of the Inspectors General on Integrity and Efficiency standards state that in all matters relating to investigative work, the investigative organization must be free, both in fact and appearance, from impairments to independence. Further, guidance for documenting independence is included in generally accepted government auditing standards, which can provide guidance to service IGs as a best practice on how to document decisions regarding independence when conducting reprisal investigations. At the time of our 2015 review, DODIG officials stated that their recusal policies for investigators, their decentralized investigation structure, and their removal of the investigator from the chain of command adequately addressed independence issues and that no further documentation of independence was needed. However, during the case file review we conducted for our 2015 report, we identified oversight worksheets on which DODIG oversight investigators had noted potential impairments to investigator objectivity in the report of investigation. For example, one oversight worksheet stated that the report gave the appearance of service investigator bias, and another oversight worksheet stated that the investigator was not outside the chain of command, as is statutorily required. DODIG approved these cases without documenting how it had reconciled these case deficiencies. As a result, in our 2015 report we recommended that DOD develop and implement a process for military service investigators to document whether the investigation was independent and outside the chain of command and direct the service IGs to provide such documentation for review during the oversight process. DOD concurred with this recommendation and issued a memorandum in June 2015 that informed service IGs that DODIG would look for certification of an investigator's independence during its oversight reviews. Concurrently, DODIG also directed the service IGs to provide such documentation. In 2012, we found that DODIG was updating its guidance related to the whistleblower program but that the updates had not yet been formalized and that the guidance that existed at that time was inconsistently followed. According to the Council of the Inspectors General on Integrity and Efficiency's quality standards for investigations, organizations should establish appropriate written investigative policies and procedures through handbooks, manuals, directives, or similar mechanisms to facilitate due professional care in meeting program requirements. Further, guidance should be regularly evaluated to help ensure that it is still appropriate and working as intended. However, in 2012 we found, among other things, that DODIG's primary investigative guide distributed to investigators conducting whistleblower reprisal investigations had not been updated since 1996 and did not reflect some investigative processes that were current in 2012. Additionally, because guidance related to key provisions of the investigative process was unclear, it was being interpreted and implemented differently by the service IGs. As a result, we recommended in our 2012 report that DODIG update its whistleblower reprisal investigative guidance and ensure that it is consistently followed, including clarifying reporting requirements, responsibilities, and terminology. DOD concurred with this recommendation and in October 2014 released a guide of best practices for conducting military reprisal investigations and in April 2015 updated Directive 7050.06 on military whistleblower protection, which established policies and assigned responsibilities for military whistleblower protection and defined key terminology. Separately, in 2015 we found that DODIG had provided limited guidance to users of its case management system on how to populate case information into the system. The case management system, in use since December 2012, was to serve as a real-time complaint tracking and investigative management tool for investigators. DOD's fiscal year 2014 performance plan for oversight investigators notes that investigators should ensure that the case management system reflects current, real- time information on case activity. This intent aligns with Council of the Inspectors General on Integrity and Efficiency's quality standards for investigations, which state that accurate processing of information is essential to the mission of an investigative organization and that this begins with the orderly, systematic, accurate, and secure maintenance of a management information system. However, based on our file review of a sample of 124 cases closed in fiscal year 2013, we found that DODIG investigators were not using the case management system for real-time case management. Specifically, we estimated that DODIG personnel uploaded key case documents to the system after DODIG had closed the case in 77 percent of cases in fiscal year 2013. Among other things, these documents included reports of investigation, oversight worksheets, and 180-day notification letters regarding delays in completing investigations. Additionally, we estimated that for 83 percent of cases closed in fiscal year 2013, DODIG staff had made changes to case variables in the case management system at least 3 months after case closure. DODIG officials stated in 2015 that they planned to further develop a manual for the case management system that was in draft form along with internal desk aides, but that they did not plan to issue additional internal guidance for DODIG staff on the case management system because they believed that the existing guidance was sufficient. However, DODIG's draft manual did not instruct users on how to access the system, troubleshoot errors, or monitor caseloads. As a result, in our 2015 report we recommended that DOD issue additional guidance to investigators on how to use the case management system as a real-time management tool. DOD concurred with this recommendation and in March 2016 issued a case management system user guide and in July 2016, a data entry guide. Collectively, these guides provide users with key information on how to work with and maintain data in the case management system. In 2015, we found that each military service IG conducted and monitored the status of military whistleblower reprisal investigations in a different case management system and that DODIG did not have complete visibility over service investigations from complaint receipt to investigation determination. Further, we found that DODIG did not have knowledge of the real-time status of service-conducted investigations and was unable to anticipate when service IGs would send completed reports of investigation for DODIG review. DODIG is required to review all service IG determinations in military reprisal investigations in addition to its responsibility for conducting investigations of some military reprisal complaints, and DOD Directive 7050.06 requires that service IGs notify DODIG of reprisal complaints within 10 days of the receipt of a complaint. However, our analysis indicated that DODIG's case management system did not have records of at least 22 percent of service investigations both open as of September 30, 2014, and closed in fiscal years 2013 and 2014. Further, based on our file review, we estimated that there was no evidence of the required service notification in 30 percent of the cases closed in fiscal year 2013. We concluded that without a common system to share data, DODIG's oversight of the timeliness of service investigations and visibility of its own future workload was limited. At the time of our 2015 review, DOD was taking steps to improve its visibility into service investigations, including by expanding its case management system to the military services. DODIG officials stated that they had created a working group comprising representatives from each of the service IGs to facilitate the expansion and that they planned a complete rollout to the service IGs by the end of fiscal year 2016. However, DODIG did not have an implementation plan for the expansion and had not yet taken steps to develop one. Project management plans should include a scope--to describe major deliverables, assumptions, and project constraints--project requirements, schedules, costs, and stakeholder roles and responsibilities and communication techniques, among other things. Given DOD's stated plans to expand the case management system to the service IGs by the end of fiscal year 2016, we recommended in our 2015 report that DOD develop an implementation plan that addresses the needs of DODIG and the service IGs and defines project goals, schedules, costs, stakeholder roles and responsibilities, and stakeholder communication techniques. DOD concurred with this recommendation and subsequently developed a plan in April 2016, in coordination with the military services, which included the elements we recommended for a plan to expand its case management system into an enterprise system. This plan states that the enterprise case management system will launch between February 2018 and May 2018 and notes that the project budget between fiscal years 2017 and 2021 is approximately $25.3 million. Although DODIG has taken several important actions, additional actions are still needed to further strengthen the capacity of DODIG and the Congress to oversee military whistleblower reprisal investigations. These actions include standardizing the investigation process and reporting corrective action information to Congress. In 2015, we found that the DODIG and the military service IGs use different terms in their guidance to refer to their investigations, thus hindering DODIG's ability to consistently classify and assess the completeness of cases during its oversight reviews. For example, we found that in the absence of standardized investigation stages, DODIG investigators had miscoded approximately 43 percent of the cases that DODIG had closed in fiscal year 2013 as full investigations, based on our estimate, when these investigations were instead preliminary inquiries as indicated in the services' reports of investigation. The Council of the Inspectors General on Integrity and Efficiency's quality standards for investigations state that to facilitate due professional care, organizations should establish written investigative policies and procedures that are revised regularly according to evolving laws, regulations, and executive orders. DODIG took an important step to improve its guidance by issuing an updated reprisal investigation guide for military reprisal investigations for both DODIG and service IG investigators in October 2014. However, the guide states that it describes best practices for conducting military reprisal intakes and investigations and DODIG officials told us that the guide does not explicitly direct the services to follow DODIG's preferred investigation process and stages. These officials further stated that they have no role in the development of service IG regulations. To improve the military whistleblower reprisal investigation process and oversight of such investigations, in our 2015 report we recommended that the Secretary of Defense in coordination with the DODIG, direct the military services to follow standardized investigation stages and issue guidance clarifying how the stages are defined. DOD concurred with this recommendation and subsequently updated its guide in June 2015. However, this guide is still characterized as describing best practices and does not direct the services to follow standardized investigation stages. We note that 10 U.S.C. SS 1034 provides the authority for the Secretary of Defense to prescribe regulations to carry out the section. Also, DOD Directive 7050.06 assigns DODIG the responsibility to provide oversight of the military whistleblower reprisal program for the department. DODIG officials noted in August 2016 that they are currently working with the military services through an established working group to standardize the investigation stages as an interim measure. The DOD Principal Deputy Inspector General performing the duties of the DOD Inspector General also indicated in August 2016 that the office is willing to coordinate with the Secretary of Defense to issue authoritative direction to the services to standardize the investigation stages, but that this will take time. As previously mentioned, we found in 2012 that DOD lacked reliable data on the corrective actions taken in response to substantiated whistleblower reprisal cases, thus limiting the visibility and oversight that DOD and Congress have of the final portion of the military whistleblower reprisal process. We also noted in 2012 that a 2009 Department of Justice review recommended that the results of investigations that substantiate allegations of reprisal be publicized as a way to heighten awareness within the services of the Military Whistleblower Protection Act, to potentially deter future incidents of reprisal, and to possibly encourage other reprisal victims to come forward. While the DODIG cannot directly take corrective action in response to a substantiated case per DOD Directive 7050.06, it is the focal point for DOD's military whistleblower reprisal program and is well positioned to collect and monitor data regarding program outcomes. Further, DODIG officials stated in 2012 that because DODIG is the focal point, it is important for it to have visibility and information of all military whistleblower reprisal activities, not only to provide oversight but also to provide a central place within the department where internal and external stakeholders can obtain information. In addition to the recommendations we made regarding establishing corrective action reporting requirements and regularly tracking these data, we also recommended in our 2012 report that DOD regularly report to Congress on the frequency and type of corrective actions taken in response to substantiated reprisal claims. We noted that DOD could do so, for example, through its semiannual reports to Congress. DOD concurred with that recommendation and has since included examples in its semiannual reports to Congress of corrective actions taken by the military services for substantiated cases but not a comprehensive list of all corrective actions taken. However, in following up on actions that DODIG has taken regarding this recommendation in August 2016, DODIG officials stated that the corrective actions listed in its semiannual reports to Congress included all corrective actions taken during the 6 month reporting period, but that the reports incorrectly identified these actions as examples. DODIG provided us corrective action information to compare with the corrective actions reported in DODIG's December 2015 and March 2016 semiannual reports to Congress for those reporting periods. We identified some key differences. Specifically, we identified corrective actions in the information provided to us by DODIG that were not published in the December and March reports to Congress and identified discrepancies in the types of corrective action contained in the reports and in the information that DODIG provided. As a result, we believe that DODIG's two most recent semiannual reports to Congress did not include the frequency and type of all corrective actions reported during those reporting periods. Relatedly, we also noted in August 2016 that DODIG's semiannual reports did not include other information needed to convey the frequency and type of corrective actions. Specifically, DODIG officials stated in August 2016 that their case management system would require additional capability in order to produce a list of substantiated allegations that do not have associated corrective actions, which would indicate which corrective action recommendations are outstanding. Further, these officials stated that publishing information showing the status of all DODIG corrective action recommendations--not just actions that were taken during a particular reporting period--could be misleading because the military services sometimes take actions that are different than those recommended by DODIG and that may not result from reprisal investigations. However, as noted in the 2009 Department of Justice review, publicizing the results of investigations that substantiate allegations of reprisal may help to deter future incidents of reprisal and encourage other whistleblowers to come forward. Without including information on (1) all corrective actions taken during a reporting period, (2) outstanding corrective action recommendations, and (3) actions taken by the services that are different than those recommended by DODIG, we believe that DODIG's current method of reporting does not fully address our recommendation to report to Congress on the frequency and type of corrective action taken in response to substantiated claims. Moreover, it does not meet the requirement to keep Congress fully and currently informed on the progress of implementing corrective actions through, among other things, its semiannual reports to Congress. We therefore continue to believe that without such information, Congress will be hindered in its ability to provide oversight of the corrective action portion of the military whistleblower reprisal program. In summary, DOD has taken actions to implement 15 of the 18 recommendations that we made to address the military whistleblower reprisal timeliness and oversight challenges we identified in our 2012 and 2015 reports. These efforts constitute progress toward improving the DODIG's ability to accurately track the timeliness of military whistleblower reprisal investigations and increase the DODIG's ability to effectively oversee the department's military whistleblower reprisal program. Fully implementing the remaining 3 recommendations would further strengthen DODIG's capacity to assess the quality of military whistleblower reprisal investigations and enhance Congress' visibility into the timeliness of investigations as well as into the corrective actions taken for substantiated allegations. We have ongoing work that will help to both monitor the actions taken by DODIG to improve its oversight of military reprisal investigations and provide additional insight on the DODIG's ability to conduct timely and quality reprisal investigations for DOD's civilian and contractor employees. Chairman DeSantis, Ranking Member Lynch, and Members of the Subcommittee, this concludes my prepared statement. I look forward to answering any questions that you might have. If you or your staff have any questions about this statement, please contact Brenda S. Farrell, Director, Defense Capabilities and Management at (202) 512-3604 or [email protected], or Lori Atkinson, Assistant Director, Defense Capabilities and Management at (404) 679- 1852 or [email protected]. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this statement. GAO staff who made key contributions to this testimony are Tracy Barnes, Sara Cradic, Ryan D'Amore, Taylor Hadfield, and Mike Silver. This is a work of the U.S. government and is not subject to copyright protection in the United States. The published product may be reproduced and distributed in its entirety without further permission from GAO. However, because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. | Whistleblowers play an important role in safeguarding the federal government against waste, fraud, and abuse, and their willingness to come forward can contribute to improvements in government operations. However, whistleblowers also risk reprisal, such as demotion, reassignment, and firing. This testimony discusses DODIG's progress in (1) taking actions to track and report on the timeliness of military whistleblower reprisal investigations, and (2) strengthening its oversight of the military services' whistleblower reprisal investigations. GAO's statement is based primarily on information from May 2015 and February 2012 GAO reports on military whistleblower reprisal investigations. For those reports, GAO examined laws, regulations, and DOD guidance; conducted detailed file reviews using representative samples of cases closed in fiscal year 2013 and between January 2009 and March 2011; analyzed DODIG and military service data for cases closed in fiscal years 2013 and 2014; and interviewed DOD officials. GAO also determined what actions DOD had taken through August 2016 in response to recommendations made in the 2015 and 2012 reports. The Department of Defense Office of Inspector General (DODIG) has taken actions to improve its tracking of the timeliness of military whistleblower reprisal investigations in response to recommendations that GAO made in 2012 and 2015. For example, in 2012 and 2015, GAO found that DOD was not meeting its internal requirement to complete whistleblower reprisal investigations within 180 days, with cases closed in fiscal years 2013 and 2014 averaging 526 days. In response, DODIG--which is responsible for both conducting investigations and overseeing investigations conducted by the military services--took steps to better track and analyze timeliness data by developing a guide to help ensure the accurate tracking of case processing time and by updating its case management system in April 2016 to include new investigation milestones. Because these actions were not taken until 2016, it is too early to determine if timeliness has improved since GAO last reported on the status. Similarly, in 2015, GAO found that DOD had not met the statutory requirement to notify servicemembers within 180-days about delays in their investigations for about half of the reprisal investigations closed in fiscal year 2013. In response, DODIG developed an automated tool in its case management system to flag cases approaching 180 days. However, DODIG continues to not regularly report to Congress on the timeliness of military whistleblower reprisal investigations as GAO recommended in 2012. On August 31, 2016, a senior DODIG official stated that DODIG will implement this recommendation by reporting timeliness information to Congress biannually. DODIG has strengthened its oversight of military service reprisal investigations in response to recommendations GAO made in 2012 and 2015 by establishing processes and developing guidance for overseeing investigations, among other things. For example, in 2015, GAO found that DODIG did not have a process for documenting whether investigations were independent and were conducted by someone outside the military service chain of command. In response, DODIG directed the service IGs to certify investigators' independence for oversight reviews. GAO also found in 2015 that DODIG had provided limited guidance to investigators using its case management system, limiting its utility as a real-time management system, as intended. In response, DODIG issued a system guide and a data entry guide, which provide key information on how to work with and maintain system data. However, in 2015 GAO also found that DODIG and the military service IGs used different terms in their guidance to investigators, hindering DODIG oversight of case completeness. GAO recommended that DOD direct the military service IGs to follow standardized investigation stages and issue related guidance. DODIG officials stated in August 2016 that they are working with the services to standardize investigation stages and that DODIG is willing to work with the Secretary of Defense to issue such direction. Separately, GAO found in 2012 that unreliable data on corrective actions taken in response to substantiated reprisal cases was hampering oversight and recommended that DOD regularly report to Congress on the frequency and type of corrective actions taken in response to substantiated reprisal claims. DODIG reports some corrective actions in its semiannual report to Congress, but does not include all relevant corrective actions or outstanding corrective action recommendations. DOD implemented 15 of the 18 recommendations GAO made to improve and track investigation timeliness and strengthen oversight of the military services' investigations, and is considering steps to implement the remaining three regarding standardized investigations and reporting to Congress. | 6,271 | 975 |
The Congress established VETS in 1980 to carry out the national policy that veterans receive priority employment and training opportunities. Faced with growing long-term challenges of new service delivery systems, an evolving labor market, and changing technology, VETS' vision is to find innovative ways to maximize the effectiveness of its efforts. VETS' strategic plan states that it will seek new and effective means to help veterans compete successfully for better paying career jobs--helping them get on a track that can provide improved income stability and growth potential. VETS provides states with grants for DVOP and LVER staff according to the formula outlined in the law. The grant agreements include assurances by states that the DVOP and LVER staff members serve eligible veterans exclusively. Under federal law, all employment service staff must give priority to serving veterans, and the assignment of DVOP and LVER staff to local offices does not relieve other employment and training program staff of this requirement. The law prescribes various duties to DVOP and LVER staff members that are intended to provide veterans with job search plans and referrals and job training opportunities. While the state-employed DVOP and LVER staff are the front-line providers for services to veterans, VETS carries out its responsibilities, as outlined in the law, through a nationwide network that includes regional and state representation. The Office of the Assistant Secretary for Veterans' Employment and Training administers the DVOP and LVER staffing grants through regional administrators and directors in each state, the District of Columbia, Puerto Rico, and the Virgin Islands. In larger states, an assistant director is appointed for every 250,000 veterans in the state. These federally paid VETS staff ensure that states carry out their obligations to provide service to veterans, including the services provided under the DVOP and LVER grants. To ensure priority service to veterans, VETS expects states to provide employment and training services to veterans at a rate exceeding the service provided to nonveterans. For example, VETS requires that veterans receive services at a rate 15 percent higher than nonveterans. Thus, if a state's placement rate for nonveterans was 10 percent, the placement rate for veterans should be 11.5 percent, or 15 percent higher than the nonveteran placement rate. There are also greater expectations for serving Vietnam-era veterans and disabled veterans. As required by law, VETS must report to the Congress on states' performance in five service categories. Historically, VETS has used these same performance categories to measure state performance for serving veterans at a higher rate than nonveterans. The performance categories include: (1) veterans placed in or obtaining employment; (2) Vietnam-era veterans and special disabled veterans placed in jobs on the Federal Contractor Job Listing; (3) veterans counseled; (4) veterans placed in training; and (5) veterans who received some reportable service. In our past reviews of VETS' programs, we have recommended changes to VETS' performance measures and plans. Recently, we have noted that VETS had proposed performance measures that were more in-line with those established under WIA; the measures focused more on what VETS' programs achieve and less on the number of services provided to veterans relative to nonveterans. Although the law still stipulates that VETS is to report to the Congress on the five service categories, VETS plans to eliminate the requirement that states compare services provided to veterans with those provided to nonveterans. However, we have reported that VETS still lacked measures to gauge the effectiveness of services or whether more staff-intense services helped veterans obtain jobs. Veterans receive priority employment services at one-stop centers as required under the law, but the effectiveness of these services cannot be determined. Based on state-gathered data reported to VETS and interviews with state officials, we found that veterans generally received employment service at a higher rate than nonveterans. However, the effectiveness of these services is unknown because VETS lacks adequate outcome data such as information on job retention and wages. The only outcome data collected--the percentage of veterans served entering employment--are often collected inconsistently from state to state. Priority service to veterans at one-stop centers is usually demonstrated by the higher rates of service for veterans as compared with those for nonveterans. Most one-stop centers provide priority services to veterans through the DVOP and LVER staff who can provide an elevated level of service to veterans. Because veterans have these dedicated staff to serve them, they also receive more intensive services, and receive these services more readily, than nonveterans. Other examples of priority service include identifying and contacting qualified veterans before the universal population has access to employers' job openings that will be posted on the states' job database. States may have other special services exclusively for veterans, such as designated computers or special information packets on available resources. While priority service can be provided in different ways depending on the one-stop center, most state officials and one-stop center managers we spoke with said that they primarily used DVOP and LVER staff to provide priority service to veterans since these staff must assist veterans exclusively. DVOP and LVER staff members have smaller caseloads than other employment services staff and, consequently, have more time to spend with individuals. Veterans also have better access to intensive services, such as counseling and case management, than nonveterans because DVOP and LVER staff are funded independently of WIA and are not subject to restrictions applicable to WIA-funded programs. According to many state officials as well as DVOP and LVER staff, the DVOP and LVER staff members relate better to veterans because they are generally veterans themselves. For example, because they are familiar with the processes at the Department of Veterans Affairs (VA), DVOP and LVER staff can more easily help veterans file disability claims with the VA or help them to receive the appropriate disability benefits. While veterans received priority employment services at one-stop centers, VETS does not currently collect appropriate data for determining the effectiveness of these services and the agency lacks sufficient employment outcome data that would indicate whether services provided to veterans were effective. VETS has proposed changes to its performance measures, such as requiring states to report job retention, but will not implement these changes until July 1, 2002. In past reviews, we have pointed out that VETS' use of relative standards comparing the percentage of veterans entering employment with that of nonveterans is not effective. This comparison results in states with poor levels of service to nonveterans being held to lower standards for service to veterans than states with better overall performance. The only outcome data that states currently report to VETS--the percentage of veterans entering employment after registering for employment services--is collected inconsistently from state to state. Some states compare their employment service registration records with unemployment insurance wage records, but others may simply call employers for employment verification or send postcards or letters to customers asking whether they have obtained employment. Some DVOP and LVER staff had more time than other employment and training staff for follow-ups by telephone or mail, resulting in more complete employment data for some veterans. In addition, states and local workforce investment areas choose to register customers at different stages of the job search process, thus the percentage of "registered" veterans entering employment may differ based on when they were required to register. In some areas, customers register to use any service, including self-service; in other areas, they are only required to register when using staff-assisted services. Those who find employment before being registered are not counted as having entered employment after using self-service resources available through the one- stop center. Consequently, the reported percentage of veterans served who entered employment is not comparable from state to state. Despite recently proposed improvements to its performance measures, VETS' overall management of the DVOP and LVER grants is ineffective because the agency does not have a comprehensive system in place to manage state performance in serving veterans with these grants. VETS does not effectively communicate performance expectations to states because its goals and measures are unclear. In addition, the agency does not have meaningful incentives to encourage states to perform well. Furthermore, VETS is required by law to have federal staff in every state and to conduct annual on-site evaluations at every local office, but this monitoring is often unproductive. In order to oversee a program effectively, an agency must have a performance management system that establishes clear goals for those administering the program; however, VETS does not communicate a consistent message to states on expected performance. In fact, the agency does not have clear goals that it communicates to states or that it tracks with outcome data. For example, while one agency goal is to provide high- quality case management to veterans, the agency does not have state performance measures for assessing the quality of case management provided to veterans. Furthermore, VETS' efforts to focus intensive services on those veterans most in need by "targeting" specific groups of veterans are unfocused. In its strategic plan, the agency, for case management and intensive services, targets disabled veterans, minority veterans, female veterans, recently separated veterans, veterans with significant barriers to employment, special disabled veterans, homeless veterans, veterans provided vocational rehabilitation under the VA, and veterans who served on active duty in the armed forces under certain circumstances. This targeting includes nearly all veterans, and not necessarily those most in need of service. The numerous categories of targeted veterans could result in the vast majority of veterans being targeted for case management. A VETS official said that the focus for service should be on veterans with the greatest needs as determined by the individual assessments because groups targeted on a national level do not necessarily correlate to the needs of veterans in particular states or local areas. Unnecessary performance measures from VETS add to the DVOP and LVER workload, without measuring quality of service to veterans. For example, some state and VETS officials we spoke with expressed concern about having performance measures that specifically focus on service to Vietnam-era veterans. These veterans make up such a small percentage of the workforce, due in part to the fact that many are at or near retirement age and may not be seeking employment, yet DVOP and LVER staff may spend much of their time trying to identify and serve this group of veterans in order to meet VETS' performance goals. State officials also identified one of VETS' performance measures that should be eliminated. VETS requires that Vietnam-era veterans, special disabled veterans, and veterans who served on active duty under certain circumstances are placed in jobs on the Federal Contractor Job Listing. To do this, in addition to identifying qualified job candidates from this pool of particular veterans, DVOP and LVER staff must monitor local federal contractors to make sure that they are listing their job opportunities with the one-stop centers on the Federal Contractor Job Listing and hiring these veterans. Because the presence of federal contractors in a given state or local area is unpredictable and is determined by the federal agencies awarding contracts, state employment service officials said the federal contractor measure should be eliminated. It is the responsibility of contractors to list their job openings, and the Office of Federal Contract Compliance Programs is responsible for ensuring that these companies list their jobs with state employment service offices and take affirmative action to hire qualified veterans. Eliminating this performance measure would allow DVOP and LVER staff members more time to focus on the employment needs of individual veterans rather than compliance issues under the purview of another federal agency. For effective oversight, in addition to having clear goals, an agency must provide incentives for meeting the goals and VETS' performance management system lacks meaningful incentives to encourage states to perform well. Presently, states are neither rewarded for meeting or exceeding their performance measures, nor penalized for failing to meet these measures. If a state fails to meet its performance measures, VETS simply requires the state to develop a corrective action plan to address the deficiencies in that state and there are no financial repercussions. States will not lose funding for failing to adequately serve veterans, and an agency official noted that taking funds away from a state would ultimately deny services to veterans. On the other hand, VETS does not encourage fiscal compliance with the grants, and a state can overspend DVOP or LVER funds and submit a grant modification requesting additional funds. A VETS official suggested that if the grants were awarded through a competitive bid process within states, the grantees might have a greater incentive to improve services to veterans. To provide effective oversight, an agency must also gauge the quality of service offered by the program and monitor the programs' progress. As prescribed by the law, VETS has federal staff in every state to monitor, along with other duties, the DVOP and LVER grants. However, this federal monitoring effort, which includes on-site evaluations at every local office, is often unproductive, and state officials characterize the DVOP and LVER grants as being "micro-managed" by VETS. The agency's annual on-site evaluations of employment services offices that we observed or whose reports we reviewed produced few substantive findings by VETS staff. Furthermore, according to some state officials, these evaluations have little or no effect on how DVOP and LVER staff members perform their duties. Finally, we found multiple problems with VETS' monitoring efforts. For example, because states generally monitor performance at one-stop centers, including the DVOP and LVER grants, VETS' monitoring can be redundant. VETS' requirement for annual on-site monitoring may also be unnecessary for those offices that exceed their performance expectations. In addition, VETS' oversight may result in confusion about the lines of authority between the federal and state monitoring staff and the DVOP and LVER staff, who are state employees. Also, VETS' monitoring is often inconsistent because operational manuals are outdated, training of monitoring staff is limited, and interpretations of the law differ among staff. According to the state and local officials we interviewed, the DVOP and LVER grant programs do not always operate well in one-stop centers. DVOP and LVER programs continue to operate under a law established prior to WIA, and states do not have the same flexibility granted under WIA to design their services for veterans in a way that best meets the needs of employers and veterans. Because of statutory requirements, states cannot, in all cases, assign DVOP and LVER staff to where the staff is most needed. For example, the law prescribes how to assign DVOP and LVER staff to local offices and does not give states the flexibility to move staff to locations where state and local officials believe veterans could best be served. This restriction may result in too many staff in some areas and too few in other areas. In addition, because DVOP and LVER grants are separate funding streams, states have little flexibility in staffing decisions. If a state does not spend all of its grant money, states return the extra funding and VETS redistributes it to states that request additional funding. A state that overspends in its DVOP program but spends less than its allocation in the LVER program would have to use other funds to cover the amount overspent in the DVOP program, and VETS would take back the additional LVER grant money. The state may request more money from VETS for its DVOP program, but there is no guarantee that it will get the additional funding. States are also constrained when it comes to deciding what DVOP and LVER staff members do and whom they serve. The law specifies the separate duties for DVOP and LVER staff, although we found that they generally performed similar duties. Furthermore, DVOP and LVER staff members may not serve certain individuals who may qualify for veteran services under other employment and training programs. The law governing the DVOP and LVER programs defines veterans eligible for employment assistance more narrowly than WIA or VETS for its other veterans' activities. Because of this more restricted definition, DVOP and LVER staff are not allowed, for example, to serve veterans who were on active duty for 180 days or less, and they are not permitted to serve Reservists or National Guard members. Another sign that the DVOP and LVER grants are not well integrated into the one-stop environment is that the funding year for DVOP and LVER programs does not coincide with the funding year for other employment programs offered in the one-stop center system. The appropriation to fund the DVOP and LVER grants is made available on a federal fiscal year basis--October 1 through September 30--while other employment programs and states operate on a program year basis--July 1 through June 30. Having Labor programs' funding streams on different schedules is burdensome for states and makes the budgeting process more complicated. VETS has taken a more reactive rather than proactive approach to adapting to the one-stop system and has not taken adequate steps to adapt the DVOP and LVER programs to the new environment. For example, instead of coordinating with other programs to determine how best to fit the DVOP and LVER programs into the one-stop system, VETS officials reported that they are waiting to see how states implement their programs and will then decide how to integrate the staff or adjust their programs. VETS has required states to sign an agreement to ensure that veterans will continue to receive priority services, but these agreements contained no insightful information about how DVOP and LVER staff might serve veterans within this new one-stop center environment. VETS has not developed practices for operating within the one-stop system or adequately shared innovative ways to help veterans find and retain jobs. Because of outdated policies and procedures, DVOP and LVER staff in many states may continue to operate separately as if they were in the old employment services system and continue to assume duties very similar to those they had in the old employment services system. Consequently, they fail to adapt to the new workforce environment created by WIA. According to one-stop managers we interviewed, this failure to adapt may diminish the quality of services to veterans. While the Congress has clearly defined employment service to veterans as a national responsibility, the law has not been amended to reflect the recent changes in the employment and training service delivery system introduced by WIA. The prescriptive nature of the law also creates a one- size-fits-all approach for service delivery, mandating many of the DVOP and LVER program activities and requirements. This approach is ineffective because it does not account for the fact that each state and one-stop center may have a different approach to satisfying the needs of local employers as well as different types of veterans who may need employment assistance. Although the law stipulates separate roles and responsibilities for DVOP and LVER staff, they perform similar duties and may not need to be separately funded. The law that governs VETS also stipulates how grant funds and staff must be allocated as well as how the grants should be monitored. These requirements hamper VETS' ability to consider alternative ways of administering or overseeing the grants. Furthermore, the law requires that VETS report annually on states' performance for serving veterans relative to serving nonveterans, which may not be a good indicator if a state serves its nonveteran population poorly. The law also requires VETS to report on requirements pertaining to the Federal Contractor Job Listing and this detracts DVOP and LVER staff members from serving veterans. While VETS' vision is to find innovative ways to assist veterans with employment, it has not been proactive in helping DVOP and LVER staff become an integral part of the one-stop center environment. The new one- stop center system, while giving veterans priority for employment services, gives states flexibility in planning and implementing employment and training systems and holds them accountable for performance. However, VETS has not taken steps to adjust to this new environment. The agency has not updated its oversight guidelines of staff training procedures to ensure consistent and effective monitoring of the DVOP and LVER programs within the one-stop centers. VETS has not established clear performance goals for states, nor has it given states the flexibility to decide how best to serve their veteran population. VETS has proposed ways of improving performance measures, but these measures have not yet been implemented. VETS has not proposed any incentives to hold states accountable for meeting performance goals. Our report recommended that the Secretary of Labor direct VETS to establish more effective management and monitoring of the DVOP and LVER programs by allowing states flexibility in planning how to best serve veterans, while at the same time holding states accountable for meeting the agency's goals and expectations. Specifically, our report recommended that the Secretary of Labor implement a more effective performance management system as soon as possible and take steps to ensure that the DVOP and LVER programs are more effectively monitored. In addition, because title 38 limits the amount of flexibility that VETS can grant to states, we recommended that Congress consider how the DVOP and LVER programs best fit in the current employment and training system and take steps to ensure that these programs become more fully integrated into this new environment. These steps may include updating the applicable law to provide more flexibility and taking other actions such as eliminating certain requirements and adjusting the DVOP and LVER grant funding cycle to correspond with that of other programs. Specifically, we suggested that the Congress consider revising title 38 to provide states and local offices more discretion to decide where to locate DVOP and LVER staff and provide states the discretion to have half-time DVOP positions; allow VETS and/or states the flexibility to better define the roles and responsibilities of staff serving veterans instead of including these duties in the law; combine the DVOP and LVER grant programs into one staffing grant to better meet states' needs for serving veterans; provide VETS with the flexibility to consider alternative ways to improve administration and oversight of the staffing grants, for example, eliminating the prescriptive requirements for monitoring DVOP and LVER grants; eliminate the requirement that VETS report to the Congress a comparison of the job placement rate of veterans with that of nonveterans; and eliminate the requirement that VETS report on Federal Contractor Job Listings. | The Department of Labor's (DOL) Disabled Veterans' Outreach Program (DVOP) and Local Veterans' Employment Representative (LVER) program allow states to hire staff members to serve veterans exclusively. The two programs are mandatory partners in the new one-stop center system created in 1998 by the Workforce Investment Act, which requires that services provided by numerous employment and training programs be made available through one-stop centers. The act also gives states the flexibility to design services tailored to local workforce needs. Although the DVOP and LVER programs must operate within the one-stop system, the act does not govern the programs--and the law that governs them does not provide the same flexibility that the act does. Because Congress sees employment service for veterans as a national responsibility, it established the Veterans' Employment and Training Service (VETS) to ensure that veterans, particularly disabled veterans and Vietnam-era veterans, receive priority employment and training opportunities. To make better use of DVOP and LVER staff services, VETS needs the legislative authority to grant each state more flexibility to design how this staff will fit into the one-stop center system. VETS also needs to be able to hold states accountable for achieving agreed upon goals. Veterans receive priority employment service at one-stop centers as required under the law, but the effectiveness of the services, as indicated by the resulting employment, cannot be determined because VETS does not require states to collect sufficient data to measure outcomes. VETS does not adequately oversee the DVOP and LVER program grants because it does not have a comprehensive system in place to manage state performance in serving veterans. VETS has not adequately adapted the DVOP and LVER programs to the new one-stop environment and determined how best to fit them into the one-stop system. | 4,747 | 381 |
For nearly 25 years, the United States has provided the Cuban people with alternative sources of news and information. In 1983, Congress passed the Radio Broadcasting to Cuba Act to provide the people of Cuba, through Radio Marti, with information they would not ordinarily receive due to the censorship practices of the Cuban government. Subsequently, in 1990, Congress authorized BBG to televise programs to Cuba. According to BBG, the objectives of Radio and TV Marti are to (1) support the right of the Cuban people to seek, receive, and impart information and ideas through any media and regardless of frontiers; (2) be effective in furthering the open communication of information and ideas through use of radio and television broadcasting to Cuba; (3) serve as a consistently reliable and authoritative source of accurate, objective, and comprehensive news; and (4) provide news, commentary, and other information about events in Cuba and elsewhere to promote the cause of freedom in Cuba. OCB employs several avenues to broadcast to Cuba, including shortwave, AM radio, and television through various satellite providers and airborne and ground-based transmitters (see fig. 1). IBB's international broadcasters generally must comply with the provisions of the U.S. Information and Educational Exchange Act of 1948 (commonly known as the Smith-Mundt Act), as amended, which bars the domestic dissemination of official American information aimed at foreign audiences. In 1983, however, the Radio Broadcasting to Cuba Act authorized the leasing of time on commercial or noncommercial educational AM radio broadcasting stations if it was determined that Radio Marti's broadcasts to Cuba were subject to a certain level of jamming or interference. Similarly, in 1990, the Television Broadcasting to Cuba Act authorized BBG to broadcast information to the Cuban people via television, including broadcasts that could be received domestically, if the receipt of such information was inadvertent. BBG has interpreted the act to allow OCB to use domestic television stations. In fiscal year 2007, OCB obligated over $35 million in support of its mission. As shown in figure 2, OCB obligated about 50 percent of this amount to salaries, benefits, and travel for OCB employees and 41 percent on mission-related contracting efforts. OCB obligated nearly $3 million to procure talent services. Federal statutes require, with certain limited exceptions, that contracting officers shall promote and provide for full and open competition in soliciting offers and awarding government contracts. The FAR states that full and open competition, when used with respect to a contract action, means that all responsible sources are permitted to compete. The process is intended to permit the government to rely on competitive market forces to obtain needed goods and services at fair and reasonable prices. When not providing for such competition, the contracting officer must, among other things, justify the reason for using other than full and open competition, solicit offers from as many potential sources as is practicable under the circumstances, and consider actions to facilitate competition for any subsequent acquisition of supplies or services. For contracts that do not exceed the simplified acquisition threshold--currently $100,000 with limited exceptions--contracting officers are to promote competition to the maximum extent practicable. In December 2006, IBB awarded contracts to two Miami-based radio and television broadcasting stations, Radio Mambi and TV Azteca, to broadcast Radio and TV Marti programming, respectively. IBB justified the use of other than full and open competition on the basis of two specific statutory authorities cited in the FAR--that there was only one responsible source capable of meeting the agency's needs and that there was an unusual and compelling urgency to award the contract. Table 1 provides selected information on the two contracts. OCB's talent services contracts typically fall below the simplified acquisition threshold and therefore are solicited and awarded directly by OCB. OCB generally awards each talent services contractor a blanket purchase agreement, which provides OCB a simplified method of obtaining specific services as needed during the course of the year. On a quarterly basis, OCB places orders against the agreements, specifying the anticipated amount of services required during that period. While certain competition requirements do not apply below the simplified acquisition threshold, contracting officers are to promote competition to the maximum extent practicable. IBB's approach for awarding the Radio Mambi and TV Azteca contracts did not reflect sound business practices in certain key aspects. IBB's approach was predicated on the confluence of several interrelated events--ongoing interagency deliberations, the issuance of a July 2006 report by the Commission for Assistance to a Free Cuba, and concerns about the health of Fidel Castro. According to BBG and IBB officials, these events required a course of action to obtain additional broadcasting services to Cuba quickly by using other than full and open competition. In certain respects, however, IBB did not document in its contract files key information or assumptions underlying its decisions to not seek competitive offers, limit the number of potential providers it considered, or the basis used to negotiate the final prices for the services provided. In addition, IBB did not actively involve its contracting office until just prior to contract award. Finally, while justifying the December 2006 award of the two contracts on the basis of urgent and compelling need and the determination that only one source would meet its minimum needs, IBB chose to exercise multiple options on the two contracts to extend their period of performance into 2008 and has only recently taken steps to identify additional providers. Our prior work has found that establishing a valid need and translating that into a well-defined requirement is essential for federal agencies to obtain the right outcome. Our review of IBB's contract files and interviews with program and contracting officials identified several interrelated events that established the need to increase radio and television broadcasting to Cuba. BBG and IBB officials noted that beginning in the spring of 2006, agency officials were involved in interagency discussions with officials from the Department of State, the Department of Defense, the National Security Council, the U.S. Agency for International Development, and other agencies on the need to expand broadcasting options. These discussions coincided with the issuance of the July 2006 report by the Commission for Assistance to a Free Cuba, which recommended funding the transmission of TV Marti by satellite television. The report did not provide a time frame in which this was to be completed, nor did it address expanding radio broadcasting. Additionally, BBG and IBB officials noted that shortly after the release of the report there was widespread concern about the health of Fidel Castro and that his death could result in unrest in Cuba, adding to the urgency to find additional ways to broadcast news to Cuba. IBB officials told us that based on their internal deliberations and discussions with other agencies involved, it was clear that the expectation was for IBB to quickly identify additional broadcasters, including radio providers, and award the resulting contracts quickly to address potential unrest in the event of a transfer of power. IBB officials were unable to provide documentation of certain classified aspects of the deliberations, or the specific time frame in which these activities were to be completed. Given this expectation, IBB subsequently decided against seeking competitive offers from radio and television broadcasters. IBB officials told us they believed they could not do so for several reasons, including concerns that publicly seeking competitive offers would not yield responses from potential service providers that met its needs; advertising its plans would alert the Cuban government of IBB's intentions, which might enable Cuba to jam the new broadcasts; and IBB had not discussed its plans with cognizant congressional committees and, in particular, its efforts to comply with the Smith- Mundt Act and other relevant legislation. IBB officials determined that they would limit the number of providers they would consider and quickly developed a set of basic requirements that broadcasters would need to meet. For radio, IBB wanted a Spanish- language station with the strongest AM signal to reach as much of Cuba as possible. To do so, officials with IBB's Office of Marketing and Program Placement stated they reviewed a prior consulting study and broadcasting databases maintained by the Federal Communications Commission, and consulted with OCB on Cuban listening habits. For television, IBB wanted a station with a limited domestic audience and one that had a contract with DirecTV, since the DirecTV signal can be received in Cuba. At IBB's request, OCB provided a list of Miami channels carried by DirecTV, highlighting three Spanish-language stations and one English-language station for consideration. BBG and IBB officials subsequently told us that their decision to limit television broadcasters to the Miami area was based on information that indicated that DirecTV receivers in Cuba likely came from the Miami area and therefore were programmed to receive only Miami television stations. A senior IBB official provided IBB's Office of Engineering a list of four radio and three television stations and requested that the office assess the extent to which the television stations' signals were viewable in the United States and the extent to which the radio stations' signals would reach Cuba. Office of Marketing and Program Placement officials then made two trips to Miami to meet with and determine these stations' willingness to broadcast Radio and TV Marti programming. In making their recommendation for a radio broadcaster to a senior IBB official, Marketing and Program Placement officials concluded that Radio Mambi provided the most powerful signal among those stations surveyed that could reach most of Cuba. IBB officials acknowledged, however, that this station was likely jammed in Havana as there is a Cuban station that broadcasts on the same frequency as Radio Mambi. IBB officials believed that broadcasting on two frequencies that cover most of Cuba would be the most effective way to overcome the Cuban government's jamming efforts. In recommending a television broadcaster, Marketing and Program Placement officials initially recommended a television station (other than TV Azteca) based on the station's verbal offer to split its DirecTV signal, enabling it to broadcast TV Marti programming only to Cuba and not to domestic audiences. According to IBB officials, subsequent to their recommendation the television broadcaster withdrew its offer once it determined it could not split its DirecTV signal and was unwilling to sell time for which it already had programs. Consequently, IBB decided to contract with TV Azteca. While TV Azteca's broadcasts could be viewed by domestic audiences, IBB officials believed that since its signal covered a small domestic area, it better met the intent of the Smith-Mundt Act to limit the extent to which broadcasts intended for foreign audiences could be received domestically. In a sole source environment, the government cannot rely on market forces to determine a fair and reasonable price and therefore must conduct market research to do so. As part of its market research, IBB officials in August and September 2006 asked a consultant to gather pricing information from the radio and television stations being considered without identifying IBB as the potential buyer. The consultant forwarded prices quoted by seven radio and television stations which varied significantly in terms of the dates, time slots, and prices offered. For instance, the information quoted by the television stations was based on one-half hour "infomercials," which IBB officials believed was useful to gauge the relative prices that might be offered by the stations, but was of only limited value to negotiate specific prices for the actual programming it sought to broadcast. While the contract files did not provide the basis by which IBB determined the final prices, IBB officials stated that after they had selected Radio Mambi and TV Azteca, the stations provided quotes for various broadcast times, which IBB officials used to reach a final price agreement. According to BBG's acquisition regulations, programming offices should discuss a prospective request for other than full and open competition with the contracting office as early as possible during the acquisition planning stage. The regulations note that these discussions may resolve uncertainties, provide offices with names of other sources, and allow proper scheduling of the acquisition, among other benefits. Further, our prior work has found that to promote successful acquisition outcomes, stakeholders with the requisite knowledge and skills must be involved at the earliest point possible. This helps ensure that the acquisition is executable and tailored to the level of risk commensurate with the individual transaction. We found, however, that neither the contracting or legal offices were actively involved in developing the acquisition strategies for the radio and television broadcasting services, nor were they involved in developing or reviewing the terms and conditions until very late in the acquisition process. For example, according to IBB officials, the substance of the agreements with the radio and television broadcasters was generally completed by mid-October 2006. However, the contracting officer who awarded the contracts indicated he was not made aware of the planned acquisition until Friday, December 1. At that time, IBB notified the contracting officer to prepare to award the contracts as early as the following Monday, based on the terms and conditions that Marketing and Program Placement officials had agreed to with the broadcasters. According to representatives from IBB's contracting and legal offices, they had been unaware of the proposed contract actions until that time, precluding their ability to provide input into the acquisition strategy or to assess the potential for conducting a more robust competition. As a result, the contracting office's role appeared to be limited to verifying the terms and conditions that the programming officials had reached with the broadcasters during their internal assessment. When agencies cite urgency as the basis for using other than full and open competition, the FAR requires them to describe the actions, if any, the agencies will take to remove the barriers for competition before any subsequent acquisition for the services is required. IBB, however, had taken few steps to determine how it might compete future broadcasting requirements. Rather, IBB extended the radio broadcasting contract by just over 8 months through February 2008, when it ended the contract due to budget constraints. Similarly, IBB exercised two options to extend the television broadcasting contract by a total of 12 months to June 2008. IBB's contracting officer told us that, in his opinion, by December 2007 IBB had sufficient knowledge of its requirements, and sufficient time to plan for and conduct a full and open competition, if IBB continued to require these services. As an interim measure, on April 25, 2008, agency officials advertised in Federal Business Opportunities their intention to exercise the final option with TV Azteca to extend contract services into December 2008. In the notice, IBB officials identified the specific times during which TV Marti programming was being broadcast and the other services provided by TV Azteca and requested that interested firms submit adequate documentation of their capability to provide these services. While the notice did not constitute a solicitation and IBB was not seeking proposals, quotes, or bids, the notice indicated that if IBB received responses it might consider competing the contract rather than exercise the option. IBB received no responses to the notice within the 15-day time period allowed. OCB's practices provide limited visibility into key steps in soliciting, evaluating, and selecting its talent services contractors. In that regard, OCB does not require that managers document instances in which resumes were received from sources other than formal solicitation means nor require that managers document their evaluation of the resumes received. IBB officials told us they would expect that pursuant to their guidance the contract files would contain such documentation. Additionally, OCB relies on the rates provided by IBB's Contracting for Talent and Other Professional Services Handbook when justifying what it pays for talent services. The usefulness of the handbook's pricing guidance, however, may be limited as the rates in the handbook are neither current nor based on the local Miami market and because OCB, at times, has reduced the rates it pays due to budget constraints. In general, OCB officials believe that they are paying their talent services contractors below market rates. Both the FAR and IBB guidance require that contracts be competitively solicited and awarded. To identify qualified contractors, OCB seeks resumes through three different means of solicitation: (1) quarterly Federal Business Opportunities notices, (2) annual advertisements in the Miami Herald newspaper, and (3) public building notices in OCB's lobby. These solicitations generally identify the wide range of services OCB requires annually, but do not specify the amount of work required or when the work may be needed. According to OCB officials, these solicitations result in a continuous stream of resumes throughout the year that are directed to its contracting office. Overall, an OCB official estimated OCB received over 600 resumes in 2006. Contracting officials group the resumes into talent and production services categories and distribute them to the pertinent radio, television, and technical managers. OCB's practices, however, provide limited visibility into the source of the resumes it receives, including those that may be received from outside the formal solicitation processes. In 31 of the 37 contract files we reviewed, OCB provided copies of all three formal solicitations to document compliance with competitive solicitation requirements. In at least three instances, however, the file contained this documentation even though the managers we interviewed stated that the resume was obtained through a recommendation from an OCB employee or contractor. In one case, for example, a manager noted that the broad nature of the solicitation did not provide suitable candidates for a specific requirement. Consequently, the manager solicited referrals from colleagues and through this means found a contractor who met the requirement. All of the seven managers we spoke with indicated that they have received, at one point or another, resumes from outside of the formal solicitation process. A senior OCB official stated that OCB does not, however, require program managers to document when resumes are received outside of the formal solicitation process. A senior IBB contracting official stated that any resume received informally should be sent to OCB's contracting office, which in turn should distribute it to the relevant managers for consideration along with all of the other resumes received. Further, OCB managers do not document their evaluation of the resumes they review or their rationale for selecting one contractor over another. After receiving resumes from the contracting office, the managers are responsible for evaluating the resumes and, in turn, recommending contractors for award. Six managers said that they reviewed the resumes they received to different degrees. For example, some managers indicated that they always reviewed the resumes obtained through OCB's formal solicitations when selecting a contractor, though none documented their assessments. On the other hand, three managers indicated that they have selected contractors based on the recommendation of an OCB employee without reviewing other resumes. While each of the 37 contract files we reviewed documented the rationale for selecting the contractor, there was no documentation to indicate that other potential contractors were considered. The Contracting for Talent and Other Professional Services Handbook requires, however, that contracting personnel maintain contract files, which must contain a justification for the contractor selected along with an evaluation of prospective contractors. OCB contracting officials told us that the guidance was not clear on how OCB was to meet this requirement. Senior IBB contracting officials with whom we spoke told us that pursuant to the guidance in the handbook, they would expect to see documentation in OCB's contract files of all the contractors considered and their rationale for selecting one contractor over another. After selecting a talent services contractor for award, OCB managers generally rely on the price ranges established in IBB's handbook to justify the price OCB will pay for the service. In that regard, we found that the rates for each of the 37 contracts we reviewed were within or below the rates established in IBB's handbook. OCB managers explained that the rates actually paid may fall below IBB's guidance because OCB's budgetary resources limit what it can afford to pay for these services. For example, an OCB manager told us that in February 2008 the decision was made to reduce programming costs, including decreasing the rates paid to many contractors, to stay within OCB's budget. The usefulness of the handbook's pricing guidance may be further limited given that the rates reflected in the handbook are not based on the local Miami market and are not current. For example, the market research used to support the rates for on- and off-camera performers was dated between October 2000 and February 2001 and only referenced prices in the Washington D.C., and Baltimore, Maryland, areas. IBB indicated that it is in the process of determining how to update its handbook, including its guidance on how managers and contracting officers are to use the rates when establishing prices for specific contracts. For their part, OCB officials believed that they were paying less than the local market rate for talent services. Competition is a fundamental principle underlying the federal acquisition process, as it allows federal agencies to identify contractors who can meet their needs while allowing the government to rely on market forces to obtain fair and reasonable prices. The competition laws and regulations provide agencies considerable flexibility to use noncompetitive procedures, if adequately justified, to meet their needs and permit agencies to use less rigorous procedures for lower dollar acquisitions. In certain respects, however, IBB's and OCB's practices to award the contracts we reviewed lacked the discipline required to ensure transparency and accountability for its decisions with regard to these matters. IBB did not fully document information or assumptions underlying its decisions, involve its contracting office in a timely manner, or actively take steps to promote competition on future efforts. Similarly, OCB's practices do not fully adhere to the requirements established by IBB's handbook to document important steps in soliciting and awarding talent services contracts, in part due to questions about how to meet the handbook's requirements. Furthermore, the pricing guidance in the handbook may be of limited use as a tool to justify prices paid to talent services contractors. Collectively, these weaknesses underscore the need for IBB and OCB to improve their practices to enhance competition, improve transparency, and ensure accountability. To better inform acquisition decisions, improve transparency, and ensure that competition is effectively utilized, we recommend that the Broadcasting Board of Governors direct IBB to take the following three actions: reinforce existing requirements to fully document information and assumptions supporting key decisions, such as when awarding contracts using other than full and open competition; reinforce existing policy for its programming staff to involve contracting personnel at the earliest possible time during the acquisition planning stage; and plan for full and open competition on any future contracts for radio and television broadcasting services that exceed the simplified acquisition threshold. With respect to improving IBB's guidance governing contracts for talent services, we recommend that the Broadcasting Board of Governors direct IBB to take the following two actions: clarify requirements in IBB's Contracting for Talent and Other Professional Services Handbook on the receipt and evaluation of resumes and ensure that OCB's practices are consistent with IBB's guidance, and determine how the pricing guidance in IBB's handbook could better meet users' needs as part of its planned revision to the handbook. In written comments on a draft of this report, BBG did not formally comment on our recommendations. BBG subsequently informed us that it did not take exception to our recommendations and has begun to take steps to implement them. In its written comments, BBG expressed concern that the draft report title may be misconstrued as an evaluation of the overall fitness of the agency's contracting efforts. We modified the draft report title for additional clarity. BBG also noted that we did not have access to certain classified information, which BBG officials believed prevented them from fully illustrating the sense of urgency that surrounded their efforts to award the broadcasting contracts. We noted in the draft report that BBG was unable to provide documentation of certain classified aspects of the deliberations, but we did not question BBG's determination that there was an urgent and compelling need to award the broadcasting contracts. Rather, we noted that the agency failed to follow sound practices in such areas as documentation, stakeholder involvement, and planning for future competition, practices that are required by federal or agency acquisition regulations, and were not related to or dependent on BBG's disclosure of classified information. BBG also provided additional context for the actions it took in awarding the broadcasting contracts and OCB's processes for awarding talent services. We believe the draft report reflected this information, but have, where appropriate, incorporated BBG's comments. These comments are reprinted in appendix II. BBG also provided technical comments, which we incorporated where appropriate. As agreed with your office, unless you publicly announce the contents of this report earlier, we plan no further distribution until 15 days from the report date. At that time, we will send copies of this report to interested congressional committees; the Broadcasting Board of Governors; the Executive Director, Broadcasting Board of Governors; the Director, Office of Cuba Broadcasting; the Secretary of State; and the Director, Office of Management and Budget. This report will also be made available to others on request. This report will be available at no charge on GAO's Web site at http://www.gao.gov. If you or your staff have any questions about this report or need additional information, please contact me at (202) 512-4841 or [email protected]. Contact points for our Office of Congressional Relations and Public Affairs may be found on the last page of this report. Staff acknowledgments are listed in appendix III. Our objectives were to evaluate the processes used (1) by the International Broadcasting Bureau (IBB) to award the Radio Mambi and TV Azteca contracts, and (2) by the Office of Cuba Broadcasting (OCB) to award its talent services contracts. For the purposes of this review, talent services contracts refer to those contracts awarded by OCB for writers, performers, program hosts, reporters, and technical support required to produce and broadcast radio and TV news and entertainment programming. To determine the laws and regulations governing the award of these contracts, we reviewed the Competition in Contracting Act, the Federal Acquisition Regulation, the Broadcasting Board of Governors' (BBG) Acquisition Regulations, and IBB's Contracting for Talent and Other Professional Services Handbook. Collectively, these provide guidance applicable to IBB and OCB on soliciting, evaluating, and awarding contracts above and below the simplified acquisition threshold of $100,000. We also reviewed the Smith-Mundt Act, as amended, as well as the Radio Broadcasting to Cuba Act and the Television Broadcasting to Cuba Act to determine the authority by which OCB may broadcast radio and television programming to Cuba. We did not specifically assess whether the award and the terms and conditions of the broadcasting contracts were in compliance with these acts. To evaluate the processes used by IBB to award the Radio Mambi and TV Azteca contracts, we reviewed the contract files to determine the information and assumptions supporting IBB's decisions leading to the award of the two contracts in December 2006. As both contracts were awarded using other than full and open competition, we reviewed IBB's justification and approval documents and other unclassified documentation supporting the solicitation process and award decision, including the July 2006 report of the Commission for Assistance to a Free Cuba. We also interviewed officials in IBB's offices of Marketing and Program Placement, Engineering, and Contracts, as well as officials from BBG's offices of General Counsel and Congressional Relations, to determine their roles and responsibilities to identify potential service providers and to negotiate and award the two contracts. Additionally, we interviewed the Director, OCB, and other senior OCB officials, as well as officials from the Department of State, to obtain information on their involvement with the award of these contracts. To assess the processes used by OCB to award its talent services contracts, we compiled information from the Federal Procurement Data System-Next Generation on the contracts awarded by OCB from fiscal years 2005 through 2007. This analysis identified 723 contracts or contract actions valued at over $3,000 for various goods and services. We then selected a stratified random selection of 37 talent services contracts-- examining at least 10 from each year--for a more in-depth review. Because of our sample size, the results of our analysis of these contracts can not be generalized to describe the process used to award all of OCB's contracts. Specifically, we reviewed the contract files to determine the extent to which the files contained documentation of the process used to solicit and evaluate resumes from potential talent services contractors. We also analyzed how the rates paid to the contractors compared against the rates recommended by IBB's handbook. We also interviewed OCB program managers and senior contracting officials to obtain insight into how OCB determined its requirements and selected talent services contractors. We interviewed senior IBB officials to obtain information on how IBB's handbook was developed and the procedures that OCB should follow when awarding talent services contracts. To provide context for OCB's contracting activities, we analyzed budget and financial data provided by BBG's Chief Financial Officer for fiscal year 2007 and verified our summary of the information using budget activities data provided by OCB's Director of Administration. We conducted this performance audit from February 2008 through June 2008 in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. The following are GAO's comments on the Broadcasting Board of Governors' letter dated July 2, 2008. 1. As noted by BBG, the draft report provided to BBG correctly characterized the scope of GAO's review. In that regard, we reviewed two broadcasting contracts awarded by IBB on behalf of OCB. Similarly, the draft report noted that the results of our analysis of OCB's talent contracts are not generalizable to the processes used by OCB to award all of its contracts. Our work, however, does provide a sound basis for discussing OCB's processes for awarding its talent services contracts, which are essential for providing the on-air talent, writers, and technical support services needed to produce and broadcast its programming. We modified the draft report title for additional clarity. 2. We noted in the draft report that BBG was unable to provide documentation of certain classified aspects of deliberations involving several agencies, including the National Security Council and the Departments of Defense and State. BBG officials stated that they were not authorized to disclose this information. As a result, BBG officials expressed concern that they were unable to fully illustrate the sense of urgency that surrounded their contracting efforts. We do not see this as a limitation to our scope, however, as we did not question BBG's determination that there was an urgent and compelling need to award the broadcasting contracts. Rather, we noted that the agency failed to follow sound practices in such areas as documentation, stakeholder involvement, and planning for future competition, which are required by federal or agency acquisition regulations, and are not related to or dependent on BBG's disclosure of classified information. We also note that the July 2006 report by the Commission for Assistance to a Free Cuba is unclassified and publicly available on the Commission's Web site. 3. We believe the draft report appropriately reflected the context and process used by IBB to identify and award the radio and television broadcasting contracts. We do note, however, that the contract files do not make reference to the 2002 engineering study; rather, IBB officials provided that information during the course of our review to supplement the information in the files. We also note that the 2002 engineering report discussed only radio stations, and not television stations. In that regard, we found that OCB identified four local Miami television stations carried on DirecTV for IBB's consideration in August 2006. 4. We stated in the draft report that as an interim measure to conducting a full and open competition, on April 25, 2008, agency officials advertised in Federal Business Opportunities their intention to exercise the final option with TV Azteca to extend contract services into December 2008. We note, however, that agency officials had not taken any action in this regard until we brought it to their attention during the course of our review that they were not in compliance with the notice requirements prescribed by the Federal Acquisition Regulation. 5. We believe the draft report appropriately reflected the context and process used by IBB to identify and award the radio and television broadcasting contracts. As the draft report noted, however, IBB did not document in its contract files key information or assumptions underlying its decision not to seek competitive offers, to limit the number of potential providers it considered, and to limit the basis used to negotiate final prices for the services provided. In these cases, IBB officials supplemented the information contained in the contract files by providing information and e-mails from their personal files. We do note that BBG's description of TV Azteca as the best overall value to the government (factoring the broadcast schedule times, surrounding programming, as well as cost) is somewhat inconsistent with the information contained in the contract files and subsequently provided by IBB. Our review found that TV Azteca was the only television station with a feasible offer after a preferred station withdrew its offer, and thus became the basis for IBB's determination that only one responsible source could meet its needs. In addition to the contact above, Timothy J. DiNapoli, Assistant Director; Katherine Trimble; Justin Jaynes; Leigh Ann Nally; Julia Kennon; and John Krump made key contributions to this report. This is a work of the U.S. government and is not subject to copyright protection in the United States. It may be reproduced and distributed in its entirety without further permission from GAO. However, because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. | The United States has long provided the Cuban people with alternative sources of news and information. As part of this effort, in December 2006 the International Broadcasting Bureau (IBB) awarded sole-source contracts to two Miami radio and television stations--Radio Mambi and TV Azteca--to provide additional broadcasting options. Additionally, the Office of Cuba Broadcasting (OCB) annually awards millions of dollars in contracts for talent services--writers, reporters, and technical support--needed to produce and broadcast news and entertainment programming. GAO evaluated the processes used to award (1) the Radio Mambi and TV Azteca broadcasting contracts, and (2) talent services contracts. We reviewed contract files and other documentation and interviewed program managers and contracting officers to determine the process used to award the two broadcasting contracts and a nongeneralizable selection of 37 talent services contracts. IBB's approach for awarding the Radio Mambi and TV Azteca contracts did not reflect sound business practices. According to officials from IBB and the Broadcasting Board of Governors--IBB's and OCB's parent organization--the confluence of several interrelated events--ongoing interagency deliberations, the issuance of a July 2006 report by a Cabinet-level commission, and concerns about the health of Fidel Castro--required them to quickly obtain additional broadcasting services to Cuba. Competition laws and regulations provide agencies considerable flexibility to use noncompetitive procedures, if adequately justified, to meet their needs. In certain respects, however, IBB did not fully document in its contract files key information or assumptions underlying its decisions to not seek competitive offers, limit the number of potential providers it considered, or the basis used to negotiate the final prices for the services provided. Additionally, IBB did not actively involve its contracting office until just prior to contract award, though agency regulations and our prior work identify that timely involvement by stakeholders helps promote successful acquisition outcomes. Finally, though it partly justified its awards based on urgency, IBB exercised multiple options on the two contracts to extend their period of performance into 2008. Only recently has it taken steps to identify additional providers. OCB's practices for soliciting, evaluating, and selecting its talent contractors provide limited visibility at key steps. OCB issues quarterly announcements in Federal Business Opportunities, advertises annually in a local newspaper, and posts announcements at OCB's headquarters. OCB does not require, however, that managers document instances in which resumes were received from sources outside these processes, such as when a contractor is recommended by an OCB employee. Further, OCB does not document why other potential providers were not selected as required by IBB's guidance, in part due to questions about how to meet this requirement. Lastly, OCB managers use an IBB handbook to justify how much it pays for talent services, but the usefulness of the handbook's pricing guidance may be limited as the recommended rates are not current or based on the local market. | 7,313 | 650 |
Information technology should enable government to better serve the American people. However, according to OMB, despite spending more than $600 billion on IT over the past decade, the federal government has achieved little of the productivity improvements that private industry has realized from IT. Too often, federal IT projects run over budget, behind schedule, or fail to deliver promised functionality. In combating this problem, proper oversight is critical. Both OMB and federal agencies have key roles and responsibilities for overseeing IT investment management and OMB is responsible for working with agencies to ensure investments are appropriately planned and justified. However, as we have described in numerous reports, although a variety of best practice documentation exists to guide their successful acquisition, federal IT projects too frequently incur cost overruns and schedule slippages while contributing little to mission- related outcomes. IT acquisition best practices have been developed by both industry and the federal government. For example, the Software Engineering Institute has developed highly regarded and widely used guidance on best practices, such as requirements development and management, risk management, configuration management, validation and verification, and project monitoring and control. This guidance also describes disciplined project management practices that call for the development of project details, such as objectives, scope of work, schedules, costs, and requirements against which projects can be managed and executed. In the federal government, GAO's own research in IT management best practices led to the development of the Information Technology Investment Management Framework, which describes essential and complementary IT investment management disciplines, such as oversight of system development and acquisition management, and organizes them into a set of critical processes for successful investments. This guidance further describes five progressive stages of maturity that an agency can achieve in its investment management capabilities, and was developed on the basis of our research into the IT investment management practices of leading private- and public-sector organizations. GAO has also identified opportunities to improve the role played by Chief Information Officers (CIO) in IT management. In noting that federal law provides CIOs with adequate authority to manage IT for their agencies, GAO also reported on limitations that impeded their ability to exercise this authority. Specifically, CIOs have not always had sufficient control over IT investments; more consistent implementation of CIOs' authority could enhance their effectiveness. Congress has also enacted legislation that reflects IT management best practices. For example, the Clinger-Cohen Act of 1996, which was informed by GAO best practice recommendations, requires federal agencies to focus more on the results they have achieved through IT investments, while concurrently improving their IT acquisition processes. Specifically, the act requires agency heads to implement a process to maximize the value of the agency's IT investments and assess, manage, and evaluate the risks of its IT acquisitions. Further, the act establishes CIOs to advise and assist agency heads in carrying out these responsibilities. The act also requires OMB to encourage agencies to develop and use best practices in IT acquisition. Additionally, the E-Government Act of 2002 established a CIO Council, which is led by the Federal CIO, to be the principal interagency forum for improving agency practices related to the development, acquisition, and management of information resources, including sharing best practices. Although these best practices and legislation can have a positive impact on major IT programs, we have previously testified that the federal government continues to invest in numerous failed and troubled projects. We stated that while OMB's and agencies' recent efforts had resulted in greater transparency and oversight of federal spending, continued leadership and attention was necessary to build on the progress that had been made. In an effort to end the recurring cycle of failed IT projects, this committee has introduced legislation to improve IT acquisition management. Among other things, this legislation would eliminate duplication and waste in IT acquisition, and increase the authority of agency CIOs, strengthen and streamline IT acquisition management practices. We have previously testified in support of this legislation. OMB plays a key role in helping federal agencies manage their investments by working with them to better plan, justify, and determine how much they need to spend on projects and how to manage approved projects. In June 2009, OMB established the IT Dashboard to improve the transparency into and oversight of agencies' IT investments. According to OMB officials, agency CIOs are required to update each major investment in the IT Dashboard with a rating based on the CIO's evaluation of certain aspects of the investment, such as risk management, requirements management, contractor oversight, and human capital. According to OMB, these data are intended to provide a near real-time perspective of the performance of these investments, as well as a historical perspective. Further, the public display of these data is intended to allow OMB, congressional and other oversight bodies, and the general public to hold government agencies accountable for results and progress. In January 2010, the Federal CIO began leading TechStat sessions-- reviews of selected IT investments between OMB and agency leadership--to increase accountability and transparency and improve performance. OMB has identified factors that may result in an investment being selected for a TechStat session, such as--but not limited to-- evidence of (1) poor performance; (2) duplication with other systems or projects; (3) unmitigated risks; and (4) misalignment with policies and best practices. OMB reported that as of April 2013, 79 TechStat sessions had been held with federal agencies. According to OMB, these sessions enabled the government to improve or terminate IT investments that were experiencing performance problems. For example, in June 2010 the Federal CIO led a TechStat on the National Archives and Records Administration's (NARA) Electronic Records Archives investment that resulted in six corrective actions, including halting fiscal year 2012 development funding pending the completion of a strategic plan. Similarly, in January 2011, we reported that NARA had not been positioned to identify potential cost and schedule problems early, and had not been able to take timely actions to correct problems, delays, and cost increases on this system acquisition program. Moreover, we estimated that the program would likely overrun costs by between $205 and $405 million if the agency completed the program as originally designed. We made multiple recommendations to the Archivist of the United States, including establishing a comprehensive plan for all remaining work, improving the accuracy of key performance reports, and engaging executive leadership in correcting negative performance trends. Drawing on the visibility into federal IT investments provided by the IT Dashboard and TechStat sessions, in December 2010, OMB issued a plan to reform IT management throughout the federal government over an 18-month time frame. Among other things, the plan noted the goal of turning around or terminating at least one-third of underperforming projects by June 2012. The plan contained two high-level objectives: achieving operational efficiency, and effectively managing large-scale IT programs. To achieve operational efficiencies, the plan outlined actions required to adopt cloud solutions and leverage shared services. To effectively manage IT acquisitions, the plan identified key actions, such as improving accountability and governance and aligning acquisition processes with the technology cycle. Our April 2012 report on the federal government's progress on implementing the plan found that not all action items had been completed. These findings are discussed in greater detail later in the next section. We have previously reported that OMB has taken significant steps to enhance the oversight, transparency, and accountability of federal IT investments by creating its IT Dashboard, by improving the accuracy of investment ratings, and by creating a plan to reform federal IT. However, we also found issues with the accuracy and data reliability of cost and schedule data, and recommended steps that OMB should take to improve these data. In July 2010, we reported that the cost and schedule ratings on OMB's Dashboard were not always accurate for the investments we reviewed, because these ratings did not take into consideration current performance. As a result, the ratings were based on outdated information. We recommended that OMB report on its planned changes to the Dashboard to improve the accuracy of performance information and provide guidance to agencies to standardize milestone reporting. OMB agreed with our recommendations and, as a result, updated the Dashboard's cost and schedule calculations to include both ongoing and completed activities. Similarly, our report in March 2011 noted that OMB had initiated several efforts to increase the Dashboard's value as an oversight tool and had used its data to improve federal IT management. However, we also reported that agency practices and the Dashboard's calculations contributed to inaccuracies in the reported investment performance data. For instance, we found missing data submissions or erroneous data at each of the five agencies we reviewed, along with instances of inconsistent program baselines and unreliable source data. As a result, we recommended that the agencies take steps to improve the accuracy and reliability of their Dashboard information, and that OMB improve how it rates investments relative to current performance and schedule variance. Most agencies generally concurred with our recommendations and three have taken steps to address them. OMB agreed with our recommendation for improving ratings for schedule variance. It disagreed with our recommendation to improve how it reflects current performance in cost and schedule ratings, but more recently made changes to Dashboard calculations to address this while also noting challenges in comprehensively evaluating cost and schedule data for these investments. Subsequently, in November 2011, we noted that the accuracy of investment cost and schedule ratings had improved since our July 2010 report because OMB refined the Dashboard's cost and schedule calculations. Most of the ratings for the eight investments we reviewed as part of our November 2011 report were accurate, although we noted that more could be done to inform oversight and decision making by emphasizing recent performance in the ratings. We recommended that the General Services Administration comply with OMB's guidance for updating its ratings when new information becomes available (including when investments are rebaselined). The agency concurred and has since taken actions to address this recommendation. Since we previously recommended that OMB improve how it rates investments, we did not make any further recommendations. Further, in April 2012, we reported that OMB and key federal agencies had made progress on implementing actions items from its plan to reform IT management, but found that there were several areas where more remained to be done. Specifically, we reviewed 10 actions and found that 3 were complete, while 7 were incomplete. For example, we found that OMB had reformed and strengthened investment review boards, but had only partially issued guidance on modular development. Accordingly, we recommended, among other things, that OMB ensure that the action items called for in the plan be completed by the responsible parties prior to the completion of the plan's 18-month deadline of June 2012, or if the June 2012 deadline could not be met, by another clearly defined deadline. OMB agreed to complete the key action items. Finally, we reviewed OMB's efforts to help agencies address IT projects with cost overruns, schedule delays, and performance shortfalls in June 2013. In particular, we reported that OMB used CIO ratings from the Dashboard, among other sources, to select at- risk investments for reviews known as TechStats. OMB initiated these reviews in January 2010 to further improve investment performance, and subsequently incorporated the TechStat model into its plan for reforming IT management. We reported that OMB and selected agencies had held multiple TechStat sessions but additional OMB oversight was needed to ensure that these meetings were having the appropriate impact on underperforming projects and that resulting cost savings were valid. Among other things, we recommended that OMB require agencies to address their highest- risk investments and to report on how they validated the outcomes. OMB generally agreed with our recommendations, and stated that it and the agencies were taking appropriate steps to address them. Subsequent to the launch of the Dashboard and the TechStat reviews, and to help the federal agencies address the well-documented acquisition challenges they face, in 2011, we reported on nine common factors critical to the success of IT investment acquisitions. Specifically, we reported that department officials from seven agencies each identified a successful investment acquisition, in that they best achieved their respective cost, schedule, scope, and performance goals. To identify these investments, we interviewed officials from the 10 departments with the largest planned IT budgets in order for each department to identify one mission-critical, major IT investment that best achieved its cost, schedule, scope, and performance goals. Of the 10 departments, 7 identified successful IT investments, for a total of 7 investments. Officials from the 7 investments cited a number of success factors that contributed to these investments' success. According to federal department officials, the following seven investments (shown in table 1) best achieved their respective cost, schedule, scope, and performance goals. The estimated total life-cycle cost of the seven investments is about $5 billion. Among these seven IT investments, officials identified nine factors as critical to the success of three or more of the seven. The factors most commonly identified include active engagement of stakeholders, program staff with the necessary knowledge and skills, and senior department and agency executive support for the program. These nine critical success factors are consistent with leading industry practices for IT acquisitions. Table 2 shows how many of the investments reported the nine factors and selected examples of how agencies implemented them are discussed below. A more detailed discussion of the investments' identification of success factors can be found in our 2011 report. Officials from all seven selected investments cited active engagement with program stakeholders--individuals or groups (including, in some cases, end users) with an interest in the success of the acquisition--as a critical factor to the success of those investments. Agency officials stated that stakeholders, among other things, reviewed contractor proposals during the procurement process, regularly attended program management office sponsored meetings, were working members of integrated project teams, and were notified of problems and concerns as soon as possible. In addition, officials from the two investments at National Nuclear Security Administration and U.S. Customs and Border Protection noted that actively engaging with stakeholders created transparency and trust, and increased the support from the stakeholders. Additionally, officials from six of the seven selected investments indicated that the knowledge and skills of the program staff were critical to the success of the program. This included knowledge of acquisitions and procurement processes, monitoring of contracts, large-scale organizational transformation, Agile software development concepts, and areas of program management such as earned value management and technical monitoring. Finally, officials from five of the seven selected investments identified having the end users test and validate the system components prior to formal end user acceptance testing for deployment as critical to the success of their program. Similar to this factor, leading guidance recommends testing selected products and product components throughout the program life cycle. Testing of functionality by end users prior to acceptance demonstrates, earlier rather than later in the program life cycle, that the functionality will fulfill its intended use. If problems are found during this testing, programs are typically positioned to make changes that are less costly and disruptive than ones made later in the life cycle would be. In summary, the expanded use of these critical IT acquisition success factors, in conjunction with industry and government best practices, should result in the more effective delivery of mission-critical systems. Further, these factors support OMB's objective of improving the management of large-scale IT acquisitions across the federal government, and wide dissemination of these factors could complement OMB's efforts. While OMB's and agencies' recent efforts have resulted in greater transparency and oversight of federal spending, continued leadership and attention are necessary to build on the progress that has been made. By improving the accuracy of information on the IT Dashboard, and holding additional TechStat reviews, management attention can be better focused on troubled projects and establishing clear action items to turn these projects around or terminate them. Further, legislation such as that proposed by this committee can play an important role in increasing the authority of agency CIOs and improving federal IT acquisition management practices. Overall, the implementation of our numerous recommendations regarding key aspects of IT acquisition management can help OMB and federal agencies continue to improve the efficiency and transparency with which IT investments are managed, in order to ensure that the federal government's substantial investment in IT is being wisely spent. Chairman Issa, Ranking Member Cummings, and Members of the Committee, this completes my prepared statement. I would be pleased to respond to any questions that you may have at this time. If you or your staffs have any questions about this testimony, please contact me at (202) 512-9286 or at [email protected]. Individuals who made key contributions to this testimony are Dave Hinchman (Assistant Director), Deborah Davis, Rebecca Eyler, Kaelin Kuhn, Thomas Murphy, Jamelyn Payan, and Jessica Waselkow. This is a work of the U.S. government and is not subject to copyright protection in the United States. The published product may be reproduced and distributed in its entirety without further permission from GAO. However, because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. | The federal government reportedly plans to spend at least $82 billion on IT in fiscal year 2014. Given the scale of such planned outlays and the criticality of many of these systems to the health, economy, and security of the nation, it is important that federal agencies successfully acquire these systems--that is, ensure that the systems are acquired on time and within budget and that they deliver the expected benefits and functionality. However, GAO has previously reported and testified that federal IT projects too frequently incur cost overruns and schedule slippages while contributing little to mission-related outcomes. To help improve these efforts, OMB has launched several initiatives intended to improve the oversight and management of IT acquisitions. In addition, during the past several years GAO has issued multiple reports and testimonies on federal initiatives to acquire and improve the management of IT investments. As discussed with committee staff, GAO is testifying today on IT best practices, with a focus on the results of its report issued on the critical success factors of major IT acquisitions. To prepare this statement, GAO drew on previously published work. Information technology (IT) acquisition best practices have been developed by both industry and the federal government. For example, the Software Engineering Institute has developed highly regarded and widely used guidance on best practices, such as requirements development and management, risk management, validation and verification, and project monitoring and control. GAO's own research in IT management best practices led to the development of the Information Technology Investment Management Framework, which describes essential and complementary IT investment management disciplines, such as oversight of system development and acquisition management, and organizes them into a set of critical processes for successful investments. GAO also recently reported on the critical factors underlying successful IT acquisitions. Officials from federal agencies identified seven investments that were deemed successfully acquired in that they best achieved their respective cost, schedule, scope, and performance goals. Agency officials identified nine common factors that were critical to the success of three or more of the seven investments. Officials from all seven investments cited active engagement with program stakeholders as a critical factor to the success of those investments. Agency officials stated that stakeholders regularly attended program management office sponsored meetings; were working members of integrated project teams; and were notified of problems and concerns as soon as possible. Additionally, officials from six investments indicated that knowledge and skills of the program staff, and support from senior department and agency executives were critical to the success of their programs. Further, officials from five of the seven selected investments identified having the end users test and validate the system components prior to formal acceptance testing for deployment as critical to the success of their program. These critical factors support the Office of Management and Budget's (OMB) objective of improving the management of large-scale IT acquisitions across the federal government; wide dissemination of these factors could complement OMB's efforts. GAO has made numerous recommendations to OMB and agencies on key aspects of IT acquisition management, as well as the oversight and management of those investments. | 3,607 | 620 |
Laws enacted since 2006 have directed CMS to collect performance information on providers and eventually reward quality and efficiency of care rather than reimburse for volume of services alone. The Tax Relief and Health Care Act of 2006 required the establishment of the Physician Quality Reporting System (PQRS) to encourage physicians to successfully report data needed for certain quality measures. PQRS applies payment adjustments to promote reporting by eligible Medicare professionals (EP)--including physicians, nurses, physical therapists, and others. In 2013, EPs could report data to PQRS using claims, electronic health records (EHR) or a qualified registry, or opt for CMS to calculate quality measures using administrative claims data. Under its group practice reporting option, CMS allows EPs to report to PQRS as a group, either through a registry or a web-based interface. The Medicare Improvements for Patients and Providers Act of 2008 established the Physician Feedback Program, under which CMS was required, beginning in 2009, to distribute confidential feedback reports, known as Quality and Resource Use Reports (QRUR), to show physicians their performance on quality and cost measures. The Patient Protection and Affordable Care Act required HHS to coordinate the Physician Feedback Program with a Value Modifier (VM) that will adjust fee-for-service (FFS) physician payments for the relative quality and cost of care provided to beneficiaries. In implementing the VM, CMS's Center for Medicare intends to use PQRS and cost data from groups of EPs defined at the taxpayer identification number level to calculate the VM and then report the payment adjustments in the QRURs. As required in the act, CMS plans to apply the VM first to select physicians in 2015 and to all physicians in 2017. As required by law, CMS implemented a performance feedback program for Medicare physicians, which serves as the basis for eventual payment adjustments. (See fig. 1.) In our December 2012 report on physician payment incentives in the VM program, we found that CMS had yet to develop a method of reliably measuring the performance of physicians in small practices, that CMS planned to reward high performers and penalize poor performers using absolute performance benchmarks, and that CMS intended to annually adjust payments 1 year after the performance measurement period ends. We recommended that CMS develop a strategy to reliably measure the performance of small physician practices, develop benchmarks that reward physicians for improvement as well as for meeting absolute performance benchmarks, and make the VM adjustments more timely, to better reflect recent physician performance. CMS agreed with our recommendations, but noted that it was too early to fully implement these changes. Private entities we reviewed provided feedback mostly to groups of primary care physicians practicing within newer delivery models. Each entity decided which measures to report and which performance benchmarks to use, leading to differences in report content across entities. Largely relying on claims data, health insurers spent from 4 to 6 months to produce the annual reports. To meet the information needs of physicians, they all provided feedback throughout the year. The entities also generally offered additional report detail and other resources to help physicians improve their performance. The private entities in our review had discretion in determining the number and type of physicians to be included in their performance reporting initiatives, and their feedback programs generally included only physician groups participating in newer delivery models--medical homes and ACOs--with which they contract. Within this set of providers, the entities used various approaches to further narrow the physician groups selected to receive performance feedback. For example, one entity told us that only physician groups accredited by a national organization focused on quality were eligible for participation in its medical home program, which included physician feedback reports. Private entities' feedback programs were generally directed toward primary care physician practices. One entity defined primary care as family medicine, internal medicine, geriatrics, and pediatrics; and included data on the services furnished by nurse practitioners and physician assistants in its medical group reports. The entities indicated that they rarely provided reports directly to specialty care physician groups. Among those that did, the programs typically focused on practice areas considered significant cost drivers--obstetrics/gynecology, cardiology, and orthopedics. Entities further limited their physician feedback programs to groups participating in medical homes with a sufficient number of attributed enrollees to ensure the reliability of the reported measures. In medical home models, enrollees are attributed to a physician (or physicians) responsible for their care, who is held accountable for the quality and cost of care, regardless of by whom or where the services are provided. Among those entities we spoke with, the minimum enrollment size for feedback reporting varied widely, with most requiring a minimum of between 200 and 1,000 attributed enrollees to participate in the program. For example, one entity had two levels of reporting in its medical home program, differentiated by the number of attributed enrollees. In one medical home model, the entity required more than 2,000 attributed enrollees for participation and rewarded the practices through shared savings. In a second medical home model, the entity included practices with fewer than 1,000 attributed enrollees, but these practices did not share in any savings. According to the entities in our study, small physician practices (including solo practitioners) typically received performance reports for quality improvement purposes only. Because smaller practices may not meet minimum enrollment requirements needed for valid measurement, private entities generally did not link their performance results to payment or use them for other purposes. For example, one entity provided feedback to practices of one to three primary care physicians upon request, but did not publicly report these practices' data on its website. To increase the volume of patient data needed for reliable reporting, some entities pooled data from several small groups and solo practitioners and issued aggregate reports for those small practices. Most of the entities that used this method said they applied their discretion in forming these "virtual" provider groups; however, another entity commented that allowing small practices to voluntarily form such groups for measurement purposes would be advantageous. Because each private entity in our study determined the number and types of measures on which it evaluated physician performance, the measures used in each feedback program differed. Each entity decided on quality measures to include, and many also identified utilization or cost measures for inclusion.ACOs to choose 8 to 10 measures from among a set of about 18 measures. To assess physicians' quality and utilization/cost results, the entities used absolute or relative performance benchmarks. Private entities generally report on physician quality using many more process of care measures than outcomes of care measures. Entities in our review commonly included indicators of clinical care in areas such as diabetes care, cardiovascular health, and prevention or screening services for both their adult and pediatric patients. The most common measure reported by all entities was breast cancer screening, followed by hemoglobin A1C measures, a service used to monitor diabetes. We found wide variation in the number and type of measures in private entities' quality measure sets. The total number of quality measures used in the feedback reports ranged from 14 to 51. Measures typically fell into one of several measurement areas, each with as few as one or as many as 20 individual measures. For example, in the quality measurement areas for pulmonary and respiratory conditions, one private entity reported on a single measure (appropriate use of medications for asthma), while another reported three measures (appropriate use of medications for asthma, appropriate testing for pharyngitis, and avoidance of antibiotic treatment for adults with acute bronchitis). Although primarily focused on clinical quality measures, entities also included nonclinical measures, such as patient safety and patient satisfaction. (See app. II for more information on the number and types of quality measures included in sample reports provided by the entities we reviewed.) Even when entities appeared to report on similar types of measures in common areas, we found considerable variability in each measure's definition and specification. For this reason, results shown in physician feedback reports may not be comparable across entities. As shown in figure 2, the diabetes hemoglobin A1C measure was defined and used in different ways in our selected entities' reports. In some cases, entities calculated the percentage of enrollees with diabetes within a certain age range that received the test. In other cases, the entities calculated the percentage of enrollees with diabetes within a certain age range that had either good or poor control of the condition, as determined from a specified hemoglobin A1C result. In addition, some entities defined their diabetic patient population as enrollees from 18 to 75 years of age, while another did not indicate the age range, and one entity set the age range from 18 to 64 years of age. Some, but not all, private entities in our review included utilization or cost measures in their performance reports to physicians. Total cost of care per enrollee was the most commonly used measure, but cost measures disaggregated by type of service--facility, pharmacy, primary care physician, and specialty--were also used. Some entities described how they limited their reporting of a total cost of care measure to those medical groups with a large number of enrollees. In one case the minimum enrollment size was 20,000 enrollees and in another it was 2,500 enrollees. Officials from one entity also told us that they allowed smaller physician practices to combine their data in order to meet the required number of enrollees for receiving feedback on cost of care. In addition to feedback on the total cost of care per enrollee, some reports given to groups of primary care physicians contained information on the cost of care provided by specialists in the entity's network. For example, one entity provided trend data that included the number of specialist visits (total and by type) and the number of patients with one or more visits for these specialty areas. (See fig. 3.) For the two specialties with the most enrollee visits during the measurement period--orthopedic surgery and dermatology--the entity also provided the medical group with data on which specialists were seen most frequently and their cost per visit. This information was intended to encourage cost-efficient referrals. Another entity said it was focused on a program in July 2013 to provide feedback to primary care physicians on cardiologists' performance showing where care was being delivered most efficiently. By providing such information, the entity expected primary care physicians to take cost differences into account when making referrals, rather than basing referrals solely on historical habits. Disseminating information to primary care physicians about the relative cost of specialty care providers is a key aspect of medical home and ACO programs. The entities were fairly consistent in the number and types of utilization measures they selected for feedback reporting. The most common utilization measures reported by our private entities were physicians' generic drug prescribing rates, followed by emergency department visits, inpatient visits, hospital readmissions, and specialist visits. One entity provided additional detail under the emergency department visits measure to show the number of patients that repeatedly seek care at emergency departments. Officials from the entity told us that this measure was included to alert physicians of potentially avoidable hospital visits so that they can encourage patients to use office-based care before seeking care in more costly settings. (See examples of this measure as presented by private entities in their sample reports in fig. 4.) To evaluate physician performance, the selected private entities compared the measures data to different types of benchmarks. Some entities compared each physician group's performance results to that of a peer group (e.g., others in the entity's network or others in the collaborative's state or region); some entities compared physician groups' results to a pre-established target; and others gauged physician groups' progress relative to their past performance. (See fig. 5.) Entities generally used two or three such benchmarks in their feedback reports. For example, one entity separately displayed results for the medical home's commercially insured, Medicare insured, and composite patient population. Within each of these population groups, it compared the practice's performance to the average for nonmedical home practices, as well as to the practice's performance in the prior measurement year. The entity also gave narrative detail to indicate favorable or unfavorable performance. The most common benchmark for the entities in our study was a physician group's performance relative to the previous measurement period. However, some entities used this benchmark only for utilization/cost measures and not for quality measures. Private entity officials told us they relied on claims as their primary data source for performance reporting. However, several private entities noted shortcomings in relying solely on claims data--the billing codes that describe a patient's diagnoses, procedures, and medications--for performance reporting. Some entities supplemented their claims data by obtaining information from EHRs, patient satisfaction surveys, or chart extractions. Entities noted that using EHR data was resource-intensive for both providers and payers, because they depended on physician groups to submit the information. The entities we spoke to have had limited success in using EHR data as a primary data source, although many saw it as complementary to claims data. Another entity supplemented its claims data with data from registries that compile information from administrative data sets, patient medical records, and patient surveys, and thus have the capacity to track trends in quality over time. The health insurers in our review typically spent from 4 to 6 months to produce and distribute annual performance reports; in contrast, the health care collaboratives spent 9 to 10 months. (See illustrations of these timelines in fig. 6.) As is common in the health insurance industry, payers require a 3-month interval after the performance period ends--referred to as the claims run-out--to allow claims for the services furnished late in the measurement period to be submitted and adjudicated for the report. The claims run-out was followed by 1 to 3 months to prepare the data, a period that allowed for provider attribution, risk-adjustment, measure calculation, and quality assurance.collaborative stated that the quality assurance process is helpful in increasing physician trust because the group is able to compare its own data with the collaborative's data before results are final. The statewide health care collaboratives we spoke with required additional time to collect and aggregate data from multiple health insurers, and their final reports were issued at least 9 months after the end of the performance period. The time needed for some or all of these report production steps varied depending on the entity and the types of measures included. Collaboratives often used all-payer claims databases--centralized data collection where each payer submits claims data on that state's health care providers--for aggregate reporting to providers. Officials from entities told us that all-payer claims databases are helpful because they provide physicians with a better picture of their entire patient panel, not just results determined by individual payers for limited sets of patients. One entity noted that it aggregates its quality data with other payers in its commercial market through a statewide organization, and no one payer can provide statistically meaningful data to a physician group on its own. Officials from one entity with all-payer claims database experience told us that the addition of Medicare data into these databases would improve the information available for measurement and feedback. In addition, one entity suggested that a multipayer database could help with feedback to physicians in groups of all sizes, including small practices, because the higher number of patients would generate sufficient data for calculating reliable measures. However, one entity acknowledged that using all-payer databases requires more time for merging data from different payers in different formats, and another entity noted the challenges of customizing reports for each medical groups' patient population. Private entities told us that physicians valued frequent feedback on their performance so that they have time to make practice changes that may result in better performance by the end of the measurement period. In response, these entities typically provided feedback reports on an interim basis throughout the measurement period. Interim reports typically covered a 1-year performance period, and were commonly issued on a rolling monthly, quarterly, or semiannual schedule. Entities also noted that frequent reporting throughout the period updated physicians on their performance so that year-end results were better expected and understood. Some entities in our study elected to issue interim reports that build up to the 12-month performance period by continually adding data from month to month. Those that used preliminary data that may not account for all final claims in building reports told us that such data starts to become useful about 3 to 6 months into the performance year. They also stated that, although the interim reports may be limited by the use of rolling or incomplete data, providers generally seek this information for early identification of gaps in care. Private entities generally offered additional report detail intended to enhance physicians' understanding of the information contained in their reports or in response to physician requests for more data. Private entity officials told us that, because physicians prefer dynamic reports with as much detail as possible, they generally sent reports that can be expanded to show individual physician or patient-level data. Some entities formatted their reports to include summary-level information on quality and cost measures in labeled sections, with supplemental information following the summary data. Other entities provided additional reports or supplemental data through a web portal that allowed providers to see individual physician or patient-level detail. Private entities sent reports in multiple file formats, such as in a spreadsheet, some of which allowed report recipients to sort their data. Entities in our study also offered resources designed to assist physician groups with actionable steps they can take to improve in the next performance period. Most entities told us they offered resources to physician groups, such as consultations with quality improvement professionals, forums for information-sharing, and documents on best practices. For example, one entity's staff worked directly with practices to improve their results by distributing improvement guidelines for each performance measure included in the feedback report. In addition, the entity's officials told us they also convened workgroups to review trend information and paid particular attention to differences between medical homes and nonmedical homes. CMS has provided feedback to increasing numbers of physician practices each year in order to eventually reach all physicians. Each medical group's chosen method of quality data submission determined the quality measures included in its report, to which CMS added health care costs and certain outcomes measures. CMS's report generation process took slightly longer than that of most private entities in our study, and the agency did not provide interim performance data during the measurement period. CMS feedback reports have included information to assist providers in interpreting their performance results. Unlike the private entities we contacted, which selected a limited set of physicians to receive feedback reports, CMS is mandated to apply the VM to all physicians by 2017. Therefore, the agency faces certain challenges not faced by private entities as it has expanded its feedback program to reach increasing numbers of physicians. In preparation for implementation of the VM, CMS provided performance reports to nearly 4,000 medical groups in September 2013. In 2014, CMS plans to disseminate reports to physicians in practices of all sizes. As of September 2013, CMS had not yet determined how to report to smaller groups and physicians in solo practices. According to CMS, the decision not to present VM information to smaller groups stemmed from concerns regarding untested cost metrics and administrative complexity. CMS agreed with a 2012 GAO recommendation to develop a strategy to reliably measure the performance of solo and small physician practices, but has not yet finalized such a strategy. Under the CMS approach to performance reporting, the content of feedback reports related to quality measures may vary across providers. Unlike our selected private entities, the agency has allowed physician groups to select the method by which they will submit quality-of-care data, which, in turn, determines the measures on which they receive feedback. CMS used claims data for a consistent set of measures in all of its feedback reports for performance on cost and outcomes. For the CMS 2013 reports, medical groups submitted data on quality measures to CMS via a web interface or through a qualified registry; if a group did not select either of these options, the agency calculated quality measures based on claims data. Both CMS and private entities focused on preventive care and management of specific diseases. Web interface. Quality measures under this method pertain to care coordination, disease management, and preventive services. CMS required groups reporting via the web interface to submit data on 17 quality measures--such as hemoglobin A1C levels for control of diabetes--for a patient sample of at least 218 beneficiaries. Registries. Some groups submitted data for quality measures via qualified registries--independent organizations, typically serving a particular medical specialty, that collect and report these data to CMS. CMS required groups reporting to a qualified registry to submit at least three measures--such as whether cardiac rehabilitation patients were referred to a prevention program--for at least 80 percent of patients. Administrative claims. As a default, if a group did not report via web interface or qualified registry, CMS calculated quality measures using claims data. In September 2013, the majority of groups with 25 or more EPs--nearly 90 percent--received quality scores based on claims data. CMS calculated performance on a set of 17 quality indicators, including several composite measures. For example, the diabetes composite measure included several different measures of diabetes control. Regardless of the method a group selected to submit quality-of-care data, CMS used claims to calculate three outcomes measures--two ambulatory care composite measures and hospital readmission. One ambulatory care composite included hospitalization rates for three acute conditions: bacterial pneumonia, urinary tract infections, and dehydration. Another composite included hospitalization rates for three chronic conditions: diabetes, chronic obstructive pulmonary disease (COPD), and heart failure. CMS included cost measures--several of which differed from the measures private entities in our study reported to physicians--in all 2013 feedback reports (see fig. 7). Using claims data, CMS calculated an overall measure of the cost of care as the total per capita costs for all beneficiaries attributed to each physician group.separately reported total per capita costs for attributed beneficiaries with any of four chronic conditions: diabetes, heart failure, COPD, or coronary artery disease. This contrasts with the private entities that typically measured a more limited set of measures focused on physicians' generic drug prescribing rates and hospital utilization. CMS's report generation process took longer than that of most private entities in our study because it required more steps. While most health insurers generated performance reports in 4 to 6 months, CMS issued reports about 9 months after the end of the January to December 2012 reporting period. To produce its 2013 physician feedback reports using administrative claims, CMS began with the standard claims run-out period followed by intervals for provider attribution, measure calculation, risk-adjustment, and quality assurance. (See fig. 10.) CMS officials said they allowed a 3-month run-out interval to account for providers' late-year claims submissions. After the run-out period, CMS required 5 to 6 months for a series of additional tasks needed to prepare the data for reporting. For groups that submitted data to CMS via the web interface or registry options, CMS gave these groups 3 months to submit such data after the end of the 12-month performance period. CMS then calculated the measures for these options over a period of the next several months. Although FFS beneficiaries see multiple physicians, CMS attributed each beneficiary to a single medical group through its yearly attribution process. It used the claims for the 12-month reporting period to determine which groups provided the beneficiary the most primary care and then assigned responsibility for performance on quality and cost measures to that group. Following attribution, the agency risk-adjusted the cost measures to account for differences in beneficiary characteristics and complexity, and standardized the cost measures by removing all geographic payment adjustments. Finally, CMS officials said they performed data checks to ensure accuracy before the reports were disseminated. According to health insurers and collaboratives, physicians find that frequent feedback enables them to improve their performance more quickly; however, CMS did not provide physicians interim performance feedback. However, with only annual feedback from CMS, physicians may be missing an opportunity to improve their performance on a more frequent basis. Asked if more frequent reporting was considered, CMS officials cited concerns about the time it would take to generate each set of reports. With each round, the agency would need to attribute all beneficiaries to a medical group, risk-adjust and standardize the cost measures, and compute the benchmarks for each measure. In addition, providing interim reports on quality data would require certain providers to report more frequently. For example, providers who submit via registry would need to finalize their data more often than annually. However, experts and CMS officials have stated that, with continued adoption of advanced data reporting technology, CMS may be able to generate reports more frequently. CMS provided general information on its website and through the Medicare Learning Network, to assist providers in understanding the performance feedback and VM. Unlike private entities, CMS has not provided tailored guidance or action steps to help providers improve their scores. However, CMS resources included steps to access reports, a review of methodology, suggested ways to use the data in reports, and contact information for technical support. A representative acting on behalf of a medical group could access the group's QRUR. In addition, CMS's web-based reports allowed providers to access further detail on the Medicare beneficiaries attributed to the group. For example, physicians could view their patients' percentage of total cost by type of service and hospital admission data. CMS included explanatory information within the reports for providers. In addition to comparative performance data, reports made available in September 2013 included a description of the attribution methods, the number of providers billing in each medical group, information about each attributed patient's hospitalizations during the year, and other details about the group's performance. In addition, CMS included within the QRUR a glossary of terms used in the feedback report. Payers have been refining their performance reports for physicians, a key component of their VBP initiatives. Private entities have selectively rolled out their feedback programs, generally applying them to relatively large groups of primary care physicians participating in medical homes and ACOs. Although they are not uniform in their approaches, the entities in our study used their discretion to select a limited number of quality and utilization/cost measures, calculated them using claims data, and used them to assess performance against a variety of benchmarks. In response to physicians' needs, their feedback reports tended to be frequent, timely, and dynamic. CMS's approach to performance reporting faces some unique challenges. First, it is driven by the statutory requirement that, by 2017, Medicare pay FFS physicians in groups of all sizes, including specialists, using a VM. Second, the agency has had to develop the feedback program in the context of pre-existing incentive programs, such as PQRS. CMS finalized several key changes to the feedback program for future reporting periods, as it expands the application of the VM to all physicians. Specifically, CMS continues to modify program components such as measures and reporting mechanisms as it works to align the reporting and feedback aspects of multiple programs. Despite these program modifications, we found that certain features of private entities' feedback programs, which are lacking in CMS's program, could enhance the usefulness of the reports in improving the value of physician care. CMS's use of a single nationwide benchmark to compare performance on quality and cost ignores richer benchmarking feedback that could benefit physicians. Private entities in our study measured provider performance against several benchmarks. CMS's reliance on a national average as the sole benchmark precludes providers from gauging their performance relative to their peers in the same geographic area. Without such contextual information, providers lack the feedback to better manage their performance and target improvement efforts. Additionally, CMS disseminates feedback reports only once a year (for example, September 2013). This gives physicians little time (October through December) to analyze the information and make changes in their practices to score better in the next measurement period. The private entities we reviewed sent reports more than once a year, and reported that greater frequency of reporting enabled more frequent improvements. Without interim performance reports, providers may not be able to make needed changes to their performance in advance of their annual VM payment modifications. Our findings also support past GAO recommendations that CMS reward physicians for improvement as well as performance against absolute benchmarks, and develop a strategy to reliably measure solo and small practices, such as by aggregating data. As CMS implements and refines its physician feedback and VM programs, the Administrator of CMS should consider taking the following two actions to help ensure physicians can best use the feedback to improve their performance: Develop performance benchmarks that compare physicians' performance against additional benchmarks such as state or regional averages; and Disseminate performance reports more frequently than the current annual distribution--for example, semiannually. We provided a draft of this report to HHS for comment. In its written response, reproduced in appendix III, the department generally agreed with our recommendations, and reiterated our observation that the agency faces unique challenges with its mandate to report to Medicare FFS providers in groups of all sizes that encompass all specialty care areas. HHS conditionally agreed with our recommendation that reporting physician performance using multiple benchmarks would be beneficial, but asked for further information on private entities' practices and their potential use for Medicare providers. As we stated in the report, private entities generally use two or three different types of benchmarks to provide a variety of performance assessments. We found alternative benchmarks that could enhance Medicare feedback reporting by allowing physicians to track their performance in their own historical and geographic context. For example, some entities' reports included physician group performance on certain measures relative to their past performance, a recommendation we previously made to HHS in December 2012. Although it agreed to consider developing benchmarks for performance improvement, HHS has yet to do so. A comparison to past performance allows a medical group to see how much, if at all, it has improved regardless of where it stands relative to its peers. In this way, CMS can motivate physicians to continuously improve their performance. In addition, some entities in our review compared physician performance data to statewide or regional-level benchmarks. Because of the number of Medicare physicians, CMS has extensive performance data, which could enable more robust localized peer benchmarks than any individual health plan could generate. As we noted, such benchmarks reflect more local patterns of care that may be more relevant to physicians than comparisons to national averages alone. HHS further asserted that, because the physician feedback program's key purpose is to support the national VM program, it is appropriate to limit reporting to a single national benchmark. HHS expressed concern that displaying other benchmarks could be misleading and confusing for the purposes of the VM. However, CMS's reports provide a group's VM payment adjustment in a concise, one-page summary, as shown in figure 9. We do not believe that additional benchmark data, displayed separately, would detract from the information provided on the summary page, and could enhance the value of the reports for physicians. HHS agreed with our second recommendation to disseminate feedback reports more frequently than on an annual basis. As seen in the private entity practices of using rolling or preliminary data for interim reporting, disseminating reports more frequently can assist physicians in making improvements to their performance before CMS determines their VM payment adjustment. HHS commented that producing more frequent reports would first require modifying the PQRS data collection schedules. For example, groups of EPs that use the web interface and registry options currently are only required to submit data to CMS once a year. The registry option will eventually require groups to submit data to CMS on a quarterly or semiannual basis, and HHS noted that these requirements would have to be synchronized with the timing of data submission through the web interface and EHR options. The agency also provided technical comments, which we incorporated as appropriate. We are sending copies of this report to the appropriate congressional committees and the Administrator of CMS. The report also is available at no charge on GAO's website at http://www.gao.gov. If you or your staffs have any questions regarding this report, please contact me at (202) 512- 7114 or [email protected]. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this report. GAO staff who made major contributions to this report are listed in appendix IV. This appendix contains information on the similarities and differences between private entities' and Medicare's performance reporting to hospitals. The private entities in our study provided feedback through a variety of value-based payment (VBP) initiatives and several entities have made accountable care organizations the focus of their feedback programs. Payers' efforts to provide feedback to hospitals on their performance are centered on rewarding higher-quality and lower-cost providers of care. We followed the same methodology for comparing how private entities and the Centers for Medicare & Medicaid Services (CMS) conduct performance feedback reporting for hospitals as we did for examining physician-focused feedback programs. We interviewed representatives of the nine selected private entities about their feedback reporting to hospitals, if any, with regard to report recipients, data sources used, types of performance measures and benchmarks, frequency of reporting, and efforts to enhance the utility of performance reports. One statewide health care collaborative in our review was established through a partnership between the state medical society and hospital association, and only provides feedback reports to hospitals. We similarly requested sample feedback reports for hospitals. We interviewed CMS officials and obtained CMS documentation on its hospital feedback reporting activities, and compared these to private entity efforts. We also reviewed a sample CMS hospital feedback report from July 2013. CMS's hospital VBP efforts over the past decade have evolved to provide performance feedback to a range of hospital types, with a focus on acute care hospitals. In 2003 the agency began with a quality incentive demonstration program designed to see whether financial incentives to hospitals were effective at improving the quality of inpatient care, and to publicly report that information. Since then, a number of laws have required CMS to conduct both feedback reporting and VBP programs for hospitals. These included the following: The Medicare Prescription Drug, Improvement, and Modernization Act of 2003, which required the establishment of the Hospital Inpatient Quality Reporting Program, a pay-for-reporting initiative.required CMS to make downward payment adjustments to hospitals The act also that did not successfully report certain quality measures. That downward payment adjustment percentage was increased by the Deficit Reduction Act of 2005. The Patient Protection and Affordable Care Act established Medicare's Hospital VBP Program for inpatient care provided in acute care hospitals. Under this program, CMS withholds a percentage of all eligible hospitals' payments and distributes those funds to high- performing hospitals. In reviewing current feedback reporting practices, we found that private entities and CMS report to hospitals on similar performance measures and that entities' feedback generally contains publicly available data. Table 1 compares features of the hospital feedback produced by those private entities in our study that report to hospitals through a VBP initiative and CMS's hospital VBP program. Table 2 summarizes the number of quality measures included in sample physician feedback reports we received from private entities in our study. These entities used their discretion to determine which measures to include in their reports. We analyzed the measures focused on quality of care and categorized them into common areas. In addition to the contact named above, individuals making key contributions to this report include Rosamond Katz, Assistant Director; Sandra George; Katherine Perry; and E. Jane Whipple. Electronic Health Record Programs: Participation Has Increased, but Action Needed to Achieve Goals, Including Improved Quality of Care. GAO-14-207. Washington, D.C.: March 6, 2014. Clinical Data Registries: HHS Could Improve Medicare Quality and Efficiency through Key Requirements and Oversight. GAO-14-75. Washington, D.C.: December 16, 2013. Medicare Physician Payment: Private-Sector Initiatives Can Help Inform CMS Quality and Efficiency Incentive Efforts. GAO-13-160. Washington, D.C.: December 26, 2012. Medicare Program Integrity: Greater Prepayment Control Efforts Could Increase Savings and Better Ensure Proper Payment. GAO-13-102. Washington, D.C.: November 13, 2012. Medicare Physician Feedback Program: CMS Faces Challenges with Methodology and Distribution of Physician Reports. GAO-11-720. Washington, D.C.: August 12, 2011. Value in Health Care: Key Information for Policymakers to Assess Efforts to Improve Quality While Reducing Costs. GAO-11-445. Washington, D.C.: July 26, 2011. Medicare: Per Capita Method Can Be Used to Profile Physicians and Provide Feedback on Resource Use. GAO-09-802. Washington, D.C.: September 25, 2009. Medicare: Focus on Physician Practice Patterns Can Lead to Greater Program Efficiency. GAO-07-307. Washington, D.C.: April 30, 2007. | Health care payers--including Medicare--are increasingly using VBP to reward the quality and efficiency instead of just the volume of care delivered. Both traditional and newer delivery models use this approach to incentivize providers to improve their performance. Feedback reports serve to inform providers of their results on various measures relative to established targets. The American Taxpayer Relief Act of 2012 mandated that GAO compare private entity and Medicare performance feedback reporting activities. GAO examined (1) how and when private entities report performance data to physicians, and what information they report; and (2) how the timing and approach CMS uses to report performance data compare to that of private entities. GAO contacted nine entities--health insurers and statewide collaboratives--recognized for their performance reporting programs. Focusing on physician feedback, GAO obtained information regarding report recipients, data sources used, types of performance measures and benchmarks, frequency of reporting, and efforts to enhance the utility of performance reports. GAO obtained similar information from CMS about its Medicare feedback efforts. Private entities GAO reviewed for this study selected a range of measures and benchmarks to assess physician group performance, and provided feedback reports to physicians more than once a year. Private entities almost exclusively focused their feedback efforts on primary care physician groups participating in medical homes and accountable care organizations, which hold physicians responsible for the quality and cost of all services provided. They limited their feedback reporting to those with a sufficient number of enrollees to ensure the reliability of reported measures. The entities decided on the number and type of measures for their reports, and compared each group's performance to multiple benchmarks, including peer group averages or past performance. All the entities used quality measures, and some also used utilization or cost measures. Because of the variety of quality measures and benchmarks, feedback report content differed across the entities. Some entities noted that in addition to national benchmarks, they compared results to state or regional level rates to reflect local patterns of care which may be more relevant to their physicians. Most health insurers spent from 4 to 6 months to generate their performance reports, a period that allowed them to amass claims data as well as to make adjustments and perform checks on the measure calculations. Commonly, private entities issued interim feedback reports, covering a 1-year measurement period, on a rolling monthly, quarterly, or semiannual schedule. They told GAO that physicians valued frequent feedback in order to make changes that could result in better performance at the end of the measurement period. Feedback from the Centers for Medicare & Medicaid Services (CMS) included quality measures determined by each medical group, along with comparison to only one benchmark, and CMS did not provide interim reports to physicians. The agency has phased in performance feedback in order to meet its mandate to apply value-based payment (VBP) to all physicians in Medicare by 2017, a challenge not faced by private entities. In September 2013, CMS made feedback reports available to 6,779 physician groups. While private entities in this study chose the measures for their reports, CMS tied the selection of specific quality measures to groups' chosen method of submitting performance data. Although both CMS and private entities focused their feedback on preventive care and management of specific diseases, CMS's reports contained more information on costs and outcomes than some entities. While private entities employed multiple benchmarks, the agency only compared each group's results to the national average rates of all physician groups that submitted data on any given measure. CMS's use of a single benchmark precludes physicians from viewing their performance in fuller context, such as relative to their peers in the same geographic areas. CMS's report generation process took 9 months to complete, several months longer than health insurers in the study, although it included more steps. In contrast to private entity reporting, CMS sent its feedback report to physicians once a year, a frequency that may limit physicians' opportunity to make improvements in advance of their annual payment adjustments. The Department of Health and Human Services generally concurred with GAO's recommendations and asked for additional information pertaining to the potential value of using multiple benchmarks to assess Medicare physicians' performance. The Administrator of CMS should consider expanding performance benchmarks to include state or regional averages, and disseminating feedback reports more frequently than the current annual distribution. | 7,891 | 898 |
A structured settlement is the payment of money for a personal injury claim in which at least part of the settlement calls for future payment. The payments may be scheduled for any length of time, even as long as the claimant's lifetime, and may consist of installment payments and/or future lump sums. Payments can be in fixed amounts, or they can vary. The schedule is structured to meet the financial needs of the claimant. For years, structured settlements have been widely used in the tort area to compensate severely injured, often profoundly disabled, tort victims. Cases generally involve medical malpractice and other personal injury. The Federal Tort Claims Act (FTCA) is the statute by which the United States authorizes tort suits to be brought against itself. With certain exceptions,it makes the United States liable for injuries caused by the negligent or wrongful act or omission of any federal employee acting within the scope of his or her employment, in accordance with the law of the state where the act or omission occurred. Generally, a tort claim against the United States is barred unless it is presented in writing to the appropriate federal agency within 2 years after the claim accrues. In addition, the National Childhood Vaccine Injury Act of 1986, as amended, created a mechanism for compensating persons injured by certain pharmaceutical products. The act established the National Vaccine Injury Compensation Program (VICP) as an alternative to traditional product liability and/or medical malpractice litigation for persons injured by their receipt of one or more of the standard childhood vaccines required for admission to schools and by certain employers. VICP is "no- fault." That is, claimants need not establish that the vaccine was defective, or that any degree of negligence was involved in its administration. The only liability-related question is causation--did the vaccine cause the injury for which compensation is sought? The industry standard of practice requires the use of a licensed broker or insurance agent to obtain a settlement annuity. DOJ's Civil Divisionestimated that structured settlements constitute between 1 and 2 percent of all settlements in litigated tort cases. Brokers receive no direct compensation from the government; rather, they are compensated by the insurance company from whom the annuity is purchased. The insurance company typically pays the brokers' commissions, which amount to 3 or 4 percent of the annuity premium. The government attorney negotiating the case is responsible for selecting the broker. Structured settlements for the federal government are negotiated by the Civil Division's torts attorneys, Assistant United States Attorneys (AUSAs), or agency attorneys. AUSAs are authorized to settle certain cases. An agency may not settle a tort claim for more than $25,000 without the prior written approval of the Attorney General or her designee, unless the Attorney General has delegated to the head of the agency the authority to do so. To ascertain DOJ's policies and guidance for the selection of settlement brokers, we reviewed the Torts Branch handbook, Damages Under the Federal Tort Claim Act (section V: Settlements), and other relevant documents pertaining to broker selection policies. In addition, to obtain information about the procedures used to select brokers, we interviewed attorneys in DOJ's Civil Division and representatives from the Executive Office for United States Attorneys (EOUSA). To obtain information on broker selection policies and guidance used by federal agencies, we asked DOJ to identify other federal agencies that handled structured settlement claims. DOJ identified six agencies--HHS and VA; the Air Force, Army, and Navy; and the U.S. Postal Service. At each of the six agencies, we met with officials who were responsible for negotiating structured settlement claims. We discussed their policies and procedures for selecting structured settlement brokers and asked them what factors they considered during the selection process. In addition, we obtained and reviewed a copy of the Army's standard operating procedures pertaining to structured settlements. Also, we asked the six agencies to supply information pertaining to the number of structured settlements since May 1997. To provide the list of DOJ's structured settlement annuities between May 1, 1997, and May 1, 1999, we used data DOJ collected from the Civil Division and the United States Attorneys Offices. The Civil Division's data came from the Torts Branch, which routinely handles structured settlements. The United States Attorneys' data were collected by EOUSA and include all the data received by EOUSA as of August 12, 1999. As of that date, 34 of the 94 United States Attorneys offices had reported annuity settlements during the relevant time period. We did not verify the accuracy of the information collected from the Torts Branch or EOUSA. To gain a broader understanding of structured settlements, we met with the Executive Vice President of the National Structured Settlement Trade Association (NSSTA). We obtained information concerning brokers working with federal structured settlements. We did our audit work between June and December 1999 in accordance with generally accepted government auditing standards. We requested comments on a draft of this report from the United States Attorney General or her designee. Also, in January we discussed the contents of this report with VA's Assistant General Counsel; U.S. Postal Service' Claims Division Counsel; and the Army's Torts Claims Division Chief. Also, we obtain comments for the Air Force and Navy from DoD's Senior Report Analysis for the GAO Affairs Directorate. In addition, we spoke with HHS' Associate General Counsel. The written and oral comments we received are discussed near the end of the report. Although DOJ had established policies and guidance for the selection of structured settlement brokers, the policies and guidance did not include an internal control requiring attorneys to document their reasons for selecting a specific broker. Similarly, although the six agencies we reviewed said they generally followed DOJ's policy guidance for selecting a structured settlement broker, they were not required to document their reasons for selecting a particular broker. None of these agencies documented the reasons why they selected particular brokers. DOJ had established policies and guidance governing the selection of structured settlement brokers, but it did not require that the reasons for selecting a specific broker be documented. On July 16, 1993, the Director of the Civil Division's Torts Branch, which is responsible for FTCA claims and litigation, issued a memorandum that was intended to supplement the guidance on structured settlements in the Damages Handbook and to codify previous informal guidance on the selection of structured settlement brokers. Neither the Damages Handbook nor the memorandum addressed documenting the reasons for selecting a specific broker. On June 30, 1997, the Acting Associate Attorney General expanded the policy guidance by issuing a memorandum to United States Attorneys. However, the new guidance did not address documenting the reasons for broker selections. Generally, the 1997 policy guidance outlined procedures concerning the selection of structured settlement brokers. These included: Every broker was to be given an opportunity to promote its services. No lists of "approved," "preferred," or "disapproved" brokers were to be maintained. Brokers who performed well in the past were to be appropriately considered for repeated use: however, such use could not be to the exclusion of new brokers. Attorneys were expected to look to supervisory attorneys for assistance; however, final broker selection was the responsibility of the attorney negotiating the settlement. When a structured settlement in an FTCA case included a reversionary interest in favor of the United States, the Torts Branch's FTCA staff was to be consulted to maintain appropriate records and ensure consistency. Any activity tending toward an appearance of favoritism, any action contrary to any of the above rules, or any activity incongruent with the spirit of the memorandum was to be scrupulously avoided. According to agency officials, attorneys sometimes asked each other about their experiences with a particular broker, but the attorney negotiating the case is responsible for making the final broker selection, and is not required to consult with the FTCA staff. DOJ officials told us that in the absence of a requirement to do so, they did not document the reasons for selecting particular settlement brokers. The Comptroller General's guidance on internal controls in the federal government, Standards for Internal Control in the Federal Government (GAO/AIMD-00-21.3.1), requires that all transactions and significant events are to be clearly documented and that the documentation is to be readily available for examination. The documentation should appear in management directives, administrative policies, or operating manuals and may be in paper or electronic form. All documentation and records should be properly managed and maintained. During 1999, DOJ provided its policy guidance to the six selected agencies in our review--HHS and VA; the Air Force, Army, and Navy; and the Postal Service. Generally, the selection processes the agencies said they had were similar to DOJ's, (e.g., the attorney negotiating a case made the final decision, no list of approved or disapproved structured settlement brokers was maintained). Five agencies in our review identified various factors they considered when selecting a structured settlement broker. For example: HHS, Postal Service, and VA officials told us that they tended to select brokers with offices in the Washington, D.C., area. According to VA officials, the use of distantly located brokers created problems because of (1) differences in time zones and (2) the inability of nonlocal brokers to physically conduct work on short notice. Air Force, Navy, and VA officials told us that they put considerable weight on an impressive presentation given by the broker's firm. HHS, Navy, Postal Service, and VA officials said they looked at the broker's knowledge and experience in handling structured settlement cases for the federal government and based their selections on positive past experiences. Navy and Postal Service officials said they looked for brokers with a reputation for being dependable and responsible. In addition, the Army had established supplemental policies governing the selection of structured settlement brokers. According to the Army's standard operating procedures, brokers were to be selected on a case-by- case basis according to the following criteria: (1) the broker's ability to become a member of the negotiating team, participate in negotiations, and travel at his or her own expense; (2) the selecting administrative officer's previous interviews with or knowledge of the broker; (3) the broker's ability to present his views verbally (if the case requires in-person negotiations); and (4) the broker's experience if the administrative officer is inexperienced. In certain more specialized cases, the selecting administrative officer's choice of a specific broker must be approved by a higher authority. Even though federal agencies we surveyed said they provided policy guidance on broker selection, none of them required documentation of the reasons for selecting a structured settlement broker. In the absence of this requirement, none documented the reason for selection. DOJ has selected several structured settlement brokerage companies to handle most of the structured settlement claims. Between May 1, 1997, and May 1, 1999, DOJ used 27 different structured settlement brokerage companies to settle 242 claims for $236 million. (See table 1 for the number and total annuity costs of annuity settlements handled by brokers.) Of the 242 claims awarded, 70 percent (169 cases) were awarded to 4 brokerage companies. One of the four companies was awarded 30 percent (72 cases) of the total number of cases. The remaining 23 companies were awarded 30 percent of the total number of cases. Because DOJ did not document the reasons for selecting a particular broker, DOJ officials could not specifically say why certain companies received more business than others. However, as noted previously, DOJ officials cited a variety of reasons for selecting a specific structured settlement broker, such as experience, dependability, and knowledge of federal structured claims. According to DOJ, the companies frequently have multiple offices and brokers that compete with each other within the same company. Thus, a simple count of the number of companies could be misleading. DOJ has developed policies and guidance for selecting structured settlement brokers and disseminated this information to the six other federal agencies with authority to handle structured settlement claims that we contacted. However, the policies and guidance lacked an internal control requiring that the reasons for selecting a broker be documented and readily available for examination. This is important because without documentation of transactions or other significant events, DOJ can not be certain that its policies and guidance on selecting structured settlement brokers are being followed. Further, without documentation on the reasons settlement brokers were selected, it is more difficult to avoid the appearance of favoritism and preferential treatment in a situation where some brokers get significantly more business than others. We recommend that the Attorney General of the United States direct the Director of the Torts Branch responsible for FTCA claims and litigation, Civil Division, to develop an adequate internal control to ensure that the reasons for selecting structured settlement brokers are always fully documented and readily available for examination; and disseminate this guidance to federal agencies, including those in our survey, responsible for handling structured settlement claims. We requested comments on a draft of this report from the Attorney General or her designee. On January 18, 2000, the Acting Assistant Attorney General, Civil Division provided us with written comments, which are printed in full in appendix I. The Justice Department expressed appreciation that the report "outlines the many steps undertaken by the Department to ensure fairness in the broker selection process." DOJ said its existing policies and guidance to ensure that the selection of brokers is fair are effective. Therefore, it disagreed with our recommendation that DOJ implement an adequate internal control to ensure that the reasons for selecting a specific structured settlement broker are always fully documented and readily available for examination. DOJ noted that the Comptroller General's Standards for Internal Control in the Federal Government specify that management should design and implement internal controls based on the related costs and benefits. It stated that it was DOJ's belief that the costs of implementing the recommendation, in terms of diversion of attention from substantive issues and generation of extra paperwork, would substantially outweigh any benefits. We recognize that determining whether to implement a particular internal control involves a judgment about whether the benefits outweigh the costs. We believe that the benefits of implementing our recommendation would outweigh any associated costs and paperwork. As stated in this report, these benefits are twofold: requiring documentation would help enable DOJ to (1) determine if its policies and guidance on selecting brokers are being followed and (2) protect DOJ from charges of favoritism towards a specific broker or brokers. Further, noting the reasons for selecting a specific broker in the case file at the time the selection is made would appear to require only minimal paperwork or cost. For example, a concise memo to the file stating the rationale for the selection would suffice. DOJ also expressed concern that, although we observed that most structured settlements have been awarded to a relatively small number of companies, we did not mention that many of the selected companies had multiple offices and brokers that competed for the same work. According to DOJ, by "treating as a monolith all brokers affiliated with the major companies, the draft report ignores the actual way those businesses are run and runs the risk of significantly understating the actual number of brokers competing to handle DOJ structured settlements." In response, we have noted that according to DOJ, because structured settlement companies may have multiple offices and brokers, the number of companies could be misleading. Data were not readily available for us to determine the extent to which multiple brokers within a single company competed for the same settlement. Nevertheless, the number and cost of settlements by brokerage company show that DOJ placed the majority of its settlement work with a relatively small number of companies--a situation that still could open it up to charges of favoritism towards these companies. Cognizant officials at HHS, VA, Air Force, Army, Navy, and the Postal Service said they generally agreed with the information presented in the report. The Army provided additional information to clarify its policy for selecting structured settlement brokers, and we incorporated this information in the report where appropriate. We are sending copies of this report to Senator Orrin G Hatch, Chairman, and Senator Patrick J. Leahy, Ranking Minority Member, Senate Committee on the Judiciary; Representative Henry J. Hyde, Chairman, and Representative John Conyers, Jr., Ranking Minority Member, House Committee on the Judiciary; and the Honorable Janet Reno, the Attorney General. We are also sending copies to other interested congressional parties. Copies will also be made available to others upon request. If you or your staff have any questions, please call me or Weldon McPhail on (202) 512- 8777. Key contributors to this assignment were Mary Hall and Jan Montgomery. The first copy of each GAO report and testimony is free. Additional copies are $2 each. Orders should be sent to the following address, accompanied by a check or money order made out to the Superintendent of Documents, when necessary. VISA and MasterCard credit cards are accepted, also. Orders for 100 or more copies to be mailed to a single address are discounted 25 percent. U.S. General Accounting Office P.O. Box 37050 Washington, DC 20013 Room 1100 700 4th St. NW (corner of 4th and G Sts. NW) U.S. General Accounting Office Washington, DC Orders may also be placed by calling (202) 512-6000 or by using fax number (202) 512-6061, or TDD (202) 512-2537. Each day, GAO issues a list of newly available reports and testimony. To receive facsimile copies of the daily list or any list from the past 30 days, please call (202) 512-6000 using a touch- tone phone. A recorded menu will provide information on how to obtain these lists. | Pursuant to a congressional request, GAO provided information on the Department of Justice's (DOJ) policy and guidance for selecting structured settlement brokers, focusing on: (1) the policies and guidance for selecting structured settlement brokers used by DOJ and six selected agencies; and (2) a list of the structured settlement brokerage companies used by DOJ and the number of settlements awarded to each company since May 1997. GAO noted that: (1) in 1993 and 1997, DOJ issued policies and guidance on the selection of structured settlement brokers to promote fairness and to avoid the appearance of favoritism; (2) DOJ officials told GAO that its policies and guidelines permit some discretion and that when selecting a particular broker, they generally relied on such factors as reputation, past experience, knowledge, and location; (3) however, DOJ officials also told GAO they were unable to specify reasons why attorneys selected particular brokers to settle specific cases, because DOJ did not require documentation of these decisions; (4) without an internal control requiring the reasons for selecting a particular settlement broker be documented and readily available for examination, it is more difficult to verify that selection policies and guidelines were followed and, in turn, to avoid the appearance of favoritism and preferential treatment; (5) overall, the six federal agencies surveyed described policies and guidance in selecting structured settlement brokers that were similar to DOJ's; (6) none of the agencies had internal controls requiring their attorneys to document their reasons for selecting a specific broker; (7) one agency had a written supplemental policy governing the use of structured settlements, but it did not require documentation of decisions; (8) officials at the other five federal agencies said they also generally relied on such factors as reputation, past experience, knowledge, and location for selecting a particular structured settlement broker; (9) however, the reasons why particular brokers were selected for specific cases were not documented; (10) GAO's review of the list of structured settlement brokerage companies used by DOJ and the number of settlements assigned to each company showed that DOJ selected a few companies to handle most of its structured settlement business; (11) according to DOJ, the companies frequently have multiple offices and brokers that compete with each other within the same company; (12) thus, a simple count of the number of companies could be misleading; (13) although DOJ used 27 different structured settlement companies to settle 242 claims for about $236 million between May 1, 1997, and May 1, 1999, 70 percent (169 cases) were awarded to 4 brokerage companies; and (14) of the remaining 23 companies, none were awarded more than 17 cases each. | 3,830 | 550 |
Contracts, grants, cooperative agreements, and other transactions are among the tools DOD has to support or acquire research. The instruments are not interchangeable, but rather are to be used according to the nature of the research and the type of government-recipient relationship desired. Contracts are procurement instruments and, as such, are governed by the Federal Acquisition Regulation (FAR) and DOD procurement regulations. Contracts are to be used when the principal purpose of the project is the acquisition of goods and services for the direct benefit of the federal government. In contrast, grants, cooperative agreements, and other transactions are assistance instruments used by DOD when the principal purpose is to stimulate or support research and development efforts for more public purposes. Assistance instruments are generally not subject to the FAR or DOD procurement regulations, thereby providing DOD a considerable degree of flexibility in negotiating terms and conditions with the recipients. Between fiscal years 1990 and 1994, DOD cited the authority provided under 10 U.S.C. 2371 to enter into 72 agreements, of which 56 were categorized as other transactions and 16 as cooperative agreements. At time of award, the planned contributions by DOD and recipients totaled about $1.5 billion. DARPA has been the primary user of the authority, entering into all 56 agreements that were identified as other transactions. The Air Force and Navy entered into a total of 16 cooperative agreements, while through fiscal year 1994 the Army had not entered into any agreements using this authority. For various policy and implementation reasons, DOD generally did not enter into assistance relationships with commercial organizations prior to the enactment of 10 U.S.C. 2371 in 1989. However, 59--or about 82 percent--of the agreements entered into under the authority of 10 U.S.C. 2371 were with consortia comprised primarily of for-profit firms. This high number of consortia-led projects was due in part to the fact that most of the programs under which the agreements were entered into--such as the Technology Reinvestment Project (TRP)--required or expected that some type of partnership arrangement be formed. Nearly all of the remaining agreements were entered into with single commercial firms. Appendix I provides additional information on various recipient characteristics. The use of cooperative agreements and other transactions appears to provide some opportunities to remove barriers between the defense and civilian industrial bases, in particular by attracting firms that traditionally did not perform research for DOD. In a previous report, we pointed out that government acquisition requirements have caused some companies to separate their defense and commercial research and development organizations or to decline accepting government research and development funds. The flexibility inherent in these instruments has enabled DOD to attract firms that have historically declined to participate in research projects sponsored under a contract--such as Cray Research, Hewlett-Packard, and the commercial division of IBM--to participate in one or more projects either as a consortium member or as a single party. Overall, based on information provided by DOD and recipient officials, we estimate that about 42 percent of the 275 commercial firms that participated in 1 or more agreements were firms that traditionally had not performed research for DOD. DOD officials stressed that a contracting officer cannot elect to use a cooperative agreement or other transaction to attract a nontraditional firm when the principal purpose of the research is for the direct benefit of the government. However, they indicated that for projects in which the use of such instruments was appropriate, the ability to attract such firms was a significant benefit, especially in those areas in which these firms' technological capabilities exceed those possessed by traditional defense firms. For example, in 1 Air Force agreement, 14 firms, including 5 that traditionally had not performed research for DOD, entered into a $60 million cooperative agreement to develop computer interface standards. The consortium manager told us that the commercial firms involved would not have participated had DOD imposed standard FAR clauses for certified cost and pricing data or intellectual property provisions. The Air Force program manager noted that the consortium has both large, multinational firms like IBM, as well as small, specialized companies working together. Representatives from the consortia and the Air Force believed that the mix of participants facilitated information exchange and consensus building on the interface standards. Discussions with DOD officials and recipients indicated that the specific terms and conditions that led to the decision to participate varied from company to company. For some, such as IBM, it was the ability to use their commercial accounting systems rather than establish systems or practices that complied with government-unique requirements; for others, such as Hewlett Packard, it was the ability to limit the government's access to and audits of the firm's financial records or the increased flexibility in the allocation of intellectual property rights that were key factors in their decision to do business with DOD. A 1994 other transaction with a Hewlett-Packard-led consortium provides insights into how the authority was used to negotiate terms and conditions affecting both financial management and intellectual property matters that are atypical of contracts, grants, or standard cooperative agreements. We had previously reported that Hewlett-Packard declined to accept government research and development funds to protect its technical data rights. In this case, however, Hewlett-Packard responded to a DARPA announcement soliciting proposals to advance the state of the art in the manufacture of more affordable optoelectronics systems and components. According to DARPA, this technology will enable data transmissions at high rates from high performance parallel processors at far lower costs than current technology allows. Under the agreement, the financial management provisions require consortium members to maintain adequate records to account for federal funds received under the agreement, and account for the members' contributions toward the project. The members are required to have an accounting system that complies with generally accepted accounting principles, but commercial firms do not have to follow the accounting requirements specified by the FAR. The agreement does not require an annual audit and does not specifically provide DARPA or our office direct access to these records. Rather, for up to 3 years after the agreement is completed, these records may be subject to an audit by an independent auditor, who will provide a report to DARPA. In comparison, under a cost-reimbursement research contract, a traditional defense contractor would be typically required to (1) follow the FAR accounting requirements, (2) undergo audits, and (3) provide the federal contracting agency and our office with access to the contractors' pertinent records. Similarly, the intellectual property provisions were structured to provide Hewlett-Packard more flexible provisions than typically allowed under contracts, grants, or standard cooperative agreements, all of which are governed by the provisions of Public Law 96-517, as amended. The provisions of this act, commonly referred to as the Bayh-Dole Act, provide the government's general policy regarding patent rights in inventions developed with federal assistance and are intended, in part, to facilitate the commercialization and public availability of inventions. In general, the government's policy is to allow the contractor to elect to retain title to the subject invention while providing the government a nonexclusive, nontransferable, irrevocable, paid-up license to practice or have practiced for or on behalf of the United States any subject invention throughout the world. Recipients must comply with certain administrative requirements. For example, under a research contract, a contractor is required to notify the government of an invention within 2 months after it has been disclosed to contractor personnel responsible for such matters. Large contractors are required to notify the government in writing whether they intend to retain rights to that invention within 8 months after disclosing the invention to the government, while small businesses are provided up to 24 months. Failure to comply with these administrative requirements provides the government the right to obtain title to an invention. Under the Hewlett-Packard agreement, the intellectual property provisions were structured so that the consortium has up to 4 months after the inventor discloses a subject invention to his company to notify the government; the consortium has up to 24 months to inform DARPA whether it intends to take title to inventions arising from the agreement after its disclosure to the government; DARPA agreed to delay exercising its government purpose license rights to inventions in which the consortium retains title until 5 years after the agreement is completed; and the consortium has the authority to maintain inventions and data as trade secrets for an unspecified period of time under certain conditions. Further, under the agreement, DARPA does not receive any rights to any technical data produced under the agreement unless DARPA invokes its "march-in" rights. These rights can be invoked only if the consortium fails to reduce an invention to practical application or for other specified reasons, such as in the case in which the consortium grants another firm an exclusive right to use or sell the invention in a product that is substantially manufactured outside of the United States or Canada. In combination, these terms provide the consortium additional time to commercialize the technology, while somewhat limiting the government's rights to that technology. These clauses illustrate the trade-offs that DOD may face as it attempts to attract firms that have not traditionally performed research for the government or move toward more commercial-like practices. Many of the oft-cited barriers to integrating the defense and civilian industrial bases, such as government cost accounting and auditing requirements, rights in technical data, and other government unique requirements, were instituted to safeguard or protect the government's and taxpayer's interests, assist suppliers, or help achieve a variety of national goals. In the Hewlett-Packard example, two of the government's traditional methods of oversight--audits and access to records--were not included, while the government's standard rights to information developed under federally sponsored research are somewhat constrained. DARPA and service program management and contracting officials acknowledged that there may be some added risks to the government due to the less stringent oversight requirements. However, most indicated that factors such as the recipient's interest in having the project succeed (given its commercial applications), the recipient's willingness to cost share, and the tendency of consortium members to self-police its agreements (since each member wants to assure that its partners are contributing as agreed), acted to reduce that risk. Similarly, DARPA officials commented that the added flexibility within the intellectual property provisions would assist the firms' efforts to develop and commercialize the technology. The instruments appear to be fostering new relationships and practices within the defense industry, especially for those projects being undertaken by consortia. Under a consortium, members mutually develop and sign articles of collaboration, which cover such issues as the consortium's management structure, each member's technical and financial responsibilities, and the exchange or protection of each member's proprietary information. Several officials we interviewed noted that developing the articles of collaboration tended to be contentious and time-consuming. Once the consortium is established, however, DOD officials and recipients indicated that a synergistic effect tended to occur because of the exchange of information under consortia, thereby expediting technology development. For example, recognizing their common interest in developing more affordable composite engine components, General Electric and Pratt & Whitney agreed to collaborate with material suppliers on a $32 million project. These two firms--normally competitors--developed mutually agreeable terms that balanced proprietary interests with research objectives. According to Air Force officials responsible for the effort, there was better information flow and greater technical progress using this joint approach than if each firm had undertaken the project separately. Depending on the project, DOD program management and contracting officials viewed themselves as being more actively involved in coordinating and facilitating activities than performing a traditional government oversight function. However, DOD officials and recipients we spoke with noted that negotiating cooperative agreements was significantly different than negotiating contracts, in which most provisions are governed by a standard FAR clause and in which negotiations tend to focus on the cost proposal. These officials noted that since the FAR is not applicable to assistance instruments, more provisions were subject to negotiation. DOD officials and consortia representatives noted that moving away from the traditional reliance on FAR-based contracting approaches and clauses to which they are accustomed and increasing the use of assistance instruments would require significant cultural or mindset changes by both parties. The potential exists for traditional defense contractors to use cooperative agreements and other transactions to develop or use new practices that may be viewed as more efficient or less cumbersome than those employed in acquisition programs under FAR-based contracts. Officials from such firms, however, generally indicated that given their investment in systems that complied with FAR or DOD requirements and the need to use these systems for procurement contracts, developing or using alternative practices was not considered cost-effective. Leveraging the private sector's financial investment is considered an important element of projects sponsored by a cooperative agreement or other transaction for several reasons. First, by having commercial firms contribute to the cost of developing technologies with both military and commercial applications, DOD hopes to stretch its research funding. Secondly, cost-sharing is seen as appropriate since commercial firms are intended to benefit financially from sales of the technology. Finally, DOD officials indicated that the participants' contributions demonstrated commitment to the project and enabled less rigid government oversight requirements, since the firms were expending their own resources. Participants' contributions may be in cash or in-kind contributions, such as the use of equipment, facilities, and other assets. As shown in table 1, the 72 agreements DOD entered into between fiscal years 1990 and 1994 have a current value of about $1.7 billion, toward which participants have agreed to contribute about $1.0 billion, or about 58 percent. Measured another way, participants planned to contribute about $1.39 for each dollar provided by DOD. It should be noted that the government's actual share of the projects' costs may be higher than indicated by table 1. Under FAR 31.205-18(e), research costs incurred by contractors under projects entered into under 10 U.S.C. 2371 should be considered allowable IR&D expenses if such costs would have been allowed in the absence of the agreement. Consequently, to the extent that participants use IR&D as their cost-share contributions and include such costs as overhead under other government contracts, a portion of these costs subsequently will be reimbursed by DOD. Participants also were allowed to propose the value of prior research as part of their cost-sharing contributions. These contributions do not represent the cost of prior research, but rather the estimated value of that research for the current project. On several agreements, DOD's acceptance of prior research enabled firms to offset their current contributions significantly. For example, in one DARPA agreement, 89 percent of the consortia's planned contribution of approximately $4.7 million was attributable to the value of prior research. Similarly, in three other agreements, more than 50 percent of the consortia's planned contributions consisted of the value of prior research. Overall, we estimate that participants' planned contributions included about $98 million--or about 10 percent--in the form of the value of prior research, with such contributions representing more than 20 percent in 8 of the 72 agreements. DOD officials expressed various views as to whether the value of prior research should be accepted and to what extent. For example, an Army official told us that while they believed prior research should be taken into consideration in evaluating the project's risk, he expressed some reservation about accepting prior research as a cost-share contribution. Similarly, a February 1995 Air Force memorandum noted that while it was permissible to accept the value of prior research as a cost-share contribution, Air Force negotiators should proceed with caution. The memorandum noted that evaluating such contributions is complicated and that grant officers have a responsibility to ensure that the prior research is relevant to and brings value to the proposed effort. DARPA officials noted that while cash or concurrent in-kind contributions are the more preferred forms of contributions, they believed that the value of prior research is acceptable in certain circumstances, such as when the participant possesses significant technical knowledge but is unable or unwilling to provide cash or in-kind contributions. Accordingly, DARPA officials told us they did not place a limit on the percentage of prior research that could be accepted. Conversely, the Navy generally included a provision in its agreements that limited the contributions of intellectual property, patents, trade secrets, and other nonfederal sources to not more than 10 percent of the participants' planned cost-sharing contributions. While 10 U.S.C. 2371 does not prohibit DOD from accepting the value of prior research as part of the participants' cost share, the legislation requires that to the extent that the Secretary deems practicable, the funds provided by the government under the cooperative agreement or other transaction should not exceed the total amount provided by other parties to the agreement. Accepting prior research in lieu of concurrent financial or in-kind contributions may obscure each party's relative contributions in the current project. Our review identified two emerging issues pertaining to instrument selection and structure of cooperative agreements and other transactions. First, we found that DARPA always designated its agreements as "other transactions," while the services always employed "cooperative agreements." While the instruments share many similar characteristics, DARPA officials indicated that a DARPA other transaction did not require participants to be subject to annual audit and generally did not require recipients to provide our office with access to their pertinent financial records. In contrast, Air Force officials indicated that their cooperative agreements generally required an annual audit, though not necessarily access to records by our office, while Navy officials indicated that their agreements generally required both. The selection of different instruments, coupled with different treatment of specific issues among the services, has led to some confusion among firms that were negotiating agreements with both DARPA and the services. Second, there remains some disagreement within DOD regarding intellectual property provisions. While DOD officials agree that cooperative agreements are subject to the provisions of the Bayh-Dole Act, there is less consensus regarding other transactions. DARPA officials maintain that other transactions entered into under the authority of 10 U.S.C. 2371 are not subject to the Bayh-Dole Act because, in their opinion, the act only applies to contracts, grants and standard cooperative agreements. In support, they noted that Congress has twice commented favorably on DARPA's use of other transactions to provide more flexible intellectual property provisions. However, a representative from the Office of Naval Research's Office of Corporate Counsel argued that the provisions of the Bayh-Dole Act are applicable to such agreements. The representative stated that it was his office's position that the act was to be interpreted broadly as to which types of instruments were covered. Reaching resolution on the issue may be important as DOD attempts to expand its research base. For example, while Air Force and Navy officials noted that they have been able to negotiate intellectual property provisions with participants that are consistent with Bayh-Dole, DARPA officials contended that the ability to provide more flexible intellectual property provisions than would be possible under Bayh-Dole was instrumental in reaching their agreements. DOD is updating its February 1994 draft guidance on the use of these instruments, in part to provide more consistency in the selection and structure of the agreements. However, DOD was unable to provide an estimate on when the revised guidance would be issued. Because inconsistent selection of a particular instrument and treatment of specific clauses may unnecessarily increase confusion for government and industry users and may hinder their effective use, we recommend that the Secretary of Defense ensure that DOD's revised guidance on the use of cooperative agreements and other transactions promotes increased consistency among DOD components on the selection and structure of these instruments. In particular, the guidance should specifically address the extent that the value of prior research should be accepted as part of a participant's cost-sharing contribution and the extent to which these instruments are subject to the provisions of the Bayh-Dole Act and under what conditions. In commenting on a draft of this report, DOD generally concurred with the thrust of our findings and recommendation. DOD noted that it shared our assessment that the instruments, if used appropriately, could be valuable tools that help DOD take advantage of technology development in the commercial sector. DOD's comments are presented in their entirety in appendix III. DOD officials also provided technical and editorial comments on a draft of this report. We have incorporated their comments where appropriate. We are sending copies of this report to other congressional committees; the Secretaries of Defense and Commerce; the Administrator, National Aeronautical and Space Administration; and the Director, Office of Management and Budget. Copies will be provided to other interested parties upon request. Please contact me at (202) 512-4587 if you or your staff have any questions concerning this report. Major contributors to this report are listed in appendix IV. The Department of Defense (DOD) entered into 72 agreements using the authority of 10 U.S.C. 2371 between fiscal years 1990 and 1994. Of these agreements, 59, or about 82 percent, were with consortia, which were comprised of some 400 participants. Based on information provided by DOD officials and participants, we estimate that about two-thirds of consortia participants were for-profit commercial firms. Of the 13 agreements with single participants, 12 agreements were awarded to for-profit firms. Overall, we estimate that about 42 percent of the 275 commercial firms that participated in one or more agreements were firms that traditionally had not performed research for DOD. Table I.1 shows selected characteristics of participants of cooperative agreements and other transactions between fiscal years 1990 and 1994. To determine the number of cooperative agreements and other transactions DOD entered into using the authority of 10 U.S.C. 2371, we reviewed the annual reports and notifications DOD submitted to Congress from fiscal years 1990 to 1993. As the fiscal year 1994 report was not available during our review, we requested information from DARPA and the services regarding their fiscal year 1994 usage. We included in our review only those other transactions that were used principally in an assistance-type relationship with commercial firms or consortia for government-sponsored research projects. Consequently, we excluded one agreement that was entered into under the authority provided by section 845 of the National Defense Authorization Act for Fiscal Year 1994 (P.L. 103-160, Nov. 30, 1993). This authority is distinct from agreements entered into under 10 U.S.C. 2371 as it enables DARPA to conduct prototype projects that are directly relevant to weapons or weapon systems proposed to be acquired or developed by DOD. Further, we did not attempt to identify to what extent DOD had used the authority of 10 U.S.C. 2371 to enter into other assistance-type relationships, such as in cases where DOD loaned equipment to firms to conduct research or in reimbursable arrangements that allow a firm to conduct experiments aboard a government experimental launch vehicle. To characterize the agreements and analyze each participant's financial or technical contributions to the agreement, we reviewed the agreement file, which generally included the agreement, articles of collaboration, the contracting officer's agreement analyses, legal review, funding documentation, and other pertinent information. We summarized key elements of the agreement, including the recipient's planned cost-sharing information, and requested that DOD verify our interpretation or provide additional information. We did not attempt to independently verify the financial information we obtained. Further, we did not attempt to determine the extent to which participants were using DOD funds to conduct projects that would have been undertaken in the absence of DOD funding. To obtain the views on the benefits and risks of using such instruments, we interviewed program management and contracting officials from DARPA, the Navy, and the Air Force, as well as representatives from various participants. We also interviewed senior management individuals from each of the services and DARPA, and from the following organizations: Office of the Director, Defense Research and Engineering; Office of the Director, Defense Procurement; Office of the Assistant Secretary of Defense (Economic Security); and Office of the Deputy Under Secretary of Defense (Acquisition Reform). Some DOD officials cautioned against making broad comparisons between the terms and conditions found in contracts with those found in cooperative agreements and other transactions since the principal purpose of the instruments--acquisition and stimulation, respectively--differs significantly. However, as acknowledged by DOD officials, DOD's relationship with commercial firms has generally been through procurement contracts. Consequently, comparing the instruments can be illustrative of the types of changes and issues that may arise as business practices evolve. We conducted our work from May 1994 to December 1995 in accordance with generally accepted government auditing standards. Rae Ann Sapp James R. Wilson Shari A. Kolnicki The first copy of each GAO report and testimony is free. Additional copies are $2 each. Orders should be sent to the following address, accompanied by a check or money order made out to the Superintendent of Documents, when necessary. VISA and MasterCard credit cards are accepted, also. Orders for 100 or more copies to be mailed to a single address are discounted 25 percent. U.S. General Accounting Office P.O. Box 6015 Gaithersburg, MD 20884-6015 Room 1100 700 4th St. NW (corner of 4th and G Sts. NW) U.S. General Accounting Office Washington, DC Orders may also be placed by calling (202) 512-6000 or by using fax number (301) 258-4066, or TDD (301) 413-0006. Each day, GAO issues a list of newly available reports and testimony. To receive facsimile copies of the daily list or any list from the past 30 days, please call (202) 512-6000 using a touchtone phone. A recorded menu will provide information on how to obtain these lists. | GAO evaluated the Department of Defense's (DOD) use of cooperative agreements and other transactions to further its objectives of: (1) helping to reduce the barriers to integrating the defense and civilian sectors of the industrial base; (2) promoting new relationships and practices within the defense industry; and (3) allowing the government to leverage for defense purposes the private sector's financial investment in research and development of commercial products and processes. GAO also discussed two emerging issues concerning the selection and structure of the instruments. GAO found that: (1) cooperative agreements and other transactions appear to have contributed to reducing some of the barriers between the defense and civilian industrial bases by attracting firms that traditionally did not perform research for DOD; (2) the instruments have enabled the use of more flexible financial management and intellectual property provisions than those typically found in contracts and grants; (3) the instruments appear to be fostering new relationships and practices within the defense industry, especially for projects being undertaken by consortia; (4) DOD has partially offset its own costs by sharing project costs with recipients, but the DOD practice of accepting the value of recipients' prior research efforts in lieu of concurrent financial or in-kind contributions may increase the actual DOD monetary share of the project's costs; (5) differences between DARPA and the military services regarding the selection of instruments and treatment of specific provisions have led to some confusion among firms that were negotiating agreements with different DOD components; and (6) DOD is revising its interim regulations to provide clearer guidance on the instruments' selection, use, and structure. | 5,532 | 322 |
PTSD can develop following exposure to life-threatening events, natural disasters, terrorist incidents, serious accidents, or violent personal assaults like rape. PTSD is the most prevalent mental disorder arising from combat. People who experience stressful events often relive the experience through nightmares and flashbacks, have difficulty sleeping, and feel detached or estranged. These symptoms may occur within the first 4 days after exposure to the stressful event or be delayed for months or years. Symptoms that appear within the first 4 days after exposure to a stressful event are generally diagnosed as acute stress reaction or combat stress. If the symptoms of acute stress reaction or combat stress continue for more than 1 month, PTSD is diagnosed. PTSD services are provided in VA medical facilities and VA community settings. VA medical facilities offer PTSD services as well as other services, which range from complex specialty care, such as cardiac or spinal cord injury, to primary care. VA's community settings include more than 800 community-based outpatient clinics and 206 Vet Centers. Community- based outpatient clinics are an extension of VA's medical facilities and mainly provide primary care services. Vet Centers offer PTSD and family counseling, employment services, and a range of social services to assist veterans in readjusting from wartime military service to civilian life. Vet Centers also function as community points of access for many returning veterans, providing them with information and referrals to VA medical facilities. Vet Centers were established as entities separate from VA medical facilities to serve Vietnam veterans, who were reluctant to access health care provided in a federal building. As a result, Vet Centers are not located on the campuses of VA medical facilities. VA has specialized PTSD programs that are staffed by clinicians who have concentrated their clinical work in the area of PTSD treatment. VA specialized PTSD programs are located in 97 VA medical facilities and provide services on an inpatient and outpatient basis. VA PTSD services include individual counseling, support groups, and drug therapy and can be provided in non-specialized clinics, such as general mental health clinics. Veterans who served in any conflict after November 11, 1998 are eligible for VA health care services for any illness, including PTSD services, for 2 years from the date of separation from military service, even if the condition is not determined to be attributable to military service. This 2-year eligibility includes those Reserve and National Guard members who have left active duty and returned to their units. After 2 years, these veterans will be subject to the same eligibility rules as other veterans, who generally have to prove that a medical problem is connected to their military service or have relatively low incomes. In July 2004, VA reported that so far 32,684 or 15 percent of veterans who have returned from service in Iraq or Afghanistan, including Reserve and National Guard members, have accessed VA for various health care needs. DOD and VA have formed a Seamless Transition Task Force with the goal of meeting the needs of servicemembers returning from Iraq and Afghanistan who will eventually become veterans and may seek health care from VA. To achieve this goal, DOD and VA plan to improve the sharing of information, including individual health information, between the two departments in order to enhance VA's outreach efforts to identify and serve returning servicemembers, including Reserve and National Guard members, in need of VA health care services. Since April 2003, VA requires that every returning servicemember from the Iraq and Afghanistan conflicts who needs health care services receive priority consideration for VA health care appointments. DOD uses two approaches to identify servicemembers who may be at risk of developing PTSD: the combat stress control program and the post- deployment health assessment questionnaire. DOD's combat stress control program identifies servicemembers at risk for PTSD by training all servicemembers to identify the early onset of combat stress, which if left untreated, could lead to PTSD. DOD uses the post-deployment health assessment questionnaire to screen servicemembers for physical ailments and mental health issues commonly associated with deployments, including PTSD. The questionnaire contains four screening questions that were developed jointly by DOD and VA mental health experts to identify servicemembers at risk for PTSD. DOD's combat stress control program identifies servicemembers at risk for PTSD by training all servicemembers to identify the early onset of combat stress symptoms, which if left untreated, could lead to PTSD. The program is based on the principle of promptly identifying servicemembers with symptoms of combat stress in a combat theater, with the goal of treating and returning them to duty. This principle is consistent with the views of PTSD experts, who believe that early identification and treatment of combat stress symptoms may reduce the risk of PTSD. To assist servicemembers in the combat theater, teams of DOD mental health professionals travel to units to reinforce the servicemembers' knowledge of combat stress symptoms and to help identify those who may be at risk for combat stress or PTSD. The teams may include psychiatrists, psychologists, social workers, nurses, mental health technicians, and chaplains. DOD requires that the effectiveness of the combat stress control program be monitored on an annual basis. DOD generally uses the post-deployment health assessment questionnaire, DD 2796, to identify servicemembers at risk for PTSD following deployment outside of the United States. (See app. II for a copy of the DD 2796.) DOD requires certain servicemembers deployed to locations outside of the United States to complete a DD 2796 within 30 days before leaving a deployment location or within 5 days after returning to the United States.This applies to all servicemembers returning from a combat theater, including Reserve and National Guard members. The DD 2796 is a questionnaire used to determine the presence of any physical ailments and mental health issues commonly associated with deployments, any special medications taken during deployment, and possible environmental or occupational exposures. The DD 2796 includes the following four screening questions that VA and DOD mental health experts developed to identify servicemembers at risk for PTSD: Have you ever had any experience that was so frightening, horrible, or upsetting that, in the past month, you have had any nightmares about it or thought about it when you did not want to? tried hard not to think about it or went out of your way to avoid situations that remind you of it? were constantly on guard, watchful, or easily startled? felt numb or detached from others, activities, or your surroundings? Once completed, the DD 2796 must be initially reviewed by a DOD health care provider, which could range from a physician to a medic or corpsman. Figure 1 illustrates DOD's process for completion and review of the DD 2796. The form is then reviewed, completed, and signed by a health care provider, who can be a physician, physician assistant, nurse practitioner, or an independent duty medical technician or corpsman. This health care provider reviews the completed DD 2796 to identify any "yes" responses to the screening questions--including questions related to PTSD--that may indicate a need for further medical evaluation. The review is to take place in a face-to-face interview with the servicemember and be conducted either on an individual basis, as we observed at the Army's Fort Lewis in Washington, or in a group setting, as we found at the Marine Corps' Camp Lejeune in North Carolina. If a servicemember answers "yes" to a PTSD question, the health care provider is instructed to gather additional information from the servicemember and use clinical judgment to determine if the servicemember should be referred for further medical evaluation to a physician, physician's assistant, nurse, or an independent duty medical technician.To document completion of the DD 2796, DOD requires that the questionnaire be placed in the servicemember's permanent medical record and a copy sent to the Army Medical Surveillance Activity, which maintains a database of all servicemembers' completed health assessment questionnaires. The National Defense Authorization Act for Fiscal Year 1998 required DOD to establish a quality assurance program to ensure, among other things, that post-deployment mental health assessments are completed for servicemembers who are deployed outside of the United States. Completion of the DD 2796 is tracked as part of this quality assurance program. DOD delegated responsibility for developing procedures for the required quality assurance program to each of its uniform services. The uniform services have given unit commanders the responsibility to ensure completion of the DD 2796 by all servicemembers under their command. To ensure the DD 2796 is completed, one DOD official we interviewed told us that servicemembers would not be granted leave to go home until the DD 2796 was completed. Another official told us that Reserve and National Guard members would not be given their active duty discharge paperwork until the DD 2796 was completed. VA does not have all the information it needs to determine whether it can meet an increase in demand for VA PTSD services. VA does not have a count of the total number of veterans currently receiving PTSD services at its medical facilities and Vet Centers. Without this information, VA cannot estimate the number of veterans its medical facilities and Vet Centers could treat for PTSD. VA could use demographic information it receives from DOD to broadly estimate the number of servicemembers who may access VA health care, including PTSD services. By assuming that 15 percent or more of returning servicemembers will develop PTSD, VA could use the demographic information to broadly estimate demand for PTSD services. However, predicting which veterans will seek VA care and at which facilities is inherently uncertain, particularly given that the symptoms of PTSD may not appear for years. VA does not have a count of the total number of veterans currently receiving PTSD services at its medical facilities and Vet Centers. Without this information, VA cannot estimate the number of additional veterans its facilities could treat for PTSD. On August 27, 2004, a Northeast Program Evaluation Center (NEPEC) official told us that a count of the total number of veterans with a diagnosis of PTSD who receive VA services at medical facilities could be obtained from VA's existing database. However, this database does not include Vet Centers' information because this information is kept separate from the medical facilities' data. VA publishes two reports that contain information on some of the veterans receiving PTSD services at its medical facilities. Neither report includes all veterans receiving PTSD services at VA medical facilities and Vet Centers. VA's annual capacity report, which is required by law, provides data on VA's most vulnerable populations, such as veterans with spinal cord injuries, blind veterans, and seriously mentally ill veterans with PTSD. The NEPEC annual report mainly provides data on veterans with a primary diagnosis of PTSD. VA has not developed a methodology that would allow it to count the number of veterans receiving PTSD services at its medical facilities and Vet Centers. The PTSD data used in VA's annual capacity report and the data used in NEPEC's annual report are drawn from different--though not mutually exclusive--subgroups of veterans receiving PTSD services at VA's medical facilities. VA developed criteria that allow it to determine which veterans should be included in each subgroup. VA's criteria, which differ in each report, are based on the type and frequency of mental health services provided to veterans with PTSD at its medical facilities. (See Figure 2 for the veterans included in each of VA's annual reports.) Veterans who are receiving VA PTSD services may be counted in both reports, only counted in the NEPEC report, or not included in either report. For example, a veteran who is seriously mentally ill and has a primary diagnosis of PTSD is counted in both reports. On the other hand, a veteran who has a primary diagnosis of PTSD but is not defined as seriously mentally ill is counted in the NEPEC report but not in the capacity report. Finally, a veteran who is receiving PTSD services only at a Vet Center is not counted in either report. Furthermore, both the VA OIG and VA's Committee on Care of Veterans with Serious Mental Illness have found inaccuracies in the data used in VA's annual capacity report.For example, OIG found inconsistencies in the PTSD program data reported by some VA medical facilities. OIG found that some medical facilities reported having active PTSD programs, although the facilities reported having no staff assigned to these programs. Additionally, the Committee on Care of Veterans with Serious Mental Illness, commenting on VA's fiscal year 2002 capacity report, stated the data VA continues to use for reporting information on specialized programs are inaccurate and recommended changes in future reporting.22, 23 VA agreed with OIG that the data were inaccurate and is continuing to make changes to improve the accuracy of the data in its annual capacity report. VA's fiscal year 2003 capacity report to Congress is currently undergoing review by OIG, which informed us that VA has not incorporated all of the changes necessary for OIG to certify that the report is accurate. OIG further stated that it will continue to oversee this process. The Committee on Care of Severely Chronically Mentally Ill Veterans assesses VA's capability to meet the rehabilitation and treatment needs of such veterans. See 38 U.S.C. SS 7321. The Committee, established within VA, is generally referred to as the Committee on Care of Veterans with Serious Mental Illness. Department of Veterans Affairs, Capacity Report Fiscal Year 2002 (Washington, D.C.: May 2003). predict the facilities or Vet Centers that could experience an increase in demand for care. By assuming that 15 percent or more of returning servicemembers will eventually develop PTSD, based on the predictions of mental health experts, VA could use the demographic information to broadly estimate the number of returning servicemembers who may need VA PTSD services and the VA facilities located closest to servicemembers' homes. However, predicting which veterans will seek VA care and at which facilities is inherently uncertain, particularly given that the symptoms of PTSD may not appear for years. VA headquarters received demographic information from DOD in September 2003; however, during our review we found that VA had not shared this information with its facilities. On July 21, 2004, VA provided this information to its medical facilities for planning future services for veterans returning from the Iraq and Afghanistan conflicts. However, VA did not provide the demographic information to Vet Centers. Officials at seven VA medical facilities told us that while the demographic information VA receives from DOD has limitations, it is the best national data currently available and would help them plan for new veterans seeking VA PTSD services. Officials at six of the seven VA medical facilities we visited explained that while they are now able to keep up with the current number of veterans seeking PTSD services, they may not be able to meet an increase in demand for these services. In addition, some of the officials expressed concern about their ability to meet an increase in demand for VA PTSD services from servicemembers returning from Iraq and Afghanistan based on DOD's demographic information. Officials are concerned because facilities have been directed by VA to give veterans of the Iraq and Afghanistan conflicts priority appointments for health care services, including PTSD service. As a result, VA medical facility officials estimate that follow-up appointments for veterans currently receiving care for PTSD may be delayed. VA officials estimate the delay may be up to 90 days. Veterans of the Iraq and Afghanistan conflicts will not be given priority appointments over veterans who have a service-connected disability and are currently receiving services. While the VA OIG continues to oversee VA's efforts to improve the accuracy of data in the capacity reports, VA does not have a report that counts all veterans receiving VA PTSD services. Although VA can use DOD's demographic information to broadly estimate demand for VA PTSD services, VA does not know the number of veterans it now treats for PTSD at its medical facilities and Vet Centers. As a result, VA will be unable to estimate its capacity for treating additional veterans who choose to seek VA's PTSD services, and therefore, unable to plan for an increase in demand for these services. To help VA estimate the number of additional veterans it could treat for PTSD and to plan for the future demand for VA PTSD services from additional veterans seeking these services, we recommend that the Secretary of Veterans Affairs direct the Under Secretary for Health to determine the total number of veterans receiving VA PTSD services and provide facility-specific information to VA medical facilities and Vet Centers. In commenting on a draft of this report, VA concurred with our recommendation and acknowledged that more coordinated efforts are needed to improve its existing PTSD data. VA stated that it plans to aggregate, at the national level, the number of veterans receiving PTSD services at VA medical facilities and Vet Centers. We believe VA should provide these data to both its medical facilities and Vet Centers so they have the information needed to plan for future demand for PTSD services. In addition, VA provided two points of clarification. First, VA stated that it is in the process of developing a mental health strategic plan that will project demand by major diagnoses and identify where projected demand may exceed resource availability. VA stated that future revisions to the mental health strategic plan would include Vet Center data. Second, VA stated that it would seek additional information from DOD on servicemembers who have served in Iraq and Afghanistan to improve its provision of health care services to these new veterans. VA's written comments are reprinted in appendix III. DOD concurred with the findings and conclusions in this report and provided technical comments, which we incorporated as appropriate. DOD's written comments are reprinted in appendix IV. As agreed with your office, unless you publicly announce its contents earlier, we plan no further distribution of this report until 30 days after its date. We will then send copies of this report to the Secretary of Veterans Affairs and other interested parties. We also will make copies available to others upon request. In addition, the report will be available at no charge at the GAO Web site at http://www.gao.gov. If you or your staff have any questions about this report, please call me at (202) 512-7101. Another contact and key contributors are listed in appendix V. To determine the approaches DOD uses to identify servicemembers who are at risk for PTSD, we reviewed directives on screening servicemembers deployed to locations outside of the United States, interviewed DOD officials, and visited a military installation for each of DOD's uniformed services. At each of the military installations, we discussed with officials the steps taken by each of the uniformed services to implement DOD's approaches, particularly the steps involved in completing the post- deployment health assessment questionnaire, DD 2796, as it relates to PTSD. How well the uniformed services implemented DOD's approaches were reported in other GAO reports. The uniformed services included in our review were Army, Marines, Air Force, and Navy. We did not include the Coast Guard in this review because few Coast Guard servicemembers are involved in the Iraq and Afghanistan conflicts. The military installations visited were: Fort Lewis Army Base and Madigan Army Medical Center in Washington, Seymour Johnson Air Force Base in North Carolina, Camp Lejeune Marine Base and the Naval Hospital Camp Lejeune in North Carolina, and the Naval Medical Center San Diego in California. We also asked DOD officials whether they provide information to VA that could help VA plan how to meet the demand for VA PTSD services from servicemembers returning from the Iraq and Afghanistan conflicts. To determine whether VA has the information it needs to determine whether it can meet an increase in demand for PTSD services, we interviewed PTSD experts from the National Center for PTSD established within VA and members of the Under Secretary for Health's Special Committee on PTSD. We also visited three divisions of the National Center for PTSD: the Executive Division in White River Junction, Vermont; the Education Division in Palo Alto, California; and NEPEC in West Haven, Connecticut to review the Center's reports on specialized PTSD programs. We also reviewed VA's fiscal year 2001 and 2002 annual reports on VA's capacity to provide services to special populations, including veterans with PTSD, and NEPEC's annual reports on specialized PTSD programs to determine the criteria VA uses to count the number of veterans receiving VA PTSD services. We reviewed the findings of VA's Committee on Care of Veterans with Serious Mental Illness and the VA OIG, who have reported on the accuracy of VA's annual capacity report to Congress on the number of veterans receiving specialized services, including PTSD services. We interviewed officials from each of these groups to clarify their findings. We did not include data from the annual capacity reports because the OIG reported that the data were not sufficiently reliable. We also interviewed the director of NEPEC to discuss the information included in NEPEC's annual reports. To determine whether VA facilities have the information needed to determine whether they can meet an increase in demand for PTSD services, we interviewed officials at 7 VA medical facilities, and 15 Vet Centers located near the medical facilities to discuss the number of veterans currently receiving VA PTSD services and the impact that an increase in demand would have on these services. We also discussed DOD's demographic information with four of the seven medical facilities we visited. We contacted VA medical facilities located in Palo Alto and San Diego in California; Durham and Fayetteville in North Carolina; White River Junction, Vermont; West Haven, Connecticut; and Seattle, Washington. We also contacted Vet Centers located in Vista, San Diego, and San Jose in California; Raleigh, Charlotte, Greenville, Greensboro, and Fayetteville in North Carolina; South Burlington and White River Junction in Vermont; Hartford, Norwich, and New Haven in Connecticut; and Seattle and Tacoma in Washington. Our work was conducted from May through September 2004 in accordance with generally accepted government auditing standards. Authority: 10 U.S.C. 136 Chapter 55. 1074f, 3013, 5013, 8013 and E.O. 9397 Principal Purpose: To assess your state of health after deployment outside the United States in support of military operations and to assist military healthcare providers in identifying and providing present and future medical care to you. Routine Use: To other Federal and State agencies and civilian healthcare providers, as necessary, in order to provide necessary medical care and treatment. Disclosure: (Military personnal and DoD civilian Employees Only) Voluntary. If not provided, healthcare WILL BE furnished, but comprehensive care may not be possible. INSTRUCTIONS: Please read each question completely and carefully before marking your selections. Provide a response for each question. If you do not understand a question, ask the administrator. Today's Date (dd/mm/yyyy) DOB (dd/mm/yyyy) Date of arrival in theater (dd/mm/yyyy) Date of departure from theater (dd/mm/yyyy) Asia (Other) To what areas were you mainly deployed: (mark all that apply - list where/date arrived) Occupational specialty during this deployment (MOS, NEC or AFSC) 1. Did your health change during this deployment? 4. Did you receive any vaccinations just before or during this deployment? Smallpox (leaves a scar on the arm) 5. Did you take any of the following medications during this deployment? (mark all that apply) PB (pyridostigmine bromide) nerve agent pill Pills to stay awake, such as dexedrine 6. Do you have any of these symptoms now or did you develop them anytime during this deployment? 10. Are you currently interested in receiving help for a stress, emotional, alcohol or family problem? (mark all that apply) 11. Over the LAST 2 WEEKS, how often have you been bothered by any of the following problems? 8. Were you engaged in direct combat where you discharged your weapon? Little interest or pleasure in doing things air ) DD FORM 2796, APR 2003 12. Have you ever had any experience that was so frightening, horrible, or upsetting that, IN THE PAST MONTH, you .... 15. On how many days did you wear your MOPP over garments? Have had any nightmares about it or thought about it when you did not want to? Tried hard not to think about it or went out of your way to avoid situations that remind you of it? 16. How many times did you put on your gas mask because of alerts and NOT because of exercises? Were constantly on guard, watchful, or easily startled? Felt numb or detached from others, activities, or your surroundings? 17. Were you in or did you enter or closely inspect any destroyed military vehicles? 13. Are you having thoughts or concerns that ... 18. Do you think you were exposed to any chemical, biological, or radiological warfare agents during this deployment? You may have serious conflicts with your spouse, family members, or close friends? You might hurt or lose control with someone? (mark all that apply) DEET insect repellent applied to skin Environmental pesticides (like area fogging) Smoke from burning trash or feces Vehicle or truck exhaust fumes Fog oils (smoke screen) Depleted Uranium (If yes, explain) Date (dd/mm/yyyy) In addition to the contact named above Mary Ann Curran, Linda Diggs, Martha Fisher, Krister Friday, and Marion Slachta made key contributions to this report. Defense Health Care: DOD Needs to Improve Force Health Protection and Surveillance Processes. GAO-04-158T, Washington, D.C.: October 16, 2003. Defense Health Care: Quality Assurance Process Needed to Improve Force Health Protection and Surveillance. GAO-03-1041, Washington, D.C.: September 19, 2003. Disabled Veterans' Care: Better Data and More Accountability Needed to Adequately Assess Care. GAO/HEHS-00-57, Washington, D.C.: April 21, 2000. The Government Accountability Office, the audit, evaluation and investigative arm of Congress, exists to support Congress in meeting its constitutional responsibilities and to help improve the performance and accountability of the federal government for the American people. GAO examines the use of public funds; evaluates federal programs and policies; and provides analyses, recommendations, and other assistance to help Congress make informed oversight, policy, and funding decisions. GAO's commitment to good government is reflected in its core values of accountability, integrity, and reliability. The fastest and easiest way to obtain copies of GAO documents at no cost is through GAO's Web site (www.gao.gov). Each weekday, GAO posts newly released reports, testimony, and correspondence on its Web site. To have GAO e-mail you a list of newly posted products every afternoon, go to www.gao.gov and select "Subscribe to Updates." | Post-traumatic stress disorder (PTSD) is caused by an extremely stressful event and can develop after the threat of death or serious injury as in military combat. Experts predict that about 15 percent of servicemembers serving in Iraq and Afghanistan will develop PTSD. Efforts by VA to inform new veterans, including Reserve and National Guard members, about the expanded availability of VA health care services could result in an increased demand for VA PTSD services. GAO identified the approaches DOD uses to identify servicemembers at risk for PTSD and examined if VA has the information it needs to determine whether it can meet an increase in demand for PTSD services. GAO visited military bases and VA facilities, reviewed relevant documents, and interviewed DOD and VA officials to determine how DOD identifies servicemembers at risk for PTSD, and what information VA has to estimate demand for VA PTSD services. DOD uses two approaches to identify servicemembers at risk for PTSD: the combat stress control program and the post-deployment health assessment questionnaire. The combat stress control program trains servicemembers to recognize the early onset of combat stress, which can lead to PTSD. Symptoms of combat stress and PTSD include insomnia, nightmares, and difficulties coping with relationships. To assist servicemembers in the combat theater, teams of DOD mental health professionals travel to units to reinforce the servicemembers' knowledge of combat stress symptoms and to help identify those who may be at risk for combat stress and PTSD. DOD also uses the post-deployment health assessment questionnaire to identify physical ailments and mental health issues commonly associated with deployments, including PTSD. The questionnaire includes the following four screening questions that VA and DOD mental health experts developed to identify servicemembers at risk for PTSD: Have you ever had any experience that was so frightening, horrible, or upsetting that, in the past month, you (1) have had any nightmares about it or thought about it when you did not want to; (2) tried hard not to think about it or went out of your way to avoid situations that remind you of it; (3) were constantly on guard, watchful, or easily startled; and/or (4) felt numb or detached from others, activities, or your surroundings? VA lacks the information it needs to determine whether it can meet an increase in demand for VA PTSD services. VA does not have a count of the total number of veterans currently receiving PTSD services at its medical facilities and Vet Centers--community-based VA facilities that offer trauma and readjustment counseling. Without this information, VA cannot estimate the number of new veterans its medical facilities and Vet Centers could treat for PTSD. VA has two reports on the number of veterans it currently treats, with each report counting different subsets of veterans receiving PTSD services. Veterans who are receiving VA PTSD services may be counted in both reports, one of the reports, or not included in either report. VA does receive demographic information from DOD, which includes home addresses of servicemembers that could help VA predict which medical facilities or Vet Centers servicemembers may access for health care. By assuming that 15 percent or more of servicemembers who have left active duty status will develop PTSD, VA could use the home zip codes of servicemembers to broadly estimate the number of servicemembers who may need VA PTSD services and identify the VA facilities located closest to their homes. However, predicting which veterans will seek VA care and at which facilities is inherently uncertain, particularly given that the symptoms of PTSD may not appear for years. | 5,837 | 757 |
Immunizations are widely considered one of the leading public health achievements of the 20th century. Mandatory immunization programs have eradicated polio and smallpox in the United States and reduced the number of deaths from several childhood diseases, such as measles, to near zero. A consistent supply of many different vaccines is needed to support this effort. CDC currently recommends routine immunizations against 11 childhood diseases: diphtheria, tetanus, pertussis (whooping cough), Haemophilus influenzae type b (most commonly meningitis), hepatitis B, measles, mumps, rubella (German measles), invasive pneumococcal disease, polio, and varicella (chicken pox). By combining antigens (the component of a vaccine that triggers an immune response), a single injection of a combination vaccine can protect against multiple diseases. The federal government, primarily through agencies of the Department of Health and Human Services (HHS), has a role both as a purchaser of vaccines and as a regulator of the industry. The federal government is the largest purchaser of vaccines in the country. CDC negotiates large purchase contracts with manufacturers and makes the vaccines available to public immunization programs under the Vaccines for Children (VFC) program. Under VFC, vaccines are provided for certain children, including those who are eligible for Medicaid or uninsured. Participating public and private health care providers obtain vaccines through VFC at no charge. A second program, established under section 317, of the Public Health Service Act, provides project grants for preventive health services, including immunizations. Currently, CDC supports 64 state, local, and territorial immunization programs (for simplicity, we refer to them as state immunization programs). In total, about 50 percent of all the childhood vaccines administered in the United States each year are obtained by public immunization programs through CDC contracts. The federal government is also responsible for ensuring the safety of the nation's vaccine supply. FDA regulates the production of vaccines. It licenses all vaccines sold in the United States, requiring clinical trials to demonstrate that vaccines are safe and effective, and reviews the manufacturing process to ensure that vaccines are made consistently in compliance with current good manufacturing practices. Once vaccines are licensed, FDA also conducts periodic inspections of production facilities to ensure that manufacturers maintain compliance with FDA manufacturing requirements. States also have an important role in immunization efforts. Policies for immunization requirements, including minimum school and day care entry requirements are made almost exclusively at the state level, although cities occasionally impose additional requirements. Each state also established an immunization infrastructure to monitor infectious disease outbreaks, administer federal immunization grants, manage centralized supplies of vaccine, and otherwise promote immunization policies. Recent vaccine shortages have necessitated temporary modifications to the recommended immunization schedule and have caused states to scale back immunization requirements. In our survey of 64 state immunization programs, administered through the Association for State and Territorial Health Officials (ASTHO), all 52 responding programs indicated that they had experienced shortages of two or more vaccines and had taken some form of action to deal with the shortages. Vaccine shortages experienced at the state level have, in turn, prompted cutbacks in immunization requirements for admission to day care or school. Thirty-five states reported putting into effect new, less stringent immunization requirements that allow children who have received fewer than the recommended number of vaccinations to attend school. In general, these states have reduced the immunization requirements for day care and/or school entry or have temporarily suspended enforcement of those requirements until vaccine supplies are replenished. For example, the Minnesota Department of Health suspended the school and postsecondary immunization laws for Td vaccine for the second year in a row, with the suspension extending through the 2002-2003 school year. Other states, including South Carolina and Washington, reported allowing children to attend day care or school even if they were not in compliance with immunization requirements, under the condition that they be recalled for vaccinations when supplies became available. While it is too early to measure the effect of deferred vaccinations on immunization rates, a number of states reported that vaccine shortages and missed make-up vaccinations may take a toll on coverage and, therefore, increase the potential for infectious disease outbreaks. The full impact of vaccine shortages is difficult to measure for several reasons. For example, none of the national immunization coverage surveys measures vaccination coverage of children under the age of 18 months--the age cohort receiving the majority of vaccinations. While immunization experts generally agree that the residual effects of historically high immunization rates afford temporary protection for underimmunized children, missed immunizations could make susceptible children vulnerable to disease outbreaks. For example, a CDC analysis of a 1998 outbreak of measles in an Anchorage, Alaska, school showed that only 51 percent of the 2,186 children exposed had received the requisite two doses of measles vaccine. No single reason explains the rash of recent vaccine shortages; rather, multiple factors coincided that affected both the supply of and demand for vaccines. We identified four key factors, as follows. Production Problems - Manufacturing production problems contributed to the shortage of certain vaccines. In some cases, production slowdowns or interruptions occurred when planned maintenance activities took longer than expected; in other cases, production was affected as manufacturers addressed problems identified in FDA inspections. Changes over the last several years in FDA inspection practices may have resulted in the identification of more or different instances of manufacturers' noncompliance with FDA manufacturing requirements. For example, prior to these changes, biologics inspections tended to focus primarily on scientific or technical issues and less on compliance with good manufacturing practices and documentation issues. FDA did take some steps to inform manufacturers about its inspection program changes; however, some manufacturers reported problems related to how well the changes were communicated. FDA issued a compliance program guidance manual detailing the new protocol for conducting inspections intended for FDA staff. However, the information in it could have provided manufacturers a better understanding of the scope of the inspections, but the manual was not made widely available--only upon request. Removal of Thimerosal - Calls for the removal of the preservative thimerosal from childhood vaccines illustrate the effect that policy changes can have on the supply of vaccine. As a precautionary measure, in July 1999, the American Academy of Pediatrics (AAP) and the U.S. Public Health Service (PHS) issued a joint statement advising that thimerosal in vaccines be eliminated or reduced as soon as possible. While thimerosal was present in several vaccines, removing it from some vaccines was more complex than for others. For example, one manufacturer of the diphtheria- tetanus-acellular pertussis vaccine (DTaP) had to switch its packaging from multidose to single-dose vials due to the removal of the preservative. This process reduced the manufacturer's output of vaccine by 25 percent, according to the manufacturer. Manufacturer's Decision to Discontinue Production - Another major factor in the shortage of DTaP, and also Td, was the decision of one manufacturer to discontinue production of all products containing tetanus toxoid. With little advance warning, the company announced in January 2001 that it had ceased production of these vaccines. According to the manufacturer, prior to its decision, it produced approximately one-quarter of all Td and 25 to 30 percent of all DTaP distributed in the United States, so the company's departure from these markets was significant. In the previous year, another manufacturer that supplied a relatively small portion of DTaP also had stopped producing this vaccine. Together these decisions decreased the number of major manufacturers of DTaP from four to two and of Td from two to one. Unanticipated Demand - The addition of new vaccines to the recommended immunization schedule can also result in shortages if the demand for vaccine outstrips the predicted need and production levels. This was the case with a newly licensed vaccine, pneumococcal conjugate vaccine (PCV), which protects against invasive pneumococcal diseases in young children. PCV was licensed by FDA in February 2000 and formally added to the recommended schedule in January 2001. Company officials said an extensive education campaign prior to its availability resulted in record-breaking initial demand for the vaccine. CDC reported shortages of PCV existed through most of 2001, and the manufacturer was only able to provide about half the needed doses during the first 5 months of 2002. Ongoing manufacturing problems limit production, exacerbating the shortage. While the recent shortages have been largely resolved, the vaccine supply remains vulnerable to any number of disruptions that could occur in the future--including those that contributed to recent shortages and other potential problems, such as a catastrophic plant fire. One key reason is that the nature of vaccine manufacturing prevents the quick production of more vaccine when disruptions occur. Manufacturing a vaccine is a complex, highly controlled process, involving living biological organisms, that can take several months to over a year. Another underlying problem is the limited number of manufacturers--five of the eight recommended childhood vaccines have only one major manufacturer each. Consequently, if there are interruptions in supply or if a manufacturer ceases production, there may be few or no alternative sources of vaccine. One situation that may help add to the supply of existing vaccines is the development of new vaccines. A recent example is a new formulation of DTaP that recently received FDA approval and has helped ease the shortage of DTaP. We identified 11 vaccines in development that could help meet the current recommended immunization schedule. These vaccines, some of which are already licensed for use in other countries, are in various stages of development, but all must undergo a rather lengthy process of clinical testing and FDA review. While FDA has mechanisms available to shorten the review process, they are not used for most vaccines under development. FDA policies generally restrict the use of its expedited review processes to vaccines that offer protection against diseases for which there are no existing vaccines. Because childhood vaccines under development often involve new forms or combinations of existing vaccines, they typically do not qualify for expedited FDA review. Federal efforts to strengthen the nation's vaccine supply have taken on greater urgency with the recent incidents of shortages. As part of its mandate to study and recommend ways to encourage the availability of safe and effective vaccines, the National Vaccine Advisory Committee formed a work group to explore the issues surrounding vaccine shortages and identify strategies for further consideration by HHS. In its preliminary report, the work group identified several strategies that hold promise, such as streamlining the regulatory process, providing financial incentives for vaccine development, and strengthening manufacturers' liability protection, but it concluded that these strategies needed further study. The work group did express support for expanding CDC vaccine stockpiles In response to the work group's finding that streamlining the regulatory process needed further study, FDA recently announced that it is examining regulations governing manufacturing processes for both drugs and vaccine products to determine if reform is needed. However, FDA officials told us it is too early to define the scope and time frame for this reexamination. Regarding financial incentives for vaccine development, the Institute of Medicine is currently conducting a study of vaccine pricing and financing strategies that may address this issue. In regard to liability protections, the work group did make recommendations to strengthen the Vaccine Injury Compensation Program (VICP). VICP is a federal program authorized in 1986 to reduce vaccine manufacturers' liability by compensating individuals for childhood-vaccine-related injuries from a VICP trust fund. The program was established, in part, to help stem the exodus of manufacturers from the vaccine business due to liability concerns. Manufacturers, however, reported a recent resurgence of childhood-vaccine-related lawsuits-- including class action lawsuits related to past use of thimerosal--that allege that the lawsuits are not subject to VICP. While the work group acknowledged that recent vaccine shortages do not appear to be related to VICP liability issues, it indicated that strengthening VICP would encourage manufacturers to enter, or remain in, the vaccine production business. Legislation has been introduced for the purpose of clarifying and modifying VICP. Also consistent with the work group's recommendations, CDC is considering whether additional vaccine stockpiles will help stabilize the nation's vaccine supply. In 1993, with the establishment of the VFC program, CDC was required to purchase sufficient quantities of pediatric vaccines not only to meet normal usage, but also to provide an additional 6-month supply to meet unanticipated needs. Further, to ensure funding, CDC was authorized to make such purchases in advance of appropriations. Despite this requirement, to date, CDC has established partial stockpiles for only two--measles-mumps-rubella (MMR) and inactivated polio vaccine (IPV)--of the eight recommended childhood vaccines. Even if CDC decides to stockpile additional vaccines, the limited supply and manufacturing capacity will restrict CDC's ability to build certain stockpiles in the near term. CDC estimates it could take 4 to 5 years to build stockpiles for all the currently recommended childhood vaccines--at a cost of $705 million. Past experience also demonstrates the difficulty of rapidly building stockpiles. Neither the current IPV nor MMR stockpiles have ever achieved target levels because of limited manufacturing capacity. In addition to these challenges, CDC will also need to address issues regarding its authority, strategy, and information needed to use stockpiled vaccines. Authority - It is uncertain whether stockpiled vaccines purchased with VFC funds can be used for non-VFC-eligible children. While the 1993 legislation required the Secretary of HHS to negotiate for a 6-month stockpile of vaccines to meet unanticipated needs, the legislation did not state that the supply of stockpiled vaccines may be made available for children not otherwise eligible through the VFC program. CDC officials said that the VFC legislation is unclear as to whether stockpiled vaccines can be used for all children. Strategy - Expanding the number of CDC vaccine stockpiles will require a substantial planning effort--an effort that is not yet complete. For example, CDC has not made key decisions about vaccine stockpiles to ensure their ready release, including the quantity of each vaccine to stockpile, the form of storage, and storage locations. Also, to ensure that use of a stockpile does not disrupt supply to other purchasers, procedures would need to be developed to ensure that stockpiles represent additional quantities to a manufacturer's normal inventory levels. | Vaccine shortages began to appear in November 2000, when supplies of the tetanus and diptheria booster fell short. By October 2001, the Centers for Disease Control and Prevention (CDC) reported shortages of five vaccines that protect against eight childhood diseases. In addition to diptheria and tetanus vaccines, vaccines to protect against pertussis, invasive pneumococcal disease, measles, mumps, rubella, and varicella were in short supply. In July 2002, updated CDC data indicated supplies were returning to normal for most vaccines. However, the shortage of vaccine to protect against invasive pneumococcal disease was expected to continue through at least late 2002. Shortages have prompted federal authorities to recommend deferring some vaccinations and have caused most states to reduce or suspend immunization requirements for school and day care programs so that children who have not received all mandatory immunizations can enroll. States are concerned that failure to be vaccinated at a later date may reduce the share of the population protected and increase the potential for disease to spread; however, data are not currently available to measure these effects. Many factors, including production problems and unanticipated demand for new vaccines, contributed to recent shortages. Although problems leading to the shortages have largely been resolved, the potential exists for shortages to recur. Federal agencies and advisory committees are exploring ways to help stabilize the nation's vaccine supply, but few long-term solutions have emerged. Although CDC is considering expanding vaccine stockpiles to provide a cushion in the event of a supply disruption, limited supply and manufacturing capacity will restrict CDC's ability to build them. | 3,303 | 377 |
From its origins as a research project sponsored by the U.S. government, the Internet has grown increasingly important to American businesses and consumers, serving as the host for hundreds of billions of dollars of commerce each year. It is also a critical resource supporting vital services, such as power distribution, health care, law enforcement, and national defense. Similar growth has taken place in other parts of the world. The Internet relies upon a set of functions, called the domain name system, to ensure the uniqueness of each e-mail and Web site address. The rules that govern the domain name system determine which top-level domains (the string of text following the right-most period, such as .gov) are recognized by most computers connected to the Internet. The heart of this system is a set of 13 computers called "root servers," which are responsible for coordinating the translation of domain names into Internet addresses. Appendix I provides more background on how this system works. The U.S. government supported the implementation of the domain name system for nearly a decade, largely through a Department of Defense contract. Following a 1997 presidential directive, the Department of Commerce began a process for transitioning the technical responsibility for the domain name system to the private sector. After requesting and reviewing public comments on how to implement this goal, in June 1998 the Department issued a general statement of policy, known as the "White Paper." In this document, the Department stated that because the Internet was rapidly becoming an international medium for commerce, education, and communication, the traditional means of managing its technical functions needed to evolve as well. Moreover, the White Paper stated the U.S. government was committed to a transition that would allow the private sector to take leadership for the management of the domain name system. Accordingly the Department stated that the U.S. government was prepared to enter into an agreement to transition the Internet's name and number process to a new not-for-profit organization. At the same time, the White Paper said that it would be irresponsible for the U.S. government to withdraw from its existing management role without taking steps to ensure the stability of the Internet during the transition. According to Department officials, the Department sees its role as the responsible steward of the transition process. Subsequently, the Department entered into an MOU with ICANN to guide the transition. ICANN has made significant progress in carrying out MOU tasks related to one of the guiding principles of the transition effort--increasing competition. However, progress has been much slower on activities designed to address the other guiding principles: increasing the stability and security of the Internet; ensuring representation of the Internet community in domain name policy-making; and using private, bottom-up coordination. Earlier this year, ICANN's president concluded that ICANN faced serious problems in accomplishing the transition and needed fundamental reform. In response, ICANN's Board established an internal committee to recommend options for reform. ICANN made important progress on several of its assigned tasks related to promoting competition. At the time the transition began, only one company, Network Solutions, was authorized to register names under the three publicly available top-level domains (.com, .net, and .org). In response to an MOU task calling for increased competition, ICANN successfully developed and implemented procedures under which other companies, known as registrars, could carry out this function. As a result, by early 2001, more than 180 registrars were certified by ICANN. The cost of securing these names has now dropped from $50 to $10 or less per year. Another MOU task called on ICANN to expand the pool of available domain names through the selection of new top-level domains. To test the feasibility of this idea, ICANN's Board selected seven new top-level domains from 44 applications; by March 2002, it had approved agreements with all seven of the organizations chosen to manage the new domains. At a February 2001 hearing before a Subcommittee of the U.S. House of Representatives, witnesses presented differing views on whether the selection process was transparent and based on clear criteria. ICANN's internal evaluation of this test was still ongoing when we finished our audit work in May 2002. Several efforts to address the White Paper's guiding principle for improving the security and stability of the Internet are behind schedule. These include developing operational requirements and security policies to enhance the stability and security of the domain name system root servers, and formalizing relationships with other entities involved in running the domain name system. Recent reports by federally sponsored organizations have highlighted the importance of the domain name system to the stability and security of the entire Internet. A presidential advisory committee reported in 1999 that the domain name system is the only aspect of the Internet where a single vulnerability could be exploited to disrupt the entire Internet. More recently, the federal National Infrastructure Protection Center issued several warnings in 2001 stating that multiple vulnerabilities in commonly used domain name software present a serious threat to the Internet infrastructure. In recognition of the critical role that the domain name system plays for the Internet, the White Paper designated the stability and security of the Internet as the top priority of the transition. The MOU tasked ICANN and the Department with developing operational requirements and security policies to enhance the stability and security of the root servers--the computers at the heart of the domain name system. In June 1999, ICANN and the Department entered into a cooperative research and development agreement to guide the development of these enhancements, with a final report expected by September 2000. This deadline was subsequently extended to December 2001 and the MOU between ICANN and the Department was amended to require the development of a proposed enhanced architecture (or system design) for root server security, as well as a transition plan, procedures, and implementation schedule. An ICANN advisory committee, made up of the operators of the 13 root servers and representatives of the Department, is coordinating research on this topic. Although the chairman of the committee stated at ICANN's November 2001 meeting that it would finish its report by February or March 2002, it had not completed the report as of May 2002. To further enhance the stability of the Internet, the White Paper identified the need to formalize the traditionally informal relationships among the parties involved in running the domain name system. The White Paper pointed out that many commercial interests, staking their future on the successful growth of the Internet, were calling for a more formal and robust management structure. In response, the MOU and its amendments included several tasks that called on ICANN to enter into formal agreements with the parties that traditionally supported the domain name system through voluntary efforts. However, as of May 2002, few such agreements had been signed. ICANN's Board has approved a model agreement to formalize the relationship between the root server operators and ICANN, but no agreements had been reached with any of the operators as of May 2002. Similarly, there are roughly 240 country-code domains (2-letter top-level domains reserved mainly for national governments), such as .us for the United States. As with the root servers, responsibility for these domains was originally given by the Internet's developers to individuals who served as volunteers. Although the amended MOU tasked ICANN with reaching contractual agreements with these operators, it has reached agreements with only 2 domain operators as of May 2002. Finally, the amended MOU tasked ICANN with reaching formal agreements with the Regional Internet Registries, each of which is responsible for allocating Internet protocol numbers to users in one of three regions of the world. The registries reported that progress was being made on these agreements, though none had been reached as of May 2002. Progress has also been slow regarding the other two guiding principles outlined in the White Paper, which call for the creation of processes to represent the functional and geographic diversity of the Internet, and for the use of private, bottom-up coordination in preference to government control. In order for the private sector organization to derive legitimacy from the participation of key Internet stakeholders, the White Paper suggested the idea of a board of directors that would balance the interests of various Internet constituencies, such as Internet service providers, domain name managers, technical bodies, and individual Internet users. The White Paper also suggested the use of councils to develop, recommend, and review policies related to their areas of expertise, but added that the board should have the final authority for making policy decisions. The Department reinforced the importance of a representative board in a 1998 letter responding to ICANN's initial proposal. The Department's letter cited public comments suggesting that without an open membership structure, ICANN would be unlikely to fulfill its goals of private, bottom-up coordination and representation. ICANN's Board responded to the Department by amending its bylaws to make it clear that the Board has an "unconditional mandate" to create a membership structure that would elect at-large directors on the basis of nominations from Internet users and other participants. To implement these White Paper principles, the MOU between ICANN and the Department includes two tasks: one relating to developing mechanisms that ensure representation of the global and functional diversity of the Internet and its users, and one relating to allowing affected parties to participate in the formation of ICANN's policies and procedures through a bottom-up coordination process. In response to these two tasks, ICANN adopted the overall structure suggested by the White Paper. First, ICANN created a policy-making Board of Directors. The initial Board consisted of ICANN's president and 9 at-large members who were appointed at ICANN's creation. ICANN planned to replace the appointed at-large Board members with 9 members elected by an open membership to reflect the diverse, worldwide Internet community. Second, ICANN organized a set of three supporting organizations to advise its Board on policies related to their areas of expertise. One supporting organization was created to address Internet numbering issues, one was created to address protocol development issues, and one was created to address domain name issues. Together these three supporting organizations selected 9 additional members of ICANN's Board-3 from each organization. Thus, ICANN's Board was initially designed to reflect the balance of interests described in the White Paper. Figure 1 illustrates the relationships among ICANN's supporting organizations and its Board of Directors, as well as several advisory committees ICANN also created to provide input without formal representation on its Board. Despite considerable debate, ICANN has not resolved the question of how to fully implement this structure, especially the at-large Board members. Specifically, in March 2000, ICANN's Board noted that extensive discussions had not produced a consensus regarding the appropriate method to select at-large representatives. The Board therefore approved a compromise under which 5 at-large members would be elected through regional, online elections. In October 2000, roughly 34,000 Internet users around the world voted in the at-large election. The 5 successful candidates joined ICANN's Board in November 2000, replacing interim Board members. Four of the appointed interim Board members first nominated in ICANN's initial proposal continue to serve on the Board. Parallel with the elections, the Board also initiated an internal study to evaluate options for selecting at-large Board members. In its November 2001 report, the committee formed to conduct this study recommended the creation of a new at-large supporting organization, which would select 6 Board members through regional elections. Overall, the number of at- large seats would be reduced from 9 to 6, and the seats designated for other supporting organizations would increase from 9 to 12. A competing, outside study by a committee made up of academic and nonprofit interests recommended continuing the initial policy of directly electing at-large Board members equal to the number selected by the supporting organizations. This committee also recommended strengthening the at- large participation mechanisms through staff support and a membership council similar to those used by the existing supporting organizations.Because of ongoing disagreement among Internet stakeholders about how individuals should participate in ICANN's efforts, ICANN's Board referred the question to a new Committee on ICANN Evolution and Reform. Under the current bylaws, the 9 current at-large Board seats will cease to exist after ICANN's 2002 annual meeting, to be held later this year. Although the MOU calls on ICANN to design, develop, and test its procedures, the two tasks involving the adoption of the at-large membership process were removed from the MOU when it was amended in August 2000. However, as we have noted, this process was not fully implemented at the time of the amendment because the election did not take place until October 2000, and the evaluation committee did not release its final report until November 2001. When we discussed this amendment with Department officials, they said that they agreed to the removal of the tasks in August 2000 because ICANN had a process in place to complete them. Nearly 2 years later, however, the issue of how to structure ICANN's Board to achieve broad representation continues to be unresolved and has been a highly contentious issue at ICANN's recent public meetings. In addition, the amended MOU tasked ICANN with developing and testing an independent review process to address claims by members of the Internet community who were adversely affected by ICANN Board decisions that conflicted with ICANN's bylaws. However, ICANN was unable to find qualified individuals to serve on a committee charged with implementing this policy. In March 2002, ICANN's Board referred this unresolved matter to the Committee on ICANN Evolution and Reform for further consideration. In the summer of 2001, ICANN's current president was generally optimistic about the corporation's prospects for successfully completing the remaining transition tasks. However, in the face of continued slow progress on key aspects of the transition, such as reaching formal agreements with the root server and country-code domain operators, his assessment changed. In February 2002, he reported to ICANN's Board that the corporation could not accomplish its assigned mission on its present course and needed a new and reformed structure. The president's proposal for reform, which was presented to ICANN's Board in February, focused on problems he perceived in three areas: (1) too little participation in ICANN by critical entities, such as national governments, business interests, and entities that share responsibility for the operation of the domain name system (such as root server operators and country- code domain operators); (2) too much focus on process and representation and not enough focus on achieving ICANN's core mission; and (3) too little funding for ICANN to hire adequate staff and cover other expenditures. He added that in his opinion, there was little time left to make necessary reforms before the ICANN experiment came to "a grinding halt." Several of his proposed reforms challenged some of the basic approaches for carrying out the transition. For example, the president concluded that a totally private sector management model had proved to be unworkable. He proposed instead a "well-balanced public-private partnership" that involved an increased role for national governments in ICANN, including having several voting members of ICANN's Board selected by national governments. The president also proposed changes that would eliminate global elections of at-large Board members by the Internet community, reduce the number of Board members selected by ICANN's supporting organizations, and have about a third of the board members selected through a nominating committee composed of Board members and others selected by the Board. He also proposed that ICANN's funding sources be broadened to include national governments, as well as entities that had agreements with ICANN or received services from ICANN. In response, ICANN's Board instructed an internal Committee on ICANN Evolution and Reform (made up of four ICANN Board members) to consider the president's proposals, along with reactions and suggestions from the Internet community, and develop recommendations for the Board's consideration on how ICANN could be reformed. The Committee reported back on May 31, 2002, with recommendations reflecting their views on how the reform should be implemented. For example, the committee built on the ICANN president's earlier proposal to change the composition of the Board and have some members be selected through a nominating committee process, and to create an ombudsman to review complaints and criticisms about ICANN and report the results of these reviews to the Board. In other cases, the committee agreed with conclusions reached by the president (such as the need for increasing the involvement of national governments in ICANN and improving its funding), but did not offer specific recommendations for addressing these areas. The committee's report, which is posted on ICANN's public Web site, invited further comment on the issues and recommendations raised in preparation for ICANN's June 2002 meeting in Bucharest, Romania. The committee recommended that the Board act in Bucharest to adopt a reform plan that would establish the broad outline of a reformed ICANN, so that the focus could be shifted to the details of implementation. The committee believed that this outline should be then be filled in as much as possible between the Bucharest meeting and ICANN's meeting in Shanghai in late October 2002. As mentioned previously, the Department is responsible for general oversight of work done under the MOU, as well as the responsibility for determining when ICANN, the private sector entity chosen by the Department to carry out the transition, has demonstrated that it has the resources and capability to manage the domain name system. However, the Department's public assessment of the status of the transition process has been limited in that its oversight of ICANN has been informal, it has not issued status reports, and it has not publicly commented on specific reform proposals being considered by ICANN. According to Department officials, the Department's relationship with ICANN is limited to its agreements with the corporation, and its oversight is limited to determining whether the terms of these agreements are being met. They added that the Department does not involve itself in the internal governance of ICANN, is not involved in ICANN's day-to-day operations, and would not intervene in ICANN's activities unless the corporation's actions were inconsistent with the terms of its agreements with the Department. Department officials emphasized that because the MOU defines a joint project, decisions regarding changes to the MOU are reached by mutual agreement between the Department and ICANN. In the event of a serious disagreement with ICANN, the Department would have recourse under the MOU to terminate the agreement. Department officials characterized its limited involvement in ICANN's activities as being appropriate and consistent with the purpose of the project: to test ICANN's ability to develop the resources and capability to manage the domain name system with minimal involvement of the U.S. government. Department officials said that they carry out their oversight of ICANN's MOU-related activities mainly through ongoing informal discussions with ICANN officials. They told us that there is no formal record of these discussions. The Department has also retained authority to approve certain activities under its agreements with ICANN, such as reviewing and approving certain documents related to root server operations. This would include, for example, agreements between ICANN and the root server operators. In addition, the Department retains policy control over the root zone file, the "master file" of top-level domains shared among the 13 root servers. Changes to this file, such as implementing a new top-level domain, must first be authorized by the Department. In addition, the Department sends officials to attend ICANN's public forums and open Board of Directors meetings, as do other countries and Internet interest groups. According to the Department, it does not participate in ICANN decision-making at these meetings but merely acts as an observer. The Department also represents the United States on ICANN's Governmental Advisory Committee, which is made up of representatives of about 70 national governments and intergovernmental bodies, such as treaty organizations. The Committee's purpose is to provide ICANN with nonbinding advice on ICANN activities that may relate to concerns of governments, particularly where there may be an interaction between ICANN's policies and national laws or international agreements. The Department made a considerable effort at the beginning of the transition to create an open process that solicited and incorporated input from the public in formulating the guiding principles of the 1998 White Paper. However, since the original MOU, the Department's public comments on the progress of the transition have been general in nature and infrequent, even though the transition is taking much longer than anticipated. The only report specifically called for under the MOU is a final joint project report to document the outcome of ICANN's test of the policies and procedures designed and developed under the MOU. This approach was established at a time when it was expected that the project would be completed by September 2000. So far, there has been only one instance when the Department provided ICANN with a formal written assessment of the corporation's progress on specific transition tasks. This occurred in June 1999, after ICANN took the initiative to provide the Department and the general public with a status report characterizing its progress on MOU activities. In a letter to ICANN, the Department stated that while ICANN had made progress, there was still important work to be done. For, example, the Department stated that ICANN's "top priority" must be to complete the work necessary to put in place an elected Board of Directors on a timely basis, adding that the process of electing at-large directors should be complete by June 2000. ICANN made the Department's letter, as well as its positive response, available to the Internet community on its public Web site. Although ICANN issued additional status reports in the summers of 2000 and 2001, the Department stated that it did not provide written views and recommendations regarding them, as it did in July 1999, because it agreed with ICANN's belief that additional time was needed to complete the MOU tasks. Department officials added that they have been reluctant to comment on ICANN's progress due to sensitivity to international concerns that the United States might be seen as directing ICANN's actions. The officials stated that they did not plan to issue a status report at this time even though the transition is well behind schedule, but will revisit this decision as the September 2002 termination date for the MOU approaches. When we met with Department officials in February 2002, they told us that substantial progress had been made on the project, but they would not speculate on ICANN's ability to complete its tasks by September 2002. The following week, ICANN's president released his report stating that ICANN could not succeed without fundamental reform. In response, Department officials said that they welcomed the call for the reform of ICANN and would follow ICANN's reform activities and process closely. When we asked for their views on the reform effort, Department officials stated that they did not wish to comment on specifics that could change as the reform process proceeds. To develop the Department's position on the effort, they said that they are gathering the views of U.S. business and public interest groups, as well as other executive branch agencies, such as the Department of State; the Office of Management and Budget; the Federal Communications Commission; and components of the Department of Commerce, such as the Patent and Trademark Office. They also said that they have consulted other members of ICANN's Governmental Advisory Committee to discuss with other governments how best to support the reform process. They noted that the Department is free to adjust its relationship with ICANN in view of any new mission statement or restructuring that might result from the reform effort. Department officials said that they would assess the necessity for such adjustments, or for any legislative or executive action, depending on the results of the reform process. In conclusion, Mr. Chairman, the effort to privatize the domain name system has reached a critical juncture, as evidenced by slow progress on key tasks and ICANN's current initiative to reevaluate its mission and consider options for reforming its structure and operations. Until these issues are resolved, the timing and eventual outcome of the transition effort remain highly uncertain, and ICANN's legitimacy and effectiveness as the private sector manager of the domain name system remain in question. In September 2002, the current MOU between the Department and ICANN will expire. The Department will be faced with deciding whether the MOU should be extended for a third time, and if so, what amendments to the MOU are needed, or whether some new arrangement with ICANN or some other organization is necessary. The Department sees itself as the responsible steward of the transition, and is responsible for gaining assurance that ICANN has the resources and capability to assume technical management of the Internet domain name system. Given the limited progress made so far and the unsettled state of ICANN, Internet stakeholders have a need to understand the Department's position on the transition and the prospects for a successful outcome. In view of the critical importance of a stable and secure Internet domain name system to governments, business, and other interests, we recommend that the Secretary of Commerce issue a status report detailing the Department's assessment of the progress that has been made on transition tasks, the work that remains to be done on the joint project, and the estimated timeframe for completing the transition. In addition, the status report should discuss any changes to the transition tasks or the Department's relationship with ICANN that result from ICANN's reform initiative. Subsequent status reports should be issued periodically by the Department until the transition is completed and the final project report is issued. This concludes my statement, Mr. Chairman. I will be pleased to answer any questions that you and other Members of the Subcommittee may have. For questions regarding this testimony, please contact Peter Guerrero at (202) 512-8022. Individuals making key contributions to this testimony included John P. Finedore; James R. Sweetman, Jr.; Mindi Weisenbloom; Keith Rhodes; Alan Belkin; and John Shumann. Although the U.S. government supported the development of the Internet, no single entity controls the entire Internet. In fact, the Internet is not a single network at all. Rather, it is a collection of networks located around the world that communicate via standardized rules called protocols. These rules can be considered voluntary because there is no formal institutional or governmental mechanism for enforcing them. However, if any computer deviates from accepted standards, it risks losing the ability to communicate with other computers that follow the standards. Thus, the rules are essentially self-enforcing. One critical set of rules, collectively known as the domain name system, links names like www.senate.gov with the underlying numerical addresses that computers use to communicate with each other. Among other things, the rules describe what can appear at the end of a domain name. The letters that appear at the far right of a domain name are called top-level domains (TLDs) and include a small number of generic names such as .com and .gov, as well as country-codes such as .us and .jp (for Japan). The next string of text to the left ("senate" in the www.senate.gov example) is called a second-level domain and is a subset of the top-level domain. Each top-level domain has a designated administrator, called a registry, which is the entity responsible for managing and setting policy for that domain. Figure 2 illustrates the hierarchical organization of domain names with examples, including a number of the original top-level domains and the country-code domain for the United States. The domain name system translates names into addresses and back again in a process transparent to the end user. This process relies on a system of servers, called domain name servers, which store data linking names with numbers. Each domain name server stores a limited set of names and numbers. They are linked by a series of 13 root servers, which coordinate the data and allow users to find the server that identifies the site they want to reach. They are referred to as root servers because they operate at the root level (also called the root zone), as depicted in figure 2. Domain name servers are organized into a hierarchy that parallels the organization of the domain names. For example, when someone wants to reach the Web site at www.senate.gov, his or her computer will ask one of the root servers for help. The root server will direct the query to a server that knows the location of names ending in the .gov top-level domain. If the address includes a sub-domain, the second server refers the query to a third server--in this case, one that knows the address for all names ending in senate.gov. This server will then respond to the request with an numerical address, which the original requester uses to establish a direct connection with the www.senate.gov site. Figure 3 illustrates this example. Within the root zone, one of the servers is designated the authoritative root (or the "A root" server). The authoritative root server maintains the master copy of the file that identifies all top-level domains, called the "root zone file," and redistributes it to the other 12 servers. Currently, the authoritative root server is located in Herndon, Virginia. In total, 10 of the 13 root servers are located in the United States, including 3 operated by agencies of the U.S. government. ICANN does not fund the operation of the root servers. Instead, they are supported by the efforts of individual administrators and their sponsoring organizations. Table 1 lists the operator and location of each root server. Because much of the early research on internetworking was funded by the Department of Defense (DOD), many of the rules for connecting networks were developed and implemented under DOD sponsorship. For example, DOD funding supported the efforts of the late Dr. Jon Postel, an Internet pioneer working at the University of Southern California, to develop and coordinate the domain name system. Dr. Postel originally tracked the names and numbers assigned to each computer. He also oversaw the operation of the root servers, and edited and published the documents that tracked changes in Internet protocols. Collectively, these functions became known as the Internet Assigned Numbers Authority, commonly referred to as IANA. Federal support for the development of the Internet was also provided through the National Science Foundation, which funded a network designed for academic institutions. Two developments helped the Internet evolve from a small, text-based research network into the interactive medium we know today. First, in 1990, the development of the World Wide Web and associated programs called browsers made it easier to view text and graphics together, sparking interest of users outside of academia. Then, in 1992, the Congress enacted legislation for the National Science Foundation to allow commercial traffic on its network. Following these developments, the number of computers connected to the Internet grew dramatically. In response to the growth of commercial sites on the Internet, the National Science Foundation entered into a 5-year cooperative agreement in January 1993 with Network Solutions, Inc., to take over the jobs of registering new, nonmilitary domain names, including those ending in .com, .net, and .org, and running the authoritative root server. At first, the Foundation provided the funding to support these functions. As demand for domain names grew, the Foundation allowed Network Solutions to charge an annual fee of $50 for each name registered. Controversy surrounding this fee was one of the reasons the United States government began its efforts to privatize the management of the domain name system. Working under funding provided by the Department of Defense, a group led by Drs. Paul Mockapetris and Jon Postel creates the domain name system for locating networked computers by name instead of by number. Dr. Postel publishes specifications for the first six generic top-level domains (.com, .org, .edu, .mil, .gov, and .arpa). By July 1985, the .net domain was added. President Bush signs into law an act requiring the National Science Foundation to allow commercial activity on the network that became the Internet. Network Solutions, Inc., signs a 5-year cooperative agreement with the National Science Foundation to manage public registration of new, nonmilitary domain names, including those ending in .com, .net, or .org. President Clinton issues a presidential directive on electronic commerce, making the Department of Commerce the agency responsible for managing the U.S. government's role in the domain name system. The Department of Commerce issues the "Green Paper," which is a proposal to improve technical management of Internet names and addresses through privatization. Specifically, the Green Paper proposes a variety of issues for discussion, including the creation of a new nonprofit corporation to manage the domain name system. In response to comments on the Green Paper, the Department of Commerce issues a policy statement known as the "White Paper," which states that the U.S. government is prepared to transition domain name system management to a private, nonprofit corporation. The paper includes the four guiding principles of privatization: stability; competition; representation; and private, bottom-up coordination. The Internet Corporation for Assigned Names and Numbers (ICANN) incorporates in California. ICANN's by-laws call for a 19-member Board with 9 members elected "at-large." The Department of Commerce and ICANN enter into an MOU that states the parties will jointly design, develop, and test the methods and procedures necessary to transfer domain name system management to ICANN. The MOU is set to expire in September 2000. ICANN issues its first status report, which lists ICANN's progress to date and states that there are important issues that still must be addressed. ICANN and the Department of Commerce enter into a cooperative research and development agreement to study root server stability and security. The study is intended to result in a final report by September 2000. ICANN and the Department of Commerce approve MOU amendment 1 to reflect the roles of ICANN and Network Solutions, Inc. The Department of Commerce contracts with ICANN to perform certain technical management functions related to the domain name system, such as address allocation and root zone coordination. At a meeting in Cairo, Egypt, ICANN adopts a process for external review of its decisions that utilizes outside experts, who will be selected at an unspecified later date. ICANN also approves a compromise whereby 5 at- large Board members will be chosen in regional online elections. ICANN issues its second Status Report, which states that several of the tasks have been completed, but work on other tasks was still under way. At a meeting in Yokahama, Japan, ICANN's Board approves a policy for the introduction of new top-level domains. The Department of Commerce and ICANN approve MOU amendment 2, which deleted tasks related to membership mechanisms, public information, and registry competition and extended the MOU until September 2001. They also agree to extend the cooperative research and development agreement on root server stability and security through September 2001. ICANN holds worldwide elections to replace 5 of the 9 interim Board members appointed at ICANN's creation. At a meeting in California, ICANN selects 7 new top-level domain names: .biz (for use by businesses), .info (for general use), .pro (for use by professionals), .name (for use by individuals), .aero (for use by the air transport industry), .coop (for use by cooperatives), and .museum (for use by museums). | This testimony discusses privatizing the management of the Internet domain name system. This system is a vital aspect of the Internet that works like an automated telephone directory, allowing users to reach Web sites using easy-to-understand domain names like www.senate.gov , instead of the string of numbers that computers use when communicating with each other. The U.S. government supported the development of the domain name system, and, in 1997, the President charged the Department of Commerce with transitioning it to private management. The Department issued a policy statement, called the "White Paper," that defined the four guiding principles for the privatization effort as stability, competition, representation, and private, bottom-up coordination. After reviewing several proposals from private sector organizations, the Department chose the Internet Corporation for Assigned Names and Numbers (ICANN), a not-for-profit corporation, to carry out the transition. In November 1998, the Department entered into an agreement with ICANN in the form of a Memorandum of Understanding (MOU) under which the two parties agreed to collaborate on a joint transition project. Progress on and completion of each task is assessed by the Department on a case-by-case basis, with input from ICANN. The timing and eventual outcome of the transition remains highly uncertain. ICANN has made significant progress in carrying out MOU tasks related to one of the guiding principles of the transition effort--increasing competition--but progress has been much slower in the areas of increasing the stability and security of the Internet; ensuring representation of the Internet community in domain name policy-making; and using private bottom-up coordination. Although the transition is well behind schedule, the Department's public assessment of the progress being made on the transition has been limited for several reasons. First, the Department carries out its oversight of ICANN's MOU-related activities mainly through informal discussions with ICANN officials. Second, although the transition is past its original September 2000 completion date, the Department has not provided a written assessment of ICANN's progress since mid-1999. Third, although the Department stated that it welcomed the call for the reform of ICANN, they have not yet taken public position on reforms being proposed. | 7,736 | 482 |
ICE has designed some management controls to govern 287(g) program implementation, such as MOAs with participating agencies that identify the roles and responsibilities of each party, background checks of officers applying to participate in the program, and a 4-week training course with mandatory course examinations for participating officers. However, the program lacks several other key controls. For example Program Objectives: While ICE officials have stated that the main objective of the 287(g) program is to enhance the safety and security of communities by addressing serious criminal activity committed by removable aliens, they have not documented this objective in program- related materials consistent with internal control standards. As a result, some participating agencies are using their 287(g) authority to process for removal aliens who have committed minor offenses, such as speeding, carrying an open container of alcohol, and urinating in public. None of these crimes fall into the category of serious criminal activity that ICE officials described to us as the type of crime the 287(g) program is expected to pursue. While participating agencies are not prohibited from seeking the assistance of ICE for aliens arrested for minor offenses, if all the participating agencies sought assistance to remove aliens for such minor offenses, ICE would not have detention space to detain all of the aliens referred to them. ICE's Office of Detention and Removal strategic plan calls for using the limited detention bed space available for those aliens that pose the greatest threat to the public until more alternative detention methods are available. Use of Program Authority: ICE has not consistently articulated in program-related documents how participating agencies are to use their 287(g) authority. For example, according to ICE officials and other ICE documentation, 287(g) authority is to be used in connection with an arrest for a state offense; however, the signed agreement that lays out the 287(g) authority for participating agencies does not address when the authority is to be used. While all 29 MOAs we reviewed contained language that authorizes a state or local officer to interrogate any person believed to be an alien as to his right to be or remain in the United States, none of them mentioned that an arrest should precede use of 287(g) program authority. Furthermore, the processing of individuals for possible removal is to be in connection with a conviction of a state or federal felony offense. However, this circumstance is not mentioned in 7 of the 29 MOAs we reviewed, resulting in implementation guidance that is not consistent across the 29 participating agencies. A potential consequence of not having documented program objectives is misuse of authority. Internal control standards state that government programs should ensure that significant events are authorized and executed only by persons acting within the scope of their authority. Defining and consistently communicating how this authority is to be used would help ICE ensure that immigration enforcement activities undertaken by participating agencies are in accordance with ICE policies and program objectives. Supervision of Participating Agencies: Although the law requires that state and local officials use 287(g) authority under the supervision of ICE officials, ICE has not described in internal or external guidance the nature and extent of supervision it is to exercise over participating agencies' implementation of the program. This has led to wide variation in the perception of the nature and extent of supervisory responsibility among ICE field officials and officials from 23 of the 29 participating agencies that had implemented the program and provided information to us on ICE supervision. For example, one ICE official said ICE provides no direct supervision over the local law enforcement officers in the 287(g) program in their area of responsibility. Conversely, another ICE official characterized ICE supervisors as providing frontline support for the 287(g) program. ICE officials at two additional offices described their supervisory activities as overseeing training and ensuring that computer systems are working properly. ICE officials at another field office described their supervisory activities as reviewing files for completeness and accuracy. Officials from 14 of the 23 agencies that had implemented the program were pleased with ICE's supervision of the 287(g) trained officers. Officials from another four law enforcement agencies characterized ICE's supervision as fair, adequate, or provided on an as-needed basis. Officials from three agencies said they did not receive direct ICE supervision or that supervision was not provided daily, which an official from one of these agencies felt was necessary to assist with the constant changes in requirements for processing of paperwork. Officials from two law enforcement agencies said ICE supervisors were either unresponsive or not available. ICE officials in headquarters noted that the level of ICE supervision provided to participating agencies has varied due to a shortage of supervisory resources. Internal control standards require an agency's organizational structure to define key areas of authority and responsibility. Given the rapid growth of the program, defining the nature and extent of ICE's supervision would strengthen ICE's assurance that management's directives are being carried out. Tracking and Reporting Data: MOAs that were signed before 2007 did not contain a requirement to track and report data on program implementation. For the MOAs signed in 2007 and after, ICE included a provision stating that participating agencies are responsible for tracking and reporting data to ICE. However, in these MOAs, ICE did not define what data should be tracked or how it should be collected and reported. Of the 29 jurisdictions we reviewed, 9 MOAs were signed prior to 2007 and 20 were signed in 2007 or later. Regardless of when the MOAs were signed, our interviews with officials from the 29 participating jurisdictions indicated confusion regarding whether they had a data tracking and reporting requirement, what type of data should be tracked and reported, and what format they should use in reporting data to ICE. Internal control standards call for pertinent information to be recorded and communicated to management in a form and within a time frame that enables management to carry out internal control and other responsibilities. Communicating to participating agencies what data is to be collected and how it should be gathered and reported would help ensure that ICE management has the information needed to determine whether the program is achieving its objectives. Performance Measures: ICE has not developed performance measures for the 287(g) program to track and evaluate the progress toward attaining the program's objectives. GPRA requires that agencies clearly define their missions, measure their performance against the goals they have set, and report on how well they are doing in attaining those goals. Measuring performance allows organizations to track the progress they are making toward their goals and gives managers critical information on which to base decisions for improving their programs. ICE officials stated that they are in the process of developing performance measures, but have not provided any documentation or a time frame for when they expect to complete the development of these measures. ICE officials also stated that developing measures for the program will be difficult because each state and local partnership agreement is unique, making it challenging to develop measures that would be applicable for all participating agencies. Nonetheless, standard practices for program and project management call for specific desired outcomes or results to be conceptualized and defined in the planning process as part of a road map, along with the appropriate projects needed to achieve those results and milestones. Without a plan for the development of performance measures, including milestones for their completion, ICE lacks a roadmap for how this project will be achieved. ICE and participating agencies used program resources mainly for personnel, training, and equipment, and participating agencies reported activities, benefits, and concerns stemming from the program. For fiscal years 2006 through 2008, ICE received about $60 million to provide training, supervision, computers, and other equipment for participating agencies. State and local participants provided officers, office space, and other expenses not reimbursed by ICE, such as office supplies and vehicles. ICE and state and local participating agencies cite a range of benefits associated with the 287(g) partnership. For example, as of February 2009, ICE reported enrolling 67 agencies and training 951 state and local law enforcement officers. At that time, ICE had 42 additional requests for participation in the 287(g) program, and 6 of the 42 have been approved pending approval of an MOA. According to data provided by ICE for 25 of the 29 program participants we reviewed, during fiscal year 2008, about 43,000 aliens had been arrested pursuant to the program. Based on the data provided, individual agency participant results ranged from about 13,000 arrests in one location, to no arrests in two locations. Of those 43,000 aliens arrested pursuant to the 287(g) authority, ICE detained about 34,000, placed about 14,000 of those detained (41 percent) in removal proceedings, and arranged for about 15,000 of those detained (44 percent) to be voluntarily removed. The remaining 5,000 (15 percent) arrested aliens detained by ICE were either given a humanitarian release, sent to a federal or state prison to serve a sentence for a felony offense, or not taken into ICE custody given the minor nature of the underlying offense and limited availability of the federal government's detention space. Participating agencies cited benefits of the program including a reduction in crime and the removal of repeat offenders. However, more than half of the 29 state and local law enforcement agencies we reviewed reported concerns community members expressed about the 287(g) program, including concerns that law enforcement officers in the 287(g) program would be deporting removable aliens pursuant to minor traffic violations (e.g., speeding) and concerns about racial profiling. We made several recommendations to strengthen internal controls for the 287(g) program to help ensure the program operates as intended. Specifically, we recommended that ICE (1) document the objective of the 287(g) program for participants, (2) clarify when the 287(g) authority is authorized for use by state and local law enforcement officers, (3) document the nature and extent of supervisory activities ICE officers are expected to carry out as part of their responsibilities in overseeing the implementation of the 287(g) program, (4) specify the program information or data that each agency is expected to collect regarding their implementation of the 287(g) program and how this information is to be reported, and (5) establish a plan, including a time frame, for the development of performance measures for the 287(g) program. DHS concurred with each of our recommendations and reported plans and steps taken to address them. Mr. Chairman and Members of the Committee, this concludes my statement. I would be pleased to respond to any questions you or other Members of the Committee may have. For questions about this statement, please contact Richard Stana at 202- 512-8777 or [email protected]. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this statement. Individuals making key contributions to this statement include Bill Crocker, Lori Kmetz, Susanna Kuebler, and Adam Vogt. This is a work of the U.S. government and is not subject to copyright protection in the United States. The published product may be reproduced and distributed in its entirety without further permission from GAO. However, because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. | This testimony discusses the Department of Homeland Security's (DHS) U.S. Immigration and Customs Enforcement's (ICE) management of the 287(g) program. Recent reports indicate that the total population of unauthorized aliens residing in the United States is about 12 million. Some of these aliens have committed one or more crimes, although the exact number of aliens that have committed crimes is unknown. Some crimes are serious and pose a threat to the security and safety of communities. ICE does not have the agents or the detention space that would be required to address all criminal activity committed by unauthorized aliens. Thus, state and local law enforcement officers play a critical role in protecting our homeland because, during the course of their daily duties, they may encounter foreign-national criminals and immigration violators who pose a threat to national security or public safety. On September 30, 1996, the Illegal Immigration Reform and Immigrant Responsibility Act was enacted and added section 287(g) to the Immigration and Nationality Act. This section authorizes the federal government to enter into agreements with state and local law enforcement agencies, and to train selected state and local officers to perform certain functions of an immigration officer--under the supervision of ICE officers--including searching selected federal databases and conducting interviews to assist in the identification of those individuals in the country illegally. The first such agreement under the statute was signed in 2002, and as of February 2009, 67 state and local agencies were participating in this program. The testimony today is based on our January 30, 2009, report regarding the program including selected updates made in February 2009. Like the report, this statement addresses (1) the extent to which Immigration and Customs Enforcement has designed controls to govern 287(g) program implementation and (2) how program resources are being used and the activities, benefits, and concerns reported by participating agencies. To do this work, we interviewed officials from both ICE and participating agencies regarding program implementation, resources, and results. We also reviewed memorandums of agreement (MOA) between ICE and the 29 law enforcement agencies participating in the program as of September 1, 2007, that are intended to outline the activities, resources, authorities, and reports expected of each agency. We also compared the controls ICE designed to govern implementation of the 287(g) program with criteria in GAO's Standards for Internal Control in the Federal Government, the Government Performance and Results Act (GPRA), and the Project Management Institute's Standard for Program Management. More detailed information on our scope and methodology appears in the January 30, 2009 report. In February 2009, we also obtained updated information from ICE regarding the number of law enforcement agencies participating in the 287(g) program as well as the number of additional law enforcement agencies being considered for participation in the program. We conducted our work in accordance with generally accepted government auditing standards. In summary, ICE has designed some management controls, such as MOAs with participating agencies and background checks of officers applying to participate in the program, to govern 287(g) program implementation. However, the program lacks other key internal controls. Specifically, program objectives have not been documented in any program-related materials, guidance on how and when to use program authority is inconsistent, guidance on how ICE officials are to supervise officers from participating agencies has not been developed, data that participating agencies are to track and report to ICE has not been defined, and performance measures to track and evaluate progress toward meeting program objectives have not been developed. Taken together, the lack of internal controls makes it difficult for ICE to ensure that the program is operating as intended. ICE and participating agencies used program resources mainly for personnel, training, and equipment, and participating agencies reported activities and benefits, such as a reduction in crime and the removal of repeat offenders. However, officials from more than half of the 29 state and local law enforcement agencies we reviewed reported concerns members of their communities expressed about the use of 287(g) authority for minor violations and/or about racial profiling. We made several recommendations to strengthen internal controls for the 287(g) program to help ensure that the program operates as intended. DHS concurred with our recommendations and reported plans and steps taken to address them. | 2,443 | 893 |
In February 2012, we reported that the increased seigniorage resulting from replacing $1 notes with $1 coins could potentially offer $4.4 billion in net benefits to the government over 30 years. We determined that seigniorage was the sole source of the net benefits and not lower production costs due to switching to the coin, which lasts much longer than a note. Seigniorage is the financial gain the federal government realizes when it issues notes or coins because both forms of currency usually cost less to produce than their face value. This gain equals the difference between the face value of currency and its costs of production, which reflects a financial transfer to the federal government because it reduces the government's need to raise revenues through borrowing. With less borrowing, the government pays less interest over time, resulting in a financial benefit. The replacement scenario of our 2012 estimate assumed the production of $1 notes would stop immediately followed by a 4-year transition period during which worn and unfit $1 notes would gradually be removed from circulation. Based on information provided by the Mint, we also assumed that the Mint would convert existing equipment to increase its production capability for $1 coins during the first year and that it would take 4 years for the Mint to produce enough coins to replace the currently outstanding $1 notes. Our assumptions covered a range of factors, but key among these was a replacement ratio of 1.5 coins to 1 note to take into consideration the fact that coins circulate with less frequency than notes and therefore a larger number are required in circulation. Other key assumptions included the expected rate of growth in the demand for currency over 30 years, the costs of producing and processing both coins and notes, and the differential life spans of coins and notes. We projected our analyses over 30 years to be consistent with previous GAO analyses and because that period roughly coincides with the life expectancy of the $1 coin. As shown in figure 1, we found that the net benefit accruing each year varied considerably over the 30 years. More specifically, across the first 10 years of our 30-year analysis, replacing the $1 note with a $1 coin would result in a $531 million net loss or approximately $53 million per year in net loss to the government. The early net loss would be due in part to the up-front costs to the Mint of increasing its coin production during the transition, together with the limited interest expense the government would avoid in the first few years after replacement began. This estimate differs from our 2011 estimate, which found that replacement would result in a net benefit of about $5.5 billion over 30 years (an average of about $184 million per year) because the 2012 estimate takes into account two key actions that occurred since our 2011 report, specifically: In April 2011, the Federal Reserve began using new equipment to process notes, which has increased the expected life of the $1 note to an average of 56 months (or 4.7 years), according to the Federal Reserve, compared with the 40 months we used in our 2011 analysis.over 30 years and thus reduces the expected net benefits of replacing the $1 note with a $1 coin. The longer note life reduces the costs of circulating a note In December 2011, the Treasury Department announced that it would take steps to eliminate the overproduction of dollar coins by relying on the approximately 1.4 billion $1 coins stored with the Federal Reserve as of September 30, 2011, to meet the relatively small transactional demand for dollar coins. This new policy would reduce the cost associated with producing $1 coins that we estimated in the status quo scenario and, therefore, would reduce the net benefit, which is the difference in the estimated costs between the status quo scenario and the replacement scenario. However, like all estimates, there are uncertainties involved in developing these analyses. In particular, while the up-front costs to the Mint of increasing its coin production during the transition is reasonably certain-- in large part because it is closer in time--the longer-term benefits, particularly those occurring in the later years, involve greater uncertainty because of unforeseen circumstances that could occur farther into the future. Nonetheless, looking at a longer time period allows for trends to be seen. Moreover, changes to the inputs and assumptions used in our analysis could significantly change the estimated net benefit. For example, in 2011, we compared our status quo scenario to an alternative scenario in which the growing use of electronic payments--such as making payments with a cell phone--results in a lower demand for cash and lower net benefit. If Americans come to rely more heavily on electronic payments, the demand for cash could grow more slowly than we assumed or even decrease. By reducing the public's demand for $1 currency by 20 percent in this alternative scenario, we found that the net benefit to the government would decrease to about $3.4 billion over 30 years. In another scenario, we reported in 2012 that if interest savings because of seigniorage were not considered, a net loss of approximately $1.8 billion would accrue during the first 10 years for an average cost of $179 million per year--or $2.8 billion net loss over 30 years. While this scenario suggests that there would be no net benefits from switching to a $1 coin, we believe that the interest savings related to seigniorage, which is a result of issuing currency, cannot be set aside because the interest savings reflects a monetary benefit to the government. Our estimates of the discounted net benefit to the government of replacing the $1 note with a $1 coin differ from the method that the Congressional Budget Office (CBO) would use to calculate the impact on the budget of the same replacement. In the mid-1990s, CBO made such an estimate and noted that its findings for government savings were lower than our estimates at that time because of key differences in the two analyses. Most important, budget scorekeeping conventions do not factor in gains in seigniorage in calculating budget deficits.expense avoided in future years by reducing borrowing needs, which accounts for our estimate of net benefit to the government, would not be part of a CBO budget-scoring analysis. Additionally, CBO's time horizon for analyzing the budget impact is up to 10 years--a much shorter time horizon than we use in our recent analyses. Two factors merit consideration moving forward. The first factor is the effect of a currency change on the private sector. Our 2011 and 2012 reports considered only the fiscal effect on the government. Because we found no quantitative estimates that could be evaluated or modeled, our estimate did not consider factors such as the broader societal impact of replacing the $1 note with a $1 coin or attempt to quantify the costs to the private sector. Based on our interviews with stakeholders representing a variety of cash-intensive industries, we believe that the costs and benefits to the private sector should be carefully weighed since some costs could be substantial. In 2011 we reported that stakeholders identified potential shorter- and longer-term costs that would likely result from the replacement. Specifically, shorter-term costs would be those costs involved in adapting to the transition such as modifying vending machines, cash-register drawers, and night-depository equipment to accept $1 coins. Such costs would also include the need to purchase or adapt the processing equipment that businesses may need, such as coin- counting and coin-wrapping machines. Longer-term costs would be those costs that would permanently increase the cost of doing business, such as the increased transportation and storage costs for the heavier and more voluminous coins as compared to notes, and processing costs. These costs would likely be passed on to the customer and the public at large through, for example, higher prices or fees. Most stakeholders we interviewed said, however, that they could not easily quantify the magnitude of these costs, and the majority indicated that they would need 1 to 2 years to make the transition from $1 notes to $1 coins. In contrast to the stakeholders who said that a replacement would mean higher costs for their businesses, stakeholders from the vending machine industry and public transit said that the changeover might have only a minimal impact on them. For example, according to officials from the National Automatic Merchandising Association, an organization representing the food and refreshment vending industry, many of its members have already modified their vending machines to accept all forms of payment, including $1 coins. In addition, according to transit industry officials, the impact on the transit industry would be minimal since transit agencies that receive federal funds were required under the Presidential $1 Coin Act of 2005 to accept and distribute $1 coins. The second factor that merits consideration is public acceptance. Our 2012 estimate assumes that the $1 coin would be widely accepted and used by the public. In 2002, we conducted a nationwide public opinion survey, and we found that the public was not using the $1 coin because people were familiar with the $1 note, the $1 coin was not widely available, and people did not want to carry more coins. However, when respondents were told that such a replacement would save the government about half a billion dollars a year (our 2000 estimate), the proportion who said they opposed elimination of the note dropped from 64 percent to 37 percent. Yet, two more recent national-survey results suggest that opposition to eliminating the $1 note persists. For example, according to a Gallup poll conducted in 2006, 79 percent of respondents were opposed to replacing $1 notes with $1 coins, and their opposition decreased only slightly, to 64 percent, when they were asked to assume that a replacement would result in half a billion dollars in government savings each year. We have noted in past reports that efforts to increase the circulation and public acceptance of the $1 coins--such as changes to the color of the $1 coin and new coin designs--have not succeeded, in part, because the $1 note has remained in circulation. Over the last 48 years, Australia, Canada, France, Japan, the Netherlands, New Zealand, Norway, Russia, Spain, and the United Kingdom, among others, have replaced lower-denomination notes with coins. The rationales for replacing notes with coins cited by foreign government officials and experts include the cost savings to governments derived from lower production costs and the decline over time of the purchasing power of currency because of inflation. For example, Canada replaced its $1 and $2 notes with coins in 1987 and 1996, respectively. Canadian officials determined that the conversion to the $1 coin saved the Canadian government $450 million (Canadian) between 1987 and 1991 because it no longer had to regularly replace worn out $1 notes. However, Canadian $1 notes did not last as long as $1 notes in the United States currently do. Stopping production of the note and actions to overcome public resistance have been important in Canada and the United Kingdom as the governments transitioned from a note to a coin. While observing that the public was resistant at first, Canadian and United Kingdom officials said that with the combination of stakeholder outreach, public relations efforts, and ending production and issuance of the notes, public dissatisfaction dissipated within a few years. Canada undertook several efforts to prepare the public and businesses for the transition to the coin. For example, the Royal Canadian Mint reached out to stakeholders in the retail business community to ensure that they were aware of the scope of the change and surveyed public opinion about using coins instead of notes and the perceived impact on consumer transactions. The Canadian Mint also proactively worked with large coin usage industries, such as vending and parking enterprises, to facilitate conversion of their equipment, and conducted a public relations campaign to advise the public of the cost savings that would result from the switch. According to Canadian officials, the $1 and $2 coins were the most popular coins in circulation and were heavily used by businesses and the public. In our analysis of replacing the $1 note with a $1 coin, we assumed that the U.S. government would conduct a public awareness campaign to inform the public during the first year of the transition and assigned a value of approximately $7.8 million for that effort. In addition, some countries have used a transition period to gradually introduce new coins or currency. For example, the United Kingdom issued the PS1 coin in April 1983 and continued to simultaneously issue the PS1 note until December 1984. Similarly, Canada issued the $1 coin in 1987 and ceased issuing the $1 note in 1989. In our prior reports, we recommended that Congress proceed with replacing the $1 note with the $1 coin. We continue to believe that the government would receive a financial benefit from making the replacement. However, this finding comes with several caveats. First, the costs are immediate and certain while the benefits are further in the future and more uncertain. The uncertainty comes, in part, from the uncertainty surrounding key assumptions like the future demand for cash. Second, the benefits derive from seigniorage, a transfer from the public, and not a cost-saving change in production. Third, these are benefits to the government and not necessarily to the public at large. In fact, public opinion has consistently been opposed to the $1 coin. Keeping those caveats in mind, many other countries have successfully replaced low denomination notes with coins, even when initially faced with public opposition. Chairman Paul, Ranking Member Clay, and members of the Subcommittee, this concludes my prepared statement. I would be pleased to answer any questions at this time. For further information on this testimony, please contact Lorelei St. James, at (202) 512-2834 or [email protected]. In addition, contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this statement. Individuals making key contributions to this testimony include Teresa Spisak (Assistant Director), Lindsay Bach, Amy Abramowitz, Patrick Dudley, and David Hooper. This is a work of the U.S. government and is not subject to copyright protection in the United States. The published product may be reproduced and distributed in its entirety without further permission from GAO. However, because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. | Since coins are more durable than notes and do not need replacement as often, many countries have replaced lower-denomination notes with coins to obtain a financial benefit, among other reasons. Six times over the past 22 years, GAO has reported that replacing the $1 note with a $1 coin would provide a net benefit to the federal government of hundreds of millions of dollars annually. This testimony provides information on what GAO's most recent work in 2011 and 2012 found regarding (1) the net benefit to the government of replacing the $1 note with a $1 coin, (2) stakeholder views on considerations for the private sector and the public in making such a replacement, and (3) the experiences of other countries in replacing small-denomination notes with coins. This testimony is based on previous GAO reports. To perform that work, GAO constructed an economic model to assess the net benefit to the government. GAO also interviewed officials from the Federal Reserve and Treasury Department, currency experts, officials from Canada and the United Kingdom, and representatives of U.S. industries that could be affected by currency changes. GAO reported in February 2012 that replacing $1 notes with $1 coins could potentially provide $4.4 billion in net benefits to the federal government over 30 years. The overall net benefit was due solely to increased seigniorage and not to reduced production costs. Seigniorage is the difference between the cost of producing coins or notes and their face value; it reduces government borrowing and interest costs, resulting in a financial benefit to the government. GAO's estimate takes into account processing and production changes that occurred in 2011, including the Federal Reserve's use of new equipment to determine the quality and authenticity of notes, which has increased the expected life of the note thereby reducing the costs of circulating a note over 30 years. (The $1 note is expected to last 4.7 years and the $1 coin 30 years.) Like all estimates, there are uncertainties surrounding GAO's estimate, especially since the costs of the replacement occur in the first several years and can be estimated with more certainty than the benefits, which are less certain because they occur further in the future. Moreover, changes to the inputs and assumptions GAO used in the estimate could significantly increase or decrease the results. For example, if the public relies more heavily on electronic payments in the future, the demand for cash could be lower than GAO estimated and, as a result, the net benefit would be lower. In March 2011, GAO identified potential shorter- and longer-term costs to the private sector that could result from the replacement of the $1 note with a $1 coin. Industry stakeholders indicated that they would initially incur costs to modify equipment and add storage and that later their costs to process and transport coins would increase. However, others, such as some transit agencies, have already made the transition to accept $1 coins and would not incur such costs. In addition, for such a replacement to be successful, the $1 coin would have to be widely accepted and used by the public. Nationwide opinion polls over the last decade have indicated lack of public acceptance of the $1 coin. Efforts to increase the circulation and public acceptance of the $1 coins have not succeeded, in part, because the $1 note has remained in circulation. Over the last 48 years, many countries, including Canada and the United Kingdom, have replaced low denomination notes with coins because of expected cost savings, among other reasons. The Canadian government, for example, saved $450 million (Canadian) over 5 years by converting to the $1 coin. Canada and the United Kingdom found that stopping production of the note combined with stakeholder outreach and public education were important to overcome public resistance, which dissipated within a few years after transitioning to the low denomination coins. GAO has recommended in prior work that Congress replace the $1 note with a $1 coin. GAO continues to believe that replacing the $1 note with a coin is likely to provide a financial benefit to the federal government if the note is eliminated and negative public reaction is effectively managed through stakeholder outreach and public education. | 2,989 | 856 |
SCI refers to classified information concerning or derived from intelligence sources, methods, or analytical processes requiring exclusive handling within formal access control systems established by the Director of Central Intelligence. The Central Intelligence Agency (CIA) is responsible for adjudicating and granting all EOP requests for SCI access. According to the EOP Security Office, between January 1993 and May 1998, the CIA granted about 840 EOP employees access to SCI. Executive Order 12958, Classified National Security Information, prescribes a uniform system for classifying, safeguarding, and declassifying national security information and requires agency heads to promulgate procedures to ensure that the policies established by the order ensure that classified material is properly safeguarded, and establish and maintain a security self-inspection program of their classified activities. The order also gives the Director, Information Security Oversight Office (an organization under the National Archives and Records Administration), the authority to conduct on-site security inspections of EOP's and other executive branch agencies' classified programs. Office of Management and Budget Circular Number A-123, Management Accountability and Control, emphasizes the importance of having clearly documented and readily available procedures as a means to ensure that programs achieve their intended results. Director of Central Intelligence Directive 1/14, Personnel Security Standards and Procedures Governing Eligibility for Access to Sensitive Compartmented Information, lays out the governmentwide eligibility standards and procedures for access to SCI by all U.S. citizens, including government civilian and military personnel, contractors, and employees of contractors. The directive requires (1) the employing agency to determine that the individual has a need to know; (2) the cognizant Senior Official of the Intelligence Community to review the individual's background investigation and reach a favorable suitability determination; and (3) the individual, once approved by the Senior Official of the Intelligence Community for SCI access, to sign a SCI nondisclosure agreement.Additional guidance concerning SCI eligibility is contained in Executive Order 12968, the U.S. Security Policy Board investigative standards and adjudicative guidelines implementing Executive Order 12968, and Director of Central Intelligence Directive 1/19. Governmentwide standards and procedures for safeguarding SCI material are contained in Director of Central Intelligence Directive 1/19, Security Policy for Sensitive Compartmented Information and Security Policy Manual. The EOP Security Office is part of the Office of Administration. The Director of the Office of Administration reports to the Assistant to the President for Management and Administration. The EOP Security Officer is responsible for formulating and directing the execution of security policy, reviewing and evaluating EOP security programs, and conducting security indoctrinations and debriefings for agencies of the EOP. Additionally, each of the nine EOP offices we reviewed has a security officer who is responsible for that specific office's security program. As discussed with your office, we reviewed EOP procedures but did not verify whether the procedures were followed in granting SCI access to EOP employees, review EOP physical security practices for safeguarding classified material, conduct classified document control and accountability inspections, or perform other control tests of classified material over which the EOP has custody. (See pp. 8 and 9 for a description of our scope and methodology.) The EOP Security Officer told us that, for the period January 1993 until June 1996, (1) he could not find any EOP-wide procedures for acquiring access to SCI for the White House Office, the Office of Policy Development, the Office of the Vice President, the National Security Council, and the President's Foreign Intelligence Advisory Board for which the former White House Security Office provided security support and (2) there were no EOP-wide procedures for acquiring access to SCI for the Office of Science and Technology Policy, the Office of the United States Trade Representative, the Office of National Drug Control Policy, and the Office of Administration for which the EOP Security Office provides security support. He added that there had been no written procedures for acquiring SCI access within the EOP since he became the EOP Security Officer in 1986. In contrast, we noted that two of the nine EOP offices we reviewed issued office-specific procedures that make reference to acquiring access to SCI--the Office of Science and Technology Policy in July 1996 and the Office of the Vice President in February 1997. According to the EOP Security Officer, draft EOP-wide written procedures for acquiring access to SCI were completed in June 1996 at the time the White House and EOP Security Offices merged. These draft procedures, entitled Security Procedures for the EOP Security Office, were not finalized until March 1998. While the procedures discuss the issuance of EOP building passes, they do not describe in detail the procedures EOP offices must follow to acquire SCI access; the roles and responsibilities of the EOP Security Office, security staffs of the individual EOP offices, and the CIA and others in the process; or the forms and essential documentation required before the CIA can adjudicate a request for SCI access. Moreover, the procedures do not address the practices that National Security Council security personnel follow to acquire SCI access for their personnel. For example, unlike the process for acquiring SCI access in the other eight EOP offices we reviewed, National Security Council security personnel (rather than the personnel in the EOP Security Office) conduct the employee pre-employment security interview; deal directly with the CIA to request SCI access; and, once the CIA approves an employee for access, conduct the SCI security indoctrination and oversee the individual's signing of the SCI nondisclosure agreement. Director of Central Intelligence Directives 1/14 and 1/19 require that access to SCI be controlled under the strictest application of the need-to-know principle and in accordance with applicable personnel security standards and procedures. In exceptional cases, the Senior Official of the Intelligence Community or his designee (the CIA in the case of EOP employees) may, when it is in the national interest, authorize an individual access to SCI prior to completion of the individual's security background investigation. At least since July 1996, according to the National Security Council's security officer, his office has granted temporary SCI access to government employees and individuals from private industry and academia--before completion of the individual's security background investigation and without notifying the CIA. He added, however, that this practice has occurred only on rare occasions to meet urgent needs. He said that this practice was also followed prior to July 1996 but that no records exist documenting the number of instances and the parties the National Security Council may have granted temporary SCI access to prior to this date. CIA officials responsible for adjudicating and granting EOP requests for SCI access told us that the CIA did not know about the National Security Council's practice of granting temporary SCI access until our review. A senior EOP official told us that from July 1996 through July 1998, the National Security Council security officer granted 35 temporary SCI clearances. This official also added that, after recent consultations with the CIA, the National Security Council decided in August 1998 to refer temporary SCI clearance determinations to the CIA. The EOP-wide security procedures issued in March 1998 do not set forth security practices EOP offices are to follow in safeguarding classified information. In contrast, the Office of Science and Technology Policy and the Office of the Vice President had issued office-specific security procedures that deal with safeguarding SCI material. The Office of Science and Technology Policy procedures, issued in July 1996, were very comprehensive. They require that new employees be thoroughly briefed on their security responsibilities, advise staff on their responsibilities for implementing the security aspects of Executive Order 12958, and provide staff specific guidance on document accountability and other safeguard practices involving classified information. The remaining seven EOP offices that did not have office-specific procedures for safeguarding SCI and other classified information stated that they rely on Director of Central Intelligence Directive 1/19 for direction on such matters. Executive Order 12958 requires the head of agencies that handle classified information to establish and maintain a security self-inspection program. The order contains guidelines (which agency security personnel may use in conducting such inspections) on reviewing relevant security directives and classified material access and control records and procedures, monitoring agency adherence to established safeguard standards, assessing compliance with controls for access to classified information, verifying whether agency special access programs provide for the conduct of internal oversight, and assessing whether controls to prevent unauthorized access to classified information are effective. Neither the EOP Security Office nor the security staff of the nine EOP offices we reviewed have conducted security self-inspections as described in the order. EOP officials pointed out that security personnel routinely conduct daily desk, safe, and other security checks to ensure that SCI and other classified information is properly safeguarded. These same officials also emphasized the importance and security value in having within each EOP office experienced security staff responsible for safeguarding classified information. While these EOP security practices are important, the security self-inspection program as described in Executive Order 12958 provides for a review of security procedures and an assessment of security controls beyond EOP daily security practices. Executive Order 12958 gives the Director, Information Security Oversight Office, authority to conduct on-site reviews of each agency's classified programs. The Director of the Information Security Oversight Office said his office has never conducted an on-site security inspection of EOP classified programs. He cited a lack of sufficient personnel as the reason for not doing so and added that primary responsibility for oversight should rest internally with the EOP and other government agencies having custody of classified material. The Director's concern with having adequate inspection staff and his view on the primacy of internal oversight do not diminish the need for an objective and systematic examination of EOP classified programs by an independent party. An independent assessment of EOP security practices by the Information Security Oversight Office could have brought to light the security concerns raised in this report. To improve EOP security practices, we recommend that the Assistant to the President for Management and Administration direct the EOP Security Officer to revise the March 1998 Security Procedures for the EOP Security Office to include comprehensive guidance on the procedures EOP offices must follow in (1) acquiring SCI access for its employees and (2) safeguarding SCI material and establish and maintain a self-inspection program of EOP classified programs, including SCI in accordance with provisions in Executive Order 12958. We recommend further that, to properly provide for external oversight, the Director, Information Security Oversight Office, develop and implement a plan for conducting periodic on-site security inspections of EOP classified programs. We provided the EOP, the Information Security Oversight Office, and the CIA a copy of the draft report for their review and comment. The EOP and the Information Security Oversight Office provided written comments, which are reprinted in their entirety as appendixes I and II, respectively. The CIA did not provide comments. In responding for the EOP, the Assistant to the President for Management and Administration stated that our report creates a false impression that the security procedures the EOP employs are lax and inconsistent with established standards. This official added that the procedures for regulating personnel access to classified information are Executive Order 12968 and applicable Security Policy Board guidelines and Executive Order 12968 and Executive Order 12958 for safeguarding such information. The Assistant to the President also stated that the report suggests that the EOP operated in a vacuum because the EOP written security procedures implementing Executive Order 12968 were not issued until March 1998. The official noted that EOP carefully followed the President's executive orders, Security Policy Board guidelines and applicable Director of Central Intelligence Directives during this time period. While the EOP disagreed with the basis for our recommendation, the Assistant to the President stated that EOP plans to supplement its security procedures with additional guidance. We agree that the executive orders, Security Policy Board guidelines, and applicable Director of Central Intelligence Directives clearly lay out governmentwide standards and procedures for access to and safeguarding of SCI. However, they are not a substitute for local operating procedures that provide agency personnel guidance on how to implement the governmentwide procedures. We believe that EOP's plan to issue supplemental guidance could strengthen existing procedures. The Assistant to the President also stated that it is not accurate to say that the EOP has not conducted security self-inspections. This official stated that our draft report acknowledges that "security personnel conduct daily desk, safe, and other security checks to ensure that SCI and other classified material is properly safeguarded." The Assistant to the President is correct to point out the importance of daily physical security checks as an effective means to help ensure that classified material is properly safeguarded. However, such self-inspection practices are not meant to substitute for a security self-inspection program as described in Executive Order 12958. Self-inspections as discussed in the order are much broader in scope than routine daily safe checks. The order's guidelines discuss reviewing relevant security directives and classified material access and control records and procedures, monitoring agency adherence to established safeguard standards, assessing compliance with controls for access to classified information, verifying whether agency special access programs (such as SCI) provide for the conduct of internal oversight, and assessing whether controls to prevent unauthorized access to classified information are effective. Our report recommends that the EOP establish a self-inspection program. In commenting on our recommendation, the Assistant to the President said that to enhance EOP security practices, the skilled assistance of the EOP Security Office staff are being made available to all EOP organizations to coordinate and assist where appropriate in agency efforts to enhance self-inspection. We believe EOP security practices would be enhanced if this action were part of a security self-inspection program as described in Executive Order 12958. The Director, Information Security Oversight Office noted that our report addresses important elements of the SCI program in place within the EOP and provides helpful insights for the security community as a whole. The Director believes that we overemphasize the need to create EOP specific procedures for handling SCI programs. He observed that the Director of Central Intelligence has issued governmentwide procedures on these matters and that for the EOP to prepare local procedures would result in unnecessary additional rules and expenditure of resources and could result in local procedures contrary to Director of Central Intelligence Directives. As we discussed above, we agree that the executive orders, Security Policy Board guidelines, and applicable Director of Central Intelligence Directives clearly lay out governmentwide standards and procedures for access to and safeguarding of SCI. However, they are not a substitute for local operating procedures that provide agency personnel guidance on how to implement the governmentwide procedures. The Director agreed that his office needs to conduct on-site security inspections and hopes to begin the inspections during fiscal year 1999. The Director also noted that the primary focus of the inspections would be classification management and not inspections of the SCI program. To identify EOP procedures for acquiring access to SCI and safeguarding such information, we met with EOP officials responsible for security program management and discussed their programs. We obtained and reviewed pertinent documents concerning EOP procedures for acquiring SCI access and safeguarding such information. In addition, we obtained and reviewed various executive orders, Director of Central Intelligence Directives, and other documents pertaining to acquiring access to and safeguarding SCI material. We also discussed U.S. government security policies pertinent to our review with officials of the Information Security Oversight Office and the U.S. Security Policy Board. Additionally, we met with officials of the CIA responsible for adjudicating and granting EOP employees SCI access and discussed the CIA procedures for determining whether an individual meets Director of Central Intelligence Directive eligibility standards. As discussed with your office, we did not verify whether proper procedures were followed in granting SCI access to the approximately 840 EOP employees identified by the EOP Security Officer. Also, we did not review EOP physical security practices for safeguarding SCI and other classified material, conduct classified document control and accountability inspections, or perform other control tests of SCI material over which the EOP has custody. We performed our review from January 1998 until August 1998 in accordance with generally accepted government auditing standards. At your request, we plan no further distribution of this report until 30 days after its issue date. At that time, we will provide copies to appropriate congressional committees; the Chief of Staff to the President; the Assistant to the President for Management and Administration; the Director, Information Security Oversight Office; the Director of Central Intelligence; Central Intelligence Agency; the U.S. Security Policy Board; the Director of the Office of Management and Budget; and other interested parties. Please contact me at (202) 512-3504 if you or your staff have any questions concerning this report. Major contributors to this report were Gary K. Weeter, Assistant Director and Tim F. Stone, Evaluator-in-Charge. The following is GAO's comment to the Assistant to the President for Management and Administration's letter dated September 23, 1998. 1. A representative of the Executive Office of the President (EOP) told us that the errors referred, for example, to statements in ours draft report that the EOP does not conduct self-inspections and that the EOP lacks written procedures. The first copy of each GAO report and testimony is free. Additional copies are $2 each. Orders should be sent to the following address, accompanied by a check or money order made out to the Superintendent of Documents, when necessary. VISA and MasterCard credit cards are accepted, also. Orders for 100 or more copies to be mailed to a single address are discounted 25 percent. U.S. General Accounting Office P.O. Box 37050 Washington, DC 20013 Room 1100 700 4th St. NW (corner of 4th and G Sts. NW) U.S. General Accounting Office Washington, DC Orders may also be placed by calling (202) 512-6000 or by using fax number (202) 512-6061, or TDD (202) 512-2537. Each day, GAO issues a list of newly available reports and testimony. To receive facsimile copies of the daily list or any list from the past 30 days, please call (202) 512-6000 using a touchtone phone. A recorded menu will provide information on how to obtain these lists. | Pursuant to a congressional request, GAO reviewed whether the Executive Office of the President (EOP) has established procedures for: (1) acquiring personnel access to classified intelligence information, specifically sensitive compartmented information (SCI); and (2) safeguarding such information. GAO noted that: (1) the EOP Security Officer told GAO that, for the period January 1993 until June 1996: (a) he could not find any EOP-wide procedures for acquiring access to SCI for the White House Office, the Office of Policy Development, the Office of the Vice President, the National Security Council, and the President's Foreign Intelligence Advisory Board for which the former White House Security Office provided security support; and (b) there were no EOP-wide procedures for acquiring access to SCI for the Office of Science and Technology Policy, the Office of the United States Trade Representative, the Office of National Drug Control Policy, and the Office of Administration for which the EOP security office provides security support; (2) the EOP-wide security procedures issued in March 1998 do not set forth security practices EOP offices are to follow in safeguarding classified information; (3) in contrast, the Office of Science and Technology Policy and the Office of the Vice President had issued office-specific security procedures that deal with safeguarding SCI material; (4) the remaining seven EOP offices that did not have office-specific procedures for safeguarding SCI and other classified information stated that they rely on Director of Central Intelligence Directive 1/19 for direction on such matters; (5) neither the EOP Security Office nor the security staff of the nine EOP offices GAO reviewed have conducted security self-inspections as described in Executive Order 12958; (6) EOP officials pointed out that security personnel routinely conduct daily desk, safe, and other security checks to ensure that SCI and other classified information is properly safeguarded; (7) these same officials also emphasized the importance and security value in having within each EOP office experienced security staff responsible for safeguarding classified information; (8) Executive Order 12958 gives the Director, Information Security Oversight Office, authority to conduct on-site reviews of each agency's classified programs; and (9) the Director of the Information Security Oversight Office said his office has never conducted an on-site security inspection of EOP classified programs. | 3,985 | 491 |
In recent years it has become clear that past fire suppression policies have not worked as effectively as was once thought. In fact, they have had major unintended consequences, particularly on federally owned lands. For decades the federal wildland fire community followed a policy of suppressing all wildland fires as soon as possible. As a result, over the years, the accumulations of brush, small trees, and other hazardous vegetation (underbrush) in these areas increased substantially. Since about one-third of all land in the United States is federally owned and consists largely of forests, grasslands, or other vegetation, the widespread buildup of this underbrush has created a national problem. Today, when a fire starts on federal lands, accumulated underbrush could act as fuel that leads to larger and more intense fires than would otherwise be the case. Accumulated underbrush, in turn, causes fires to spread more rapidly. This combination of factors greatly heightens the potential for fires to become catastrophic. As several recent studies have pointed out, without changes in the way federal agencies prepare for and respond to wildland fires, communities that border fire-prone lands--commonly known as the wildland-urban interface--will increasingly be at risk for fire damage. The 2000 fire season demonstrated the impact of past fire policies. In that year one of the most challenging on record large numbers of intense and catastrophic fires frequently surpassed the fire-fighting capacities of federal, state, and local agencies. Many of these fires became the out-of- control disasters that routinely led national television news broadcasts as they threatened or damaged the communities in their path. While most of these fires occurred in western states, other areas of the country were also affected. These recent experiences have led the fire-fighting community across the country and policymakers at all levels of government to call for federal action to help mitigate this growing threat. The Forest Service and Bureau of Land Management are the two major federal land management fire-fighting agencies. The Forest Service manages about 192 million acres of land in 155 national forests and grasslands, and the Bureau of Land Management manages about 264 million acres of land. Also involved are the National Park Service, the Bureau of Indian Affairs, and the Fish and Wildlife Service within the Department of the Interior. Together, these agencies are caretakers of over one-third of all the land in the United States. The five land management agencies developed the National Fire Plan. The plan consists of five key initiatives: Firefighting--Ensure adequate preparedness for future fire seasons, Rehabilitation and Restoration--Restore landscapes and rebuild communities damaged by wildland fires, Hazardous Fuel Reduction--Invest in projects to reduce fire risk, Community Assistance--Work directly with communities to ensure Accountability--Be accountable, and establish adequate oversight and monitoring for results. The plan is expected to be a long-term effort to be implemented over a 10- year period. While the agencies are to use funding provided under the National Fire Plan to implement all five aspects of the Plan, they are to use the majority of these funds to increase their capacity for fire-fighting preparedness and suppression by acquiring and maintaining additional personnel and equipment. Agencies use preparedness funding at the beginning of each fire season to place fire-fighting resources in locations where they can most effectively respond to fires that start on federal lands. Agencies use fire suppression funding to control and extinguish wildland fires. This effort includes supporting fire-fighting personnel and equipment on the fire line and at the established fire camp. The Forest Service and Interior have not effectively determined the level of fire-fighting personnel and equipment they need to fight wildland fires. As a result, they may not be as prepared as they could be to manage fires safely and cost-effectively. In managing wildland fires, the agencies rely primarily on (1) fire management plans, which contain information on how wildland fires should be fought, and (2) computer planning models that use the planning information to identify the most efficient level of personnel and equipment needed to safely and effectively fight fires. Of the five major federal land management agencies, only the Bureau of Land Management has fully complied with the fire policy requirement that all burnable acres have fire management plans. Furthermore, even though the fire policy calls for the agencies to coordinate their efforts, the Forest Service and Interior use three different computer planning models to determine the personnel and equipment needed to achieve their fire- fighting preparedness goals. Moreover, none of the models focus on the goals of protecting communities at the wildland-urban interface or fighting fires that go across the administrative boundaries of the federal agencies. Since 1995, the national fire policy has stated that fire management plans are critical in determining fire-fighting preparedness needs that is, the number and types of personnel and equipment needed to respond to and suppress fires when they first break out. Among other things, fire management plans identify the level of risk associated with each burnable acre including areas bordering the wildland-urban interface and set forth the objectives that a local forest, park, or other federal land unit is trying to achieve with fire. The plans provide direction on the level of suppression needed and whether a fire should be allowed to burn as a natural event to either regenerate ecosystems or reduce fuel loading in areas with large amounts of underbrush. In addition, fire management plans provide information that is entered into computer planning models to identify the level of personnel and equipment needed to effectively fight fires and ultimately help to identify the funding needed to support those resources. As of September 30, 2001, 6 years after the national fire policy was developed, over 50 percent of all federal areas that were to have a fire management plan consistent with the requirements of the national fire policy were without a plan. These areas did not meet the policy's requirements because they either had no plans or had plans that were out of date with the policy requirements because, among other things, they did not address fighting fires at the wildland-urban interface. Table 1 shows, as of September 30, 2001, the Bureau of Land Management was the only agency with all of its acreage covered by a fire management plan that was compliant with the policy. In contrast, the percent of units with noncompliant plans ranged from 38 percent at the Fish and Wildlife Service to 82 percent at the National Park Service. When we asked fire managers why fire management plans were out of date or nonexistent, they most often told us that higher priorities precluded them from providing the necessary resources to prepare and update the plans. Without a compliant fire management plan, some of these fire managers told us that their local unit was following a full suppression strategy in fighting wildland fires, as the current fire policy requires. That is, they extinguish all wildland fires as quickly as possible regardless of where they are without considering other fire management options that may be more efficient and less costly. Other fire managers told us that while their fire management plans were not in compliance with the national policy, they were still taking action to ensure their day-to-day fire- fighting strategy was following the more important principles outlined in the current policy, such as addressing the fire risks around communities in the wildland-urban interface. A January 2000 Forest Service report clearly demonstrates the importance of adequate fire management planning in determining the level of fire- fighting personnel and equipment needed. In this report, Forest Service officials analyzed the management of two large wildland fires in California that consumed 227,000 acres and cost about $178 million to contain. Fire managers at these fires did not have fire management plans that complied with the national fire policy. The report stated that a compliant fire management plan would have made a difference in the effectiveness of the suppression efforts. For example, without a fire management plan, the local fire managers were not provided with a "let burn" option. Had this option been available, it could have reduced the need for personnel and equipment for one of the fires and lowered total suppression costs. The Forest Service and Interior acknowledge the need to complete and update their fire management plans. Both agencies have initiatives underway in response to the renewed emphasis on fire management planning under the National Fire Plan. Specifically, the agencies are developing consistent procedures and standards for fire management planning that will assist local units in their efforts to have fire management plans that are in compliance with the national fire policy. The agencies are expected to have a strategy in place by the spring of 2002 for accomplishing this objective. However, developing the procedures and standards and incorporating them into fire management plans at all local units is not likely to occur until 2003, at the earliest. Because it has been 7 years since the 1995 policy first directed agencies to complete their fire management plans, and the agencies have given the issue low priority, it is critical that the Forest Service and Interior complete this initiative as expeditiously as possible. Fire management planning decisions about the amount and types of personnel and equipment needed to reach a given level of fire-fighting preparedness are based on computer planning models that the Forest Service and the Interior agencies have developed. The national fire policy directs the agencies to conduct fire management planning on a coordinated, interagency basis using compatible planning processes that address all fire-related activities without regard to the administrative boundaries of each agency. This level of interagency coordination is not now being achieved because of historical differences in the missions of the five land management agencies. The Forest Service and Interior agencies are currently using three different computer planning models to identify the personnel and equipment needed to respond to and suppress wildland fires. As a result, each model reflects different fire-fighting objectives and approaches in calculating the level of resources needed to fight fires safely and cost-effectively in terms of its own mission and responsibilities. This disparate approach is inconsistent with the current national fire management policy, which calls upon the agencies to use a coordinated and consistent approach to fire management planning. More importantly, each of the models only considers the fire-fighting resources available on the lands for which the agency has direct fire protection responsibilities. According to agency officials, this approach has been the general practice for fire management planning. Fire protection of nonfederal lands, including lands in the wildland-urban interface that pose direct risks to communities, are not incorporated into the models. Yet, as set out in the national fire policy, these are the areas that are currently the focus of determining appropriate fire preparedness levels. Moreover, since wildland fires do not respect agency or other administrative boundaries, the policy states that fire management planning must be conducted across federal boundaries, on a landscape scale. However, none of the models are currently designed to achieve this objective. Because the models focus only on federal lands and the personnel and equipment available at the local unit, they do not consider the fire-fighting resources that are available from state and local fire authorities. These resources could decrease the need for federal fire- fighting personnel and equipment in certain areas. As a result of these problems with the computer models, the Forest Service and Interior are not able to adequately determine the number of fire-fighting personnel and equipment needed to meet fire-fighting policy objectives in the most cost- effective manner. The Forest Service and Interior have acknowledged our concerns and are reviewing how best to replace the three different computer planning models currently being used. A revised system for determining the resources needed would also help the agencies be responsive to congressional concerns. Past appropriations committee reports have directed the Forest Service and Interior to provide more detailed budget submissions on fire management planning and to base these submissions on common methods and procedures. These reports also directed the agencies to have a coordinated approach for calculating readiness, including consideration of the resources available from state and local fire authorities. The agencies are in the early stages of replacing the models with an interagency, landscape-scale fire planning and budget system that is expected to provide a single, uniform, and performance-based system for preparedness and fire management planning. We are encouraged by this initiative but remain concerned over its implementation because the agencies have acknowledged that, even with aggressive scheduling, full implementation may take 4 to 6 years. Until then, fire management planning will not comply with current fire policy, continue to be conducted based on each agency's missions, and remain focused within the boundaries of each local federal unit. While the agencies don't have a clear sense of the total resources they need to effectively conduct their fire-fighting activities, the Forest Service and Interior have nonetheless made progress in acquiring more fire- fighting personnel and equipment with the additional funding received under the National Fire Plan. However, as of September 30, 2001, they had not reached the full level of preparedness they had identified as necessary to carry out the objectives of the plan. Most of the Interior agencies are likely to reach their full level of preparedness in fiscal year 2002, while the Forest Service and the Fish and Wildlife Service will not reach this level until 2003 or later. Prior to the initiation of the National Fire Plan, the Forest Service and Interior estimated they were at about 74 percent and about 83 percent, respectively, of their desired preparedness levels. To increase these levels, the agencies needed to hire, develop, and support additional fire managers and fire fighters; and procure more fire-fighting equipment. The funding received in fiscal year 2001 is designed to help the agencies achieve these goals. The agencies are making good progress in hiring additional personnel. As of September 30, 2001, the Forest Service had filled about 98 percent of its needed positions and the Interior agencies, in aggregate, had filled over 83 percent of their positions. Because the availability of experienced fire- fighting personnel was limited and the agencies were competing for the same personnel in many cases, the agencies were not able to hire all of the fire-fighting personnel identified as needed in fiscal year 2001. The agencies have initiated new recruiting and outreach programs and expect to hire the remaining personnel they need by the 2002 fire season. Table 2 shows the status of the agencies' efforts in acquiring personnel. Regarding equipment, by the end of fiscal year 2002, most of the Interior agencies are likely to have all the fire-fighting equipment they identified as needed for implementing the National Fire Plan. During fiscal year 2001, the Bureau of Land Management and Bureau of Indian Affairs ordered the equipment it needed, but about 31 percent of the equipment will not be delivered until fiscal year 2002. This specialized equipment, such as fire engines and water tenders, had to be built after contracting for its purchase, which delayed its delivery. The Forest Service and the Fish and Wildlife Service have made much less progress in purchasing the equipment they said they needed to achieve their fire-fighting preparedness goals. The Forest Service did not include in its budget request all of the necessary funds to procure equipment and pay for associated costs. Forest Service officials told us that this incomplete request was an oversight on their part. This underestimate of equipment and associated costs resulted in a total budget shortfall of about $101 million in fiscal year 2001, according to Forest Service estimates. Consequently, the agency has not been able to procure hundreds of pieces of fire-fighting equipment fire engines, bulldozers, water tenders, and trucks and associated supplies for the equipment or cover expenses for some other operating costs that are required if the agency is to reach its full level of fire-fighting preparedness. Until this equipment is acquired, a few fire managers are taking measures to compensate for these shortcomings, such as contracting for needed equipment with state and private suppliers. According to the Forest Service, the agency may not attain the level of fire-fighting capacity it originally envisioned in the National Fire Plan until fiscal year 2003 at the earliest. Like the Forest Service, the Fish and Wildlife Service is not certain when it will get the equipment it identified as needed to implement the National Fire Plan. In October 2000, the agency did not take the opportunity it had to request funds for equipment to carry out the plan's objectives. As a result, the agency did not have about $10 million it estimated needing to purchase 90 pieces of fire-fighting equipment it identified as necessary. According to Fish and Wildlife Service officials, they were not aware that they could request additional one-time funds to purchase more equipment. Fish and Wildlife Service officials also told us they have no plans to request additional funding for their equipment. In commenting on a draft of this report, the departments acknowledged that the full level of preparedness as identified under the National Fire Plan was not reached by the end of fiscal year 2001. They stated that the Forest Service and the Fish and Wildlife Service will reach this level in 2003 or early 2004. They also said that in order to maintain the full level of preparedness in 2003 and beyond, the funding level may need to increase to keep pace with inflation and new standards and requirements for crew safety, initial attack effectiveness, and direct and indirect management oversight and support such as salaries, aviation contracts, and facility maintenance. Even though they have received over $800 million to increase their fire- fighting capacity, the Forest Service and Interior have not yet identified the results they expect to achieve with these additional resources. It, therefore, will be difficult to determine the extent to which these additional personnel and equipment have increased the level of fire- fighting preparedness. Both the Forest Service and Interior recognize the need to develop methods for determining the impact of the hundreds of millions of dollars provided to increase fire-fighting capacity. To facilitate such accountability, both the Forest Service and Interior have developed performance measures. However, the measures do not focus on the results to be achieved and are not consistent among the agencies. The Forest Service's performance measure is designed to provide information on the amount of personnel and equipment it has to respond to a fire. This information will only indicate the amount of resources the Forest Service is using to address its fire-fighting needs. It will not indicate whether the agency has improved the effectiveness of its fire fighting with the additional personnel and equipment. The Interior agencies, on the other hand, have a performance measure that focuses on the goals they expect to achieve with their fire-fighting resources. However, the performance measure they are using is not specifically tied to the increased fire-fighting resources provided under the National Fire Plan. Instead, the Interior agencies are using the same goal they had prior to receiving additional resources provided to implement the plan. Specifically, the Interior agencies' objective is to contain 95 percent of all fires during initial attack. Even if the agencies' performance measures were more results-oriented, they would only fulfill the requirements of the national fire policy if they were also consistent with each other. However, the measures are not consistent. The agencies were unable to provide us with a rationale for why the measures are not consistent. The Forest Service and Interior acknowledge that the development of a common set of results-oriented performance measures is critical to implementing the National Fire Plan's fire-fighting preparedness objectives. They are now working together to develop a common set of wildland fire management performance measures that will be results- oriented, measurable, valid, and connected to the goals contained in the National Fire Plan. However, agency officials estimate that the planned completion date for developing and implementing these measures will be late in fiscal year 2004--more than 4 years after the increased funding was provided. Until the implementation of the National Fire Plan in 2001, both the Forest Service and the Interior agencies used a similar method to account for their fire-fighting personnel costs. However, beginning in fiscal year 2001, the Forest Service changed its accounting method for these costs. As a result, the agencies do not now use a consistent approach for collecting and reporting on fire-fighting costs, which makes budget cost comparisons and analyses more difficult. When the Forest Service prepares its annual budget for wildland fire management activities, the costs for personnel normally assigned to managerial, administrative, and other staff positions in the fire program are budgeted for in the "Wildland Fire Preparedness" account. Personnel in these categories are also frequently assigned to help fight wildland fires during the fire season. When these staff were assigned to a wildland fire prior to fiscal year 2001, the first 8 hours of their workday their base hours were charged to the preparedness account where the funds were originally budgeted. Any additional time spent working on wildland fires above their base hours was charged to the "Wildland Fire Suppression" account. However, starting in fiscal year 2001, the first year of the National Fire Plan, the Forest Service directed its personnel to charge all of their time to the suppression account when assigned to a wildland fire. According to the director of program and budget analysis, the Forest Service made the accounting change to better reflect the cost of wildland fire suppression.We have previously supported this type of accounting for personnel costs because it better tracks how these costs are actually incurred rather than as budgeted. The change will reduce costs charged to the Forest Service's preparedness activities and increase costs charged to its suppression activities when compared with years past and with Interior's accounting for its costs charged to similar activities. Because the Forest Service and Interior now use different methods of accounting for the cost of personnel assigned to wildland fires, it will now be much more difficult for the Congress and other decisionmakers to compare and analyze budget and cost information on the fire preparedness and suppression activities of the agencies at a national level. It is important to note that this accounting change will likely affect the Forest Service's fire-fighting budgets in future years. Over time, this accounting change is likely to result in an overall increase in the cost of fighting wildland fires in the Forest Service. As more and more managerial and administrative personnel are assigned to fire suppression activities, the total costs for these activities will increase. Since suppression budgets are based on a 10-year rolling average of suppression costs, future suppression budgets will increase. This situation will also add to the difficulty of comparing and analyzing Forest Service and Interior fire activities over time. To effectively reduce the risk of catastrophic fire, the Forest Service and Interior are engaged in a long-term effort to reduce the large buildup of underbrush and other vegetative fuels that have accumulated to dangerous levels over the past several decades. This will ultimately reduce the number of large catastrophic fires that occur annually. However, until the Forest Service and Interior make progress in this area, it is even more critical to have adequate levels of personnel and equipment available to fight the intense, quick-spreading wildland fires that characterize current conditions in many areas. As the national fire plan and its underlying policy envision, these fire-fighting preparedness efforts will be much more effective if the agencies involved coordinate their efforts. The federal agencies have made progress in enhancing their fire-fighting capacity, but much work remains. Most fire management plans have yet to be updated so that they are consistent with current policy requirements. Until then, the coordinated approach to fire fighting called for in the National Fire Plan--having the agencies' plans reach beyond individual administrative boundaries--will not be realized. Moreover, it may be 6 years before the agencies develop an integrated, more consistent planning and budget system that includes a single model that incorporates information from updated fire management plans. Without this system in place, the results of the models currently being used cannot be relied upon for effectively identifying fire-fighting personnel and equipment needs. While the agencies are developing these plans and a new planning and budgeting system, they cannot now measure the results achieved with their additional personnel and equipment. The agencies plan to have consistent, results-oriented performance measures in place by fiscal year 2004. Until then, the Congress and the public cannot readily compare results across agencies. Accountability would be further enhanced if both the Forest Service and the Interior agencies were using the same accounting methods for collecting and reporting on fire preparedness and fire suppression costs. Since they are not, Congress and the public have no consistent basis for comparing or analyzing these costs or associated budget requests. For the most part, the agencies acknowledge the need for improvements in each of these areas and have plans to address them. We are concerned, however, that these improvements may not occur expeditiously. It has been 7 years since establishment of the national fire policy where the agencies first acknowledged the need to address many of these issues. Nonetheless, they are only now--with the impetus provided by the National Fire Plan developing implementation plans and strategies for addressing them. Given this history and the added need to make certain that the substantial increase in funding that has come with the plan is used most efficiently, it is critical that the agencies be held accountable for following through on their plans for improvements. To make sure this occurs will require sustained monitoring and oversight by top agency officials and the Congress. If and when these improvements are completed, the agencies and the Congress will have a more credible basis for determining fire-fighting preparedness needs. In order to better meet the objectives of the National Fire Plan and improve the Forest Service's and Interior's ability to identify their fire- fighting preparedness needs, we recommend that the secretaries of agriculture and of the interior require the heads of their respective fire agencies to ensure that ongoing initiatives to address weaknesses in their preparedness efforts are fully implemented in a timely and consistent manner and across the agencies. In particular, the agencies need to ensure that fire management plans are completed expeditiously for all burnable acres and are consistent with the national fire policy; establish a single planning and budgeting system, applicable to all fire agencies, to determine fire-fighting personnel and equipment needs in accordance with up-to-date fire management plans; and develop performance measures identifying the results to be achieved with the personnel and equipment obtained with the additional funding provided under the National Fire Plan. We also recommend that the secretary of interior require the Interior agencies to change their method for allocating and reporting fire-fighting personnel costs--similar to the method now being used by the Forest Service to better reflect the cost of wildland fire suppression. We provided a draft of this report to the departments of agriculture and of the interior for review and comment. The departments provided a consolidated response to our report. They generally agreed with our recommendations to better identify their fire-fighting preparedness needs and provided additional information on the initiatives being taken. However, in commenting on our recommendation dealing with the development of performance measures to identify the results they are achieving under the National Fire Plan, the departments indicated they had already developed such measures. We disagree. The departments acknowledge elsewhere in their response that more work is needed to establish common performance measures and recent meetings with department officials have indicated that agreement on common measures has not yet been obtained. In commenting on this report, the departments expressed concerns that our report (1) did not give the departments enough credit for the progress they have made to increase their fire-fighting capacity under the National Fire Plan; (2) suggests that by simply updating fire management plans, fire managers will then be allowed to implement "let burn" decisions; and (3) infers that allowing more fires to burn naturally will automatically provide greater public and fire fighter safety. With respect to the first issue, we acknowledge the difficulty of the departments' tasks under the National Fire Plan and, as noted in the report, recognize that the agencies have made progress in increasing their fire-fighting preparedness needs. We also agree it is important to look at results under the plan to place in proper perspective the issue of accountability in fire-fighting preparedness. However, 1 year after receiving $830 million in additional preparedness funding under the National Fire Plan in fiscal year 2001, the agencies are still putting out the same percentage of fires at initial attack. To us, it is reasonable to expect that with the substantial increase in preparedness funds and the increased resources that these funds allowed the agencies to acquire, the results achieved would have been greater than they were in the past year. Second, the departments stated that the full range of fire fighting options outlined in a local unit's fire management plan, including a "let burn" option, can only be used when the overall land management plan provides for them. In this regard, they noted that in many cases land management plans have not been updated to reflect the full range of fire-fighting options as outlined in fire management plans. As a result, they contend that until the land management plans are updated, the fire management plans that are out of date cannot be revised to include all fire-fighting options, such as a "let burn" option. However, according to the 2001 update to the national fire policy, "the existence of obsolete land management plans should not be reason for failure to complete or update Fire Management Plans." Third, the departments stated that our report appears to state that allowing more fires to burn naturally will automatically provide greater public and fire fighter safety. We disagree. Our report states that fire management plans provide fire managers with direction on the level of suppression needed and whether a fire should be allowed to burn as a natural event to regenerate ecosystems or reduce fuel loading in areas with large amounts of underbrush and other vegetative fuels. Where appropriate, we have incorporated the departments' position on the different issues discussed in the report. The departments' comments appear in appendix II. As arranged with your offices, unless you publicly announce its contents earlier, we plan no further distribution of this report until 30 days after the date of this letter. At that time, we will send copies of this report to the secretary of agriculture; the secretary of the interior; the chief of the Forest Service; and the directors of the Bureau of Land Management, National Park Service, and Fish and Wildlife Service; deputy commissioner, Bureau of Indian Affairs; director, Office of Management and Budget; and other interested parties. We will make copies available to others upon request. This report will also be available on GAO's home page at http://www.gao.gov/. If you or your staff have any questions about this report, please contact me at (202) 512-3841. The overall objective of this review was to determine how the federal land management agencies the Forest Service within the Department of Agriculture and the Bureau of Land Management, National Park Service, Fish and Wildlife Service, and Bureau of Indian Affairs within the Department of the Interior prepare for wildland fires while meeting key objectives of the National Fire Plan. A primary objective of the plan is to ensure an adequate level of fire-fighting preparedness for coming fire seasons. Specifically, to assess the effectiveness of the agencies' efforts to determine the amount of fire-fighting personnel and equipment needed, we reviewed the extent to which the agencies adopted fire management plans as required by the national fire policy and the types and scope of computer planning models that the agencies use to determine their desired level of fire-fighting preparedness needs. We discussed these issues with officials at the five agencies' headquarters offices and at the National Interagency Fire Center, in Boise, Idaho; BLM state and district offices, selected national forests, national parks, and state offices, and the National Academy of Public Administration. We also obtained, reviewed, and analyzed supporting documentation, such as laws, regulations, policies, and reports on wildland fires. Table 3 shows the sites we visited. We selected these sites to (1) meet with National Interagency Fire Center officials and the Interior agencies' wildland fire managers who are located in Boise, Idaho, (2) obtain geographical dispersion of sites between eastern and western states, although more western sites were selected because more wildland fires occurring in those areas, and/or (3) to visit sites identified by agency officials as having recent fire history or as being good examples of fire-fighting preparedness. In addition, we selected more of the Forest Service's sites than sites from other agencies because the Forest Service receives most of the fire-related funding. To determine the status of the agencies' efforts to acquire additional fire- fighting resources, we contacted each of the five land management agencies to obtain information on the number of temporary and permanent positions acquired as of September 30, 2001, and compared this information with the number of positions needed to meet the agencies' desired level of fire-fighting resources. We also obtained information from these agencies on the amount of fire-fighting equipment obtained with the increase in funding that they had identified as needed to carry out the objectives of the National Fire Plan. To determine the results that the agencies expected to achieve with their additional fire-fighting resources as determined through performance measures, we obtained documentation from the land management agencies and discussed with agency officials their management practices, including how they measure their progress in meeting fire-fighting preparedness objectives under the National Fire Plan. Finally, to determine whether the Forest Service and Interior were consistently reporting their fire-fighting personnel costs, we obtained information on the practices the agencies use to report their fire-fighting personnel costs. We compared any differences between the Forest Service and the Interior agencies on their practices in accounting for their fire- fighting preparedness funds. We conducted our work from February 2001 through January 2002 in accordance with generally accepted government auditing standards. In addition to those named above, Paul Bollea; Frank Kovalak; Paul Lacey; Carol Herrnstadt Shulman; and, in special memory, our colleague and friend, John Murphy made key contributions to this report. | Each year, fires on federal lands burn millions of acres and federal land management agencies spend hundreds of millions of dollars to fight them. Wildland fires also threaten communities adjacent to federal lands. The Departments of Agriculture (USDA) and the Interior, the lead federal agencies in fighting wildfires, jointly developed a long-term fire-fighting strategy in September 2000. Five federal land management agencies--the Forest Service, the Bureau of Land Management, the Bureau of Indian Affairs, the National Park Service, and the Fish and Wildlife Service--are working together to accomplish the plan's objectives. GAO found that the Forest Service and Interior have not effectively determined the amount of personnel and equipment needed to respond to and suppress wildland fires. Although the agencies have acquired considerably more personnel and equipment than were available in 2000, they have not acquired all of the resources needed to implement the new strategy. Despite having received substantial additional funding, the two agencies have not yet developed performance measures. The Forest Service simply measures the amount of fire-fighting resources it will be able to devote to fire fighting at each location, regardless of risk. Without results-oriented performance measures, it is difficult to hold the Forest Service accountable for the results it achieves. The Forest Service and the Interior agencies use different methods to report fire-fighting personnel costs--an approach that is not in keeping with policies requiring coordination and consistency across all aspects of fire management, including accounting for fire-related costs. | 7,072 | 312 |
The Gramm-Leach-Bliley Act eliminated many of the legislative barriers to affiliations among banks, securities firms, and insurance companies. One of the expected benefits of expanded affiliation across industries was to provide financial institutions with greater access--by sharing information across affiliates--to a tremendous amount of nonpublic personal information obtained from customers through normal business transactions. This greater access to customer information is important to financial institutions wishing to diversify and may give customers better product information than they would have otherwise received. At the same time, there are increasing concerns about how financial institutions use and protect their customers' personal information. Some financial industry observers have characterized the privacy provisions contained in GLBA as the most far-reaching set of privacy standards--pertaining to financial information and certain personal data--ever adopted by Congress. Title V of GLBA sets forth major privacy provisions under two subtitles, which apply to a wide range of financial institutions. Among other things, Subtitle A requires financial institutions to provide a notice to its customers on its privacy policies and practices and how information is disclosed to their affiliates and nonaffiliated third parties. Financial institutions are required to provide consumers the opportunity to "opt out" of having their nonpublic personal information shared with nonaffiliated third parties, with certain exceptions. Subtitle A also limits the ability of financial institutions to reuse and redisclose nonpublic personal information about consumers that is received from nonaffiliated financial institutions. Subtitle B of GLBA makes it a crime for persons to obtain, or attempt to obtain, or cause to be disclosed customer information from financial institutions by false or fraudulent means. Subtitle B provides for both criminal penalties and civil administrative remedies through FTC and federal banking regulatory enforcement. Subtitle B places the primary responsibility for enforcing the subtitle's provisions with FTC. In addition, federal financial regulators are given administrative enforcement authority with respect to compliance by depository institutions under their jurisdiction. Under section 525 in Subtitle B, the banking regulators, NCUA, and SEC are required to review their regulations and guidelines and to make the appropriate revisions as necessary to deter and detect the unauthorized disclosure of customer financial information by false pretenses. Subtitle B contains five categories of exceptions to the prohibition on obtaining customer information by false pretenses. Specifically, there were exceptions for law enforcement agencies; financial institutions under specified circumstances, such as testing security procedures; insurance institutions for investigating insurance fraud; public data filed pursuant to the securities laws; and state-licensed private investigators involved in collecting child support judgments. Pretext calling is one common method used to fraudulently obtain nonpublic customer financial information from a financial institution. Pretext calling often involves an information broker--a company that obtains and sells financial information and other data about individual consumers--contacting a bank and pretending to be a customer who has forgotten an account number. Pretext callers may also pose as law enforcement agents, social workers, potential employers, and other figures of authority. The pretext caller then obtains detailed account data--often including exact balances and recent transactions--and sells that information to lawyers, collection agencies, or other interested parties. Perhaps more importantly, pretext calling can lead to "identity theft." Generally, identity theft involves "stealing" another person's personal identifying information--Social Security number, date of birth, mother's maiden name, etc.--to fraudulently establish credit, run up debt, or take over existing financial accounts. The American Bankers Association (ABA) reported that its 1998 industry survey found that $3 out of $4 lost by a community bank to credit fraud was due to some form of identity theft.Consumers targeted by identity thieves typically do not know they have been victimized until the thieves fail to pay the bills or repay the loans. Identity thieves also buy account information from information brokers to engage in check and credit card fraud. A survey by the California Public Interest Research Group and Privacy Rights Clearinghouse found that fraudulent charges made on new and existing accounts in identity theft cases averaged $18,000. The Identity Theft and Assumption Deterrence Act of 1998 made identity theft a federal crime punishable, in most circumstances, by a maximum term of 15 years' imprisonment, a fine, and criminal forfeiture of any personal property used or intended to be used to commit the offense. It is too soon to assess the efficacy and adequacy of the remedies provided for in Subtitle B of Title V of the Gramm-Leach-Bliley Act of 1999. As of March 31, 2001, federal regulatory and enforcement agencies had not taken any enforcement actions or prosecuted any cases under this law. Federal agencies have taken initial regulatory steps to ensure that financial institutions establish appropriate safeguards designed to protect customer information. Financial institutions are required to be in compliance with the new regulations by July 1, 2001. Lastly, we found that there are limited data available to indicate the prevalence of fraudulent access to financial information or pretext calling. As of March 31, 2001, FTC had initiated a number of nonpublic investigations targeting pretexters but had not fully prosecuted any cases for Subtitle B violations that prohibit obtaining customer financial information through fraudulent methods. Thus, FTC officials told us that it was too soon to assess the efficacy and adequacy of the remedies of this law because they had not had any experiences prosecuting under the statute. They stated that it would take at least 3 to 5 years before there would be sufficient case history to permit them to assess the usefulness of the statute. FTC officials stated that one key benefit of Subtitle B is that it clearly established pretext calling as a federal crime, making it easier for them to take enforcement actions against firms that use fraud to access financial information. Prior to the enactment of GLBA, FTC had undertaken one enforcement action against an information broker that was engaging in pretext calling. FTC pursued this case under its general statute, section 5(a) of the Federal Trade Commission Act, which provides that "unfair or deceptive acts or practices in or affecting commerce are declared unlawful." One of the five FTC commissioners issued a dissenting statement because he felt pretext calling did not clearly violate FTC's long-standing deception or unfairness standard. In June 2000, FTC settled the case, which prohibited the broker from engaging in pretext calling, and entered into a $200,000 settlement with the broker, which was subsequently suspended on the basis of the defendants' inability to pay. FTC reported to Congress that its staff began a nonpublic investigation in June 2000 to test compliance with Subtitle B provisions that prohibit the use of fraudulent or deceptive means to obtain personal financial information. On January 31, 2001, FTC issued a press release regarding its "Operation Detect Pretext." As part of this operation, FTC's staff had conducted a "surf" of more than 1,000 Web sites and a review of more than 500 advertisements in the print media for firms that offered to conduct financial searches. FTC reported that it had identified approximately 200 firms that offered to obtain and sell asset or bank account information about consumers. FTC stated that it had sent notices to these 200 firms on January 26, 2001, advising them that their practices must comply with GLBA's restrictions as well as other applicable federal laws, including the Fair Credit Reporting Act. According to the press release, the notices also informed the firms that FTC would continue to monitor Web sites and print media advertisements offering financial searches to ensure that they complied with GLBA and all other applicable federal laws. As part of Operation Detect Pretext, FTC published a consumer alert entitled Pretexting: Your Personal Information Revealed that offers tips to consumers on protecting their personal information. On April 18, 2001, FTC filed suit to halt the operations of three information brokers who used false pretenses, fraudulent statements, or impersonation to illegally obtain consumers' confidential financial information, such as bank balances, and sell it. The Department of Justice had not prosecuted any cases involving pretext calling as of March 31, 2001. Department officials told us that in their experience, pretext calling is typically a component of a larger fraud scheme. They stated that they would normally prosecute under the larger fraud schemes, such as mail, wire, or bank fraud. They supported the new legislation and felt it provided them with sufficient enforcement authority to address the full criminal activity for related bank fraud cases. They said it was premature to comment on the adequacy of the criminal penalties provided in the act because they had no experience in prosecuting cases under this statute. They believed it would likely take several years before they would have adequate case history under this law to make any suggestions concerning the remedies contained in Subtitle B. Officials from the federal banking agencies, SEC, and NCUA all agreed that it was too soon to assess the efficacy and adequacy of the remedies in Subtitle B. None of these agencies had taken enforcement actions against financial institutions for violations of Subtitle B--which prohibits using fraudulent means to obtain personal financial information. Federal banking officials told us that they did not anticipate that there would be many circumstances in which they would use this law against a financial institution, unless an officer or employee of a financial institution was involved in the fraud. They stated that the financial institutions are typically one of the "victims" of pretext calling because the cost of the related crimes--credit card fraud or identity theft--is often borne by the financial institutions. They told us that they felt they had sufficient enforcement authority to take action against a bank officer or employee involved in fraudulent activities prior to the passage of Subtitle B and did not believe the statute gave them any additional enforcement authority. However, they supported the legislation because it explicitly makes fraudulent access to financial information a crime. Subtitle B of GLBA requires the federal banking agencies, NCUA, SEC, or self-regulatory organizations, as appropriate, to review their regulations and guidelines and prescribe such revisions as necessary "to ensure that financial institutions have policies, procedures, and controls in place to prevent the unauthorized disclosure of customer financial information and to deter and detect" fraudulent access to customer information. As of April 2001, the federal banking agencies and NCUA were coordinating their efforts to update the guidelines on pretext calling that they issued to financial institutions in the latter part of 1998 and early 1999. The earlier advisory was jointly prepared by the federal banking agencies, Federal Bureau of Investigation, U.S. Secret Service, Internal Revenue Service, and Postal Inspection Service. The advisory alerted institutions to the practice of pretext calling and warned institutions about the need to have strong controls in place to prevent the unauthorized disclosure of customer information. According to federal banking agency officials, they had discussed updating the guidelines to provide more information on identity theft and its relationship to pretext calling, but had not issued the updated guidelines as of April 2001. In addition, NCUA and the federal banking agencies issued guidelines for financial institutions relating to administrative, technical, and physical safeguards for customer records and information on January 30, 2001,and February 1, 2001. As discussed earlier, Subtitle A of GLBA requires the federal banking regulatory agencies, FTC, NCUA, SEC, and the state insurance regulators to establish standards for safeguarding customer information for the institutions that they regulate. Among other things, these standards are to establish safeguards to protect against unauthorized access to or use of such records or information that could result in substantial harm or inconvenience to any customer. For example, the guidelines issued by the banking agencies and NCUA require institutions to have controls designed to prevent employees from providing customer information to unauthorized individuals who may seek to obtain customer information through fraudulent means. Financial institutions under the jurisdiction of the federal banking agencies and NCUA are required to put in place by July 1, 2001, information security programs that satisfy the requirements of the guidelines. Officials at the bank regulatory agencies and NCUA told us that they plan to include the new guidelines for safeguarding customer financial information in their examination procedures. On June 22, 2000, SEC adopted regulations that require, among other things, brokers, dealers, investment companies, and registered investment advisors to adopt policies and procedures that address administrative, technical, and physical safeguards for the protection of customer records and information. These policies and procedures must be reasonably designed to (1) ensure the security and confidentiality of customer records and information, (2) protect against any anticipated threats or hazards to the security or integrity of customer records and information, and (3) protect against unauthorized access to or use of customer records or information that could result in substantial harm or inconvenience to any customer. SEC stated that it had conducted preliminary examinations of securities firms' efforts to comply with these requirements and planned to include firms' compliance with the regulations as a formal component of its examination program as of July 2001--the mandatory compliance date. SEC did not plan to develop additional guidance on pretext calling because it concluded that its regulation on safeguarding customer financial information would satisfy the agency guidance requirements of Subtitle B. FTC has begun the rulemaking process to establish safeguarding standards for customer information but had not issued its proposed regulations as of March 1, 2001. FTC officials told us that they expect to issue their proposed regulations by July 1, 2001--the date when financial institutions regulated by the federal banking agencies, NCUA, and SEC are required to have their safeguards in place. Subtitle B does not require state insurance regulators to review their regulations and guidance to ensure that financial institutions under their jurisdiction have policies, procedures, and controls in place to prevent the unauthorized disclosure of customer financial information. However, Subtitle A does require the state insurance regulators to establish standards for safeguarding customer financial information. As of March 1, 2001, the National Association of Insurance Commissioners (NAIC) was discussing how to approach these standards, either through issuing regulations, similar to SEC, or through general guidelines, similar to the federal banking regulators. In addition, the states were still in the process of drafting laws and regulations to be in compliance with the disclosure, information- sharing, and opt-out requirements contained in Subtitle A. Officials from the federal and state agencies whom we contacted were not aware of any available data sources that would indicate the prevalence of fraudulent access to financial information. Law enforcement officials told us that they do not collect such information. Justice officials stated that they track the number of offenses filed under the statute, but no matters had been brought forward as of March 1, 2001. Representatives from privacy or consumer groups also told us they were unaware of any statistics or databases that track the prevalence of pretexting. To obtain an indicator of the prevalence of pretext calling, we requested Suspicious Activity Report (SAR) data from the Financial Crimes Enforcement Network (FinCEN). Although banks are not obligated to report pretext-calling attempts, banks are generally required to file a SAR when it detects a known or suspected criminal violation of federal law or a suspicious transaction related to a money laundering activity or a violation of the Bank Secrecy Act. Banks are not required to file SARs until a certain dollar threshold has been met or exceeded. FinCEN officials told us that "false pretense"--their wording for pretext--is not part of the SAR data because it is not considered a criterion for filing a SAR, but it may be kept as secondary information contained in the narrative field as reported by the banks. At our request, in September 2000, FinCEN officials searched the narrative field of their database and found that only 3 of the 400,000 SARs in their database contained narrative regarding the use of false pretenses to obtain customer financial information. FinCEN subsequently advised us that recently completed research on SAR data for the calendar year 2000 indicated an increase in bank reporting on identity theft during the year. FinCEN noted that it is possible there may be an attendant increase in narrative reporting on attempted fraudulent access to financial information. Representatives of the Interagency Bank Fraud Working Group whom we contacted also discussed potentially expanding the narrative section of the SARs to capture information on pretext calling and identity theft. In our effort to identify indicators of the impact of Subtitle B, we reviewed information from FTC's Identity Theft Clearinghouse Database and the federal financial regulators' consumer complaint databases. According to FTC staff, victims of identity theft often typically did not know how their personal financial information was obtained, unless they had lost their wallets or family members or friends were involved. Therefore, it is unlikely these victims would be aware of whether someone had used pretexting to obtain their information. FTC reported that they had processed over 40,000 entries from consumers and victims of identity theft as of December 31, 2000. Of those entries, about 88 percent had no relationship with the identity theft suspect (about 12 percent had a personal relationship with the identity theft suspect). According to officials from the federal banking agencies, NCUA, and SEC, they received few consumer complaints related to financial privacy. They explained that they believed that consumers may be more likely to report potential cases of fraud to their banks or to law enforcement agencies first, rather than contacting the financial regulators. Thus, consumer complaints submitted to the federal regulators may not accurately reflect the prevalence of financial privacy violations. In addition, consumer complaint databases maintained by the regulators typically did not have a specific category to capture pretext-calling allegations, which is distinct from related incidents of fraud, such as credit card fraud. In October 2000, FDIC expanded its coding system to capture additional information related to financial privacy complaints. Pretexting is difficult to detect and is likely to be underreported. Many officials told us that pretexting was a common practice, especially among private investigators. According to many law enforcement officials we spoke with, crimes involving pretexting are particularly difficult to prove, and it was unlikely that pretexting would be reported or prosecuted as a single crime. If a pretexter is clever in his or her fraud scheme and successful in obtaining financial information, the financial institution is unaware that it was fooled into providing information. Often there is a time lag before victims of pretext calling suffer financial loss, and they may not be aware of how their financial information was obtained. According to law enforcement officials we spoke with, offenders using fraud to access financial information are generally detected as part of a larger crime, such as credit card, identity theft, or other bank fraud. An increase in related crimes, although not directly correlated to pretext calling, may be a possible indication of the prevalence of fraudulent access to financial information. For example, the number of SAR filings by the banks related to check fraud, debit and credit card fraud, false statement, and wire transfer fraud continued to increase from 1998 to 1999, according to the October 2000 report by the Bank Secrecy Act Advisory Group. As stated previously, more time and experience are needed to assess the efficacy and adequacy of the remedies contained in Subtitle B regarding fraudulent access to financial information. Therefore, we are not making any recommendations for additional legislation or regulatory actions. During our consultations with representatives from FTC, the federal banking agencies, NCUA, SEC, and federal and state enforcement agencies and insurance regulators, we obtained their views about the efficacy and adequacy of the subtitle's other provisions. Some federal and state officials and representatives from consumer and privacy groups we contacted had some suggestions regarding possible changes to Subtitle B provisions, which are presented below. As discussed earlier, we did not evaluate how practical these suggestions were since we found no consensus on these issues. These suggestions reflect the continued concerns and issues raised by FTC staff and the privacy and consumer groups with whom we spoke. FTC staff and some state officials suggested that states be allowed to take enforcement actions for violations of Subtitle B provisions. According to these FTC staff and state officials, this would allow the states to augment the federal resources used to enforce compliance with the Subtitle B prohibition against pretext calling. Earlier versions of the House and Senate bills that were the basis for Subtitle B contained provisions that provided for state actions for injunctive relief or for recovering damages of not more than $1,000 per violation. These provisions were subsequently eliminated in the House and Conference versions of the legislation. FTC staff stated that the additional resources of the state attorneys general would be particularly helpful in enforcing compliance by some of the smaller information brokers that may otherwise escape detection or monitoring. According to some of the state officials we contacted, allowing state actions under the federal statute would increase the deterrent effects of the legislation. However, other state officials stated that they did not expect that providing states with enforcement authority under this statute would result in significantly greater enforcement activity due to resource limitations at the state enforcement level. Some of the consumer and privacy groups suggested that a private right of action provision be added to allow the consumers who were the victims of pretext calling to obtain financial compensation from the perpetrators of the violations. Like the state enforcement action provision, earlier House and Senate versions of Subtitle B contained provisions, which were subsequently eliminated, that would have allowed for civil lawsuits by individuals and financial institutions. These provisions recognized that pretext-calling victims will, in some instances, have a stronger incentive to proceed against an information broker or the broker's client than a law enforcement agency or prosecutor operating with limited resources and forced to juggle competing priorities, particularly in those cases in which the amount of monetary damages is minimal. According to some of the state officials we contacted, the possibility of civil lawsuits would potentially increase the penalties for violating the statute's provisions and, thus, help to deter such criminal activities. However, some officials did not agree with this suggestion and stated that a private right of action could also result in unintended consequences, such as frivolous lawsuits and overcrowded court dockets. There were differing suggestions made regarding the provision in the statute that allows private investigators to use pretext calling under certain conditions. The statute allows state-licensed private investigators to use pretext calling to collect child support from persons adjudged to have been delinquent by a federal or state court and if authorized by an order or judgment of a court of competent jurisdiction. The exception for state-licensed private investigators is nullified if prohibited by another federal or state law or regulation. Some consumer and privacy representatives stated that the exception was too broad and could result in potential abuse. On the other hand, one of the trade groups for private investigators wanted Congress to amend Subtitle B to allow the use of pretexting as an investigative tool to locate hidden assets when investigators contact judgment debtors or persons who have committed fraud. According to this trade group, one of the unintended consequences of Subtitle B is that it makes it easier for criminals and judgment debtors to hide their assets from lawful collection. We provided a draft of this report to the Chairman of the Federal Trade Commission, the Attorney General, the Secretary of the Treasury, the Chairman of the Federal Deposit Insurance Corporation, the Chairman of the Federal Reserve Board, the Comptroller of the Currency, the Director of the Office of Thrift Supervision, the Acting Chairman of the National Credit Union Administration, the Chair of the National Association of Insurance Commissioners, and the Acting Chairman of the Securities and Exchange Commission for their review and consultation. The Federal Trade Commission, Treasury, Federal Deposit Insurance Corporation, Federal Reserve Board, Office of the Comptroller of the Currency, NCUA, and SEC agreed with our overall report's message and provided technical comments, which we incorporated into the appropriate sections of this report. The Office of Thrift Supervision, Justice, and NAIC agreed with our overall message and did not provide any comments on our report. In commenting on our draft report, the Financial Crimes Division of the U.S. Secret Service expressed concern over an increase in attacks directed at on-line service databases that ultimately contain personal financial information, such as credit card numbers, Social Security numbers, etc. The Secret Service also emphasized that they support any steps taken toward deterring individuals from attempting attacks directed at any institution's infrastructure for the purposes of obtaining financial information. Although we acknowledge these concerns and their support on securing the privacy of financial information on-line, our study did not focus on on-line information security. We are sending copies of this report to the requesting congressional committees. We are also sending copies to the Honorable Robert Pitofsky, Chairman, Federal Trade Commission; the Honorable John Ashcroft, the Attorney General; the Honorable Paul H. O'Neill, Secretary of the Treasury; the Honorable Donna Tanoue, Chairman, the Federal Deposit Insurance Corporation; the Honorable Alan Greenspan, Chairman, the Federal Reserve Board of Governors; the Honorable John D. Hawke, Jr., Comptroller of the Currency; the Honorable Ellen Seidman, Director, the Office of Thrift Supervision; the Honorable Dennis Dollar, Acting Chairman, the National Credit Union Administration; the Honorable Kathleen Sebelius, Chair, the National Association of Insurance Commissioners; and the Honorable Laura S. Unger, Acting Chairman, the Securities and Exchange Commission. If you or your staff have any questions on this report, please contact me at (202) 512-8678 or Harry Medina at (415) 904-2000. Key contributors to this report were Debra R. Johnson, Nancy Eibeck, Shirley A. Jones, and Charles M. Johnson, Jr. To determine the efficacy and adequacy of the remedies provided by the Gramm-Leach-Bliley Act of 1999 (GLBA) in addressing attempts to obtain financial information by false pretenses, we interviewed officials from the Department of Justice, the Department of the Treasury, the Federal Deposit Insurance Corporation, the Federal Reserve Board, the Federal Trade Commission (FTC), the National Credit Union Administration, the Office of the Comptroller of the Currency, the Office of Thrift Supervision, and the Securities and Exchange Commission. Within Justice, we interviewed officials representing its Criminal and the Civil Divisions, the Federal Bureau of Investigation, and the Executive Office of the United States Attorneys. In addition, we talked with officials at seven U.S. attorney offices: (1) Eastern District of New York, (2) Southern District of New York, (3) Central District of California, (4) Northern District of California, (5) District of Massachusetts, (6) District of Minnesota, and (7) District of Colorado. The officials at the U.S. attorney offices we spoke with are primarily responsible for overseeing any federal prosecution of financial crimes that occur in their respective districts. We selected these offices because they were located in states that had been identified as being particularly active regarding consumer financial privacy. We also consulted with a number of state officials located in those same five states. Specifically, we interviewed staff from the state insurance regulatory agency and the attorney general's office located in California, Colorado, Massachusetts, Minnesota, and New York. In addition, we interviewed representatives of the National Association of Insurance Commissioners. Within Treasury, we talked with officials from its Office of Financial Institutions, Office of Enforcement, Financial Crimes Enforcement Network, Internal Revenue Service, and U.S. Secret Service. We interviewed FTC staff from the Bureau of Consumer Protection who monitor compliance of financial institutions under FTC's jurisdiction and FTC officials responsible for designing and implementing "Operation Pretext," and we reviewed relevant FTC documents on FTC's enforcement activities related to information brokers. We also examined the regulations and guidelines developed by the Federal Deposit Insurance Corporation, the Federal Reserve Board, FTC, the National Credit Union Administration, the Office of Comptroller of the Currency, the Office of Thrift Supervision, and the Securities and Exchange Commission related to their implementation of the privacy provisions of GLBA. In addition, we requested and reviewed data from the various agencies regarding enforcement activity and consumer complaints related to fraudulent access to financial information. To identify suggestions for additional legislation or regulatory actions with respect to fraudulent access to financial information, we obtained the viewpoints of the federal and state agencies' officials we met with and interviewed a number of consumer and privacy groups that have been active in the area of financial privacy. Specifically, we interviewed representatives of the Center for Democracy and Technology, the Consumer Federation of America, Consumers Union, Eagle Forum, the Electronic Privacy Information Center, the Privacy Rights Clearinghouse, Privacy Times, the U.S. Public Interest Research Group, and the California Public Interest Research Group. In addition, we also talked with the American Bankers Association; the Association of Credit Bureaus; the North American Securities Administrators Association, Inc.; and the National Council of Investigation and Security Services, which represents the investigation and guard industry. We conducted our work in Washington, D.C.; San Francisco, CA; and New York City, NY, between August 2000 and April 2001, in accordance with generally accepted government auditing standards. | This report provides information on (1) the efficacy and adequacy of remedies provided by the Gramm-Leach-Bliley Act of 1999 in addressing attempts to obtain financial information by false pretenses and (2) suggestions for additional legislation or regulatory action to address threats to the privacy of financial information, from financial institutions. As of March 2001, federal regulatory and enforcement agencies had not taken any enforcement actions or prosecuted any cases under Subtitle B. The Federal Trade Commission (FTC) and the Department of Justice are still in the process of taking steps to ensure that the financial institutions that they regulate have reasonable controls to protect against fraudulent access to financial information. Although all of the federal regulators and privacy experts whom GAO contacted agreed that more time and experience are needed to determine if Subtitle B remedies adequately address fraudulent access to financial information, FTC staff and privacy experts suggested legislative changes to Subtitle B. GAO did not evaluate the potential impact or practicality of these suggestions because it found no consensus on these ideas. | 6,215 | 216 |
According to senior SBA officials in headquarters and the field, several aspects of the current organizational alignment contribute to the challenges faced by SBA management. The problem areas include cumbersome communication links between headquarters and field units; complex, overlapping organizational relationships; confusion about the district offices' primary customer; and a field structure not consistently matched with mission requirements. According to the agency scorecard report for SBA, while SBA recognizes the need to restructure, little progress has been made to date. In response to our findings and additional challenges identified by OMB and the SBA Inspector General, SBA drafted a 5-Year Workforce Transformation Plan. The 1990s realignment--in which the regions were downsized, but not eliminated, and the Office of Field Operations was created, but never fully staffed--resulted in the cumbersome communication links between headquarters and field units according to senior SBA officials in headquarters and the field. The Office of Field Operations had fewer than 10 staff at the time of our review, and senior SBA officials told us that it would be impossible for such a small office to facilitate the flow of information between headquarters and district offices as well as was done by the 10 regional offices when each region had its own liaison staff. As a result, headquarters program offices sometimes communicate with the district offices directly and they sometimes go through the Office of Field Operations. To further complicate communication, the regional offices are still responsible for monitoring goals and coordinating administrative priorities to the district locations. Officials described how these multiple lines of communication have led to district staff being on the receiving end of conflicting or redundant requests. While some SBA officials felt that the regions had a positive effect on communication between headquarters and the districts, others felt that the regions were an unnecessary layer of management. The SBA Inspector General's office found similar problems with communication within SBA when it conducted management challenge discussion groups with almost 50 senior officials from SBA headquarters, regional, and district offices. SBA has recognized that as it transforms itself, it needs to make the lines of communication between the districts, regions, and headquarters clearer to help bring about quick, effective decision-making. SBA plans to increase the responsibilities of the regional offices, perhaps by adding a career deputy regional administrator to assist the Regional Administrator in overseeing the district offices. Under SBA's draft plan, the deputy would also work closely with the Office of Field Operations to coordinate program delivery in the field. We also found evidence of complex, overlapping organizational relationships, particularly among field and headquarters units. For example, district staff working on SBA loan programs report to their district management, while loan processing and servicing center staff report directly to the Office of Capital Access in headquarters. Yet, district office loan program staffs sometimes need to work with the loan processing and servicing centers to get information or to expedite loans for lenders in their district. Because loan processing and servicing centers report directly to the Office of Capital Access, requests that are directed to the centers sometimes must go from the district through the Office of Capital Access then back to the centers. District managers and staff said that sometimes they cannot get answers to questions when lenders call and that they have trouble expediting loans because they lack authority to direct the centers to take any action. Lender association representatives said that the lines of authority between headquarters and the field can be confusing and that practices vary from district to district. Figure 1 depicts the variety of organizational relationships we found between SBA headquarters and field units. SBA plans to eliminate the current complicated overlapping organizational relationships between field organizations and headquarters organizations by consolidating functions and establishing specific lines of authority. SBA's draft transformation plan states that this effort will reduce management layers and provide a more efficient management structure. Specifically, SBA plans to further centralize loan processing, servicing, oversight, and liquidation functions; eliminate area offices for surety bonds and procurements by making regional or district offices responsible; and move oversight for entrepreneurial development programs to district offices. We found disagreement within SBA over the primary customer of the district offices. Headquarters executives said that the district offices primarily serve small businesses, while district office officials told us that their primary clients are lenders. The headquarters officials said that the role of the district office was in transition and that, because many lending activities had been centralized, the new role for the district offices was to work with small businesses. However, the district office managers said that their performance ratings were weighted heavily on aspects of loan activity. Moreover, there is only one program--8(a) business development--through which district offices typically work directly with small businesses, further reinforcing the perception of the district managers that lenders rather than small businesses are their primary customers. According to SBA's transformation plan, the mission of its districts will become one of marketing SBA's continuum of services, focusing on the customer, and providing entrepreneurial development assistance. SBA stated that over the next 5 years, it is fully committed to making fundamental changes at the district level, changes that have been discussed for years, but have never been fully implemented. To begin this change, SBA plans to test specific strategies for focusing district offices' goals and efforts on outreach and marketing of SBA services to small businesses and on lender oversight in three offices during fiscal year 2002. SBA plans to implement the results in 10-20 districts in fiscal year 2003. As part of this change, SBA will need to carefully consider how the new mission of its district offices will affect the knowledge, skills, and abilities--competencies--district staff will need to be successful in their new roles. If competency gaps are identified, SBA will need to develop recruitment, training, development, and performance management programs to address those gaps. SBA managers said that, in some cases, the current field structure does not consistently match mission requirements. For example, the creation of loan processing and servicing centers moved some, but not all, loan- related workload out of the district offices. District offices retained responsibility for the more difficult loans and loans made by infrequent lenders. Similarly, the regional offices were downsized, but not eliminated during the 1990s. In addition, they said that some offices and centers are not located to best accomplish the agency's mission. For example, Iowa has two district offices located less than 130 miles apart, and neither manages a very large share of SBA's lending program or other workload. SBA also has a loan-related center located in New York City, a very high- cost area where it has trouble attracting and retaining staff. Figure 2 shows the locations of SBA offices around the country. SBA officials also stressed that congressional direction has played a part in SBA's current structure. SBA officials pointed out that Congress has created many new offices, programs, aspects of existing programs, and pilot projects and has prescribed reporting relationship, grade, and/or type of appointment for several senior SBA officials. We found 78 offices, programs, or program changes that were created by laws since 1961, with most of the changes occurring in the 1980s and 1990s. Eleven SBA staff positions and specific reporting relationships were also required by law. In its transformation plan, SBA discusses its difficulty with matching its field structure with mission requirements and states that in order for the field structure to reflect the new mission and customer focus, consolidation of functions and the elimination or reduction of redundant offices may be necessary. The result of consolidations will be a streamlined organization with reduced management layers and an increased span of control for the field organizations that remain. For example, over the course of the 5-year plan, SBA plans to consolidate all loan processing, servicing, and liquidation into fewer centers, but give them an expanded role for handling all the functions currently carried out in the district offices. Integrating personnel, programs, processes, and resources to support the most efficient and effective delivery of services--organizational alignment--is key to maximizing an agency's performance and ensuring its accountability. The often difficult choices that go into transforming an organization to support its strategic and programmatic goals have enormous implications for future decisions. Our work has shown that the major elements that underpin a successful transformation--and that SBA should consider employing--include strategic planning; strategic human capital management; senior leadership and accountability; alignment of activities, processes, and resources to support mission achievement; and internal and external collaboration. Proactive organizations employ strategic planning to determine and reach agreement on the fundamental results the organization seeks to achieve, the goals and measures it will set to assess programs, and the resources and strategies it will need to achieve its goals. Strategic planning is used to drive programmatic decision-making and day-to-day actions and, thereby, help the organization be proactive, able to anticipate and address emerging threats, and take advantage of opportunities, rather than remain reactive to events and crises. Leading organizations, therefore, understand that strategic planning is not a static or occasional event, but a continuous, dynamic, and inclusive process. Moreover, it can guide decision-making and day-to-day activities. According to the agency scorecard report, SBA has not articulated a clear vision of what role it should fill in the marketplace. In our review of SBA's fiscal year 2000 performance report and fiscal year 2002 performance plan, we reported that we had difficulty assessing SBA's progress in achieving its goals because of weaknesses in its performance measures and data.We said that SBA should more clearly link strategies to measurable performance indicators, among other things. SBA said it has made adjustments to its managing for results process and now has identified specific performance parameters that must be met. Additionally, SBA recognizes the need for its workforce transformation plan and 5-Year Strategic Plan to complement each other. People--or human capital--are an organization's most important asset and define its character, affect its capacity to perform, and represent its knowledge base. We have recently released an exposure draft of a model of strategic human capital management that highlights the kinds of thinking that agencies should apply and steps they can take to manage their human capital more strategically. The model focuses on four cornerstones for effective human capital management--leadership; strategic human capital planning; acquiring, developing, and retaining talent; and results-oriented organizational cultures--and a set of associated critical success factors that SBA and other federal agencies may find useful in helping to guide their efforts. In its workforce transformation plan, SBA said that it recognizes that employees are its most valuable asset. It plans to emphasize the importance of human capital by clearly defining new agency functions and identifying and developing the skills and competencies required to carry out the new mission. SBA also plans, beginning in fiscal year 2002, to conduct a comprehensive skill and gap analysis for all employees. In addition, SBA will increase its emphasis on its two succession planning programs, the Senior Executive Service Candidate Development Program and the District Director Development Program, to recruit qualified individuals for future leadership roles. SBA also said that it plans to increase the number of professional development opportunities for employees to ensure that they can build missing competencies. The importance of senior leadership and commitment to change is essential. Additionally, high performing organizations have recognized that a key element of an effective performance management system is to create a "line of sight" that shows how individual responsibilities and day-to-day activities are intended to contribute to organizational goals. In addition to creating "lines of sight," a performance management system should encourage staff to focus on performing their duties in a manner that helps the organization achieve its objectives. The SBA Administrator has demonstrated his commitment to transforming SBA by tasking his Deputy Administrator and Chief Operating Officer with coordinating the implementation of SBA's 5-year workforce transformation plan. He also said that the transformation plan will complement the agency's 5-Year Strategic Plan and that SBA's successes will be measured by the successes of its clients. These are important steps in aligning expectations within the agency toward agency goals. As SBA begins to implement its transformation plan, it will also be important to be certain that agency goals are reflected in the performance objectives and ratings of SBA's senior executives and the performance appraisal systems for lower-level employees. Sustained senior management attention to implementation of the plan and support from key internal and external stakeholders will be important ingredients in the ultimate success or failure of SBA's transformation. An organization's activities, core processes, and resources must be aligned to support its mission and help it achieve its goals. Leading organizations start by assessing the extent to which their programs and activities contribute to fulfilling their mission and intended results. They often find, as our work suggested, that their organizational structures are obsolete and that levels of hierarchy or field-to-headquarter ratios must be changed. Similarly, as priorities change, resources must be moved and workforces redirected to meet changing demands. According to the President's Management Agenda, while SBA recognizes the need to restructure, little progress has been made to date and SBA has not translated the benefits of asset sales and technological improvements into human resource efficiencies. In response, SBA drafted a 5-Year Workforce Transformation Plan intended to adjust its programs and delivery mechanisms to reflect new ways of doing business and the changing needs of its clients. SBA said that it plans to continue with asset sales, to enhance technology by using contractors, and to use technology to move work to people--more of whom will be deployed at smaller facilities in the future. There is also a growing understanding that all meaningful results that agencies hope to achieve are accomplished through networks of governmental and nongovernmental organizations working together toward a common purpose. Internally, leading organizations seek to provide managers, teams, and employees at all levels the authority they need to accomplish programmatic goals and work collaboratively to achieve organizational outcomes. Communication flows up and down the organization to ensure that line staffs have the ability to provide leadership with the perspective and information that the leaders need to make decisions. Likewise, senior leaders keep the line staff informed of key developments and issues so that the staff can best contribute to achieving organizational goals. SBA has long understood the need for collaboration. In the late 1980s, SBA shifted its core functions of direct loan making and entrepreneurial assistance to reliance on resource partners to deliver SBA programs directly. This shift allowed SBA to greatly increase its loan volume and the number of clients served. However, SBA has lost much of its direct connection with its small business owner clients. SBA has only recently begun to develop the appropriate oversight tools for its resource partners and the appropriate success measures for its programs and staff. Mr. Chairman, this concludes my prepared statement. I would be pleased to respond to any questions that you or other Members of the Subcommittee may have at this time. | The Small Business Administration (SBA) has made organizational structure and service delivery changes during the past 10 years. However, ineffective lines of communication, confusion over the mission of district offices, complicated and overlapping organizational relationships, and a field structure not consistently matched with mission requirements all combine to impede SBA staff efforts to deliver services effectively. SBA's structural inefficiencies stem in part from realignment efforts during the mid-1990s that changed SBA's functions but left aspects of the previous structure intact, congressional influence over the location of field offices and centers, and legislative requirements such as specified reporting relationships. In response to GAO's findings and additional challenges identifies by the Office of Management and Budget and the SBA Inspector General, SBA recently announced a draft 5-year workforce transformation plan that discusses many of GAO's findings regarding the difficulties posed by its current structure. Organizational alignment is crucial if an agency is to maximize its performance and accountability. As SBA executes its workforce transformation plan, it should employ strategies common to successful transformation efforts both here and abroad. Successful efforts begin with instilling senior-level leadership, responsibility, and accountability for organizational results and transformation efforts. Organizations that have successful undertaken transformation efforts also typically use strategic planning and human capital management, alignment of activities, processes, and resources, and internal and external collaboration to underpin their efforts. | 3,127 | 291 |
Influenza is more severe than some other viral respiratory infections, such as the common cold. Most people who contract influenza recover completely in 1 to 2 weeks, but some develop serious and potentially life- threatening medical complications, such as pneumonia. People aged 65 and older, people of any age with chronic medical conditions, children younger than 2 years, and pregnant women are generally more likely than others to develop severe complications from influenza. Vaccination is the primary method for preventing influenza and its more severe complications. Produced in a complex process that involves growing viruses in millions of fertilized chicken eggs, influenza vaccine is administered annually to provide protection against particular influenza strains expected to be prevalent that year. Experience has shown that vaccine production generally takes 6 or more months after a virus strain has been identified; vaccines for certain influenza strains have been difficult to mass-produce. After vaccination, it takes about 2 weeks for the body to produce the antibodies that protect against infection. According to CDC recommendations, the optimal time for vaccination is October through November, because the annual influenza season typically does not peak until January or February. Thus, in most years vaccination in December or later can still be beneficial. At present, two vaccine types are recommended for protection against influenza in the United States: an inactivated virus vaccine injected into muscle and a live virus vaccine administered as a nasal spray. The injectable vaccine--which represents the large majority of influenza vaccine administered in this country--can be used to immunize healthy individuals and those at highest risk for complications, including those with chronic illness and those aged 65 and older, but the nasal spray vaccine is currently approved for use only among healthy individuals aged 5 to 49 years who are not pregnant. Vaccine manufacture and purchase take place largely within the private sector: for the 2004-05 influenza season, two companies (one producing the injectable vaccine and one producing the nasal spray) manufactured vaccine for the U.S. market. Although vaccination is the primary strategy for protecting individuals who are at greatest risk of serious complications and death from influenza, antiviral drugs can also contribute to the treatment and prevention of influenza. Four antiviral drugs have been approved for treatment. If taken within 2 days after symptoms begin, these drugs can reduce symptoms and make someone with influenza less contagious to others. Three of the four antiviral drugs are also approved for prevention; according to CDC, they are about 70 to 90 percent effective for preventing illness in healthy adults. HHS has primary responsibility for coordinating the nation's response to public health emergencies. As part of its mission, the department has a role in the planning needed to prepare for and respond to an influenza pandemic. One action the department has taken is to develop a draft national pandemic influenza plan, titled Pandemc In uenza Preparedness i fl and Response Pan, which was released in August 2004 for a 60-day l comment period. Within HHS, CDC is the principal agency for protecting the nation's health and safety. CDC's activities include efforts to prevent and control diseases and to respond to public health emergencies. CDC and its Advisory Committee on Immunization Practices (ACIP) recommend which population groups should be targeted for vaccination each year and, when vaccine supply allows, recommends that any person who wishes to decrease his or her risk of influenza-like illness be vaccinated. FDA, another HHS agency, also plays a role in preparing for the annual influenza season and for a potential pandemic. FDA is responsible for ensuring that new vaccines and drugs are safe and effective. The agency also regulates and licenses vaccines and antiviral agents. HHS has limited authority to control vaccine production and distribution directly; influenza vaccine supply and marketing are largely in the hands of the private sector. Although the Public Health Service Act authorizes the Secretary of HHS to "take such action as may be appropriate" to respond to a public health emergency, as determined and declared by the Secretary, it is not clear whether or to what extent the Secretary could directly influence the manufacture or distribution of influenza vaccine to respond to an influenza pandemic. The appropriateness of the Secretary's response would depend on the nature of the public health emergency, for example on the available evidence relating to a pandemic. According to a senior HHS official involved in HHS emergency preparedness activities, manufacturers of vaccine for the U.S. market have agreed in principle to switch to production of pandemic influenza vaccine should the need arise and proper compensation and indemnification be provided; therefore, he said, it would probably be unnecessary for the federal government to nationalize vaccine production, although the federal government has the legal authority to do so if circumstances warrant it. For the 2004-05 influenza season, CDC estimated as late as September 2004 that about 100 million doses of vaccine would be available for the U.S. market. CDC and ACIP recommended vaccination for about 185 million people, including roughly 85 million people at high risk for complications. On October 5, 2004, however, one manufacturer announced that it could not provide its expected production of 46-48 million doses--roughly half of the U.S. supply of expected vaccine. Because a large proportion of vaccine produced by the other major manufacturer of injectable vaccine had already been shipped before October 5, 2004, about 25 million doses of injectable vaccine for high-risk individuals and others, and about 1 million doses of the nasal spray vaccine for healthy people, were available after the announcement to be distributed to Americans who wanted an influenza vaccination. Preparing for and responding to an influenza pandemic differ in several respects from preparing for and responding to a typical influenza season. For example, past influenza pandemics have affected healthy young adults who are not typically at high risk for complications associated with influenza, and a pandemic could result in an overwhelming burden of ill persons requiring hospitalization or outpatient medical care. In addition, the demand for vaccine may be greater in a pandemic. Challenges remain in planning for purchase and distribution of vaccine and defining priority groups in the event of a pandemic. HHS has not finalized planning for an influenza pandemic, leaving unanswered questions about the nation's ability to prepare for and respond to such an outbreak. For the past 5 years, we have been urging HHS to complete its pandemic influenza plan. The document remains in draft form, although federal officials said in June 2005 that an update of the plan is being completed and is expected to be available in summer 2005. Key questions about the federal role in purchasing and distributing vaccines during a pandemic remain, and clear guidance on potential groups that would likely have priority for vaccination is lacking in the current draft plan. One challenge is that the draft pandemic plan does not establish the actions the federal government would take to purchase or distribute vaccine during an influenza pandemic. Rather, it describes options for vaccine purchase and distribution, which include public-sector purchase of all pandemic influenza vaccine; a mixed public-private system where public-sector supply may be targeted to specific priority groups; and maintenance of the current largely private system. The draft plan does not specifically recommend any of these options. According to the draft plan, the federal government's role may change over the course of a pandemic, with greater federal involvement early, when vaccine is in short supply. Noting that several uncertainties make planning vaccination strategies difficult, the draft plan states that national, state, and local planning needs to address possible contingencies, so that appropriate strategies are in place for whichever situation arises. If public-sector vaccine purchase is an option, establishing the funding sources, authority, or processes to do so quickly may be needed. During the 2004-05 shortage, some state health officials reported problems with states' ability, with regard to both funding and the administrative process, to purchase influenza vaccine. For example, during the effort to redistribute vaccine to locations of greatest need, the state of Minnesota tried to sell its available vaccine to other states seeking additional vaccine for their high-risk populations. According to federal and state health officials, however, certain states lacked the funding or authority under state law to purchase the vaccine when Minnesota offered it. In response to problems encountered during the 2004-05 shortage, the Association of Immunization Managers proposed in 2005 that federal funds be set aside for emergency purchase of vaccine by public health agencies and that cost not be a barrier in acquiring vaccine to distribute to the public. Although an influenza pandemic may differ from an annual influenza season, experience during the 2004-05 shortage illustrates the importance of having a distribution plan in place ahead of time to prevent delays when timing is critical: Collaborating with stakeholders to create a workable distribution plan is time consuming. After the October 5, 2004, announcement of the sharp reduction in influenza vaccine supply, CDC began working with the sole remaining manufacturer of injectable vaccine on plans to distribute this manufacturer's remaining supply to providers across the country. The plan had two phases and benefited from voluntary compliance by the manufacturer to share proprietary information to help identify geographic areas of greatest need for vaccine. The first phase, which began in October 2004, filled or partially filled orders from certain provider types, including state and local public health departments and long-term care facilities. The second phase, which began in November 2004, used a formula to apportion the remaining doses across the states according to each state's estimated percentage of the national unmet need. States could then allocate doses from their apportionment to providers and facilities, which would purchase the vaccine through a participating distributor. The state ordering process under the second phase continued through mid-January. Health officials in several states commented on the late availability of this vaccine; officials in one state, for example, remarked that the phase two vaccine was "too much, too late." Identifying priority groups in local populations also takes time. Federal, state, and local officials need to have information on the population of the priority groups and the locations where they can be vaccinated to know how, where, and to whom to distribute vaccine in the event of an influenza pandemic. During the 2004-05 influenza season, federal officials developed a distribution plan to allocate a limited amount of vaccine, but the states also had to determine how much vaccine was needed and where to distribute it within their own borders. For example, state health officials in Florida did not know exactly how many high-risk individuals needed vaccination, so they surveyed long-term care facilities and private providers to estimate the amount of vaccine needed to cover high-risk populations. It took nearly a month for state officials to compile the results of the surveys, to decide how many doses needed to be distributed to local areas, and to receive and ship vaccine to the counties. Distributing the vaccine to a state or locality is not the same as administering the vaccine to an individual. Once vaccine has been distributed to a state or local agency, individuals living in those areas still need to be vaccinated. Vaccinating a large number of people is challenging, particularly when demand exceeds available supply. For example, during the 2004-05 influenza season, many places giving vaccinations right after the shortage was announced were overwhelmed with individuals wanting to be vaccinated. Certain local public health departments in California, including the Santa Clara County Public Health Department, provided chairs and extra water for people waiting in long lines outdoors in warm weather. Fear of a more virulent pandemic influenza strain could exacerbate such scenarios. A number of states reported that they did not have the capacity to immunize large numbers of people and partnered with other organizations to increase their capacity. For example, in 2004-05, according to state health officials in Florida, county health departments, including those in Orange and Broward Counties, worked with a national home health organization to immunize high-risk individuals by holding mass immunization clinics and setting up clinics in providers' offices to help administer available vaccine quickly. Other locations, including the local health department in Portland, Maine, held lotteries for available vaccine; according to local health officials, however, administrative time was required to arrange and publicize the lottery. HHS's draft pandemic plan does not define priority groups for vaccination, although the plan states that HHS is developing an initial list of suggested priority groups and soliciting public comment on the list. The draft plan instructs the states to define priority groups for early vaccination and indicates that as information about virus severity becomes available, recommendations will be formulated at the national level. According to the plan, setting priorities will be iterative, tied to vaccine availability and the pandemic's progression. Without agreed-upon identification of potential priority groups in advance, however, problems can arise. During the 2004-05 season, for example, CDC and ACIP acted quickly on October 5, 2004, to narrow the priority groups for available vaccine, giving the narrowed groups equal importance. In some places, however, there was not enough available vaccine to cover everyone in these narrowed priority groups, so states set their own priorities among these groups. Maine, for example, excluded health care workers from the state's early priority groups because state officials estimated that there was not enough vaccine to cover everyone in CDC and ACIP's priority groups. Another challenge in responding to a pandemic will be to clearly communicate information about the situation and the nation's response plans to public health officials, providers, and the public. Experience during the 2004-05 vaccine shortage illustrates the critical role communication plays when information about vaccine supply is unclear. Communicating a consistent message and clearly explaining any apparent inconsistencies. In a pandemic, clear communication on who should be vaccinated will be important, particularly if the priority population differs from those targeted for annual influenza vaccination, or if the priority groups in one area of the country differ from those in others. During the 2004-05 influenza season, health officials in Minnesota reported that some confusion resulted when the state determined that vaccine was sufficient to meet demand among the state's narrower priority groups and made vaccine available to other groups, such as healthy individuals aged 50-64 years, earlier than recommended by CDC. Health officials in California reported a similar situation. State health officials pointed out that in mid-December, local radio stations in California were running two public service announcements--one from CDC advising those 65 and older to be vaccinated and one from the California Department of Health Services advising those 50 and older to be vaccinated. State officials emphasized that these mixed messages created confusion. Communicating information from a primary source. Having a primary and timely source of information will be important in a pandemic. In the 2004-05 influenza season, individuals seeking vaccine could have found themselves in a communication loop that provided no answers. For example, CDC advised people seeking influenza vaccine to contact their local public health department; in some cases however, individuals calling the local public health department would be told to call their primary care provider, and when they called their primary care provider, they would be told to call their local public health department. This lack of a reliable source of information led to confusion and possibly to high-risk individuals' giving up and not receiving the protection of an annual influenza vaccination. Recognizing that different communication mechanisms are important and require resources. Another challenge in communicating plans in the event of a pandemic will be to ensure that the communication mechanisms used reach all affected populations. During the 2004-05 influenza season, public health officials reported the importance of different methods of communication. For example, officials from the Seattle-King County Public Health Department in Washington State reported that it was important to have a hotline as well as information posted on a Web site, because some seniors calling Seattle-King County's hotline reported that they did not have access to the Internet. According to state and local health officials, however, maintaining these communication mechanisms took time and strained personnel resources. In Minnesota, for example, to supplement state employees, the state health department asked public health nursing students to volunteer to staff the state's influenza vaccine hotline. Educating health care providers and the public about all available vaccines. For the 2004-05 season, approximately 3 million doses of nasal spray vaccine were ultimately available for vaccinating healthy individuals aged 5-49 years who were not pregnant, including some individuals (such as health care workers in this age group and household contacts of children younger than 6 months) in the priority groups defined by CDC and ACIP, yet some of these individuals were reluctant to use this vaccine because they feared that the live virus in the nasal spray could be transmitted to others. State health officials in Maine, for example, reported that the state purchased about 1,500 doses of the nasal spray vaccine for their emergency medical service personnel and health care workers, yet administered only 500 doses. Challenges in ensuring an adequate and timely supply of influenza vaccine and antiviral drugs--which can help prevent or mitigate the number of influenza-related deaths until an pandemic influenza vaccine becomes available--may be exacerbated during an influenza pandemic. Particularly given the time needed to produce vaccines, influenza vaccine may be unavailable or in short supply and may not be widely available during the initial stages of a pandemic. According to CDC, maintaining an abundant annual influenza vaccine supply is critically important for protecting the public's health and improving our preparedness for an influenza pandemic. The shortages of influenza vaccine in 2004-05 and previous seasons have highlighted the fragility of the influenza vaccine market and the need for its expansion and stabilization. In its budget request for fiscal year 2006, CDC reports that it plans to take steps to ensure an expanded influenza vaccine supply. The agency's fiscal year 2006 budget request includes $30 million for CDC to enter into guaranteed-purchase contracts with vaccine manufacturers to ensure the production of bulk monovalent influenza vaccine. If supplies fall short, this bulk product can be turned into a finished trivalent influenza vaccine product for annual distribution. If supplies are sufficient, the bulk vaccine can be held until the following year's influenza season and developed into finished vaccines if the bulk products maintain their potency and the circulating strains remain the same. According to CDC, this guarantee will help expand the influenza market by providing an incentive to manufacturers to expand capacity and possibly encourage additional manufacturers to enter the market. In addition, CDC's fiscal year 2006 budget request includes an increase of $20 million to support influenza vaccine purchase activities. In the event of a pandemic, before a vaccine is available or during a period of limited vaccine supply, use of antiviral drugs could have a significant effect. Antiviral drugs can be used against all strains of pandemic influenza and, because they can be manufactured and stored before they are needed, could be available both to prevent illness and, if administered within 48 hours after symptoms begin, to treat it. Like vaccine, antiviral drugs take several months to produce from raw materials, and according to one antiviral drug manufacturer, the lead time needed to scale up production capacity and build stockpiles may make it difficult to meet any large-scale, unanticipated demand immediately. HHS' National Vaccine Program Office also reported that in a pandemic, the manufacturing capacity and supply of antiviral drugs is likely to be less than the global demand. For these reasons, the National Vaccine Program Office reported that analysis is under way to determine optimal strategies for antiviral drug use when supplies are suboptimal; the office also noted that antiviral drugs have been included in the national stockpile. HHS has purchased more than 7 million doses of antiviral drugs for the national stockpile. Nevertheless, this stockpile is limited, and it is unclear how much will be available in the event of a pandemic, given existing production capacity. Moreover, some influenza virus strains can become resistant to one or more of the four approved influenza antiviral drugs, and thus the drugs may not always work. For example, the avian influenza virus strain (H5N1) identified in human patients in Asia in 2004 and 2005 has been resistant to two of four existing antiviral drugs. The lack of sufficient hospital and workforce capacity is another challenge that may affect response efforts during an influenza pandemic. The lack of sufficient capacity could be more severe during an influenza pandemic compared with other natural disasters, such as a tornado or hurricane, or with an intentional release of a bioterrorist agent because it is likely that a pandemic would result in widespread and sustained effects. Public health officials we spoke with said that a large-scale outbreak, such as an influenza pandemic, could strain the available capacity of hospitals by requiring entire hospital sections, along with their staff, to be used as isolation facilities. In addition, most states lack surge capacity--the ability to respond to the large influx of patients that occurs during a public health emergency. For example, few states reported having the capacity to evaluate, diagnose, and treat 500 or more patients involved in a single incident. In addition, few states reported having the capacity to rapidly establish clinics to immunize or treat large numbers of patients. Moreover, shortages in the health care workforce could occur during an influenza pandemic because higher disease rates could result in high rates of absenteeism among workers who are likely to be at increased risk of exposure and illness or who may need to care for ill family members. Important challenges remain in the nation's preparedness and response should an influenza pandemic occur in the United States. As we learned in the 2004-05 influenza season, when vaccine supply, relative to demand, is limited, planning and effective communication are critical to ensure timely delivery of vaccine to those who need it. HHS's current draft plan lacks some key information for planning our nation's response to a pandemic. It is important for the federal government and the states to work through critical issues--such as how vaccine will be purchased, distributed, and administered; which population groups are likely to have priority for vaccination; what communication strategies are most effective; and how to address issues related to vaccine and antiviral supply and hospital and workforce capacity--before we are in a time of crisis. Although HHS contends that agency flexibility is needed during a pandemic, until key federal decisions are made, public health officials at all levels may find it difficult to plan for an influenza pandemic, and the timeliness and adequacy of response efforts may be compromised. Mr. Chairman, this concludes my prepared statement. I would be happy to respond to any questions you or other Members of the Committee may have at this time. For further information about this testimony, please contact Marcia Crosse at (202) 512-7119. Jennifer Major, Nick Larson, Gay Hee Lee, Kim Yamane, George Bogart, and Ellen W. Chu made key contributions to this statement. nluenza Pandemic: Challenges Reman in Preparedness. GAO-05-760T. I f Washington, D.C.: May 26, 2005. Flu Vaccine: Recent Supply Shortages Underscore Ongoing Challenges. GAO-05-177T. Washington, D.C.: November 18, 2004. Emergng Infecious Diseases: Revew of Sate and Federal Disease Surveillance Effors. GAO-04-877. Washington, D.C.: September 30, 2004. t nectous Disease Preparedness: Federal Chalenges in Responding to I f i Influenza Outbreaks. GAO-04-1100T. Washington, D.C.: September 28, 2004. Emergng Infecious Diseases: Asian SARS Outbreak Challenged I tnernational and Natonal Responses. GAO-04-564. Washington, D.C.: i April 28, 2004. Publc Heath Preparedness: Response Capac y mproving, bu Much Remains to Be Accomp shed. GAO-04-458T. Washington, D.C.: February li 12, 2004. Infectious Diseases: Gaps Remain in Survei ance Capabil ies o State and Local Agences. GAO-03-1176T. Washington, D.C.: September 24, 2003. i Severe Acute Respiraory Syndrome: Estabished Infectious Dsease Control Measures Helped Contain Spread, but a Large-Scale Resurgence May Pose Challenges. GAO-03-1058T. Washington, D.C.: July 30, 2003. SARS Outbreak: Improvemens to Pub c Health Capacity Are Needed for Responding o Bioerrorism and Emergng Infectous Diseases. GAO-03-tti 769T. Washington, D.C.: May 7, 2003. Infectious Disease Outbreaks: Bioterrorism Preparedness Efforts Have Improved Pub c HealhResponse Capacty, but Gaps Reman. GAO-03-t liii 654T. Washington, D.C.: April 9, 2003. Bioterrorism: Preparedness Vared across Stae and Local Jurisdictions. GAO-03-373. Washington, D.C.: April 7, 2003. Global Healh: Chalenges in Improving Infectious Disease Surve ance Systems. GAO-01-722. Washington, D.C.: August 31, 2001. ill Flu Vaccine: Steps Are Needed to Better Prepare for Possible Future Shortages. GAO-01-786T. Washington, D.C.: May 30, 2001. Flu Vaccne: Supply Probems Heighen Need o Ensure Access for High Risk People. GAO-01-624. Washington, D.C.: May 15, 2001. nluenza Pandemic: Pan Needed for Federal and State Response. GAO- I f 01-4. Washington, D.C.: October 27, 2000. West Nile Virus Outbreak: Lessons for Pubic Healh Preparedness. GAO/HEHS-00-180. Washington, D.C.: September 11, 2000. Global Health: Framework for Infectious Disease Surveillance. GAO/NSIAD-00-205R. Washington, D.C.: July 20, 2000. This is a work of the U.S. government and is not subject to copyright protection in the United States. It may be reproduced and distributed in its entirety without further permission from GAO. However, because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. | Shortages of influenza vaccine in the 2004-05 and previous influenza seasons and mounting concern about recent avian influenza activity in Asia have raised concern about the nation's preparedness to deal with a worldwide influenza epidemic, or influenza pandemic. Although the extent of such a pandemic cannot be predicted, according to the Centers for Disease Control and Prevention (CDC), an agency within the Department of Health and Human Services (HHS), it has been estimated that in the absence of any control measures such as vaccination or antiviral drugs, a "medium-level" influenza pandemic could kill up to 207,000 people in the United States, affect from 15 to 35 percent of the U.S. population, and generate associated costs ranging from $71 billion to $167 billion in the United States. GAO was asked to discuss the challenges the nation faces in responding to the threat of an influenza pandemic, including the lessons learned from previous annual influenza seasons that can be applied to its preparedness and overall ability to respond to a pandemic. This testimony is based on GAO reports and testimony issued since 2000 on influenza vaccine supply, pandemic planning, emergency preparedness, and emerging infectious diseases and on current work examining the influenza vaccine shortage in the United States for the 2004-05 influenza season. The nation faces multiple challenges to prepare for and respond to an influenza pandemic. First, key questions about the federal role in purchasing and distributing vaccines during a pandemic remain, and clear guidance on potential priority groups is lacking in HHS's current draft of its pandemic preparedness plan. For example, the draft plan does not establish the actions the federal government would take to purchase or distribute vaccine during an influenza pandemic. In addition, as was highlighted in the nation's recent experience responding to the unexpected influenza vaccine shortage for the 2004-05 influenza season, clear communication of the nation's response plan will be a major challenge. During the 2004-05 influenza season, state health officials reported that mixed messages created confusion. For example, CDC advised vaccination for persons aged 65 and older, and at the same time a state advised vaccination for persons aged 50 and older. Further challenges include ensuring an adequate and timely supply of influenza vaccine and antiviral drugs, which can help prevent or mitigate the number of influenza-related deaths. Particularly given the length of time needed to produce vaccines, influenza vaccine may be unavailable or in short supply and might not be widely available during the initial states of a pandemic. Finally, the lack of sufficient hospital and health care workforce capacity to respond to an infectious disease outbreak may also affect response efforts during an influenza pandemic. Public health officials we spoke with said that a large-scale outbreak, such as an influenza pandemic, could strain the available capacity of hospitals by requiring entire hospital sections, along with their staff, to be used as isolation facilities. | 5,784 | 616 |
The purpose of DOE's contract with ORAU is to provide management and direction of programs through ORISE that maintain and advance science and education capabilities supporting DOE's strategic goals in the areas of defense, energy, science, and the environment. To support these goals, ORAU carries out a range of activities for DOE, including administering workforce development programs to help ensure the future availability of scientists and engineers. These workforce development programs are intended to encourage individuals to enter STEM careers, complement students' academic programs, and provide faculty with state- of-the-art information to use in the classroom, as well as developing a pool of talent from which federal agencies can draw for future employment. ORAU groups its workforce development activities into the following three categories: Research participation program: This program provides research experiences to students, postgraduates, faculty, and other participants. These activities make up the ORISE program. Fellowships and scholarships: Among other things, these programs provide financial assistance for students to obtain academic degrees in areas related to the sponsoring agency's mission. Events, academies, and competitions: These programs, such as the National Science Bowl, a nationwide middle- and high-school science and mathematics competition, are designed to encourage participation in scientific and technological fields. In fiscal year 2014, most of the federal agency expenditures on workforce development activities administered by ORAU were for the ORISE program. In that year, federal agencies expended $193.8 million on the ORISE program, followed by $3.2 million for fellowships and scholarships, and $1.7 million for events, academies, and competitions. These expenditures supported 5,854 research participation appointments, 72 scholarships and fellowships, and 1,191 special event participants. ORISE research participants engage in a variety of subject areas, such as climate change, weather impacts on military projects, infectious and chronic diseases, computer simulations of potential terrorist attacks or natural disasters, and conservation measures for fish and wildlife. (See app.I for additional examples, by sponsoring agency, of project subject areas in fiscal year 2014.) Research participants may also be involved in developing briefing materials on their research for agency leadership, publishing the results of their research, participating in conferences, and obtaining other research-related experiences. Research participants are not considered federal employees or federal contractors and do not receive a salary. They instead receive stipends and certain other expenses to defray their costs of living during their appointments. Participant appointments can be full- or part-time and can last for weeks, such as in the case of a 10- to 12-week summer program, or years, such as in the case of a 1-year postgraduate program renewable for up to 4 additional years. From fiscal year 2010 through fiscal year 2014, DOE and other sponsoring agencies expended a total of $776.4 million for the ORISE program, with DOD, DOE, and HHS accounting for the majority of the expenditures. Over that period, annual program expenditures increased by 73 percent, and the number of annual appointments rose by 42 percent. Stipends accounted for the largest portion of agencies' expenditures. Sponsoring agency expenditures per appointment varied, affected by factors such as the length of research participants' appointments and the program support services sponsoring agencies had ORAU perform. During fiscal years 2010 through 2014, sponsoring agencies, which included 11 departments and other federal agencies, expended a total of $776.4 million for the ORISE program. DOD, HHS, and DOE collectively had the highest expenditures for the program (over 87 percent) over that period and had the highest number of appointments in fiscal year 2014 (over 88 percent). Within DOD, the Army was the primary component that sponsored ORISE research participants, accounting for 77 percent of DOD expenditures over the 5-year period and 70 percent of appointments in fiscal year 2014. Within HHS over the same time periods, the Food and Drug Administration (FDA) and Centers for Disease Control and Prevention (CDC) accounted for about 59 percent and 32 percent of expenditures, respectively, and about 53 percent and 36 percent of appointments. See figure 1 below and appendix II for further information on agencies' expenditures and numbers of appointments. Sponsoring agencies' total annual expenditures increased from $112.3 million in fiscal year 2010 to $193.8 million in fiscal year 2014, a 73 percent increase (61 percent when adjusted for inflation), and the number of appointments grew from 4,128 to 5,854, a 42 percent increase (see fig. 2 below and app. II for further information). An ORAU official who maintains data on appointments attributed the growth in the number of appointments to an increase in the program's popularity, which led to the addition of new sponsoring agencies and increases in the number of appointments per sponsoring agency. Agency component officials we interviewed cited a variety of reasons for wanting to sponsor ORISE research participants, including: access to the ORISE program's recruiters and network of connections administrative support from the ORISE program that the sponsoring agencies could not easily supply themselves, and the speed, flexibility, and relatively low overhead cost of the ORISE program. For example, an official who managed the research participation program at the U.S. Army Medical Research Institute of Infectious Diseases told us that the cost of hiring and managing staff to administer their own program would cost more than the overhead that they pay for the ORISE program. The average total expenditure per appointment in the ORISE program also increased from fiscal year 2010 through fiscal year 2014, from about $27,200 per appointment in fiscal year 2010 to about $33,100 per appointment in fiscal year 2014. Expenditures per appointment may have risen for a variety of reasons, such as changes in the average education level of research participants and the average length of their appointments. For example, the proportions of appointments at different education levels in fiscal year 2014 shifted compared to the proportions in fiscal year 2010, with recent graduate and postdoctoral appointments increasing 65 percent and 68 percent, respectively, while undergraduate appointments increased 12 percent. An ORAU official said that postgraduate appointments generally command higher stipends than undergraduate appointments. For example, according to information provided by FDA's Center for Drug Evaluation and Research, monthly stipends at their center could be as high as $2,897 for currently enrolled undergraduate students and as high as $7,569 for postgraduates with PhD degrees. The official said that postgraduate appointments at their center also last longer than undergraduate appointments, resulting in higher expenditures per appointment. From fiscal year 2010 through fiscal year 2014, stipends--funds paid to research participants to defray their costs of living during their appointments--comprised the majority of agencies' expenditures for the ORISE program. Sponsoring agencies' other expenditures for the program included the following categories of expenses: Travel and other research participant expenses: Funds paid to research participants to cover particular expenses not covered by their stipends, such as expenses for travel to conferences or other appointment-related destinations. Program support and overhead: Funds paid to DOE to cover ORAU's expenses for administering the appointment of research participants at agencies. These expenses included (1) program support expenses--direct expenses for services ORAU provides to agencies, such as managing recruitment activities, and (2) general and administrative expenses--indirect expenses such as building expenses, paid by agencies as a fixed percentage (negotiated by DOE and ORAU) of total expenditures on the ORISE program. Federal administrative and security charges: Fees paid to DOE by other sponsoring agencies, including (1) a federal administrative charge of 3 percent of an agency's total expenditures on the ORISE program to offset DOE's administrative expenses for work conducted on behalf of other agencies and (2) a charge applied to Strategic Partnership Projects to supplement DOE support for safeguards and security expenses. Figure 3 shows the percentage of expenditures for each category of expense. Sponsoring agencies' expenditures per appointment in the ORISE program varied among agencies. In each year from fiscal year 2010 through fiscal year 2014, the lowest average expenditure per appointment for a sponsoring agency was $14,396 or less, and the highest average expenditure per appointment was $42,996 or more. For example, in fiscal year 2014, the Department of the Interior expended an average of $12,246 per appointment, while the Environmental Protection Agency expended an average of $44,099 per appointment. The proportions expended for different categories of expenses also varied. For example, data provided by ORAU showed that the proportion agencies expended on stipends ranged from 69 percent to 88 percent in fiscal year 2014. We identified the following factors that contributed to per-appointment expenditures varying among agencies: Research participants' appointment terms. Differences in the terms that sponsoring agency components set for research participants' appointments contributed to variation in expenditures per appointment. Some appointments lasted for days, weeks, or months, while others lasted for a full year or more. For example, FDA's National Center for Toxicological Research's Summer Student Research Program placed research participants in a 10-week summer program. In contrast, the National Library of Medicine's Associate Fellowship Program placed research participants in 1- or 2-year residency programs. In addition, some appointments were full-time, while others were part-time. Methods of setting stipends. Officials at sponsoring agency components reported that they used differing methods to set research participants' stipends. ORAU officials said that they sometimes provided advice to the agencies, but that the agencies ultimately set their own stipends. Almost all of the officials we interviewed at sponsoring agency components said that they considered applicants' education levels when setting stipends, but they varied in the other factors they considered. For example, some used the Office of Personnel Management's General Schedule pay scale, but others did not. The officials also differed in the extent to which they considered other factors, including prior work experience, salaries in the private and government sectors, stipends received by research participants in other programs, and geographic location. Some of the officials said that they set fixed stipends for all research participants, but others said that they determined stipends individually or made exceptions to fixed stipends when attempting to fill particular appointments. Other expenses covered. Sponsoring agency components chose to reimburse their research participants for different types and amounts of expenses not covered by their stipends. For example, in fiscal year 2014, the Air Force expended an average of $492 per appointment to pay for participants' travel expenses, while the Environmental Protection Agency expended an average of $1,387 per appointment for that purpose. Other expenses that can vary among sponsoring agency components include payment of research participants' tuition and fees at their academic institutions; reimbursement for the costs of moving to a research site; allowances for housing at research sites; payment of visa processing fees for foreign research participants; and purchase of safety equipment, books, and research supplies. Services performed by ORAU. Sponsoring agency components selected from and paid for many different services performed by ORAU. These services included managing recruitment activities, processing applications, making and monitoring appointments, designing and implementing program enhancements, paying stipends, administering other research participant expenses and insurance, managing domestic and foreign travel, analyzing and providing financial reports, developing and administering program goals and objectives, handling immigration status issues, and other tasks. Agencies' selections of these services determined the amount that they paid in program support expenses for each of their appointments. According to DOE officials, the ORISE program consists of a set of distinct activities, or separate programs, that ORAU carries out on behalf of DOE and other sponsoring agency components. As a result, DOE considers responsibility for assessing the effectiveness of ORISE program activities to be dispersed among the sponsoring agencies, each of which may have separate objectives for sponsoring research participants. Sponsoring agency components we reviewed use questionnaires and other methods to assess how well the program is working. Responsibility for ensuring research participants do not perform inherently governmental functions is also dispersed among sponsoring agencies. However, documents provided by DOE, DOD, and HHS components to research participants, coordinators, and mentors contain varying levels of detail on the prohibition on nonfederal employees performing inherently governmental functions. Without detailed guidance, sponsoring agencies have limited assurance that the prohibition is being followed. In May 2013, the National Science and Technology Council, which coordinates executive branch science and technology policy, released its 5-year strategic plan for STEM education, which stated that federal agencies would focus on building and using evidence-based approaches to evaluate the federal investment in STEM education. DOE officials told us that, because the ORISE program consists of separate activities that ORAU carries out on behalf of DOE and other sponsoring agencies, these agencies choose whether to assess the effectiveness of ORISE program activities as part of their other investments in STEM education. As a result, other than periodically evaluating ORAU's performance (with input from sponsoring agencies) under its contract to determine ORAU's award fee, DOE does not assess the overall effectiveness of the activities that ORAU carries out under the ORISE program, according to a DOE official. For example, the official said DOE does not assess how ORISE program activities at other sponsoring agencies contribute to the ORISE program's objective to enhance the quantity, quality, and diversity of the future scientific and engineering workforce and to increase the scientific and technical literacy of the U.S. citizenry. Sponsoring agency components establish their own objectives for sponsoring research participants and decide whether and how to assess the extent to which the ORISE program meets those objectives, according to DOE officials. Some but not all DOE, DOD, and HHS components have used questionnaires, and some components have used other methods to assess how well the ORISE program is working in the short term, such as over the course of a research participant's appointment. In particular, some components use questionnaires developed with assistance from ORAU and administered to research participants, and sometimes to mentors. ORISE program coordinators and other officials at sponsoring agency components described other methods they use to assess the program, such as asking research participants about their experiences and monitoring the progress of research participants' research projects, research participants' publications and presentations related to their research, and the number of current agency employees who were past ORISE research participants. In addition, one of the program support functions that ORAU can offer to sponsoring agencies at the cost of the service is performing an assessment of ORISE program effectiveness. In response to a request from the sponsoring agency component, ORAU performed such an assessment for the Joint Prisoner of War/Missing in Action Accounting Command and issued a report in August 2014. The methods being used by the sponsoring components we reviewed assess how well the program is meeting the short-term needs of research participants and mentors. For example, some research participant questionnaires included questions about research participants' satisfaction with their assignment, training, mentoring, stipends, and program administration. DOD mentor questionnaires include questions on reasons for renewing a research participant's appointment and research participants' skills and knowledge. A DOE Office of Science official told us that DOE is working with other agencies to develop methods for assessing the long-term outcomes of STEM education efforts, such as the ORISE program increasing the diversity of the STEM workforce. The official noted that, without such methods, they face challenges in assessing the long-term effectiveness of the ORISE program. For example, according to the official, such challenges include developing methods to track research participants over the course of their careers and determining the extent to which a participant's degree of success in a STEM field is a result of the ORISE program as opposed to other educational experiences. In 2011, OMB's Office of Federal Procurement Policy issued guidance to assist agency officers and employees in ensuring that only federal employees perform work that is inherently governmental or otherwise needs to be reserved to the public sector. This guidance directs agencies to develop and maintain internal procedures; take appropriate steps to help employees understand and meet their responsibilities; and periodically evaluate the effectiveness of their internal management controls for reserving work for federal employees. In accordance with OMB's guidance, agencies that sponsor ORISE research participants are responsible for ensuring that research participants at their agencies do not perform inherently governmental functions. Documents we reviewed that are issued by DOE, DOD, and HHS, regarding research participants at their agencies, and that are used by sponsoring agency components' coordinators, mentors, and research participants varied in level of detail on activities considered inherently governmental functions. For example, within HHS, ORISE program handbooks from FDA's Center for Veterinary Medicine and Center of Drug Evaluation and Research included examples of activities research participants should not perform, such as serving as a drug, device, safety, or facilities reviewer. Similarly, within DOD, the research participant appointment letters used by the Office of the Under Secretary of Defense for Acquisition, Technology, and Logistics included detailed guidance, such as a statement that research participants should not accept policy, budget, or program management authority. In contrast, sample appointment letters that we reviewed used by HHS's CDC and DOD's U.S. Army Environmental Command stated only that the research participant will not enter into an employee/employer relationship with ORISE, DOE, or any other office or agency, and did not specifically cite the inherently governmental activities prohibition. The terms of appointment developed by ORAU and used by DOE, DOD, and HHS to make research participant appointments states only that the appointment is an educational experience and not a contract of employment. Statements of work agreed upon as part of interagency agreements between DOE and sponsoring agencies also varied in their level of detail about activities considered to be inherently governmental functions. For example, a statement of work for the CDC stated that ORISE research participation projects should not include activities reserved for federal employees, such as those involving budget or program management authority. In contrast, a statement of work for the National Institutes of Health did not include this level of detail, stating only that individuals selected for appointments do not become employees. DOE and other sponsoring agency officials noted that ORISE research participants are assigned to research projects that generally do not involve inherently governmental functions. A DOE Office of Science official said that the research focus of most ORISE appointments reduced the risk of those research participants performing inherently governmental functions. However, GAO found that some research participants' projects involve activities that are closely associated with inherently governmental functions, such as participating in policy and strategic planning meetings, which may increase the risk of the participants performing inherently governmental functions. The DOE Office of Science official described how, in such cases, DOE provided more detailed briefings on inherently governmental functions for certain research participants, as well as briefings for their mentors. However, officials at other sponsoring agency components we interviewed did not describe providing such briefings as a standard practice for coordinators, mentors, or research participants. For example, the position description for a research participant in the DOD Office of the Under Secretary of Defense for Acquisition, Technology, and Logistics included participating in policy and strategic planning meetings, but officials at this DOD component did not describe providing briefings on inherently governmental functions, increasing the importance of written guidance. Not having detailed guidance increases the risk that coordinators responsible for managing overall participation in the program and mentors responsible for directing research participants' day-to-day activities may overlook the possibility of research participants engaging in inherently governmental functions, especially in cases where participants' activities are closely associated with inherently governmental functions. Development of detailed guidance could help sponsoring agencies fulfill their responsibilities as identified in OMB's Office of Federal Procurement Policy guidance on inherently governmental functions. By providing hands-on research experiences in government agencies for students, postgraduates, and faculty, the ORISE research participation program makes an important contribution to federal efforts to help prepare students and teachers for careers in STEM fields. Responsibility for administering the program is dispersed among agencies that sponsor research participants. In particular, agencies are responsible for ensuring that research participants do not perform inherently governmental functions--for example, by developing guidance and other documents for research participants, coordinators, and mentors. Having this responsibility allows agencies to tailor guidance on inherently governmental functions to the features of the ORISE program at their agencies, such as the types of projects to which research participants are assigned. However, the level of detail in documents currently used by DOE, DOD, and HHS varies, with some documents describing specific types of activities that are inherently governmental functions and others only providing general statements that research participants are not federal government employees. More detailed guidance can help ORISE coordinators, mentors, and research participants ensure that they are adhering to the prohibition on research participants as nonfederal government employees performing inherently government functions. We recommend that the Secretaries of Energy, Defense, and Health and Human Services develop detailed guidance to ensure that ORISE program coordinators, mentors, and research participants are fully informed of the prohibition on nonfederal employees performing inherently governmental functions. We provided a draft of this report to DOE, DOD, and HHS for their review and comment. In their written comments, reproduced in appendices III through V, DOE, DOD and HHS concurred with our recommendation. DOE and HHS also provided technical comments, which we incorporated as appropriate. In their written comments, DOE, DOD, and HHS described the measures they will take to implement our recommendation on inherently governmental functions. In particular, DOE stated that it plans to provide detailed guidance to all relevant parties involved in DOE-sponsored research participation activities administered through ORISE within 180 days, following consultation with relevant DOE offices. DOD stated detailed guidance will be developed to further ensure those connected with the ORISE program are fully informed of the prohibition on non- federal employees performing inherently governmental functions. HHS stated that they are developing an agency-wide policy, including a section on inherently governmental functions that will provide guidance to agency program coordinators, mentors, and research participants. In its letter and technical comments, DOE stated that the draft report did not reflect detailed discussions we had with DOE officials regarding inherently governmental functions. In addition, DOE stated that the draft report significantly understated the extent to which DOE communicates the prohibition of inherently governmental functions to sponsored participants and agency mentors. We do not believe our report understates DOE's efforts. For example, our report includes a discussion of the detailed briefings that DOE Office of Science officials provide on inherently governmental functions to research participants selected for a program designed to expose the participants to federal policymaking. Other DOE, DOD, and HHS sponsoring agency components we interviewed did not describe a similar practice for their coordinators, mentors, or research participants. Our report's discussion of these briefings, as well as of documents issued by DOE for the ORISE program, reflect the extent of communications on inherently governmental functions that DOE provided to us. We are sending copies of this report to the appropriate congressional committees; the Secretaries of Energy, Defense, and Health and Human Services; and other interested parties. In addition, the report is available at no charge on the GAO website at http://www.gao.gov. If you or your staff members have any questions about this report, please contact me at (202) 512-3841 or [email protected]. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this report. GAO staff who made key contributions to this report are listed in appendix VI. Appendix I: Examples of Subject Areas for ORISE Research Participant Projects in Fiscal Year 2014 Examples of subject areas Improving prevention and treatments of emerging foreign animal diseases, climate change impacts on forests, porcine epidemic diarrhea virus, and sensor networks on variable rate irrigation systems. Laser systems, nanomaterials, acoustics, neurobiology, additive manufacturing, civil engineering, cognitive modeling, intelligent sensors, exercise science, and visual analytics. Weather impacts on military projects, cognitive function and psychological performance of soldiers, health promotion and wellness, environmental medicine, improving military field equipment, and science and technology policy. Infectious disease or deployment health surveillance, clinical and health care epidemiology, optical coherence tomography, and aeromedical studies. Forensic sciences, human immunodeficiency virus/acquired immune deficiency syndrome prevention, public health and preventive medicine studies, and science and technology policy. Neutron scattering, fusion energy, efficiency of renewable energy sources, computational sciences, materials sciences, process controls of advanced power systems, gas sensors and high temperatures, improving extraction of earth elements, quantum computing, biofilms and biotechnology, advanced manufacturing (carbon fiber), climate change, and science and technology policy. Infectious diseases (e.g., influenza, sexually transmitted, food borne, vector borne, respiratory), chronic diseases (e.g., heart, obesity, cancer), environmental health, toxic substances, health statistics, and public health preparedness. Toxicology, food safety, drug evaluation and testing, biological therapeutics, tobacco products, blood products, medical devices, biotechnology products, translational sciences, women's health, vaccines, cell and gene therapies, and regulatory science. Localization of proteins using molecular markers, gene regulatory effects in cancer, medical informatics, and central nervous system injuries. Public health economics, population based model testing, clinical care models, minority health, women's health, tobacco prevention initiatives, national human immunodeficiency virus/acquired immune deficiency syndrome strategies, and geospatial analysis of underserved populations. Encryption for criminal databases, improving materials for coastal bridges, computer simulations of potential terrorist attacks or natural disasters, brain-like modeling systems, searchable databases of potential threats, human trafficking, and detecting and identifying explosive-related threats. Data analysis of housing and urban development impacts and value on communities. Department of the Interior Data collection and surveys related to conservation measures for fish and wildlife. Clean energy and climate change policy and analyses in the international economy, and building efficiencies. Climate change, software codes for aerial sampling systems, urban ecosystems, nanoparticles and surface coating, waste disposal, safety of water supply, and biomarkers for environmental contaminants. Juvenile prostitution and child abduction, causes of postmortem hair root banding, forensic applications of isotopes, crimes against adults, and identification of facial phenotypic markers. Includes Office of the Secretary, Office of Diversity Management and Equal Opportunity, National Geospatial-Intelligence Agency, Defense Threat Reduction Agency, Defense Prisoner of War/Missing in Action Accounting Command, and U.S. Southern Command. Includes Office of the Secretary and Health Resources and Services Administration, and Center for Medicare and Medicaid Innovation. The following tables detail federal agencies' expenditures for and research participant appointments they sponsored as part of the Oak Ridge Institute for Science and Education (ORISE) research participation program for fiscal years 2010 through 2014. Table 1 identifies agencies' total annual expenditures for their involvement in the ORISE program. Table 2 identifies the numbers of appointments at each agency for each year. Table 3 details each agency's total expenditures for the ORISE program for fiscal years 2010 through 2014 by type of expense, including stipends, travel, other research participant expenses, program support and overhead, and federal administrative and security charges. In each table, the three agencies that account for the largest share of expenditures and appointments--the Department of Health and Human Services (HHS), the Department of Defense (DOD), and the Department of Energy (DOE)--are broken out into component agencies that sponsored research participants. In addition to the individual named above, Joseph Cook (Assistant Director), Sherri Doughty, Ellen Fried, Tobias Gillett, Kirsten Lauber, Gerald Leverich, Cynthia C. Norris, Stephanie Shipman, Kathryn Smith, Jeanette Soares, Sara Sullivan, and Thema Willette made key contributions to this report. | The ORISE research participation program seeks to enhance the future scientific and engineering workforce by providing students, postgraduates, and faculty with hands-on research experiences in federal agencies. The program is administered by a DOE contractor, and other agencies sponsor research participants via interagency agreements with DOE. Research participants engage in a variety of projects at DOE and other sponsoring agencies, but they are not considered federal government employees and thus are prohibited from performing inherently governmental functions. GAO was asked to review the ORISE research participation program. This report examines (1) program expenditures by all sponsoring agencies and (2) selected agencies' assessments of program effectiveness and their guidance on inherently governmental functions. GAO reviewed program data for fiscal years 2010-2014, the five most recent years for which data were available; examined program policies and guidance at DOE, DOD, and HHS, the three agencies that sponsored the most participants in fiscal year 2014; and interviewed officials at those three agencies. For fiscal years 2010 through 2014, the 11 departments and other federal agencies that sponsor research participants collectively expended $776.4 million for activities carried out through the Oak Ridge Institute for Science and Education (ORISE) research participation program (ORISE program). The three agencies with the highest expenditures for the program over the 5-year period were the Department of Energy (DOE), which oversees the contractor managing ORISE, and the Department of Defense (DOD) and Department of Health and Human Services (HHS), which both sponsor research participants via interagency agreements with DOE. Expenditures increased 73 percent over that period, and the number of appointments increased 42 percent. Stipends accounted for 82 percent of expenditures over that period, with the remainder going to other participant expenses, overhead and program support, and administrative and security charges. Agencies' expenditures per appointment varied for several reasons, such as differences in methods of setting stipends. Components within DOE, DOD, and HHS that sponsor research participants have performed some assessments of the short-term effectiveness of the ORISE program, but provide varying levels of detail to agencies' employees and research participants about inherently governmental functions--those functions that are so intimately related to the public interest as to require performance by federal government employees. Program effectiveness. Sponsoring agency components establish their own objectives for research participants and can decide whether and how to assess the extent to which the ORISE program meets those objectives. DOE, DOD, and HHS components have used questionnaires and other methods to assess how well the ORISE program meets the short-term needs of research participants and of the agency staff who oversee their activities. Agencies also face challenges in assessing the program's long-term effectiveness; for example, they do not have methods to track research participants over their careers to determine the extent to which participants' success is a result of the program. DOE has worked with other agencies on developing ways to address such challenges. Inherently governmental functions. Federal guidance directs agencies to develop internal procedures to ensure that only federal employees perform inherently governmental functions. DOE, DOD, and HHS sponsoring components' guidance for research participants that GAO reviewed had varying levels of detail on inherently governmental functions. Officials at these agencies said that research participants' projects generally do not involve inherently governmental functions, but GAO found that some research participants' projects involve activities that are closely associated with inherently governmental functions, such as participating in certain policy and strategic planning meetings, which may increase the risk of the participants performing inherently governmental functions. Development of detailed guidance could help sponsoring components reduce this risk and help officials better ensure adherence to the federal guidance on inherently governmental functions. GAO recommends that DOE, DOD, and HHS develop detailed guidance to inform their employees and research participants about inherently governmental functions. DOE, DOD, and HHS concurred with the recommendation and said they will take additional measures to provide detailed guidance to relevant parties. | 6,080 | 839 |
The National Industrial Security Program was established in 1993 for the protection of classified information. DSS administers the National Industrial Security Program on behalf of DOD and 23 other federal departments and agencies. DSS is responsible for providing oversight, advice, and assistance to more than 11,000 U.S. contractor facilities that are cleared for access to classified information. Contractor facilities can range in size, be located anywhere in the United States, and include manufacturing plants, laboratories, and universities. About 221 industrial security representatives work out of 25 DSS field offices across the United States and serve as the primary points of contact for these facilities. DSS is responsible for ensuring that these contractors meet requirements to safeguard classified information under the National Industrial Security Program. Contractors must have facility security clearances under this program before they can work on classified contracts. To obtain a facility security clearance, contractors are required to self- report foreign business transactions on a Certificate Pertaining to Foreign Interests form. Examples of such transactions include foreign ownership of a contractor's stock, a contractor's agreements or contracts with foreign persons, and whether non-U.S. citizens sit on a contractor's board of directors. DSS's industrial security representatives provide guidance to contractors on filling out the certificate. If a contractor declares no foreign business transactions on the certificate, DSS places the certificate in the contractor's file located in the field. When U.S. contractors with facility security clearances have changes in foreign business transactions to report, they are required to complete the certificate again and resubmit it every 5 years, even if no foreign transactions take place. Because a U.S. company can own a number of contractor facilities, the corporate headquarters or another legal entity within that company is required to complete the certificate. When contractors declare foreign transactions on their certificates and notify DSS, industrial security representatives are responsible for ensuring that contractors properly identify all relevant foreign business transactions. They are also required to collect, analyze, and verify pertinent information about these transactions. For example, by examining various corporate documents, the industrial security representatives can determine corporate structures and ownership and identify key management officials. The representatives may consult with DSS counterintelligence officials, who can provide information about threats to U.S. classified information. If contractors' answers on the certificates indicate that foreign transactions meet certain DSS criteria or exceed thresholds, such as the percentage of company stock owned by foreign persons, the representatives forward these FOCI cases to DSS headquarters. DSS headquarters works with contractors to determine what, if any, protective measures are needed to reduce the risk of foreign interests gaining unauthorized access to U.S. classified information. DSS field staff are then responsible for monitoring contractor compliance with these measures. Figure 1 shows highlights of the FOCI process. On a case-by-case basis, DSS headquarters can approve the use by contractors of one of six types of protective measures: voting trust agreements, proxy agreements, special security agreements, security control agreements, board resolutions, and limited facility clearances. These protective measures are intended to insulate contractor facilities from undue foreign control and influence and to reduce the risk of unauthorized foreign access to classified information. Protective measures vary in the degree to which foreign entities are insulated from classified information and are not intended to deny foreign owners the opportunity to pursue business relationships with their U.S.-based contractor facilities working on classified contracts. Table 1 provides a general description of each of these protective measures. In addition to these measures, DSS can also require contractors to take certain actions to mitigate specific FOCI situations such as termination of loan agreements or elimination of debt owed to a foreign entity. For contractors operating under voting trust, proxy, special security, or security control agreements, industrial security representatives are supposed to conduct annual FOCI meetings with contractor staff who are responsible for ensuring compliance with these protective measures. In preparation for these annual meetings, contractors are required to produce and submit to DSS annual FOCI compliance reports that can describe specific acts of noncompliance with protective measures, changes in organizational structure or changes in security procedures at the contractor, and other issues that have occurred over the course of a year. Industrial security representatives should then review the reports to determine how contractors are fulfilling their obligations under the protective measures. In addition, DSS generally conducts security reviews annually for facilities that store classified information or every 18 months for facilities that do not have classified information on site. However, for contractors operating under voting trust, proxy, special security, or security control agreements, industrial security representatives are required to conduct a security review every 12 months whether the contractor has classified information on site or not. These reviews are designed to determine security vulnerabilities and contractor compliance with National Industrial Security Program requirements and to evaluate the overall quality of the facility's security program, including compliance with protective measures to mitigate FOCI. DSS will not grant a new facility security clearance to a contractor until all relevant FOCI have been mitigated. In addition, DSS shall suspend an existing clearance if FOCI at a contractor facility has not been mitigated. A contractor with a suspended facility clearance can continue to work on an existing classified contract unless the government contracting office denies access to the existing contract. In addition, the contractor cannot be awarded a new classified contract until the clearance is restored. DSS does not systematically ask for, collect, or analyze foreign business transactions in a manner that helps it properly oversee contractors entrusted with U.S. classified information, nor does DSS aggregate and analyze information to determine the overall effectiveness of its oversight of FOCI contractors. Notably, DSS does not know if contractors are reporting foreign business transactions as they occur and lacks knowledge about how much time a contractor facility with unmitigated FOCI has access to classified information. Figure 2 shows a general description of gaps in DSS knowledge about the FOCI process. Furthermore, DSS field staff said they lack research tools and sufficient training regarding the subject of foreign transactions and have indicated challenges with regard to staff turnover. DSS does not systematically ask for information that would allow it to know if contractors are reporting certain foreign business transactions when they occur, which begins the process for reducing FOCI-related security risks. DSS industrial security representatives are responsible for advising contractors that timely notification of foreign business transactions is essential. The National Industrial Security Program Operating Manual requires contractors with security clearances to report any material changes of foreign business transactions previously notified to DSS but does not specify a time frame for doing so. DSS is dependent on contractors to self-report transactions by filling out the Certificate Pertaining to Foreign Interests form, but this form does not ask contractors to provide specific dates for when foreign transactions took place. In addition, DSS does not compile or analyze how much time passes before DSS becomes aware of foreign business transactions. DSS field staff told us that some contractors report foreign business transactions as they occur, while others report transactions months later, if at all. During our review, we found a few instances in which contractors were not reporting foreign business transactions when they occurred. One contractor did not report FOCI until 21 months after awarding a subcontract to a foreign entity. Another contractor hired a foreign national as its corporate president but did not report this transaction to DSS, and DSS did not know about the FOCI change until 9 months later, when the industrial security representative came across the information on the contractor's Web site. In another example, DSS was not aware that a foreign national sat on a contractor's board of directors for 15 months until we discovered it in the process of conducting our audit work. Without timely notification from contractors, DSS cannot track when specific foreign business transactions took place and therefore is not in a position to take immediate action so that FOCI is mitigated, if necessary. In addition, DSS does not determine the time elapsed from reporting of foreign business transactions by contractors with facility clearances to the implementation of protective measures or when suspensions of facility clearances occur. Without protective measures in place, unmitigated FOCI at a cleared contractor increases the risk that foreign interests can gain unauthorized access to U.S. classified information. During our review, we found two cases in which contractors appeared to have operated with unmitigated FOCI before protective measures were implemented. For example, officials at one contractor stated they reported to DSS that their company had been acquired by a foreign entity. However, the contractor continued operating with unmitigated FOCI for at least 6 months. In the other example, a foreign-purchased contractor continued operating for 2 months with unmitigated FOCI. Contractor officials in both examples told us that their facility clearances were not suspended. According to the National Industrial Security Program Operating Manual, DSS shall suspend the facility clearance of a contractor with unmitigated FOCI. DSS relies on field office staff to make this determination. Because information on suspended contractors with unmitigated FOCI is maintained in the field, DSS headquarters does not determine at an aggregate level the extent to which and under what conditions it suspends contractors' facility clearances due to unmitigated FOCI. DSS does not centrally collect and analyze information to determine the magnitude of contractors under FOCI and assess the effectiveness of its oversight of those contractors. For example, DSS does not know how many contractors under FOCI are operating under all types of protective measures and, therefore, does not know the extent of potential FOCI- related security risks. Although DSS tracks information on contractors operating under some types of protective measures, it does not centrally compile data on contractors operating under all types of protective measures. Specifically, DSS headquarters maintains a central repository of data on contractors under voting trust agreements, proxy agreements, and special security agreements--protective measures intended to mitigate majority foreign ownership. However, information on contractors under three other protective measures--security control agreements, limited facility clearances, and board resolutions--are maintained in paper files in the field offices. DSS does not aggregate data on contractors for all six types of protective measures and does not track and analyze overall numbers. In addition, DSS does not conduct overall analysis of foreign business transactions reported by contractors on their Certificate Pertaining to Foreign Interests forms or maintain aggregate information for contractors' responses. Consequently, DSS does not know the universe of FOCI contractors operating under protective measures, and DSS cannot determine the extent to which contractors under FOCI are increasing or if particular types of foreign business transactions are becoming more prevalent. This information would help DSS target areas for improved oversight. According to DSS officials, centralizing and tracking information on contractors under all types of measures would require more resources because information is dispersed in paper files in DSS field offices around the country. DSS does not systematically compile and analyze trends from its oversight functions to identify overall compliance trends or concerns with implementation of protective measures by contractors. DSS industrial security representatives are responsible for ensuring compliance of FOCI contractors under certain protective measures through annual FOCI meetings where they discuss contractors' compliance reports. Industrial security representatives notify headquarters of the results of the meetings and place compliance reports and their own assessments in paper files located in field offices. However, DSS headquarters does not use annual compliance reports to assess trends to evaluate overall effectiveness of the FOCI process. Finally, the use of protective measures at FOCI contractor facilities was designed in part to counter attempts to gather classified information through unauthorized means. DSS does not assess trends from its own counterintelligence data or information gathered by other intelligence agencies to evaluate whether protective measures are effectively mitigating FOCI risk across the board. For example, a 2004 DSS counterintelligence report states that foreign information targeting through e-mail and Internet communication and collection methods is on the rise. However, according to DSS officials, not all protective measures at FOCI contractors include provisions to monitor e-mail or other Internet traffic. By assessing counterintelligence trends to analyze the effectiveness of protective measures in countering foreign information collection attempts, DSS could identify weaknesses in its protective measures and adjust them accordingly. DSS's field staff face numerous challenges: complexities in verifying FOCI cases, limited tools to research FOCI transactions, insufficient FOCI training, staff turnover, and inconsistencies in implementing guidance on FOCI cases. For industrial security representatives, verifying if a contractor is under FOCI is complex. Industrial security representatives cited various difficulties verifying FOCI information. To verify if a contractor is under FOCI, industrial security representatives are required to understand the corporate structure of the legal entity completing the Certificate Pertaining to Foreign Interests form and evaluate the types of foreign control or influence that exist for each entity within a corporate family. DSS officials informed us that tracing strategic company relationships, country of ownership, and foreign affiliations and suppliers, or reviewing corporate documentation--such as loan agreements, financial reports, or Securities and Exchange Commission filings--is complicated. For example, representatives are required to verify information on stock ownership by determining the distribution of the stock among the stockholders and the influence or control the stockholders may have within the corporation. This entails identifying the type of stock and the number of shares owned by the foreign person(s) to determine their authority and management prerogatives, which DSS guidance indicates may be difficult to ascertain in certain cases. According to DSS field officials, verifying information is especially difficult when industrial security representatives have limited exposure to FOCI cases. In some field offices we visited, industrial security representatives had few or no FOCI cases and, therefore, had limited knowledge about how to verify foreign business transactions. Some industrial security representatives in one field office told us they do not always have the tools needed to verify if contractors are under FOCI. As part of their review process, industrial security representatives are responsible for verifying what a contractor reports on its Certificate Pertaining to Foreign Interests form and determining the extent of foreign interests in the company. Industrial security representatives conduct independent research using the Internet or return to the contractor for more information to evaluate the FOCI relationships and hold discussions with management officials, such as the chief financial officer, treasurer, and legal counsel. DSS headquarters officials told us additional information sources, such as the Dun and Bradstreet database of millions of private and public companies are currently not available in the field. However, some industrial security representatives stated that such additional resource tools would be beneficial for verifying complex FOCI information. In addition, industrial security representatives stated they lacked the training and knowledge needed to better verify and oversee contractors under FOCI. For example, DSS does not require its representatives to have financial or legal training. While some FOCI training is provided, representatives largely depend on DSS guidance and on-the-job training to oversee a FOCI contractor. In so doing, representatives work with more experienced staff or seek guidance, when needed, from DSS headquarters. In a 1999 review, DSS recognized that recurring training was necessary to ensure industrial security representatives remain current on complex FOCI issues and other aspects of the FOCI process. DSS headquarters officials said that they have held regionwide meetings where they discussed FOCI case scenarios and responded to questions about the FOCI process. However, we found that the training needs on complex FOCI issues are still a concern to representatives. In fact, many said they needed more training to help with their responsibility of verifying FOCI information, including how to review corporate documents, strategic company relationships, and financial reports. DSS field officials said the DSS training institute currently offers a brief training unit on FOCI covering basic information. DSS established a working group of DSS field and headquarters staff to look at ways to improve the training program, including more specific FOCI training. The group submitted recommendations in March 2005 to field managers for their review. DSS is also planning to work with its training institute to develop additional FOCI courses to better meet the needs of the industrial security representatives. According to field staff, industrial security representatives operate in an environment of staff turnover, which can affect their in-depth knowledge of FOCI contractors. Officials from one-third of the field offices we reviewed noted staff retention problems. DSS officials at two of these field offices said that in particular they have problems retaining more experienced industrial security representatives. Field officials said that when an industrial security representative retires or leaves, the staff member's entire workload is divided among the remaining representatives, who already have a substantial workload. In addition, DSS guidance advises field office officials to rotate contractor facilities among industrial security representative every 3 years, if possible, as a means of retaining DSS independence from the contractors. DSS officials told us the rotation can actually occur more frequently because of staff turnover. DSS headquarters officials said they are formulating a working group to help improve staff retention in the field. Compounding these challenges are inconsistencies among field offices in how industrial security representatives said they understood and implemented DSS guidance for reviewing contractors under FOCI. For example, per DSS guidance, security reviews and FOCI meetings should be performed every 12 months for contractors operating under special security agreements, security control agreements, voting trust agreements, and proxy agreements. However, we found that some industrial security representatives were inconsistent in implementing the guidance. For example, one representative said a contractor under a special security agreement was subject to a security review every 18 months because the contractor did not store classified information on-site. In addition, two industrial security representatives told us they did not conduct annual FOCI meetings for contractors that were operating under a proxy agreement and security control agreement, respectively. We also found that industrial security representatives varied in their understanding or application of DSS guidance for when they should suspend a contractor's facility clearance when FOCI is unmitigated. The guidance indicates that when a contractor with a facility clearance is determined to be under FOCI that requires mitigation by DSS headquarters, the facility security clearance shall be suspended until a protective measure is implemented. However, we were told by officials in some field offices that they rarely suspend clearances when a contractor has unmitigated FOCI as long as the contractor is demonstrating good faith in an effort to provide documentation to DSS to identify the extent of FOCI and submits a FOCI mitigation plan to DSS. Officials in other field offices said they would suspend a contractor's facility clearance once they learned the contractor had unmitigated FOCI. The protection of classified information has become increasingly important in light of the internationalization of multibillion-dollar cooperative development programs, such as a new-generation fighter aircraft, and a growing number of complex cross-border industrial arrangements. Although such developments offer various economic and technological benefits, there can be national security risks when foreign companies control or influence U.S. contractors with access to classified information. Given the growing number of DOD contractors with connections to foreign countries, it is critical for DSS to ensure that classified information is protected from unauthorized foreign access. In carrying out its responsibilities, DSS is dependent on self-reported information from the contractors about their foreign activities, creating vulnerabilities outside of DSS's control. Within this environment, unless DSS improves the collection and analysis of key information and provides its field staff with the training and tools they need to perform FOCI responsibilities, DSS will continue to operate without knowing how effective its oversight is at reducing the risk of foreign interests gaining unauthorized access to U.S. classified information. To improve knowledge of the timing of foreign business transactions and reduce the risk of unauthorized foreign access to classified information, we recommend that the Secretary of Defense direct the director of DSS to take the following three actions: clarify when contractors need to report foreign business transactions determine how contractors should report and communicate dates of specific foreign business transactions to DSS, and collect and analyze when foreign business transactions occurred at contractor facilities and when protective measures were implemented to mitigate FOCI. To assess overall effectiveness of DSS oversight of contractors under FOCI, we recommend that the Secretary of Defense direct the director of DSS to take the following three actions: collect and analyze data on contractors operating under all protective measures as well as changes in types and prevalence of foreign business transactions reported by contractors; collect, aggregate, and analyze the results of annual FOCI meetings, contractors' compliance reports, and data from the counterintelligence community; and develop a plan to systematically review and evaluate the effectiveness of the FOCI process. To better support industrial security representatives in overseeing contractors under FOCI, we recommend the Secretary of Defense direct the director of DSS to formulate a human capital strategy and plan that would encompass the following two actions: evaluate the needs of representatives in carrying out their FOCI responsibilities and determine and implement changes needed to job requirements, guidance, and training to meet FOCI responsibilities and explore options for improving resource tools and knowledge-sharing efforts among representatives. In commenting on a draft of our report, DOD disagreed with our conclusions that improvements are needed to ensure sufficient oversight of contractors under FOCI, and it also disagreed with our recommendations to improve oversight. Overall, DOD's comments indicate that it believes that the actions DSS takes when it learns of FOCI at contractors is sufficient. However, DOD has not provided evidence necessary to support its assertions. In fact, we found two cases in which contractors appeared to have operated with unmitigated FOCI before protective measures were put into place. Unmitigated FOCI at contractors increases the risk that foreign interests can gain unauthorized access to U.S. classified information. Further, DOD states that we did not establish a link between collecting and analyzing FOCI data and the effectiveness of DSS's oversight or the protection of classified information. We found that DSS lacks fundamental FOCI information--including information on the universe of FOCI contractors and trends in overall contractor compliance with protective measures--that is needed to determine the effectiveness of the FOCI process and the sufficiency of oversight. Ultimately, without making this determination, DSS cannot adequately ensure it is taking necessary steps to reduce the risk of foreign interests gaining unauthorized access to classified information. Unless our recommendations are implemented, we are concerned that DSS will continue to operate on blind faith that its FOCI process is effective and its oversight is sufficient. DOD did not concur with seven of our recommendations and only partially concurred with the eighth. Regarding our first three recommendations, which aim to improve DSS's knowledge of the timing of foreign business transactions and reduce the risk of unauthorized foreign access to classified information, DOD argues that having such information will not help protect classified information. However, as we noted in our report, without this information, DSS is not in a position to know when FOCI transactions occur so that timely protective measures can be implemented to mitigate FOCI as needed--the purpose of the FOCI process. Regarding our next three recommendations, which aim to enable DSS to assess the overall effectiveness of its oversight of contractors under FOCI, DOD argues that it does not need to collect and analyze information on the universe of contractors under FOCI and trends in foreign business transactions, or aggregate compliance and counterintelligence information. However, without this information, DSS limits its ability to identify vulnerabilities in the FOCI process and to target areas for improving oversight of contractors, including potential changes to protective measures. DOD also argues that it has three mechanisms to systematically evaluate DSS's processes: DSS's Inspector General, a management review process for industrial security field office oversight, and a standards and quality program. However, DOD has not provided evidence in its comments that these mechanisms are focused on systematically reviewing and evaluating the effectiveness of the FOCI process. Regarding our last two recommendations--to formulate a human capital strategy and plan that would better support industrial security representatives in overseeing FOCI contractors--DOD does not believe that its industrial security representatives need additional support. DOD supports this belief with two points. First, DOD states that because less than 3 percent of the approximately 12,000 cleared companies overseen by DSS have any FOCI mitigation, most DSS industrial security representatives do not oversee such contractors. Yet it is unclear how DOD arrived at these figures because DSS does not collect and analyze information on all contactors operating under protective measures. Regardless of the number of these contractors, industrial security representatives must have adequate support--including training and guidance--to verify if contractors are under FOCI and to ensure contractors comply with any protective measures put in place. In the course of our review, we found that industrial security representatives are not sufficiently equipped to fulfill their FOCI responsibilities. Second, DOD noted that DSS is under new leadership and is exploring operational improvements as well as implementing a new industrial security information management system. While it is too early to assess the effect of these proposals, it is also unclear how these efforts will bring about any needed changes to industrial security representatives' job requirements, guidance, tools, and training. As we concluded in our report, DSS's dependence on self-reported information from contractors about their foreign activities creates vulnerabilities outside of DSS's control. Given these vulnerabilities, it is imperative that DSS improve the collection and analysis of key information on the FOCI process and provide its industrial security representatives with the training and tools they need to perform their FOCI responsibilities. If DSS continues to operate without knowing how effective its oversight is and does not support the representatives in carrying out their FOCI responsibilities, then the value of DSS's management and the FOCI process should be open for further examination. Therefore, we did not modify our recommendations. DOD also provided technical comments, which we addressed. DOD's letter is reprinted in appendix II, along with our evaluation of its comments. We are sending copies of this report to interested congressional committees; the Secretary of Defense; the Director, Defense Security Service; the Assistant to the President for National Security Affairs; and the Director, Office of Management and Budget. We will make copies available to others upon request. In addition, this report will be available at no charge on the GAO Web site at http://www.gao.gov. If you have any questions about this report, please contact me at (202) 512-4841. Major contributors to this report are Anne-Marie Lasowski, Maria Durant, Ian A. Ferguson, Suzanne Sterling, Kenneth E. Patton, Lily J. Chin, and Karen Sloan. To assess the Defense Security Service's (DSS) process for determining and overseeing contractors under foreign ownership, control, or influence (FOCI), we reviewed Department of Defense (DOD) regulations and guidance on FOCI protective measures included in the National Industrial Security Program Operating Manual, and the Industrial Security Operating Manual, as well as DSS policies, procedures, and guidance for verifying contractors under FOCI and for overseeing them. We discussed with DSS officials at headquarters and field locations how they use DSS guidance to oversee FOCI contractors. We also discussed DSS roles and responsibilities for headquarters and field staff and challenges in overseeing contractors that report FOCI and the use of FOCI information to evaluate effectiveness of the process. We reviewed DSS training materials to learn about the type of training DSS offers industrial security representatives in meeting their FOCI responsibilities. We also examined FOCI studies conducted by DSS to determine the results of earlier DSS reviews of the FOCI process. We visited nine field offices that varied in how many FOCI contractors they monitored and in their geographic location. Through discussions with DSS officials at headquarters in Alexandria, Virginia, and from nine field offices, we identified FOCI contractors operating under various protective measures and examined DSS actions to verify FOCI and oversee the implementation of protective measures at contractor facilities. We collected information on a nonrepresentative sample of 27 contractor facility case files reviewed by DSS for FOCI. In addition, we visited 8 of the 27 contractor facilities and spoke with security officials, corporate officers, and board members to obtain additional clarification on the types of protective measures and the FOCI process. We spoke with DSS headquarters and field staff regarding actions taken to implement protective measures and reviewed supporting documentation maintained by DSS and contractor facilities. During our visits to nine field offices, we discussed the contents of selected contractor facility file folders to understand how DSS oversees contractors' implementation of protective measures, determines unmitigated FOCI, and assesses the effectiveness of the FOCI process. Because we did not take a statistical sample of case files, the results of our analyses cannot be generalized. However, we confirmed that the data used to select the files that we reviewed were consistent with the information in the facility files that we reviewed. The following are GAO's comments on the Department of Defense's letter dated June 29, 2005. 1. It is unclear how DOD came to the conclusion that our report lacks an understanding of the national policy governing contractors' access to classified information, given that our description of the policy and process in the background of our report is taken directly from documentation provided by DSS. Further, DOD did not provide in its technical comments any suggested amendments to remove perceived misunderstandings from our report. 2. Cleared U.S. citizens need not break the law for foreign interests to gain unauthorized access to classified information or adversely affect performance of classified contracts. Classified information can be at risk when foreign nationals at a cleared FOCI contractor facility are not identified and timely protective measures are not established to mitigate their influence. 3. DOD's position that there is little in our report that would enable DSS to improve the FOCI process or justify the cost of implementing our recommendations underscores the department's failure to grasp the gravity of our findings. DOD has neither systematically evaluated the effectiveness of its FOCI process nor identified opportunities to strengthen its oversight for contractors under FOCI. Our recommendations specifically target correcting these weaknesses. Further, raising concerns about cost without evaluating the effectiveness of its FOCI process is shortsighted. 4. According to the National Industrial Security Program Operating Manual, contractors are required to report material changes to FOCI information previously reported and every 5 years, even if no change occurs. We added a footnote to further clarify the definition of foreign business transactions used in our report. 5. DOD's response concerning self-reporting underscores the department's complacency regarding its responsibility to take actions needed to prevent foreign interests from gaining unauthorized access to U.S. classified information. While we recognize that DSS is dependent on self-reporting and that some vulnerabilities are outside of DSS's control, there are numerous steps DOD could take to mitigate these vulnerabilities. For example, if DSS implemented our recommendation to clarify when reporting should occur and require reporting dates when specific foreign business transactions took place, then DSS could monitor whether contractors are reporting foreign transactions on time and put mitigation measures in place, as appropriate. 6. While DOD maintains that contractors are to report material changes concerning FOCI information as they occur, we found that the National Industrial Security Program Operating Manual does not state this. As we reported, DSS field staff told us that while some contractors report transactions as they occur, some do not report transactions until months later, if at all. Specifying a time frame for contractors could result in more timely reporting of these transactions. 7. As we reported, the FOCI process begins when a contractor reports FOCI information. Having information on when foreign transactions occur would enable DSS to take timely action to impose safeguards or restrictions authorized by the National Industrial Security Program Operating Manual. 8. Unmitigated FOCI at a cleared contractor increases the risk that foreign interests can gain unauthorized access to U.S. classified information. During our review, we found two cases in which contractors appeared to have operated with unmitigated FOCI before protective measures were put in place. Therefore, it is important to know the length of time between when a foreign transaction occurs and when protective measures are put in place to mitigate FOCI. 9. According to the National Industrial Security Program Operating Manual, a contractor under FOCI with an existing facility clearance shall have its clearance suspended or revoked unless protective measures are established to remove the possibility of unauthorized access to classified information or adversely affect performance on classified contracts. DOD's characterization of DSS having the option to suspend the clearance of contractors with unmitigated FOCI seems to differ from what is stated in the manual. 10. It is unclear why DOD does not see the value in collecting information on contractors operating under all six protective measures, when DSS already centrally collects information on contractors operating under three measures. DSS cannot assess the overall effectiveness of its FOCI process unless it has a complete and accurate account of contractors operating under all types of protective measures. 11. It is unclear how DOD determined that less than 3 percent of its cleared contractors are operating under all six protective measures because DSS does not centrally collect and analyze this information for all six measures. In addition, the most recent information provided to us by DSS indicated that there are about 11,000 contractor facilities participating in the National Industrial Security Program, rather than the 12,000 cited in DOD's comments. Further, DOD did not provide technical comments to revise the number of contractor facilities stated in our report. 12. Industrial security representatives may use the results of annual meetings, compliance reports, and counterintelligence data to assess an individual contractor's security posture. However, as stated in our report, DSS does not systematically compile and analyze trends from these oversight activities. Aggregating overall compliance and counterintelligence trends is valuable because it would allow DSS to identify actual or potential weaknesses, evaluate effectiveness, and take actions as needed to improve its FOCI process. 13. Citing how long the program has been in existence misses the point, and DOD does not provide evidence that the needs of representatives are well known. As we reported, industrial security representatives face numerous challenges in carrying out their FOCI responsibilities, which formulates the basis of our recommendation to evaluate the needs of the representatives. Assessing their needs is particularly important given the increasingly complex environment--characterized by international cooperative defense programs and a growing number of cross-border defense industrial relationships--in which industrial security representatives work. 14. As stated in our report, industrial security representatives told us they lacked the training and knowledge they needed to verify complex FOCI cases and oversee contractors under FOCI. | The Department of Defense (DOD) is responsible for ensuring that U.S. contractors safeguard classified information in their possession. DOD delegates this responsibility to its Defense Security Service (DSS), which oversees more than 11,000 contractor facilities that are cleared to access classified information. Some U.S. contractors have foreign connections that may require measures to be put into place to reduce the risk of foreign interests gaining unauthorized access to classified information. In response to a Senate report accompanying the National Defense Authorization Act for Fiscal Year 2004, GAO assessed the extent to which DSS has assurance that its approach provides sufficient oversight of contractors under foreign ownership, control, or influence (FOCI). DSS's oversight of contractors under FOCI depends on contractors self-- reporting foreign business transactions such as foreign acquisitions. As part of its oversight responsibilities, DSS verifies the extent of the foreign relationship, works with the contractor to establish protective measures to insulate foreign interests, and monitors contractor compliance with these measures. In summary, GAO found that DSS cannot ensure that its approach to overseeing contractors under FOCI is sufficient to reduce the risk of foreign interests gaining unauthorized access to U.S. classified information. First, DSS does not systematically ask for, collect, or analyze information on foreign business transactions in a manner that helps it properly oversee contractors entrusted with U.S. classified information. In addition, DSS does not collect and track the extent to which classified information is left in the hands of a contractor under FOCI before measures are taken to reduce the risk of unauthorized foreign access. During our review, we found instances in which contractors did not report foreign business transactions to DSS for several months. We also found a contractor under foreign ownership that appeared to operate for at least 6 months with access to U.S. classified information before a protective measure was implemented to mitigate foreign ownership. Second, DSS does not centrally collect and analyze information to assess its effectiveness and determine what corrective actions are needed to improve oversight of contractors under FOCI. For example, DSS does not know the universe of all contractors operating under protective measures, the degree to which contractors are complying overall with measures, or how its oversight could be strengthened by using information such as counterintelligence data to bolster its measures. Third, DSS field staff face a number of challenges that significantly limit their ability to sufficiently oversee contractors under FOCI. Field staff told us they lack research tools and training to fully understand the significance of corporate structures, legal ownership, and complex financial relationships when foreign entities are involved. Staff turnover and inconsistencies over how guidance is to be implemented also detract from field staff's ability to effectively carry out FOCI responsibilities. | 7,279 | 576 |
EPA relies heavily on grants to carry out its environmental mission; over one half of its $7.6 billion budget for fiscal year 2000 was provided for grants. Grants are used (1) to financially support continuing environmental programs administered by state and local governments and (2) to fund other environmental projects. During fiscal year 1999, EPA awarded $1.8 billion for continuing environmental programs and $716 million for environmental projects--the subject of this report. Grants are funded by EPA's headquarters offices, such as the Office of Research and Development and Office of Air and Radiation, and by EPA regional offices. The administration of these grants (from activities prior to the award though the closeout of completed or inactive grants) has been delegated to EPA's Grants Administration Division, and 10 regional Grants Management Offices. EPA carries out its' grant programs within the framework of the strategic goals and objectives contained in its strategic plan. The plan sets forth 10 goals with 41 objectives and 123 subobjectives that cover its major programs, such as those for clean air, clean water, and pesticides. For example, EPA's clean air goal has 4 objectives and 14 subobjectives. One of the four objectives is "Attain National Ambient Air Quality Standards for Ozone and Particulate Matter." This objective in turn has several subobjectives, including "National Ambient Air Quality Standards for Ozone." Once potential grantees submit their grant applications, EPA officials review them. If the grant application is approved, the grantee is awarded the grant and funds are made available for the purposes specified in the grant. In connection with the grant award, EPA's program office officials determine how the grant will support a particular strategic goal, objective, and subobjective. In fiscal year 1999, EPA began coding new grant awards by "program result codes," which are aligned with goals, objectives, and subobjectives. Before 1999, EPA officials assigned "program element codes" to grant awards, which reflected the program and EPA office awarding the grant. EPA awards grants to organizations and individuals under regulations that establish uniform administrative requirements throughout the agency. The regulations cover a range of grant activities--from those prior to the award through the closeout of completed or inactive grants--and a variety of topics, such as grantee reporting requirements and allowable uses of grant funds. Particular regulations cover grants to institutions of higher education, hospitals, and nonprofit organizations (40 C.F.R. part 30), as well as assistance to state, local, and Indian tribal governments (40 C.F.R part 31). Other EPA regulations cover grants under specific programs, such as Superfund (40 C.F.R. part 35, subpart O), and specific types of assistance, such as fellowships (40 C.F.R. part 46). EPA regulations authorize the agency to deviate from certain regulations on a case-by-case basis. We previously reported that EPA used this deviation authority extensively to close out inactive grants without following certain closeout requirements. EPA awarded about 17,000 project grants totaling $2.8 billion in fiscal years 1996 through 1999. Project grant funds were concentrated in five categories--investigations, surveys or studies; research; Superfund site cleanup support; senior environmental employment program; and training, which accounted for $2.3 billion, or 80 percent of all funds. The grants were also concentrated by the type of recipient: nonprofit organizations, state or local governments, and colleges or universities received approximately 89 percent of the total project grant amount. In fiscal year 1996 through fiscal year 1999, project grants focused on (1) investigations, surveys, or studies; (2) research; (3) Superfund site cleanup support; (4) the senior environmental employment program; and (5) training. The remaining project grants were awarded in 37 other EPA areas, such as the Hardship Grants Program for Rural Communities and the Great Lakes National Program. (See app. I for the number and value of all project grants, fiscal years 1996 through 1999). As shown in figure 1, grants for investigations, surveys, and studies accounted for the single largest category--about 30 percent of all grant dollars awarded. A brief description of these categories follows. EPA awarded $851.8 million in grants for investigations, surveys, or studies for fiscal years 1996 through 1999. These grants were provided for a wide range of activities supporting investigations, surveys, studies, and special purpose assistance in the areas of air and water quality, hazardous waste, toxic substances, and pesticides. These grants are also used for evaluating economic or social consequences related to environmental strategies and for other efforts to support EPA environmental programs. Finally, the grants are used to identify, develop, or demonstrate pollution control techniques or to prevent, reduce, or eliminate pollution. The following examples illustrate the variety of activities funded by these grants: In February 1999, EPA awarded a $10,000 grant to Monitor International, a nonprofit organization located in Annapolis, Maryland, to develop a feasibility study and action plan for a science and education center in Indonesia. In August 1999, EPA awarded a $1.5 million grant to the West Virginia University Research Corporation, National Research Center for Coal and Energy. With the grant funds the center was to provide technical assistance, outreach, a library of databases, maintenance of a Web site, and publications on the design, implementation, and maintenance of alternative wastewater treatment and collection systems for small communities. EPA awarded research project grants totaling $690.9 million. Generally, these grants were to fund laboratory and other research into a variety of environmental problems, such as air pollution and its impact on asthma. For example, EPA awarded a $4.6 million grant to the University of New Orleans in September 1999 for research and development on technical solutions to waste management problems faced by the academic, industrial, and governmental communities. EPA awarded about $408.8 million in grants to states and other government entities and to nonprofit organizations to conduct cleanup activities at specific hazardous waste sites and to implement the requirements of the Superfund program. For example, in September 1999, EPA awarded a $1.5 million grant to the Wisconsin Department of Natural Resources to complete an investigation and study at a waste site in order to select a cleanup remedy for controlling the risks to human health and the environment. The Senior Environmental Employment program, for which EPA makes grants authorized by the Environmental Programs Assistance Act of 1984, accounted for approximately $199.1 million. Under this program, EPA awards cooperative agreements to organizations to enable individuals 55 or older to provide technical assistance to federal, state, or local environmental agencies for pollution prevention, abatement, and control projects. For example, in September 1999, EPA awarded a $1.3 million grant to the National Older Worker Career Center to provide general support to EPA's staff within the Office of Pesticides Program. EPA awarded $108.3 million in training grants to government, educational, and nonprofit entities, which provide environmental related training in a variety of topics. For example, EPA awarded a $1.5 million grant in July 1999 to North Carolina State University to provide state-of-the-art training courses on the Clean Air Act Amendments. Nonprofit organizations, state or local governments or colleges and universities received most project grant dollars awarded by EPA in fiscal years 1996 through 1999, as table 1 shows. Nonprofit organizations received the largest portion of project grant dollars ($741.8 million, or 33 percent of the total), and the majority of these funds were provided to support investigations, the senior environmental employment program, and research. State or local governments received the next largest amount, with most of these funds provided for Superfund site cleanup support or for investigations. Colleges and universities also received a significant amount of project grant funds, the majority of which was for research. For-profit organizations, individuals, and other government entities, such as water district authorities, also received project grant funds. In October 1998, EPA began designating grant awards to indicate which Results Act goal, objective, and subobjective each grant supported. EPA intended to account for all new obligations by using a program results code (PRC) that aligned with the agency's strategic goals, objectives, and subobjective. (Previously, EPA accounted for grant funds by using program element codes, which identified the program and EPA office that awarded the grant.) PRCs allows EPA to account for its grant award amounts by goal, objective, and subobjective. EPA project officers assign codes to the grant after deciding which grants to award. Approximately 82 percent of the $1.4 billion in project grants EPA awarded in fiscal years 1999 and 2000 that were assigned a PRC concentrated in 4 of EPA's 10 goals: clean air, clean and safe water, waste management, and sound science. For 7 of the 100 grants we reviewed, the relationship between the activities funded by the grant and the goal(s), objective(s), and subobjective(s) that EPA identified was not clear. EPA officials explained that for six of these grants the definitions of the goals, objectives, and subobjectives were sufficiently broad to encompass the activities funded by the grants, and agreed that one grant had been designated the incorrect subobjective. The grant award process involves several steps before funds are provided to the grantee. EPA may solicit grant proposals from potential grantees, or grantees may submit unsolicited grant proposals to EPA. In either situation, the grant proposal details the grant's purpose, amount, and time frame. EPA officials review the grant proposals and frequently discuss them with the submitting entity---a process that may result in modifications to the scope of activities, funding amount, or time period. Once EPA reaches a final decision to fund a grantee, it provides the grantee a commitment letter. In preparing the final grant award document, EPA makes several determinations regarding the authority for the grant activities, the funding authority for the grant, and the PRC code specifying the relevant Results Act goal, objective, and subobjective. The PRC code is entered into EPA's automated systems to record the obligation of funds under the goals. Because some grants fund a variety of activities, more than one PRC code may be designated for a particular grant. According to EPA officials, the designation of a PRC identifying the goal, objectives, and subobjective to be supported by the grant is part of the grant award. In practice, EPA designates Results Act goal(s), objective(s), and subobjective(s) after the decision has been made to award a particular grant. EPA assigned PRCs to approximately $1.2 billion of the project grants made in fiscal years 1999 and 2000. Most of these funds aligned with the agency goals for waste management ($438.7 million), clean and safe water ($298.1 million), sound science ($146.8 million), and clean air ($119.2 million). Figure 2 shows the distribution of these grant dollars among Results Act goals for fiscal years 1999 and 2000. The remaining $222 million in project grant funds assigned PRC codes were aligned with one of EPA's six other strategic goals--safe food; preventing pollution and reducing risk in communities, homes, workplaces and ecosystems; reduction of global and cross-border environmental risks; expansion of Americans' right to know about their environment; a credible deterrent to pollution and greater compliance with the law; and effective management. For 7 of the 100 grants that we reviewed, the funded grant activities did not appear to match the EPA activities defined for the assigned PRC code. More specifically, two of the grants were not clearly related to any EPA goals, objectives, or subobjectives; three grants were clearly related to the indicated goals, but not the objectives and subobjectives; and two grants were related to the indicated goals and objectives, but not the subobjectives. A brief description of these grants follows. In June 1999, EPA awarded a $2.5 million grant to the Brownsville Public Utilities Board in Texas to support specific planning, engineering, environmental, and legal activities related to the development and construction of a dam and reservoir project. The PRC indicated that the grant was to support the Results Act subobjective of working with states and tribes to ensure reporting consistency under the Clean Water Act and Safe Drinking Water Act. In June 1999, EPA awarded a $2 million grant to the University of Missouri to conduct research on the economic, social, biological, physical, and ecological benefits of tree farming. The PRC indicated that the grant was to support the Results Act objective of promoting and implementing sector-based environmental management approaches that achieve superior environmental results at less cost than through conventional approaches. In August 1999, EPA awarded a $20,000 grant to the Urban Land Institute to conduct a conference on smart growth that was coded for Clean and Safe Water goal activities, such as watershed assessment and protection, coastal and marine protection, water quality criteria and standards, or Chesapeake Bay and Gulf of Mexico activities. In January 2000, EPA awarded a $228,000 grant to Michigan State University to examine public opinions regarding the value of wetland ecosystems. The PRC indicated that the grant was to support the Results Act subobjective of cleaning up contaminants that are associated with high-priority human health and environmental problems. In May 2000, EPA awarded a $64,000 grant to Science Services, a nonprofit organization located in Washington, D. C., for hosting an international science and engineering fair for high school students competing for monetary science awards. The PRC indicated that the grant was to support the Results Act goal of supporting research in global climate change. In June 2000, EPA awarded a $8,000 grant to Environmental Learning for Kids, Denver, Colorado to educate culturally diverse families about environmental issues; activities included overnight camping trips, and monthly outdoor workshops. The PRC indicated that the grant was to support the Results Act objective for activities related to providing training to teachers for making presentations to grades K-12. In June 2000, EPA awarded a $5,000 grant to Southwest Youth Corps in Colorado to support the organization and management of the Conservation Corps. The primary purpose of this grant was to train young adults on environmental issues. The PRC indicated that the grant was to support the Results Act objective of providing activities related to training teachers on making presentations to grades K-12. EPA officials explained that the project officer had assigned an incorrect subobjective to the grant EPA awarded to Michigan State University to examine public opinion on the value of wetland ecosystems. EPA believes that the definitions of the goals, objectives, and subobjectives for the other six grants were sufficiently broad to encompass the activities funded by the grants. According to EPA officials, it would be impossible, when defining Results Act goals, objectives, and subobjectives, to list every activity that could apply. However, they stated that it was important to designate the correct PRC for grant activities. EPA approved at least one deviation from its regulations for 25 of the 100 grants we reviewed, and for 15 grants EPA authorized more than one deviation. Most of the deviations were made on a case-by-case basis to waive requirements relating to grant budget periods, matching fund requirements, or other regulations. Individual deviation decision memoranda contained in the grant files documented these decisions. Deviations from regulations for 6 grants, made under EPA's Science to Achieve Results (STAR) program, were not determined on a case-by-case basis. The STAR fellowship grant program, which is administered by EPA's Office of Research and Development (ORD), by design provides grants with greater dollar amounts and longer time periods than allowed by EPA's regulations. According to an EPA official, the STAR program, which began in 1995, is EPA's largest fellowship program in terms of dollars and number of fellowships. According to ORD officials, the program was designed to be consistent with other federal fellowship programs for scientists. STAR fellowship grants deviate from EPA's grant regulations governing fellowships in three ways: While the regulations place a limit of $750 on grant funds that can be used to purchase books and supplies, STAR fellowship grants provide up to $5,000 for this purpose. The regulations limit fellowships to 1 year, while STAR fellowships provide up to two years for master degree students and up to 3 years for doctoral students. The regulations stipulate that grant funds may be used for purchasing books and supplies if provided directly to the student; however, STAR fellowship grants funds are used to directly pay the educational institution for these items. EPA does not track the number of deviations it makes. However, regulations require that the authority for each deviation must be documented in the appropriate grant file. The agency awarded 471 STAR fellowship grants in fiscal years 1996 through 1999, totaling $34.1 million in funding. EPA prepared and processed a request for deviation for each of these grants. ORD officials stated that they wanted the STAR fellowship program to parallel a National Science Foundation fellowship program, which authorizes greater funding levels and longer funding periods than allowed by EPA's regulations. They also stated that they thought providing payments for books and supplies directly to an institution would provide better stewardship and control over the funds and ensure funds were used for authorized purposes. The officials stated that, rather than amending the regulations solely for the STAR program, which it considered time- consuming and a low priority, they opted to use deviations in awarding the grants and currently do not have staff in place to work on amending the regulations. They acknowledged, however, that the regulations are outdated and should be reviewed for possible revision The other deviations we reviewed had been made on a case-by-case basis: Eleven of these deviations involved EPA waiving a requirement that the grant budget date and the project period ending date coincide. For example, in January 1999, EPA amended a grant awarded in March 1997 to the Northeast States for Coordinated Air Use Management to provide an additional $200,000 for research in establishing an ambient air monitoring network for mercury deposition within New England. The project period and the budget period ending dates were changed from March 1999 to March 2001, deviating from EPA's regulations that require the budget period not exceed 2 years from the award date. EPA approved the deviation, allowing the grantee to expand the number of sampling sites to obtain a better measurement of the pollution problem. EPA made nine deviations that waived the grantee matching funding requirement for the grant. For example, in September 1999, EPA awarded a $4.6 million grant to the University of New Orleans to fund the University Urban Waste Management and Research Center, which provides research and technical assistance to cities with wet weather conditions typical of coastal areas. EPA waived the minimum 5- percent nonfederal matching share requirement for the university. However, this deviation proved unnecessary because the regulation requiring matching funds had been repealed in 1996. Unaware of the change in regulations, EPA officials continued to grant deviations for a matching fund requirement well into fiscal year 2000. Appendix II details the deviations EPA made for the grants we reviewed, aside from those associated with the STAR fellowship program. EPA has extensively used its deviation authority for STAR fellowship grants, citing the time and resources that would be needed to amend its regulations. While amending the grant regulations would entail a time and resource cost in the short-term, EPA's regulations are intended to provide consistency and transparency for the agency's grant activities and should reasonably reflect actual practices in the agency's grant programs. In this case, the regulations do not reflect the actual practice in the STAR fellowship grant program--EPA's largest fellowship grant program-- which routinely awards more money for longer periods of time than is authorized by EPA's fellowship regulations. Consistency between regulations and practice could be achieved by amending either EPA's grant regulations or the practices of the STAR fellowship program. To ensure that EPA's fellowship regulations are consistent with the actual practices, we recommend that the Administrator of EPA direct the Assistant Administrator for Administration and Resources Management to include in future amendments to its fellowship regulations the funding amounts, time periods, and payment methods that will meet the needs of the STAR fellowship grant program. We provided EPA with a draft of this report for review and comment. The agency agreed with the findings in the report and suggested several changes to improve clarity, which we incorporated into the report, where appropriate. EPA agreed with our recommendation to update the fellowship regulation and plans to establish a workgroup to ensure that the regulation reflects the current requirements of the STAR fellowship program. We conducted our review from May 2000 through March 2001 in accordance with generally accepted auditing standards. Our scope and methodology are presented in appendix III. We are sending copies of this report to appropriate congressional committees; interested Members of Congress; the Honorable Christine Todd-Whitman, Administrator, Environmental Protection Agency, and other interested parties. We will also make copies available to others on request. Should you or your staff need further information, please call me at (202) 512-3841. Key contributors to this report were E. Odell Pace, Jill A. Roth, John A. Wanska, and Richard P. Johnson. Appendix II: Listing of Deviations on Other Than STAR Fellowship Grants Allowed deviation Research grantees were allowed to have the budget period of the grants coincide with the project period end date. In some cases, this deviation allowed an extension beyond EPA's regulatory limits. State and local grantees were not required to provide 5% in non-federal matching funds. Grantees were allowed to incur cost prior to the award of the grants. Grantees were allowed to deviate from numerous requirements. 40 CFR 35.6230(b) and 40 CFR 35.6250(a) 40 CFR 35.6650(b)(2), (3), and (4) Grantee was not required to include a comparison of the (1) percentages Grantee was allowed to change the scope or objective of the project without prior EPA approval. Grantee was not required to submit a list of sites at which it planned to take remedial action. Grantee was not required to submit a non-site specific budget for the support activities funded. of the project completed to the project schedule; (2) estimated funds spent to date to planned expenditures; and (3) comparison of the estimated time and funds needed to complete the work to the time and funds remaining. Grantee was allowed to have the budget period of the grant coincide with the project period. Grantee was not required to submit a quality assurance plan. To determine the activities funded by project grants, we identified EPA project grants and then analyzed automated information, taken from EPA's Grants Information Control System on grant dollar amounts and grantee type, which we obtained from EPA's Office of Inspector General. To determine how project grants align with EPA's Results Act goals and objectives, we identified goals and objectives for all project grants awarded in fiscal years 1999 and 2000 from the automated data. We interviewed EPA headquarters and regional officials, including individual project grant officers, regarding how goals and objectives are identified in EPA's grant award process. From a universe of 4,717 grants awarded in fiscal years 1999 and 2000,we selected a random sample of 100 grants . We reviewed supporting documentation for these grants and interviewed cognizant EPA officials to assess whether the funded activities were consistent with the activities for the goal(s) and objective(s) that EPA identified as being supported by the grant. To determine the extent EPA used its authority to deviate from regulations, we reviewed the same 100 randomly selected grants. In cases where deviations occurred, we obtained additional information regarding the reasons for the deviation. We interviewed EPA officials to determine the circumstances and frequency for using deviations in general and for the specific grants we selected. | This report provides information on the Environmental Protection Agency's (EPA) management and oversight of project grants. Specifically, GAO examines (1) the dollar amounts of project grants EPA awarded in fiscal years 1996 through 1999 and the program activities they funded, by grantee type; (2) how the activities funded by the project grants align with the Government Performance and Results Act goals and objectives identified by EPA; and (3) the extent to which EPA uses its authority to deviate from relevant regulations in awarding grants. GAO found that EPA awarded about 17,000 project grants totaling more than $2.8 billion in fiscal years 1996 through 1999. Five categories accounted for nearly 80 percent of all project grant funds (1) general investigations, surveys or studies involving air and water quality; (2) research; (3) studies and cleanups of specific hazardous waste sites; (4) nonprofit organizations; and (5) training activities. EPA identified about 82 percent of the $1.4 billion in project grants awarded in fiscal years 1999 and 2000 as supporting four strategic goals under the Results Act. GAO found this to be the case in 93 of 100 grants reviewed. EPA used its authority to deviate from regulations in awarding 25 of the 100 grants GAO reviewed. | 5,114 | 255 |
VA has two basic cash disability benefits programs. The compensation program pays monthly benefits to eligible veterans who have service- connected disabilities (injuries or diseases incurred or aggravated while on active military duty). The payment amount is based on the veteran's degree of disability, regardless of employment status or level of earnings. By contrast, the pension program assists permanently and totally disabled wartime veterans under age 65 who have low incomes and whose disabilities are not service-connected. The payment amount is determined on the basis of financial need. VBA and the Board process and decide veterans' disability claims and appeals on behalf of the Secretary. The claims process starts when veterans submit claims to one of VBA's 57 regional offices. (See app. I for the overall flow of claims and appeals processing.) By law, regional offices must assist veterans in supporting their claims. For example, for a compensation claim, the regional office obtains records such as the veteran's existing service medical records, records of relevant medical treatment or examinations provided at VA health-care facilities, and other relevant records held by a federal department or agency. If necessary, the regional office arranges a medical examination for the claimant or obtains a medical opinion about the claim. The regional office adjudicator then must analyze the evidence for each claimed impairment (veterans claim an average of about five impairments per claim); determine whether each claimed impairment is service-connected (VA grants service-connection for an average of about three impairments per claim); apply VA's Rating Schedule which provides medical criteria for rating the degree to which each service-connected impairment is disabling (disability ratings can range from zero to 100 percent, in 10-percent increments); determine the overall disability rating that results from the combination of service-connected impairments suffered by the veteran; and notify the veteran of the decision. If a veteran disagrees with the regional office's decision, he or she begins the appeals process by submitting a written Notice of Disagreement to the regional office. During fiscal years 1999-2000, the regional offices annually made an average of about 616,000 decisions involving disability ratings, and veterans submitted Notices of Disagreement in about 9 percent of these decisions. Veterans can disagree with decisions for reasons other than the outright denial of benefits that occurs, for example, in a compensation case when a regional office decides an impairment claimed by a veteran is not service-connected. The veteran also may believe the severity rating assigned to a service-connected impairment is too low and ask for an increase in the rating. In response to a Notice of Disagreement, the regional office provides a further written explanation of the decision, and if the veteran still disagrees, the veteran may appeal to the Board. During fiscal years 1999- 2000, about 48 percent of the veterans who filed Notices of Disagreement in decisions involving disability ratings went on to file appeals with the Board. In fiscal year 2001, VBA began nationwide implementation of the Decision Review Officer position in its regional offices. Now, before appealing to the Board, a veteran may ask for a review by a Decision Review Officer, who is authorized to grant the contested benefits based on the same case record that the regional office relied on to make the initial decision. VBA believes this process will result in fewer appeals being filed with the Board. Located in Washington, D.C., the Board is an administrative body whose members are attorneys experienced in veterans' law and in reviewing benefits claims. The Board's members are divided into four decision teams, with each team having up to 15 Board members and 61 staff attorneys. Each team has primary responsibility for reviewing the appeals that originate in an assigned group of regional offices. Board members' decisions must be based on the law, regulations, precedent decisions of the courts, and precedent opinions of VA's General Counsel. During the Board's appeals process, the veteran or the veteran's representative may submit new evidence and request a hearing. During fiscal years 1999 and 2000, for all VA programs, the Board annually decided an average of about 35,700 appeals, of which about 32,900 (92 percent) were disability compensation cases. The average appealed compensation case contains three contested issues. As a result, in some cases, the Board member may grant the requested benefits for some issues but deny the requested benefits for others. During fiscal years 1999 and 2000, the Board in its initial decisions on appealed compensation cases granted at least one of the requested benefits in about 24 percent of the cases. In some instances, the Board member finds a case is not ready for a final decision and returns (or remands) the case to the regional office to obtain additional evidence and reconsider the veteran's claim. During fiscal years 1999 and 2000, respectively, the Board in its initial decisions on appealed compensation cases remanded 38 percent and 34 percent of the cases. After obtaining additional evidence for remanded cases, if the regional office still denies the requested benefits, it resubmits the case to the Board for a final decision. If the Board denies benefits or grants less than the maximum benefit available under the law, veterans may appeal to the U. S. Court of Appeals for Veterans Claims. The court is not part of VA and not connected to the Board. During fiscal years 1999 and 2000, veterans filed appeals with the court in an estimated 10 percent of the Board's decisions. Unlike the Board, the court does not receive new evidence, but considers the Board's decision, briefs submitted by the veteran and VA, oral arguments, if any, and the case record that VA considered and that the Board had available. The court may dismiss an appeal on procedural grounds such as lack of jurisdiction, but in the cases decided on merit, the court may affirm the Board's decision (deny benefits), reverse the decision (grant benefits), or remand the decision back to the Board for rework. During fiscal years 1999 and 2000, the court annually decided on merit an average of about 1,800 appealed Board decisions, and in about 67 percent of these cases, the court remanded or reversed the Board's decisions in part or in whole.Under certain circumstances, a veteran who disagrees with a decision of the court may appeal to the U.S. Court of Appeals for the Federal Circuit and then to the Supreme Court of the United States. In fiscal year 1998, the Board established the first quantitative quality assurance program to evaluate and score the accuracy of its decisions and to collect data to identify areas where the quality of decision-making needs improvement. The accuracy measure used by the Board understates its true accuracy rate because the Board's accuracy rate calculations include certain deficiencies that would not result in either a reversal or a remand by the court. Even so, the Board's quality assurance program does not capture certain data that potentially could help improve the quality of the Board's decisions. Such data include information identifying the specific medical issues involved in cases where a disability decision was judged as being in error. Having such data could enhance the Board's ability to target training for its decision makers. On the basis of the results of the quality assurance program it established in fiscal year 1998, the Board estimated that 89 percent of its decisions were accurate (or "deficiency-free"). Using these results as a baseline, VA established performance accuracy goals for the Board. One of the Board's strategic performance goals is to make deficiency-free decisions 95 percent of the time. To calculate its estimated overall accuracy rate, the Board does quality reviews of selected Board decisions. We reviewed the Board's methods for selecting random samples and calculating accuracy rates and concluded that the number of decisions reviewed by the Board was sufficient to meet the Board's goal for statistical precision in estimating its accuracy rate. However, we brought to the Board's attention some issues that caused the Board to fall short of proper random sampling and accuracy rate calculation methods, such as not ensuring that decisions made near the end of the fiscal year are sampled or that the results from quality reviews are properly weighted in the accuracy rate calculation formula. We do not believe the overall accuracy rate reported by the Board for fiscal year 2001 would have been materially different if these methodological issues had been corrected earlier; however, if not corrected, these issues potentially could lead to misleading accuracy rate calculations in the future. The Board agreed in principle to correct these issues. As of June 2002, the Board had not yet instituted corrective actions. According to VA's performance reports, the Board has come close but has not achieved its annual interim goals for accuracy (see table 1). However, in calculating its reported accuracy rates, the Board includes deficiencies that are not "substantive"--that is, they would not be expected to result in either a remand by the court or a reversal by the court. Consequently, the reported accuracy rates understate the Board's level of accuracy that would result if only substantive deficiencies were counted in the calculation. Under its quality assurance program, the Board's quality reviewers assess the accuracy of selected decisions on the basis of six critical areas (see table 2). One error (or deficiency) in any of these six areas means that a decision fails the quality test. However, according to the Board, all six areas would include certain deficiencies that are not substantive. In particular, according to the Board, most deficiencies in the "format" category are not substantive. In fiscal year 2001, the format category accounted for about 38 percent of all recorded deficiencies. At our request, the Board recalculated its accuracy rate for fiscal year 2001, excluding format deficiencies, and the resulting accuracy rate was 92 percent, as compared with the reported accuracy rate of 87 percent. Excluding all other nonsubstantive deficiencies presumably would have resulted in an even higher accuracy rate. In contrast with the Board, beginning in fiscal year 2002, VBA no longer includes nonsubstantive deficiencies in its accuracy rate calculations; however, it continues to monitor them. VBA took this action based on a recommendation by the 2001 VA Claims Processing Task Force, which said that mixing serious errors with less significant deficiencies can obscure what is of real concern. The Board's quality review program subdivides the six critical areas shown in table 2 into 31 subcategories. For example, if a quality reviewer classifies an error as stemming from "reasons and bases," the reviewer must then indicate whether the error was due to misapplying legal authority, failing to apply appropriate legal authority, using an incorrect standard of proof, or providing an inadequate explanation for the decision. This information is recorded in the Board's quality review database, providing the Board with data that can be analyzed to identify training needed to improve quality. However, the Board does not record in its quality review database any information on the specific issue that prompted the appeal (such as whether a disability is service-connected) or the specific medical impairment to which an error is related. For example, a quality reviewer might find an error in a Board decision for an appeal that involved four separate medical impairments--two for which the veteran had requested service connection and two others for which he had requested a disability rating increase. On the basis of information that the quality review database currently captures, however, the Board could not determine which of the four impairments the error was related to, nor could the Board determine whether the error was related to a request for service- connection or an increased disability rating. This is not the case, however, for Board decisions remanded by the Court of Appeals for Veterans Claims. For these cases, the Board maintains a separate database with information on the reasons that the court remands decisions back to the Board for rework. For each issue that the court remands in a compensation case, the Board records in the database such information as: (1) whether the issue involved a request for service- connection or an increased rating, (2) the diagnostic code of the impairment involved in each issue, and (3) the reason for the remand. According to Board officials, being able to analyze the court's reasons for remands by type of decisional issue and type of impairment enhances the Board's ability to reduce remands from the court through appropriate training. VBA and the Board recognize that in some cases, different adjudicators reviewing the same evidence can make differing judgments on the meaning of the evidence, without either decision necessarily being wrong. In such cases, VBA and the Board instruct quality reviewers not to record an error. A hypothetical case provided by the Board furnishes an example. In this case, a veteran files a claim in 1999 asserting he suffered a back injury during military service but did not seek medical treatment at that time. One of the veteran's statements says he injured his back during service in 1951, but another says he injured his back in 1953. An adjudicator may find that this discrepancy in dates adversely affects the claimant's credibility about whether an injury actually occurred in service, but the quality reviewer may consider the discrepancy to be insignificant. Where such judgments are involved, the Board's and VBA's quality review programs recognize that variations in judgment are to be expected and are acceptable as long the degree of variation is within reason. (App. II provides other examples of difficult judgments that could result in decision-making variations and explains VA's "benefit-of-the-doubt" rule.) The Board and VBA, however, differ in their approaches to collecting information about cases where this type of variation occurs. In such instances, the Board's quality reviewers note why they believe an alternative decision could have been made and send the explanation to the deciding Board member. However, they do not enter any of this information in the quality review database. In contrast, VBA recently instructed its quality reviewers to enter such information in the VBA quality review database, even though no error is recorded in the database. VBA believes that by identifying and analyzing cases in which quality reviewers believed the adjudicator's judgment was pushing against the boundary of reasonableness, it potentially can identify opportunities to improve the quality of decision making by improving training. Even though evidence suggests decision making across regional office and Board adjudicators may not be consistent, VA does not systematically assess decision making consistency to determine the degree of variation that occurs for specific impairments and to provide a basis for identifying steps that could be taken, if considered necessary, to reduce such variation. In its 2003 performance plan, VA acknowledged that veterans are concerned about the consistency of disability claims decisions across the 57 regional offices. In a nationwide comparison, VBA projected in its fiscal year 2001 Annual Benefits Report that the average compensation payments per disabled veteran in fiscal year 2002 would range from a low of $5,783 in one state to a high of $9,444 in another state. According to a VBA official, this disparity in average payments per veteran might be due in part to demographic factors such as differences in the average age of veterans in each state. However, this disparity in average payments per veteran also raises the possibility that when veterans in the same age group submit claims for similar medical conditions, the regional office in one state may tend to give lower disability ratings than the regional office in another state. Indeed, in 1997, the National Academy of Public Administration reviewed disability claims processing and said VA needed to identify the degree of decision-making variation expected for specific medical issues, set consistency standards, and measure the level of consistency as part of the quality review process or through testing of control cases in multiple regional offices. Furthermore, in 2001, VA's Claims Processing Task Force said there was an apparent lack of uniformity among regional offices in interpreting and complying with directives from VA headquarters and that VA's regulations and the procedures manual for regional offices were in dire need of updating. The task force concluded that there was no reasonable assurance that claims decisions would be made as uniformly and fairly as possible to the benefit of the veteran. Even though such concerns and issues exist, VA does not systematically assess the decision- making consistency of regional office adjudicators. Similarly, VA does not assess consistency between decisions made by regional offices and the Board even though evidence suggests this issue may warrant VA's attention. Because veterans may submit new evidence during the appeals process, one might assume that the Board generally grants benefits denied by regional offices due to the impact of such new evidence. However, an analysis in 1997 of about 50 decisions in which the Board had granted benefits previously denied by regional offices yielded a different viewpoint. Staff from both VBA and the Board reviewed these cases and concluded that most of these Board decisions to grant benefits had been based on the same evidence that the regional offices had considered in reaching their decisions to deny benefits. The reviewers characterized the reason for the Board members' decisions to grant benefits as a difference of opinion between the Board members and regional office adjudicators in the weighing of evidence. Furthermore, even in remanded compensation cases for which regional offices have obtained new evidence in accordance with the Board's remand instructions and then again denied the benefits, the Board generally has granted benefits in about 26 percent of these cases after they have been resubmitted for a final decision. This seems to indicate that, in these particular cases, Board members in some way differed with regional office adjudicators on the impact of the new evidence obtained by the regional offices before resubmitting the remanded cases to the Board. Available evidence also provides indications that the issue of variations in decision making among the Board members themselves may warrant VA's attention in studies of consistency. Historically, there have been variances in the rates at which the Board's four decision teams have remanded decisions to regional offices for rework. No systematic study has been done to explain the variances in remand rates. Board officials said that it is their perception that the remand rates vary among the Board's decision teams because the quality of claims processing varies among the regional offices for which each team is responsible. Similar concerns about consistency of claims adjudication in the Social Security Administration (SSA) have prompted SSA to begin taking steps to assess consistency issues in its disability program. As we reported in 1997, SSA's primary effort to improve consistency has focused on decision- making variations between its initial and appellate levels. To gather data on variations between these two levels, SSA instituted a system in 1993 under which it selects random samples of final decisions made by administrative law judges and reviews the entire decisional history of each case at both the initial and appellate levels. The reviewers examine adjudicative and procedural issues to address broad program issues such as whether a claim could have been allowed earlier in the process. Data captured through this system have provided a basis for taking steps to clarify decision-making instructions and provide training designed to improve consistency between the initial and appellate levels. However, no systematic evaluations have been done to determine the effectiveness of these actions. In its January 2001 disability management plan, SSA said that it needed to take further steps to promote uniform and consistent disability decisions across all geographic and adjudicative levels. Opportunities exist to improve the quality of the Board's reporting of accuracy and decision making. The Board includes nonsubstantive deficiencies in its accuracy rate calculation. By doing so, the Board may be obscuring what is of real concern. In addition, the Board's quality assurance database does not capture data on specific medical disability issues related to the reasons for errors found in Board decisions. Also, in contrast with VBA, the Board's quality assurance program does not collect information on cases in which quality reviewers do not charge errors but have differences of opinion with judgments made by Board members. We believe that analysis of such data could lead to improvements in quality through improved training or by clarifying regulations, procedures, and policies. Furthermore, because variations in decision making are to be expected due to the difficult judgments that adjudicators often must make, one must ask the questions: For a given medical condition, how much variation in decision making exists and does the degree of variation suggest that VA should take steps to reduce the level of variation? VA, however, does not assess variation in decision making. None of the quality review efforts of either VBA or the Board are designed to systematically assess the degree to which veterans with similar medical conditions and circumstances may be receiving different decisional outcomes or to help identify steps that could reduce such variation if necessary. Without ongoing systematic assessments of consistency across the continuum of decision making, VA cannot adequately assure veterans that they can reasonably expect to receive consistent treatment of their claims across all decision-making levels in VA. We recognize that our recommendations will have to be implemented within the context of VA's current major efforts to reduce a large and persistent backlog of disability claims and appeals and to reduce the average processing time. Nevertheless, we believe it is critical that VA take the necessary steps to support improvements in training and in regulations, procedures, or policies that could enhance the quality of disability decision making across the continuum of adjudication and to help provide adequate assurance to veterans that they will receive consistent and fair decisions as early as possible in the process. Indeed, maintaining and improving quality should be of paramount concern while implementing a major effort to reduce backlogs and processing time. Accordingly, we recommend that the Secretary of VA direct the Chairman of the Board of Veterans' Appeals to: Revise the quality assurance program so that, similar to VBA, the calculation of accuracy rates will take into account only those deficiencies that would be expected to result in a reversal of a Board decision by the U.S. Court of Appeals for Veterans Claims or result in a remand by the court. Revise the Board's quality assurance program to record information in the quality review database that would enable the Board to systematically analyze case-specific medical disability issues related to specific errors found in Board decisions in the same way that the Board is able to analyze the reasons that the court remands Board decisions. Monitor the experience of VBA's quality assurance program in collecting and analyzing data on cases in which VBA's quality reviewers do not record errors but have differences of opinion with regional office adjudicators in the judgments made to reach a decision. If VBA finds that the analysis of such data helps identify training that can improve the quality of decision making, the Board should test such a process in its quality assurance program to assess whether it would enable the Board to identify training that could improve the quality of Board decisions. We also recommend that the Secretary direct the Under Secretary for Benefits and the Chairman of the Board of Veterans' Appeals to jointly establish a system to regularly assess and measure the degree of consistency across all levels of VA adjudication for specific medical conditions that require adjudicators to make difficult judgments. For example, VA could develop sets of hypothetical claims for specific medical issues, distribute such hypothetical claims to multiple adjudicators at all decision-making levels, and analyze variations in outcomes for each medical issue. Such a system should provide data to determine the degree of variation in decision making and provide a basis to identify ways, if considered necessary, to reduce such variation through training or clarifying and strengthening regulations, procedures, and policies. Such a system should also assess the effectiveness of actions taken to reduce variation. If departmental consistency reviews reveal any systematic differences among VA decision makers in the application of disability law, regulations, or court decisions, the Secretary should, to the extent that policy clarifications by VBA cannot resolve such differences, direct VA's General Counsel to resolve these differences through precedent legal opinions if possible. We received written comments on a draft of this report from VA (see app. III). In its comments, VA concurred fully or in principle with our recommendations. With regard to our first recommendation, VA said that the Board intends to revise its quality review system to count only substantive errors for computational and benchmarking purposes but will continue to track all errors. On the basis of VA's comments, we also modified the report to accurately reflect the standard of review employed by the U.S. Court of Appeals for Veterans Claims in reviewing Board decisions. With regard to our second recommendation, VA said that it would use its Veterans Appeals Control Locator System to gather information on case-specific medical disability issues related to specific errors found in Board decisions. VA questioned our basis for concluding that tracking such information will yield useful data for improving the adjudication system. As stated in the draft report, we based our recommendation on the fact that the Board has already concluded that such information is beneficial for analyzing the reasons for remands from the Court of Appeals for Veterans Claims. With regard to our third recommendation, VA said representatives of the Board and VBA will meet so that a system may be established for the Board to access and review VBA's methodology for assessing, reporting, and evaluating instances of "difference of opinion" between the quality reviewer and the decision maker. In its comments, VA concurred in principle with our fourth recommendation. VA agreed that consistency is an important goal and acknowledged that it has work to do to achieve it. However, VA was silent on how it would measure consistency for specific medical conditions that require adjudicators to make difficult judgments. Instead, VA described the kinds of actions underway that it believes will generally reduce inconsistency. While we support these efforts, we maintain that without a way to evaluate and measure consistency, VA will be unable to determine the extent to which such efforts actually improve consistency of decision- making across all levels of VA adjudication now and over time. Neither will VA have information needed to identify ways to reduce decision- making variations for specific medical conditions, if considered necessary. As agreed with your office, unless you publicly announce its contents earlier, we plan no further distribution of this report until 30 days after its issue date. At that time, we will send copies of this report to the Secretary of the Department of Veterans Affairs, appropriate congressional committees, and other interested parties. We will also make copies available to others upon request. In addition, the report will be available at no charge on the GAO Web site at http://www.gao.gov. If you have questions about this report, please call me on (202) 512-7101 or Irene Chu on (202) 512-7102. Other key contributors were Ira Spears, Steve Morris, Patrick diBattista, and Mark Ramage. 57 Regional Offices Decide Claims and Notify Veterans of Decisions (estimated disposition of 100,000 compensation claims filed with regional offices) 1. Veterans either agree with regional offices decisions 2. Veterans submit Notices of Disagreement to regional or take no further action in 90,880 cases. offices in 9,120 cases. In 3,657 of these cases, veterans go on to file appeals with the Board. Board of Veterans' Appeals Board Members Review Regional Office Decisions Appealed by Veterans (estimated disposition of 3,657 compensation cases appealed to Board) 3. Board remands 1,311 cases to regional 4. Board grants at least one requested offices to develop further evidence and reconsider their decisions (Board remands 211 of these cases twice). benefit in 1,153 cases (Board makes 269 of these grants after regional offices resubmit remanded cases). 5. Board denies all benefits in 1,956 cases (Board makes 494 of these denials after regional offices resubmit remands). 6. Regional offices obtain more evidence 7. Regional offices obtain 9. Veterans appeal 307 but deny requested benefits in 839 cases and resubmit these cases to the Board for a final decision (of the 211 remanded twice, regional offices deny benefits in 135 and resubmit them to Board). more evidence and grant requested benefits in 339 cases (47 of these 339 grants occur after the second remand). 8. Veterans withdraw or regional offices close 209 cases (28 of these 209 withdrawals or closures occur after second remand). cases to U.S. Court of Appeals for Veterans Claims. U.S. Court of Appeals for Veterans Claims Court Reviews Board Decisions Appealed by Veterans (estimated disposition of 307 compensation cases appealed to the court) 10. Court dismisses 74 cases on 11. Court affirms Board decisions in whole 12. In whole or in part, Court reverses Board procedural grounds. in 77 cases (all requested benefits denied). decisions (grants requested benefits) or remands Board decisions in 156 cases. U.S. Court of Appeals for the Federal Circuit. the cases remanded by the court. The estimated disposition by VA's regional offices of the 100,000 claims (in boxes 1 and 2) is based on data for claims involving disability ratings for fiscal years 1997 to 2000. During those years, veterans submitted Notices of Disagreement in about 9 percent of the regional office decisions and went on to file appeals with the Board in about 40 percent of the cases in which they had submitted such notices. On the basis of Board data for fiscal years 1999 and 2000, in its initial decisions on appealed compensation cases, the Board: (1) granted at least one of the requested benefits in about 24 percent of the cases, (2) denied all requested benefits in about 40 percent of the cases, and (3) remanded about 36 percent of the cases to regional offices for rework. After obtaining the additional evidence required by the Board for remanded cases, the regional offices granted requested benefits in about 22 percent of the remanded cases and denied requested benefits in 64 percent of the cases. After regional offices resubmitted denied cases to the Board for a final decision, the Board granted at least one of the requested benefits in about 26 percent of the cases, denied all benefits in about 49 percent, and remanded about 25 percent once again to regional offices for further rework. For this illustration, we assumed that the Board did not remand a case more than two times. The estimate of 307 cases appealed to the U.S. Court of Appeals for Veterans Claims (in box 9), the court's estimated disposition of these 307 cases (in boxes 10, 11, 12), and the estimated number of decisions appealed to the U.S. Court of Appeals for the Federal Circuit (in box 13) are based on fiscal years 1999 and 2000 data from the court's annual reports. Appendix II: Board of Veterans' Appeals Illustrations of Difficult Judgments Resulting in Decision-Making Variations Examples of difficult judgments To be granted benefits for post-traumatic stress disorder, a veteran's claim must have credible evidence that a stressor occurred during military service. Assume the record shows a claimant served in Vietnam as a supply specialist, and he identified mortar attacks as a stressor. Reports prepared by his military unit in Vietnam indicate a single enemy mortar attack occurred where the claimant was stationed. The claimant's testimony was vague about the number and the time of the attacks. One adjudicator may rely on the unit's reports and conclude the claimant engaged in combat and is entitled to have his lay statements accepted without further corroboration as satisfactory evidence of the in-service stressor. Another adjudicator may conclude that the claimant is not credible as to exposure to enemy fire and require other credible supporting evidence that the in-service stressor actually occurred. Assume an appeal for either service connection or a higher disability rating has two conflicting medical opinions, one provided by a medical specialist who reviewed the claim file but did not actually examine the veteran and a second opinion provided by a medical generalist who reviewed the file and examined the veteran. One adjudicator could assign more weight to the specialist's opinion, while another could find the generalist's opinion to be more persuasive. Thus, depending on which medical opinion is given more weight, one adjudicator could grant the claim and the other deny it. Yet, a third adjudicator could find both opinions to be equally probative and conclude that VA's "benefit-of-the-doubt" rule requires that he decide in favor of the veteran's request for either service-connection or a higher disability rating. Under the benefit-of-the-doubt rule, if an adjudicator concludes that there is an approximate balance between the evidence for and the evidence against a veteran's claim, the adjudicator must decide in favor of the veteran. The Rating Schedule does not provide objective criteria for rating the degree to which certain spinal impairments limit a claimant's motion. The adjudicator must assess the evidence and draw a conclusion as to whether the limitation of motion falls into one of three severity categories: "slight, moderate, or severe." Similarly, in assessing the severity of incomplete paralysis, the adjudicator must draw a conclusion as to whether the veteran's incomplete paralysis falls into one of three severity categories: "mild, moderate, or severe." In each case, each severity category in itself encompasses a range of severity, and the judgment as to whether a claimant's condition is severe enough to cross over from one severity range into the next could vary in the minds of different adjudicators. The Rating Schedule provides a formula for rating the severity of a veteran's occupational and social impairment due to a variety of mental disorders. However, the formula actually is a nonquantitative, behaviorally oriented framework for guiding adjudicators in making judgments and drawing conclusions as to which of the following characterizations best describes the degree to which a claimant is occupationally and socially impaired: (1) totally impaired; (2) deficient in most areas such as work, school, family relations, judgment, thinking, or mood; (3) reduced reliability and productivity; (4) occasional decrease in work efficiency and intermittent periods of inability to perform occupational tasks; (5) mild or transient symptoms that decrease work efficiency and ability to perform occupational tasks only during periods of significant stress or symptoms can be controlled by continuous medication, and (6) not severe enough to interfere with occupational or social functioning or to require continuous medication. | For fiscal year 2002, the Department of Veterans Affairs (VA) will pay $25 billion in cash disability benefits to 3.3 million disabled veterans and their families. Veterans who are dissatisfied with VA's 57 regional offices' decisions may file appeals with VA's Board of Veteran's Appeals. In about half of such appeals, the Board has either granted the benefits denied or returned the cases to regional offices for rework. Additionally, VA reported an accuracy rate of less than 70 percent for regional office disability decisions when it tested a new quality assurance program in fiscal year 1998. When the Board itself denies benefits, veterans may appeal to the U.S. Court of Appeals for Veterans Claims. In over half of these appeals, the Court has either granted the benefits denied by the Board or returned the decisions to the Board for rework. In fiscal year 1998, the Board of Veteran's Appeals established a quantitative evaluation program to score its decisionmaking accuracy and collect data to improve decisionmaking. The accuracy measure used by the Board understates its true accuracy rate because the calculations include certain deficiencies, such as errors in a written decision's format, which would not result in either a reversal or a remand by the Court. VA does not assess the consistency of decisionmaking across the regional office and Board disability adjudicators even though VA acknowledges that in many cases adjudicators of equal competence could review the same evidence but render different decisions. Although available evidence indicates that variations in decisionmaking occur across all levels of VA adjudication, VA does not conduct systematic assessments to determine the degree of variations that occurs for specific impairments and to provide a basis for determining ways to reduce such variations. | 7,475 | 369 |
The Rocky Mountain Arsenal, established in 1942, occupies 17,000 acres northeast of Denver, Colorado, and is contaminated from years of chemical and weapons activities. The Army manufactured chemical weapons, such as napalm bombs and mustard gas, and conventional munitions until the 1960s and destroyed weapons at the Arsenal through the early 1980s. In addition, it leased a portion of the Arsenal to Shell Oil Company from 1952 to 1987 to produce herbicides and pesticides. The Arsenal was placed on the Environmental Protection Agency's (EPA) National Priorities List, the list of the nation's most heavily contaminated sites, in July 1987. More than 300 species of birds, mammals, amphibians, reptiles, and fish can be found on the installation. Once the EPA certifies the cleanup is complete, the Arsenal is to become a national wildlife refuge managed by the Fish and Wildlife Service. Refuge management activities are already underway. (App. I shows the key physical features of the Arsenal.) Waste disposal practices used by the Army and Shell in the past have resulted in extensive soil and groundwater contamination. Some of the common contaminants include nerve agents, diisopropyl methyl phosphorate (DIMP), and the pesticides dieldrin and aldrin. Other contaminants include heavy metals, such as arsenic, lead, chromium, and mercury, and volatile organic compounds, such as benzene, toluene, and xylene. The 209 contaminated sites on the Arsenal are divided into on-post and off-post segments. The on-post sites include all contaminated structures, water, and soil within the boundaries of the Arsenal. The off-post sites include a region north of the Arsenal requiring cleanup because of migrating groundwater contamination. Cleanup at the Arsenal is subject to the legal requirements of the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) of 1980, as amended (42 U.S.C. 9601); the Resource Conservation and Recovery Act of 1976, as amended (42 U.S.C. 6901); and state laws. (See app. II for a description of the CERCLA process.) The Army is in charge of the cleanup under a Federal Facility Agreement, which was signed in 1989. The signatories include the Army; Shell Oil Company; the EPA; and the Departments of Justice, the Interior, and Health and Human Services. The agreement established a framework for cleanup and a process to resolve formal disputes among the parties. However, the state of Colorado was not a party to the Federal Facility Agreement because of litigation with the Army and Shell. A court-appointed mediator facilitated negotiations between the parties over several years. The recent conceptual agreement for cleaning up Rocky Mountain Arsenal may mark a turning point in years of conflict that has slowed the implementation of permanent cleanup remedies and increased costs. According to Army, EPA, state of Colorado, and Shell officials, long-standing disagreements and extensive studies have diverted key staff and contractors away from the cleanup program and driven costs up. In the 20 years since the installation restoration program began, the Army and Shell have spent about $1 billion to study and control the environmental damage. The majority of the cost has been for studying the site and resolving disagreements. Totaling $354 million as of December 1994, the Arsenal's study phase is the costliest in the history of DOD's cleanup program. However, about $316 million was spent on interim remediation projects to cut off contamination pathways. These actions may contribute significantly to permanent solutions. (App. III contains a time line of the Arsenal's installation restoration program.) The most recent delay in adopting a cleanup plan for the Arsenal was caused by disagreements over cost-effectiveness and alternative cleanup remedies. EPA's and the state of Colorado's initial cleanup proposals were estimated to cost about $2.7 billion; Shell Oil Company's was $1.6 billion; and the Army's was in the middle, at about $2.1 billion. According to officials from the Army, EPA, and the state of Colorado, the 2-year debate involved how to clean up contaminated soils on the Arsenal and contaminated water off the Arsenal. All parties agreed that soils should remain on-site, because moving them off-site would be prohibitively expensive. However, while the Army and Shell suggested that untreated soils be capped in place to prevent the spread of contaminants, EPA and the state suggested that contaminated soils should be treated to neutralize them, before they are capped or placed in a landfill. The key off-post issue involved groundwater quality standards for water contaminated with DIMP, a by-product of nerve agent production. In 1993, the state promulgated a drinking water standard of 8 parts per billion. The Army and Shell wanted to continue to pump and treat the water to meet EPA's health advisory of 600 parts per billion, while the state wanted the Army to provide the residents with an alternative water supply. Largely due to the volume of lawsuits, formal disputes, and other disagreements, the Rocky Mountain Arsenal has experienced the costliest study phase in DOD's history. According to DOD reports, the Arsenal's study costs represent at least 16 percent of the Army's total study costs for about 1,200 installations. The Arsenal's study phase began more than 20 years ago and was completed recently, in October 1995, when the Army requested public comment on its preferred remedy. As of December 1994, Shell and the Army had spent approximately $354 million on studies, which represents about 37 percent of the total costs incurred by Shell and the Army at the Arsenal. Figure 1 shows shared cleanup costs by category. Historical cost (1975-87) Shell contribution remaining in special account Total: $961 million Over 400 studies have been conducted at the Arsenal since 1983. Approximately 14,000 samples were taken and 230 reports were produced during the study phase. Although the complexity of the site warranted study, according to Army, EPA, and state officials, the litigation and other disputes encouraged excessive and duplicative studies. For example, had the parties come to an earlier agreement on the installation's future use and on levels of ecological standards, some of the studies might have been avoided. Relationships among the key parties have been strained by differences throughout the history of the cleanup program, but particularly since 1983 when two major lawsuits were filed. The Army sued Shell, and the state of Colorado sued the Army and Shell to recover compensation for natural resource damages and cleanup costs. The state sued the Army again in 1986 to enforce regulatory authority over parts of the cleanup. Although the Army and Shell settled their suit in 1988, the first Colorado case has not yet been resolved and the second case went to the U.S. Supreme Court. In January 1994, the Supreme Court refused to hear the case, letting stand the lower court's decision in favor of Colorado's jurisdiction. The key parties' exhaustive efforts to resolve their legal disputes involved 7 years of assistance from a court-appointed mediator. (See app. IV for a detailed chronology of major legal actions involving Rocky Mountain Arsenal.) In addition to the lawsuits, more than 140 issues have been taken to formal dispute since 1987 under the Federal Facility Agreement, which allows the parties to dispute Army decisions. Disputes have been triggered by a variety of technical issues, often requiring further studies to resolve the controversy. For example, the parties disagreed about what level of dieldrin is considered safe in soil. The Army, EPA, and Shell have all conducted and evaluated studies on this issue, yielding different results and reaching different conclusions. This dispute was invoked in December 1987 and is still not resolved. According to Army, EPA, and state officials, study results are particularly sensitive because precedents set at the Arsenal could potentially have ramifications for Shell Oil Company at its other locations. Although final cleanup has not begun, the Army and Shell have made efforts to mitigate the most critical threats at the Arsenal. As of December 1994, they had spent about $316 million on source control and interim actions designed to provide immediate containment or treatment of some of the more highly contaminated areas. Early assessments, conducted between 1975 and 1985, identified ways to minimize the potential for exposure to and migration of contaminants. Resulting projects included the installation of three groundwater treatment systems at the Arsenal's boundary, the closure of an abandoned well, and the removal of sewer lines known to be a source of soil and groundwater contamination. Building on earlier source control efforts, the Army began its interim actions in 1986 to control immediate problems while the final cleanup solutions were being determined. The resulting 14 interim actions were designed to be consistent with long-term comprehensive cleanup on and off the Arsenal. Two of these, the incineration of liquid waste from the Arsenal's major disposal basin and the removal of asbestos, have permanently removed the hazardous materials. Table 1 shows, for each of the 14 actions, the start date, actual or estimated completion date, and the actual or estimated cost as of December 1994. If the parties are successful in adopting the on- and off-post cleanup plans as expected in 1996, the final cleanup can begin. The conceptual agreement reached in June 1995 resolved the major disputes and outlined a $2.1-billion cleanup to be completed in 2012. However, the current cost and completion targets may be overly optimistic given remaining uncertainties about the final details. In addition, costs have significantly increased over time at the Arsenal. According to the conceptual agreement, the parties are expected in 1996 to adopt a final cleanup plan or record of decision for a $2.1-billion cleanup effort. Although most of the cleanup is expected to be accomplished by 2012, groundwater treatment and monitoring will continue for at least 30 years. The conceptual agreement resolves the two most significant disputes among the parties, regarding contaminated soils on site and contaminated groundwater off site. The parties agreed that a portion of basin F, the most contaminated of the basins, will be solidified in place through a technique that binds the soil together to minimize the release of contaminants but does not destroy them. Contaminated soil excavated from the basin in 1988 will be removed from the basin area and contained, along with other highly contaminated portions of the Arsenal, in a hazardous waste landfill. The basin will then be capped. The parties also agreed on demolition and on-site disposal for buildings in the manufacturing areas. Structures with high levels of contamination, such as agent residues, may be treated to reduce the contamination before they are placed in the landfill. Structural debris that is uncontaminated or has low levels of contamination will not be disposed of in the landfill; it will be consolidated in the other major basin, basin A, and capped. Regarding off-site contaminated groundwater, the parties agreed to continue operating existing groundwater treatment systems at the Arsenal's boundary, where the water will be treated to meet Colorado's groundwater standard of 8 parts per billion of DIMP. The Army and Shell will also supply clean water to residents living near the Arsenal's boundaries. The parties agreed in concept on a $2.1-billion cleanup, but until the record of decision is finalized, the cost and time frame estimates remain uncertain. The cleanup estimate reported to Congress just prior to the June settlement called for $2.3 billion in appropriated funds, in addition to Shell's $500-million share, for a total of $2.8 billion. According to Army officials, the $2.8 billion represented a reduction from a $3.6-billion estimate prepared just 2 months earlier. The Army did not have a detailed analysis at the time of our fieldwork that explained how the conceptual agreement reduced the estimate to $2.1 billion. The Army expects to complete its analysis for the May 1996 record of decision. The Army's projected cost estimates and cleanup dates have changed significantly since 1984. The $2.1 billion estimated for the conceptual agreement is 10 times greater than the best case estimate released a decade ago. The 1984 projections of a record of decision by 1990 and cleanup by 2000 are now estimated for 1996 and 2012, respectively. The cost and completion schedules recently established could be affected by numerous uncertainties. Budget limitations that reduce the scope or extend the life of the cleanup, cleanup complications, and evolving standards could drive up costs and extend time frames. In July 1994, we reported Army officials' concern that stricter state standards could increase cleanup costs at the Rocky Mountain Arsenal by at least $1 billion. Although the conceptual agreement should make this less likely, Army officials noted continuing uncertainties regarding the scope of the state's regulatory authority. In addition, the Army's $2.1-billion cleanup estimate does not include an estimated $200 million for inflation, or costs of long-term operations and maintenance for the off-post treatment facility. Under the cost-sharing agreement between the Army and Shell, Shell's share of cleanup costs decreases on a sliding scale from 50 percent to 20 percent as total costs increase. The agreement was reached in 1989, when the cost estimates were lower than now. According to officials from the Army, EPA, and the Department of Justice, the formula was based on the best available knowledge of risk and damages at the time. However, Shell's share of total costs has dropped significantly as cleanup costs exceeded the early estimates; the current estimate is more than 3 times higher than estimated at the time of the settlement. According to Arsenal and Shell officials, the Army will pay about $1.6 billion, and Shell about $500 million toward the $2.1 billion cleanup. When the permanent cleanup begins, Shell's 20 percent share of the costs will be significantly less than its share of remaining contaminants. Because its operations contributed to the contamination problem, Shell agreed to pay a portion of the cleanup costs. The cost-sharing formula divides cleanup costs equally between the Army and Shell for the first $500 million of allocable or shared costs, but then reduces Shell's share to 35 percent of the next $200 million of these costs, and 20 percent of all allocable costs exceeding $700 million. Each party agreed to absorb its own program management costs. "Army-only" and "Shell-only" costs, for contamination solely attributed to each party, are also excluded from the allocable formula. When the Army and Shell adopted the cost-sharing formula, cleanup costs were expected to be less than $700 million, not the currently estimated $2.1 billion. Even though the permanent cleanup is not yet underway, the parties have already arrived at the second level of the cost-sharing formula; allocable costs reached $500 million in 1994. According to Army, EPA, and state officials, Shell's 20-percent share of the final costs has an inverse relationship to its share of remaining contaminants that are to be cleaned up. They stated that from a risk management perspective, the contaminants driving the majority of the final cleanup costs will be those related to Shell's production activities. According to Army and EPA officials, the cost-sharing formula was negotiated when much less was known about the extent of Arsenal contaminants and associated risks. In addition, an Army attorney said that the decision to reduce Shell's share as costs increased was an equitable way of recognizing that the Army owned the installation and the disposal systems that Shell used. In retrospect, these officials noted that a declining formula is probably not the best approach to use in allocating shares, particularly early in the study phase before the contaminants have been fully characterized. The Army and Shell have already spent nearly $1 billion of the current $2.1-billion estimate. As of December 1994, the Army had spent about $687 million of its estimated $1.6-billion share and Shell had contributed about $274 million of its expected $500-million share. The Army's $687-million share breaks down into about $431 million in shared or allocable costs and $256 million in Army-only costs. Total allocable costs paid by both parties represent about $589 million of the total. Although Shell contributed about $274 million toward the allocable costs, the Army has not yet spent $80 million of this amount. Figure 2 shows Army and Shell expenditures as of December 1994. Army ($687 million) Shell ($274 million) Shell pays its share of cleanup costs directly to a government account. As of December 1994, Shell had contributed about $274 million of the $500 million it is expected to pay. About $116 million of the $274 million was deposited into the Shell account, and the other $158 million represented costs Shell incurred directly at the Arsenal. Shell was credited, for example, for conducting one of the Arsenal's costliest projects--the incineration of liquid waste. Legislation restricts use of Shell's reimbursements to cleanup projects at the Arsenal. As of December 1994, the Army had spent approximately $36 million from the $116 million that Shell had deposited into the account, leaving about $80 million for future obligations. The funds are retained by the U.S. Treasury until they are requested. According to Army officials, the funds in the Shell account are generally not used to offset budget requirements. Rather, the funds are used to supplement appropriations from the Defense Environmental Restoration Account. The Arsenal's annual work plans outline requirements for appropriated funds, and those requirements are rolled up and consolidated into a DOD budget request. Therefore, according to these officials, the Shell funds are not visible in the budgeting process as requests proceed from the Army to DOD and Congress and do not influence funding decisions. Officials said it is not feasible to use the Shell funds to offset budget requirements in most instances because they do not represent a steady fixed flow and they are not fiscal year specific. The Arsenal's allocation for fiscal year 1995 was about $70 million, which is less than the balance available in the Shell account. In discussing a draft of our report, DOD officials agreed with the report's findings and conclusions. Their comments have been incorporated where appropriate. We performed our work at the Rocky Mountain Arsenal, Commerce City, Colorado; EPA's Region VIII headquarters; and the Colorado Department of Health, Denver. To determine the status of the cleanup work at the Rocky Mountain Arsenal, we attended public hearings and reviewed applicable documents and records maintained by DOD and EPA. We also interviewed officials from the Departments of the Army, the Interior, and Justice; EPA; and the state of Colorado. To assess plans for future cleanup at the Arsenal, we interviewed officials from the Army, EPA, the Fish and Wildlife Service, and the state of Colorado. We also reviewed the Federal Facility Agreement and the conceptual agreement for Arsenal cleanup. To understand the cost-sharing arrangement between the Army and Shell, we reviewed the settlement agreement, financial manual, and other pertinent documents. We also interviewed officials from the Army, EPA, and the Department of Justice. We conducted our review from October 1994 to January 1996 in accordance with generally accepted government auditing standards. Unless you publicly announce its contents earlier, we plan no further distribution of the report until 30 days after its issue date. At that time, we will send copies to appropriate congressional committees; the Secretaries of Defense and the Army; the Administrator, EPA; and the Director of the Office of Management and Budget. We will also make copies available to others upon request. Please contact me on (202) 512-8412 if you or your staff have any questions concerning this report. Major contributors to this report are listed in appendix V. Located 9 miles northeast of downtown Denver, Rocky Mountain Arsenal is adjacent to the communities of Commerce City, Montbello, and rural Adams County. Key physical features of the Arsenal include the north and south chemical manufacturing complexes, numerous pits and trenches, and a series of man-made lakes and basins A through F. Liquid waste from the two manufacturing complexes was discharged into basins A, B, C, D, and E, a series of unlined waste evaporation ponds. In the mid-1950s, the Army discharged all liquid waste to basin F, a newly constructed asphalt-lined waste basin. Solid waste was disposed of in the trenches and pits. The man-made lakes were used to provide process and cooling water to facilities within the south plants area. (See fig. I.1.) The initial stage of the cleanup program is an installationwide study to determine if sites are present that pose hazards to public health or the environment. Available information is collected on the source, nature, extent, and magnitude of actual and potential hazardous substance releases at sites on the installation. The next step consists of sampling and analysis to determine the existence of actual site contamination. Information gathered is used to evaluate the site and determine the response action needed. Uncontaminated sites do not proceed to later stages of the process. Remedial investigation may include a variety of site investigative, sampling, and analytical activities to determine the nature, extent, and significance of the contamination. The focus of the evaluation is determining the risk to the general population posed by the contamination. Concurrent with the remedial investigations, feasibility studies are conducted to evaluate remedial action alternatives for the site to determine which would provide the protection required. Detailed design plans for the remedial action alternative chosen are prepared. The chosen remedial alternative is implemented. Remedial actions can be taken at any time during the cleanup process to protect public health or to control contaminant releases to the environment. Memorandum of Agreement signed by state of Colorado, the Army, Shell Oil Company, and the Environmental Protection Agency. U.S. Army litigation against Shell Oil Company for natural resource damages and cleanup costs. State of Colorado filed suit for damages to natural resources and state money spent responding to contamination. Memorandum of Agreement considered invalid. Colorado filed suit to enforce Army compliance with the Resource Conservation and Recovery Act on basin F. Army and Shell Oil Company settled 1983 suit by signing consent decree. State of Colorado won the 1986 suit and issued an administrative order requiring the Army to follow its closure plan at basin F; Army filed suit disputing administrative order. Court granted Army's motion and affirmed EPA's role as final authority at Rocky Mountain Arsenal; state appealed. 10th Circuit Court of Appeals ruled in favor of Colorado. Army appealed to U.S. Supreme Court. Certiorari denied. Patricia Foley Hinnen Maria Durant Mark McClarie Stephen Gaty The first copy of each GAO report and testimony is free. Additional copies are $2 each. Orders should be sent to the following address, accompanied by a check or money order made out to the Superintendent of Documents, when necessary. VISA and MasterCard credit cards are accepted, also. Orders for 100 or more copies to be mailed to a single address are discounted 25 percent. U.S. General Accounting Office P.O. Box 6015 Gaithersburg, MD 20884-6015 Room 1100 700 4th St. NW (corner of 4th and G Sts. NW) U.S. General Accounting Office Washington, DC Orders may also be placed by calling (202) 512-6000 or by using fax number (301) 258-4066, or TDD (301) 413-0006. Each day, GAO issues a list of newly available reports and testimony. To receive facsimile copies of the daily list or any list from the past 30 days, please call (202) 512-6000 using a touchtone phone. A recorded menu will provide information on how to obtain these lists. | Pursuant to a congressional request, GAO reviewed the cleanup program at the Rocky Mountain Arsenal, focusing on the: (1) status of cleanup efforts; (2) completion plans for the cleanup; and (3) cost-sharing plans between the Army and Shell Oil Company, which leased a portion of the Arsenal. GAO found that: (1) permanent cleanup at Rocky Mountain Arsenal has been delayed for years due to lawsuits and numerous other disputes between the parties involved; (2) in June 1995, Colorado and five other key parties signed an agreement for a conceptual remedy to address the lawsuits and disputes; (3) although about $300 million of the nearly $1 billion spent to date has been for interim actions to mitigate the most urgent environmental threats, the majority has been spent on studies and other management activities; (4) the June 1995 conceptual agreement resolves the most significant issues and paves the way for a final settlement, or record of decision, in 1996; (5) based on the agreement, the Army currently estimates the cleanup will cost $2.1 billion and take until 2012; (6) prior to the agreement, the Army had estimated a $2.8-billion to $3.6-billion cleanup effort to be complete in about 2010; (7) although the agreement addresses many of the disputed issues, the final details are yet to be negotiated; (8) until the cleanup plan is detailed and finalized in the record of decision, the cost and completion estimates will be subject to change; (9) under a 1989 settlement, the Army and Shell are sharing cleanup costs, and the costs to correct damages attributable solely to either the Army or to Shell are to be financed by the responsible party; (10) however, most contamination was commingled, and these cleanup costs will be shared under a formula requiring each party to pay 50 percent of the first $500 million in cleanup costs, with Shell's share decreasing as total costs increase; (11) although the agreement does not limit total contributions, Shell estimated its total costs will be about $500 million and so far has contributed $274 million; (12) by the time the final phase of cleanup begins in May 1996, under an expected record of decision, the Army will be responsible for 80 percent of the costs for commingled contamination; and (13) these costs represent most of the remaining cleanup. | 5,017 | 481 |
In our 2005 report, we found that facilities-related problems at the Smithsonian had resulted in a few building closures and access restrictions and some cases of damage to the collections. A few facilities had deteriorated to the point where access must be denied or limited. For example, the 1881 Arts and Industries Building on the National Mall was closed to the public in 2004 for an indefinite period, pending repair of its weakened roof panels, renovation of its interior (which had been damaged by water intrusion), and replacement of aging systems such as heating and cooling. Currently, this building remains closed. Other facilities also faced problems. We found that water leaks caused by deteriorated piping and roofing elements, along with humidity and temperature problems in buildings with aging systems, posed perhaps the most pervasive threats to artifacts in the museums and storage facilities. For example, leaks have damaged two historic aircraft at the National Air and Space Museum. Additionally, Smithsonian Archives officials told us that they had had to address 19 "water emergencies" since June 2002. These problems were indicative of a broad decline in the Smithsonian's aging facilities and systems that posed a serious long-term threat to the collections. We also found that the Smithsonian had taken steps to maximize the effectiveness of its resources for facilities. These changes resulted from an internal review and a 2001 report by the National Academy of Public Administration, which recommended that the Smithsonian centralize its then highly decentralized approach to facilities management and budgeting in order to promote uniform policies and procedures, improve accountability, and avoid duplication. The Smithsonian created the Office of Facilities Engineering and Operations in 2003 to assume responsibility for all facilities-related programs and budgets. At the time of our 2005 review, this office was adopting a variety of recognized industry best practices for managing facilities projects, such as the use of benchmarking and metrics recommended by the Construction Industry Institute and leading capital decision-making practices. Preliminary results from our ongoing work show that as of March 30, 2007, the Smithsonian estimates it will need about $2.5 billion for revitalization, construction, and maintenance projects identified from fiscal year 2005 through fiscal year 2013, an increase of about $200 million from its 2005 estimate of about $2.3 billion for the same time period. Smithsonian officials stated that to update this estimate, they identified changes that had occurred to project cost figures used in the 2005 estimate and then subtracted from the new total the appropriations the Smithsonian had received for facilities revitalization, construction, and maintenance projects for fiscal years 2005-2007. According to Smithsonian officials, this estimate includes only costs for which the Smithsonian expects to receive federal funds. Projects that have been or are expected to be funded through the Smithsonian's private trust funds were not included as part of the estimate, although the Smithsonian has used these trust funds to support some facilities projects. For example, the Steven F. Udvar-Hazy Center was funded largely through trust funds. According to Smithsonian officials, maintenance and capital repair projects are not generally funded through trust funds. At the time of our 2005 report, Smithsonian officials told us that the Smithsonian's estimate of about $2.3 billion could increase for a variety of reasons. For example, the estimate was largely based on preliminary assessments. Moreover, in our previous report, we found that recent additions to the Smithsonian's building inventory--the National Museum of the American Indian and the Steven F. Udvar-Hazy Center--and the reopening of the revitalized Donald W. Reynolds Center for American Art and Portraiture on July 1, 2006 would add to the Smithsonian's annual maintenance costs. According to Smithsonian officials, the increase in its estimated revitalization, construction, and maintenance costs through fiscal year 2013 from about $2.3 billion in our 2005 report to about $2.5 billion as of March 30, 2007, was due to several factors. For example, Smithsonian officials said that major increases had occurred in projects for the National Zoo and the National Museum of American History because the two facilities had recently had master plans developed that identified additional requirements. In addition, according to Smithsonian officials, estimates for anti-terrorism projects had increased due to adjustments for higher costs experienced and expected for security-related projects at the National Air and Space Museum. According to Smithsonian officials, the increase also reflects the effect of delaying corrective work in terms of additional damage and escalation in construction costs. According to Smithsonian officials, the Smithsonian's March 30, 2007, estimate of about $2.5 billion could also increase, as the about $2.3 billion estimate was largely based on preliminary assessments, and therefore, as the Smithsonian completes more master plans, more items will be identified that need to be done. Moreover, this estimate does not include the estimated cost of constructing the National Museum of African American History and Culture, which was authorized by Congress and which the Smithsonian notionally estimates may cost about $500 million, half of which is to be funded by Congressional appropriations. The Smithsonian's annual operating and capital program revenues come from its own private trust fund assets and its federal appropriation. According to Smithsonian officials, the Smithsonian's federal appropriation totaled nearly $635 million in fiscal year 2007, with about $99 million for facilities capital and about $536 million for salaries and expenses, of which the facilities maintenance appropriation, which falls within the salaries and expenses category, was about $51 million. In our previous work, we found that the facilities projects planned for the next 9 years exceeded funding at this level. As a result, we recommended that the Secretary of the Smithsonian establish a process for exploring options for funding its facilities needs and engaging the key stakeholders--the Smithsonian Board of Regents, the Administration, and Congress--in the development and implementation of a strategic funding plan to address the revitalization, construction, and maintenance projects identified by the Smithsonian. Smithsonian officials told us during our current review that the Smithsonian Board of Regents --the Smithsonian's governing body, which is comprised of both private citizens and members of all three branches of the federal government--has taken some steps to address our recommendation. In June 2005, the Smithsonian Board of Regents established the ad-hoc Committee on Facilities Revitalization to explore options to address the about $2.3 billion the Smithsonian estimated it needed for facilities revitalization, construction, and maintenance projects through fiscal year 2013. In September 2005, the ad-hoc committee held its first meeting, at which it reviewed nine funding options that had been prepared by Smithsonian management for addressing the about $2.3 billion in revitalization, construction, and maintenance projects through fiscal year 2013. These options included the following: Federal income tax check off contribution, in which federal income tax returns would include a check-off box to allow taxpayers to designate some of their tax liability to a special fund for the Smithsonian's facilities. Heritage treasures excise tax, in which an excise tax would be created, and possibly levied on local hotel bills, to generate funds for the Smithsonian's facilities. National fundraising campaign, in which the Smithsonian would launch a national campaign to raise funds for its facilities. General admission fee program, in which the Smithsonian would institute a general admission charge to raise funds for critical but unfunded requirements. Special exhibition fee program, in which the Smithsonian would charge visitors to attend a select number of special exhibitions as a means to raise funds to meet critical but unfunded requirements. Smithsonian treasures pass program, in which the Smithsonian would design a program through which visitors could purchase a Smithsonian treasures pass with special benefits, such as no-wait entry into facilities or behind-the-scenes tours, to raise funds to meet critical but unfunded requirements. Facilities revitalization bond, in which the Smithsonian would borrow funds such as through a private or public debt bond for the Smithsonian's facilities. Closing Smithsonian museums, in which the Smithsonian would permanently or temporarily close museums to the public in order to generate savings to help fund its facilities. Increasing Smithsonian appropriations, in which the Board of Regents and other friends of the Smithsonian would approach the Administration about a dramatic appropriations increase to fund Smithsonian's facilities. According to Smithsonian officials, after considering these nine proposed options, the ad-hoc committee decided to request an increase in the Smithsonian's annual federal appropriations, specifically deciding to request an additional $100 million over the Smithsonian's current appropriation annually for 10 years, starting in fiscal year 2008, to reach a total of an additional $1 billion. In September 2006, according to Smithsonian officials, several members of the Board of Regents and the Secretary of the Smithsonian met with the President of the United States to discuss the issue of increased federal funding for the Smithsonian's facilities. According to Smithsonian officials, during the meeting, among other things, the Regents discussed the problem of aging facilities and the need for an additional $100 million in federal funds annually for 10 years to address the institution's facilities revitalization, maintenance, and construction needs. According to Smithsonian officials, the representatives of the Smithsonian at the meeting told the President that they had no other options to obtain this $100 million except the Smithsonian's federal appropriation. According to Smithsonian officials, these representatives said the Smithsonian had made considerable expense cuts and raised substantial private funds, but donors are unwilling to donate money to repair and maintain facilities. The President's fiscal year 2008 budget proposal, published in February 2007, proposed an increase of about $44 million over the Smithsonian's fiscal year 2007 appropriation. The Smithsonian's appropriation is divided into two categories. The about $44 million increase in the President's budget proposal represented an increase of about $9 million for facilities capital and an increase of about $35 million for salaries and expenses, which includes facilities maintenance. However, funds in the salaries and expenses category also support many other activities, such as research, collections, and exhibitions, and it is not clear how much of the $35 million increase the Smithsonian would use for facilities maintenance. Moreover, Congress may choose to adopt or modify the President's budget proposal when funds are appropriated for the fiscal year. As part of our ongoing work, we are reviewing the Smithsonian's analysis of each funding option, including its potential for addressing its revitalization, construction, and maintenance needs. We plan to report on these issues later in the year. The Smithsonian's estimate for revitalization, construction, and maintenance needs has increased at an average of about $100 million a year over the past 2 years. Therefore, the Smithsonian's request for an additional $100 million a year may not actually reduce the Smithsonian's estimated revitalization, construction, and maintenance needs but only offset the increase in this estimate. Absent significant changes in the Smithsonian's funding strategy or significant increases in funding from Congress, the Smithsonian faces greater risk to its facilities and collections over time. Since our work is still ongoing, it remains unclear why the Smithsonian has only pursued one of its nine options for increasing funds to support its significant facilities needs. At this time, we still believe our recommendation that the Smithsonian explore a variety of funding options is important to reducing risks to the Smithsonian's facilities and collections. Madam Chairman, this concludes my prepared statement. I would be happy to respond to any questions you or other Members of the Committee may have at this time. We conducted our work for this testimony in March 2007 in accordance with generally accepted government auditing standards. Our work is based on our past report on the Smithsonian's facilities management and funding, our review of Smithsonian documents, and interviews with Smithsonian officials. Specifically, we reviewed the Smithsonian's revised estimated costs for major revitalization projects from fiscal year 2005 through fiscal year 2013 and documents from the Board of Regents. We also reviewed the President's fiscal year 2008 proposed budget and the Smithsonian's federal appropriations from fiscal years 2005-2007. We are continuing to evaluate the Smithsonian's efforts to strategically manage, fund, and secure its real property. Our objectives include assessing (1) the extent to which the Smithsonian is strategically managing its real property portfolio, (2) the extent to which the Smithsonian has developed and implemented strategies to fund its revitalization, construction, and maintenance needs, and (3) the Smithsonian's security cost trends and challenges, including the extent to which the Smithsonian has followed key security practices to protect its assets. We are also examining how similar institutions, such as other museums and university systems, strategically manage, fund, and secure their real property. We expect to report on these issues later this year. In addition to those named above, Colin Fallon, Brandon Haller, Carol Henn, Susan Michal-Smith, Dave Sausville, Gary Stofko, Alwynne Wilbur, Carrie Wilks, and Adam Yu made key contributions to this report. This is a work of the U.S. government and is not subject to copyright protection in the United States. It may be reproduced and distributed in its entirety without further permission from GAO. However, because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. | The Smithsonian Institution (Smithsonian) is the world's largest museum complex and research organization. The age of the Smithsonian's structures, past inattention to maintenance needs, and high visitation levels have left its facilities in need of revitalization and repair. This testimony discusses our prior work on some effects of the condition of the Smithsonian's facilities and whether the Smithsonian has taken steps to maximize facility resources. It also discusses the current estimated costs of the Smithsonian's needed facilities projects. In addition, it describes preliminary results of GAO's ongoing work on the extent to which the Smithsonian developed and implemented strategies to fund these projects, as GAO recommended in prior work. The work for this testimony is based on GAO's 2005 report, Smithsonian Institution: Facilities Management Reorganization Is Progressing, but Funding Remains a Challenge; GAO's review of Smithsonian documents and other pertinent information; and interviews with Smithsonian officials. In 2005, GAO reported that facilities-related problems at the Smithsonian had resulted in a few building closures and posed a serious long-term threat to the collections. For example, the 1881 Arts and Industries Building on the National Mall was closed to the public in 2004 for an indefinite period over concern about its deteriorating roof structure. Currently, this building remains closed. GAO also found that the Smithsonian had taken steps to maximize the effectiveness of its existing resources for facilities. Preliminary results of GAO's ongoing work indicate that as of March 30, 2007, the Smithsonian estimated it would need about $2.5 billion for its revitalization, construction, and maintenance projects from fiscal year 2005 through fiscal year 2013, up from an estimate of $2.3 billion in 2005. In 2005, GAO recommended that the Smithsonian develop and implement a strategic funding plan to address its facilities needs. The Smithsonian Board of Regents--the Smithsonian's governing body--has taken some steps to address GAO's recommendation regarding a strategic funding plan. The board created an ad-hoc committee, which, after reviewing nine options, such as establishing a special exhibition fee, decided to request an additional $100 million annually in federal funds for facilities for the next 10 years, for a total of an additional $1 billion. The President's fiscal year 2008 budget proposal, however, proposes an increase of about $44 million over the Smithsonian's fiscal year 2007 appropriation. It is not clear how much of this proposed increase would be used to support facilities, and how Congress will respond to the President's budget request. Absent significant changes in the Smithsonian's funding strategy or significant increases in funding from Congress, the Smithsonian faces greater risk to its facilities and collections over time. GAO is continuing to evaluate the Smithsonian's efforts to strategically manage, fund, and secure its real property. We expect to publish a report on these issues later this year. | 3,020 | 666 |
As a comprehensive health benefit program for vulnerable populations, each state Medicaid program, by law, must cover certain categories of individuals and provide a broad array of benefits. Within these requirements, however, the Medicaid program allows for significant flexibility for states to design and implement their programs, resulting in more than 50 distinct state-based programs. These variations in design have implications for program eligibility and services offered, as well as how expenditures are reported and services are delivered. Specifically, in administering their own programs, states make decisions regarding populations or health services to cover beyond what are mandated by law. States must cover certain groups of individuals, such as pregnant women with incomes at or below 133 percent of the federal poverty level (FPL), but may elect to cover them above this required minimum income level. For example, as of March 2011, some states covered pregnant women with incomes at or above 250 percent of the FPL. Similarly, while states' Medicaid programs generally must cover certain mandatory services--including inpatient and outpatient hospital services, physician services, laboratory and X-ray services, and nursing facility services for those age 21 and older--states may also elect to cover additional optional benefits and services. These optional benefits and services include prescription drugs, dental care, hospice care, home- and community-based services, and rehabilitative services. In addition, even among states that offer a particular benefit, the breadth of coverage (i.e., amount, duration, and scope) of that benefit can vary greatly. For example, most states cover some dental services, but some limit this benefit to trauma care and/or emergency treatment for pain relief and infection, while others also cover annual dental exams. States also have flexibility, within general federal requirements, to determine how the services they cover will be delivered to Medicaid enrollees--whether on a fee-for-service basis or through managed care arrangements. For example, under some managed care arrangements, the state pays managed care organizations a fixed amount, known as a capitation payment, to provide a package of services. States vary in terms of the types of managed care arrangements used and the eligibility groups enrolled. For example, while 12 states enrolled 50 percent or more of their disabled enrollees in comprehensive risk-based managed care in fiscal year 2011, 20 states enrolled fewer than 5 percent of disabled enrollees in such arrangements. States may also operate premium assistance programs to subsidize the purchase of private health insurance--such as employer-sponsored insurance--for Medicaid enrollees. In 2009, 35 states reported using Medicaid funds to provide premium assistance. These differences in covered services and delivery systems can affect the distribution of states' spending across categories of services. For example, states that rely heavily on managed care arrangements to provide hospital care and acute care services to their enrollees are likely to have a greater proportion of their expenditures devoted to managed care, and a lower proportion to the covered services, than states that do not have such managed care arrangements. A small percentage of Medicaid-only enrollees consistently accounted for a large percentage of total Medicaid expenditures for Medicaid-only enrollees. As shown in figure 1, there was little variation across the years we examined. In each fiscal year from 2009 through 2011, the most expensive 1 percent of Medicaid-only enrollees in the nation accounted for about one-quarter of the expenditures for Medicaid-only enrollees; the most expensive 5 percent accounted for almost half of the expenditures; the most expensive 25 percent accounted for more than three- quarters of the expenditures; in contrast, the least expensive 50 percent accounted for less than 8 percent of the expenditures; and about 12 percent of enrollees had no expenditures. These findings regarding Medicaid-only enrollees are similar to those that others have reported for all Medicaid enrollees, as well as for Medicare and personal healthcare spending in the United States. A Kaiser Family Foundation report found that in fiscal year 2001, the most expensive 1.1 percent of all Medicaid enrollees--including those dually eligible for Medicare--accounted for more than one-quarter of Medicaid expenditures, and the most expensive 3.6 percent accounted for nearly half. The Congressional Budget Office reported that in 2001, the most expensive 5 percent of Medicare enrollees in fee-for-service plans accounted for 43 percent of Medicare expenditures, and the most expensive 25 percent accounted for 85 percent. The National Institute for Health Care Management reported that in 2009, the most expensive 1 percent of the overall civilian U.S. population living in the community accounted for more than 20 percent of personal health care spending, with the most expensive 5 percent accounting for nearly half. We also found that in each state, a similarly small percentage of high- expenditure Medicaid-only enrollees was responsible for a disproportionately large share of expenditures for Medicaid-only enrollees, although the magnitude of this effect varied widely across states. For example, the percentage of expenditures for the most expensive 5 percent of Medicaid-only enrollees ranged from 28.8 percent in Tennessee to 63.2 percent in California. For additional state-by-state information about the distribution of expenditures among Medicaid-only enrollees in fiscal year 2011, see appendix II. The proportions of high-expenditure Medicaid-only enrollees in different eligibility groups were consistent from fiscal year 2009 through 2011, as shown in figure 2.enrollees were disabled (less than 10 percent), disabled enrollees were disproportionately represented in the high-expenditure group, consistently constituting about 64 percent of those with the highest expenditures. Conversely, although children were the largest group of Medicaid-only enrollees (about 50 percent), they consistently constituted about 16 percent of the high-expenditure group. The distribution of high-expenditure Medicaid-only enrollees' expenditures among selected categories of service in fiscal year 2011 varied widely across states. As noted above, managed care arrangements can affect the distribution of expenditures for covered services. For some states, such as Tennessee and Hawaii, a high percentage of expenditures were for managed care or premium assistance, and correspondingly low percentages were for expenditures such as hospital care or acute care services. For other states, such as Idaho and Oklahoma, a low percentage of expenditures were for managed care or premium assistance, and correspondingly higher percentages were for hospital care or acute care services. States' reliance on managed care plans to provide certain services limits what can be learned from the MSIS summary data regarding the services received by enrollees, because the data show the per-enrollee payments made by state Medicaid programs to plans, not the payments the plans made to providers for the services for which the plans are responsible. In a state such as Tennessee, for example, in which all Medicaid enrollees are in managed care plans that are responsible for providing hospital care and a broad array of acute care services, the state's low percentages of expenditures in those service categories reflect the delivery system structure of the state Medicaid program, not enrollees' utilization of services. The greatest variation among states in their expenditures for specific service categories was for managed care and premium assistance. As shown in figure 3, four states reported that 0 percent of their expenditures were for managed care or premium assistance. For states that did report expenditures in this category, the percentage ranged from less than 1 percent to 75 percent. Nationwide, about 15 percent of expenditures for high-expenditure Medicaid-only enrollees were in this category. The variation among states in the percentages of expenditures in this service category reflects the wide variation among states in their reliance on managed care arrangements to provide services to enrollees, and particularly disabled enrollees, who constituted almost two-thirds of high- expenditure Medicaid-only enrollees. In the five states with the highest percentage of expenditures for managed care and premium assistance, the percentage of disabled enrollees in comprehensive risk-based managed care plans ranged from 44 percent in New Mexico to more than 90 percent in Hawaii and Tennessee, compared with 0 percent in the five states with the lowest percentages of expenditures in this service category. States also varied widely--from 0 to about 45 percent--in the percentages of high-expenditure Medicaid-only enrollees' expenditures for hospital care (inpatient and outpatient). About 27 percent of nationwide expenditures for high-expenditure Medicaid-only enrollees were in this category. (See fig. 4.) Similarly, states varied widely--from nearly 0 to about 45 percent--in the percentages of high-expenditure Medicaid-only enrollees' expenditures that were for non-institutional support services other than acute or long- term support services. These other support services include hospice benefits, private duty nursing, rehabilitative services, and targeted case management. About 17 percent of nationwide expenditures were for enrollees in this category. (See fig. 5.) States also varied in the percentages of high-expenditure Medicaid-only enrollees' expenditures in other categories, if not as widely. States varied least--from 0 to 11 percent--in the percentage of expenditures for high- expenditure Medicaid-only enrollees that were for psychiatric facility care, which accounted for about 2 percent of nationwide expenditures for high-expenditure Medicaid-only enrollees. The percentage of a state's expenditures for high-expenditure Medicaid-only enrollees varied in other categories from 0 to 33 percent for acute care services,11 percent of nationwide expenditures; 0 to 25 percent for prescription drugs, which accounted for 14 percent 0 to about 23 percent for long-term non-institutional support services,expenditures; and which accounted for about 6 percent of nationwide 0 to 22 percent for long-term institutional care, 9 percent of nationwide expenditures. Long-term institutional care includes nursing facilities and intermediate care facilities for individuals with intellectual disabilities. See GAO, Medicaid: Assessment of Variation among States in Per-Enrollee Spending, GAO-14-456 (Washington, D.C.: June 16, 2014), and GAO, Medicaid: Alternative Measures Could Be Used to Allocate Funding More Equitably, GAO-13-434 (Washington, D.C.: May 10, 2013). percentage of expenditures reported in the MSIS summary file that was attributable to prescription drugs was lower on average in states that included some or all drugs in the package of services provided by managed care plans than in states that paid for all drugs on a fee-for- service basis, and the three states in which the share of expenditures that went to drugs was lowest--Arizona, Hawaii, and New Mexico--included all drugs in their managed care packages. States vary widely in the distribution of their expenditures among service categories; for state-by-state information about the percentage of high- expenditure Medicaid-only enrollees' expenditures for selected categories of services in fiscal year 2011, see appendix V. HHS reviewed a draft of this report and provided technical comments, which we incorporated as appropriate. We are sending copies of this report to the Secretary of HHS and other interested parties. The report also will be available at no charge on the GAO website at http://www.gao.gov. If you or your staffs have any questions about this report, please contact me at (202) 512-7114 or [email protected]. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this report. GAO staff who made major contributions to this report are listed in appendix VI. This appendix describes the methodology for addressing our three objectives regarding high-expenditure Medicaid enrollees who are not also enrolled in Medicare, that is, Medicaid-only enrollees. These objectives were to: (1) examine the distribution of expenditures among Medicaid-only enrollees, (2) determine whether the proportions of high- expenditure Medicaid-only enrollees in selected categories changed or remained consistent from year to year, and (3) determine whether the distribution of high-expenditure Medicaid-only enrollees' expenditures among selected categories of service varied across states. We analyzed data from the Medicaid Statistical Information System (MSIS) Annual Person Summary File. This summary file consolidates individual enrollees' claims for a single fiscal year, including data on their enrollment and expenditures. The file includes enrollee-specific information regarding enrollment categories, expenditures, dual eligibility status, age, gender, payment arrangements--including fee-for-service payments and capitated payments made to managed care organizations--and indicators for five chronic conditions and two service categories. The five chronic condition indicators are for asthma, diabetes, human immunodeficiency virus/acquired immunodeficiency syndrome (HIV/AIDS), mental health conditions, and substance abuse. The two service category indicators are for delivery or childbirth (which may include costs attributed to a mother during delivery or the child soon after birth) and long-term care residence. The summary file does not provide information on other conditions that may affect enrollees' expenditures. We used data from fiscal years 2009, 2010, and 2011--the most recent years for which data from almost all states were available. As of December 2014, the summary file did not include expenditure or enrollment data from Maine for fiscal year 2011. We made several changes to limit our analyses to Medicaid-only enrollees and ensure that the data were sufficiently reliable for our purposes. For example, because our objectives focused on Medicaid-only enrollees, we excluded those who were dually eligible for both Medicaid and Medicare. Specifically, we made the following adjustments to the data: If an individual's enrollment category was listed as child, adult, or aged, and the recorded age or other information was inconsistent with that category, we re-defined the enrollment category as unknown. We reset all negative expenditures (which can indicate adjustments to expenditures recorded in prior years) to 0. Generally, states may report adjustments to their Medicaid expenditures for up to two years. To the extent that negative expenditures reflect adjustments to prior year expenditures, retaining them would result in an underestimate of expenditures for any specific year. records because we could not determine which expenditures for these enrollees were Medicaid expenditures. After making these changes, we retained about 85 percent of the original records in the summary file for each fiscal year, counting the records from all states and the District of Columbia (but not counting records from Maine in 2011, which were unavailable). These records represent just under 65 percent of total Medicaid expenditures in these years. (We previously reported that dual-eligible enrollees--whom we excluded from our analyses--accounted for about 35 percent of total Medicaid expenditures in fiscal year 2009.) As of December 2014, the summary file did not include fiscal year 2011 expenditure data from Florida, and so we excluded Florida from all further analyses of 2011 data. We assessed the reliability of these data by performing appropriate electronic data checks and reviewing relevant documentation, and determined that the data from Idaho for 2010 were not sufficiently reliable for our purposes. We determined that the remaining data were sufficiently reliable for our purposes. Our analyses were thus based on data from all states and the District of Columbia, but excluded Idaho in fiscal year 2010, and excluded Florida and Maine in fiscal year 2011. To determine the distribution of expenditures among Medicaid-only enrollees, we calculated the cumulative frequency distribution of expenditures for enrollees. That is, we placed all Medicaid-only enrollees nationwide in rank order by their total Medicaid expenditures, from highest to lowest, and determined the cumulative percentage of nationwide expenditures for Medicaid-only enrollees attributable to enrollees as the percentage of ordered enrollees increased. We analyzed data from 3 years--fiscal years 2009, 2010, and 2011--separately to determine whether the relationship was similar or different across years. To facilitate interpretation of these frequency distributions, we also computed a mathematical coefficient that provides information about the relationship between the percentage of Medicaid-only enrollees and the percentage of total Medicaid expenditures for these enrollees--the Gini coefficient. This coefficient indicates the degree of inequality, that is, the extent to which the frequency distribution differs from one in which expenditures are equal for all enrollees. Figure 6 illustrates the difference between frequency distributions with differing Gini coefficients. To determine whether the proportions of high-expenditure Medicaid-only enrollees in selected categories changed or remained consistent from year to year, we conducted two separate analyses. For both, we defined high-expenditure Medicaid-only enrollees as the 5 percent with the highest expenditures within each state, as we had in our earlier work on high-expenditure Medicaid enrollees. For one analysis, we examined the percentage of high-expenditure Medicaid-only enrollees in five mutually exclusive eligibility groups (child, adult, aged, disabled, or unknown). For another analysis, we examined the percentage of high-expenditure Medicaid-only enrollees identified as having any one of the five chronic conditions recorded in the summary file (asthma, diabetes, HIV/AIDS, mental health conditions, or substance abuse) or either of the two services (delivery or childbirth, and long-term care residence) recorded in the summary file. Enrollees could have any of these seven conditions or services, any combination of them, or none of them. We compared the proportions of high-expenditure enrollees in each of these sets of categories in fiscal years 2009, 2010, and 2011. To determine whether the distribution of high-expenditure Medicaid-only enrollees' expenditures among selected categories of service varied across states, we again defined high-expenditure Medicaid-only enrollees as the 5 percent with the highest expenditures within each state and examined expenditures for fiscal year 2011 in eight categories of service. These categories were three types of institutional care-- hospital, long-term, and psychiatric facility; three types of non-institutional services--acute care; long-term support; and other support services, such as targeted case management or rehabilitative services; prescription drugs; and managed care and premium assistance. We identified the distribution of expenditures for high-expenditure enrollees among these types of service within each state in fiscal year 2011 and compared the distributions across states. Table 4 provides information about the distribution of expenditures among Medicaid-only enrollees nationally and in each state and the District of Columbia in fiscal year 2011, including the percentages of expenditures for Medicaid-only enrollees that were attributable to the most expensive 1, 5, 10, and 25 percent of these enrollees; the percentage of expenditures for Medicaid-only enrollees that were attributable to the least expensive 50 percent of these enrollees (including those with 0 expenditures); and the Gini coefficient, which indicates the degree of inequality; that is, the extent to which the frequency distribution differs from one in which expenditures are equal for all enrollees. These state-by-state data illustrate that states differ widely in the degree to which their distribution of expenditures varied across enrollees, but in each state, a small percentage of high-expenditure Medicaid-only enrollees was responsible for a disproportionately large share of the expenditures for Medicaid-only enrollees. Table 5 provides information about the percentage of high-expenditure Medicaid-only enrollees in five mutually exclusive eligibility groups (child, adult, aged, disabled, or unknown) nationally and in each state and the District of Columbia in fiscal year 2011. These data indicate that while there was considerable variation across the states, in each state, the greatest percentage of high-expenditure Medicaid-only enrollees were disabled and the lowest percentage in a known eligibility group were aged. Table 6 provides information about the percentage of high-expenditure Medicaid-only enrollees with certain conditions or services nationally and in each state and the District of Columbia in fiscal year 2011. The conditions are five chronic conditions recorded in the Medicaid Statistical Information System Annual Person Summary File--asthma, diabetes, human immunodeficiency virus/acquired immunodeficiency syndrome (HIV/AIDS), mental health conditions, or substance abuse. The services are two services--delivery or childbirth, and long-term care residence-- recorded in the summary file. Enrollees could have any of these conditions or services, any combination of them, or none of them. These data indicate considerable variation across states, although the majority of these enrollees in each state except Pennsylvania had at least one of these conditions or services, and within each state, mental health conditions were the most common of these conditions and services. Table 7 provides information about the percentage of high-expenditure Medicaid-only enrollees' expenditures in different categories of services nationally and in each state and the District of Columbia in fiscal year 2011, and illustrates that states vary widely in the distribution of their expenditures among service categories. These categories were three types of institutional care--hospital, long-term, and psychiatric facility; three types of non-institutional services--acute care; long-term support; and other support services, such as targeted case management or rehabilitative services; prescription drugs; and managed care and premium assistance. Expenditures for categories of service other than managed care and premium assistance do not include payments for those services that were made by managed care plans. As a result, the percentage of expenditures does not necessarily reflect enrollees' utilization of services. In addition to the contact named above, key contributors to this report were Robert Copeland, Assistant Director; Dee Abasute; Kristen Joan Anderson; Nancy Fasciano; Giselle Hicks; Drew Long; and Jennifer Whitworth. | Studies on healthcare spending generally find that a small percentage of individuals account for a large proportion of expenditures, and Medicaid--a federal-state health financing program for low-income and medically needy individuals--is no exception. Medicaid expenditures for fiscal year 2013 totaled about $460 billion, covering about 72 million enrollees, some of whom were also eligible for Medicare. More information about Medicaid enrollees who are not also eligible for Medicare (i.e., Medicaid-only enrollees) and who account for a high proportion of expenditures could enhance efforts to manage expenditures and facilitate improvements to care. GAO was asked to provide information about the characteristics of high-expenditure Medicaid-only enrollees and their expenditures. GAO (1) examined the distribution of expenditures among Medicaid-only enrollees, (2) determined whether the proportions of high-expenditure Medicaid-only enrollees in selected categories changed or remained consistent from year to year, and (3) determined whether the distribution of high-expenditure Medicaid-only enrollees' expenditures among selected categories of service varied across states. GAO analyzed data from the Medicaid Statistical Information System Annual Person Summary File for fiscal years 2009, 2010, and 2011, the most recent years for which data from almost all states were available. A small percentage of Medicaid-only enrollees--that is, those who were not also eligible for Medicare--consistently accounted for a large percentage of total Medicaid expenditures for Medicaid-only enrollees. In each fiscal year from 2009 through 2011, the most expensive 5 percent of Medicaid-only enrollees accounted for almost half of the expenditures for all Medicaid-only enrollees. In contrast, the least expensive 50 percent of Medicaid-only enrollees accounted for less than 8 percent of the expenditures for these enrollees. Of the Medicaid-only enrollees who were among the 5 percent with the highest expenditures within each state, the nationwide proportions of these enrollees in different eligibility groups (such as the disabled or children) and with certain conditions (such as asthma) or services (such as childbirth or delivery) were also consistent from fiscal years 2009 through 2011. The distribution of high-expenditure Medicaid-only enrollees' expenditures among categories of service in fiscal year 2011 varied widely across states. Expenditures for managed care and premium assistance varied most widely (from 0 to 75 percent). The Department of Health and Human Services provided technical comments on a draft of this report, which were incorporated as appropriate. | 4,856 | 569 |
RUS, EDA, Reclamation, and the Corps each have distinct missions and fund rural water supply and wastewater projects under separate programs and congressional authorizations. Furthermore, each agency has its own definition of what constitutes a rural area and a unique organizational structure to implement its programs. Specifically, RUS administers the U.S. Department of Agriculture's rural utilities programs throughout the country, which are aimed at expanding electricity, telecommunications, and water and waste disposal services. RUS provides assistance for water supply and wastewater projects through its Water and Environmental Program and defines rural areas for this program as incorporated cities and towns with a population of 10,000 or fewer and unincorporated areas, regardless of population. RUS manages this program through its headquarters in Washington, D.C., and 47 state offices, each supported by area and local offices. EDA provides development assistance to areas experiencing substantial economic distress regardless of whether or not they are rural or urban. EDA primarily provides assistance for water supply and wastewater projects in distressed areas through its Public Works and Development Facilities Program and uses a U.S. Census Bureau definition for rural areas that is based on metropolitan statistical areas. EDA manages this program through its headquarters in Washington, D.C., six regional offices, and multiple field personnel. Reclamation was established to implement the Reclamation Act of 1902, which authorized the construction of water projects to provide water for irrigation in the arid western states. Reclamation generally manages numerous municipal and industrial projects as part of larger, multipurpose projects that provide irrigation, flood control, power, and recreational opportunities in 17 western states, unless otherwise directed by the Congress. Reclamation provides assistance for water supply projects through individual project authorizations and defines a rural area as a community, or group of communities, each of which has a population of not more than 50,000 inhabitants. Reclamation manages these projects through its headquarters in Washington, D.C., and Denver, Colorado, five regional offices, and multiple field offices in the western United States. The Corps' Civil Works programs investigate, develop, and maintain water and related environmental resources throughout the country to meet the agency's navigation, flood control, and ecosystem restoration missions. In addition, the Civil Works programs also provide disaster response, as well as engineering and technical services. The Corps provides assistance for water supply and wastewater projects through authorizations for either a project in a specific location, or for a program in a defined geographic area, and does not have a definition for rural areas. The Corps administers its programs and projects through its Headquarters in Washington, D.C., eight regional divisions, and 38 district offices. These agencies rely on several sources of funding--including annual appropriations from the general fund and from dedicated funding sources, such as trust funds--to provide financial support for these projects and programs. RUS, EDA, Reclamation, and the Corps obligated $4.7 billion to 3,104 rural water supply and wastewater projects from fiscal years 2004 through 2006. Of these obligations, RUS obligated nearly $4.2 billion (or about 90 percent) of the funding--about $1.5 billion in grants and about $2.7 billion in loans--to about 2,800 projects. EDA, Reclamation, and the Corps provided a combined $500 million in grants to rural communities for about 300 water supply and wastewater projects. Table 1 shows the number of projects and the amount of obligations for rural water supply and wastewater projects by agency for fiscal years 2004 through 2006. Figures 1 through 4 show the location of these rural water supply and wastewater projects by agency during fiscal years 2004 through 2006. RUS provided the majority of the funding to the largest number of projects, while Reclamation provided the largest amount of funding per project. As table 1 shows, the average RUS grant was approximately $680,000 per project, while the average Reclamation grant was about $22 million per project. EDA and Corps grants averaged about $1 million and $800,000 per project, respectively. The average Reclamation grant amount was significantly larger than the grant amounts provided by the other agencies because Reclamation provided funding to a relatively small number of large regional water supply projects that span multiple communities. For example, during fiscal years 2004 through 2006, Reclamation obligated nearly $87 million of the about $459 million estimated total cost for the Mni Wiconi project. This project will provide potable water to about 51,000 people in rural communities spanning seven counties and three Indian Reservations. The Mni Wiconi project covers approximately 12,500 square miles of the state of South Dakota or roughly 16 percent of the state's total land area. Figure 5 shows the location of the Mni Wiconi project area. In contrast, the other three agencies primarily provided funding to relatively smaller scale projects located in single communities. For example, Penns Grove, New Jersey, a community with a population of about 5,000, received an $800,000 EDA grant to upgrade a wastewater treatment plant with an estimated total project cost of $1.16 million. Similarly, according to Corps officials, Monticello, Kentucky, a community with a population of about 6,000, received about $312,500 from the Corps for two sewer line extensions with total project costs of about $435,000. This community also received about $1 million from RUS for water and sewer line upgrades with an estimated total project cost of about $1.4 million. While the types of projects RUS, EDA, Reclamation, and the Corps fund are similar, varying agency eligibility criteria can limit funding to certain communities based on their population size, economic need, or geographic location. Specifically, RUS and EDA have established nationwide programs with standardized eligibility criteria and processes under which communities compete for funding. In contrast, Reclamation and the Corps have historically provided funding to congressionally authorized projects in certain geographic locations, without standardized eligibility criteria. Table 2 shows the types of projects each agency funds, the funding mechanisms they use, and their eligibility criteria. The rural water projects that RUS, EDA, Reclamation, and the Corps fund are similar, and all four agencies use similar funding mechanisms. While Reclamation primarily provides funding for water supply projects, RUS, EDA, and the Corps fund both water supply and wastewater projects. These projects primarily include the construction or upgrading of water or wastewater distribution lines, treatment plants, and pumping stations. For example, all four agencies funded water line expansions or upgrades in either residential or commercial areas. RUS, EDA, and the Corps also funded sewer line extensions into either residential or commercial areas. RUS and EDA have established nationwide programs with standardized eligibility criteria and processes under which communities compete for funding. Specifically, RUS' eligibility criteria require projects to be located in a city or town with a population of less than 10,000 or an unincorporated rural area, regardless of the area's population. EDA's eligibility criteria require projects to be located in economically distressed communities, regardless of the size of the community served, and the project must also create or retain jobs. RUS' eligibility criteria require water supply or wastewater projects to serve rural areas. A project must be located in a city or town with a population of less than 10,000 or in an unincorporated rural area regardless of the population. For example, St. Gabriel, Louisiana, with a population of about 6,600, received RUS funding to expand sewer lines to connect residents to a wastewater treatment plant. Similarly, Laurel County Water District No. 2, which provides potable water to about 17,000 residents who live in unincorporated rural areas of southeastern Kentucky between the cities of London, Kentucky, and Corbin, Kentucky, received RUS funding to upgrade a water treatment plant to accommodate potential growth opportunities in the area. Table 3 provides the number of RUS funded rural water supply and wastewater projects by state for fiscal years 2004 through 2006. To apply for RUS funding for a water supply or wastewater project, a community must submit a formal application. Once the formal application is submitted, communities then compete for funding with other projects throughout the state. In general, RUS officials in the state office rank each proposed project according to the project's ability to alleviate a public health issue, the community's median household income, and other factors. As applications are reviewed and ranked on a rolling basis, RUS officials in the state office generally decide which projects will receive funding until all funds are obligated for the fiscal year. RUS provides both grants and loans for eligible projects, and communities must meet certain requirements depending upon the type of assistance they are requesting. For example, RUS grants can be used to finance up to 75 percent of a project's cost based on a number of factors including a community's financial need and median household income. Alternatively, to receive a loan, the community must certify in writing, and RUS must determine, that the community is unable to finance the proposed project from their own resources or through commercial credit at reasonable rates and terms. For projects also funded through RUS loans, RUS requires the community to charge user fees that, at a minimum, cover the costs of operating and maintaining the water system while also meeting the required principal and interest payments on the loan. For example, RUS provided the Wood Creek Water District, located in Laurel County, Kentucky, a $1 million grant and a $7.98 million loan for a major water treatment plant expansion. A Wood Creek official told us that the water district had attempted to obtain a loan from a commercial lender; however, the loan would have had an interest rate of 7 percent and a term of 20 years, which would have rendered the project financially unfeasible. According to RUS, Wood Creek was able to receive a loan with an interest rate of 4.3 percent and a term of 40 years, thereby significantly reducing the annual loan payments. RUS also required Wood Creek to slightly increase its user fees to support the operation and maintenance of the water system and cover the loan repayment. EDA's eligibility criteria require water supply or wastewater projects to be located in an economically distressed area, regardless of the area's population size. EDA defines an area as economically distressed if it meets one of the following three conditions: the area has (1) an unemployment rate that is at least 1 percent greater than the national average, (2) a per capita income that is 80 percent or less of the national average, or (3) has experienced or is about to experience a special need arising from changes in economic conditions. The project must also create or retain long-term private sector jobs and/or attract private capital investment. For example, Assumption Parish Waterworks District No.1 in Napoleonville, Louisiana, received EDA funding to upgrade water service to two sugarcane mills. The community qualified for the funding because Assumption Parish met EDA's criteria for unemployment and per capita income. The water supply project allowed the sugarcane mills to maintain and expand their operations, saving 200 existing jobs, creating 17 new jobs, and attracting $12.5 million in private investment. Table 4 provides the number of EDA funded rural water supply and wastewater projects by state for fiscal years 2004 through 2006. To apply for EDA funding for a water supply or wastewater project, the community must submit a preapplication to an EDA Regional Office. If the proposed project is found eligible, the community must then submit a formal application to an EDA Regional Office. The Regional Office then prioritizes and makes funding decisions that are forwarded to EDA headquarters for approval. These decisions are based upon, among other things, how the project promotes innovative, entrepreneurial, or long-term economic development efforts. EDA applications are reviewed on a rolling basis, and funding decisions are made until all of the funds for the fiscal year are obligated. EDA provides grants for eligible projects that may finance 50 to 100 percent of a project's total costs based on a number of factors including an area's level of economic distress. For example, the London-Laurel County Industrial Development Authority located in Laurel County, Kentucky, qualified for an EDA grant because the county has a per capita income of $14,165, which is 66 percent of the national average. Because Laurel County's per capita income was between 60 to 70 percent of the national average, EDA's grant could fund no more than 60 percent of the project's total cost. The project received a $950,000 grant, which covered 50 percent of the $1.9 million total project cost to construct water and sewer line extensions for an industrial park. The new occupants of this industrial park were expecting to create 425 new jobs and provide $20.9 million in private investment. Reclamation and the Corps have not historically had rural water supply and wastewater programs; rather they have provided funding to specific projects or programs in certain geographic locations under explicit congressional authorizations. Although the Corps continues to provide assistance to projects under specific congressional authorizations, many of which are pilot programs, the Rural Water Supply Act of 2006 directed Reclamation to establish a rural water supply program with standardized eligibility criteria. Reclamation provides grants to individual rural water supply projects in eligible communities for which the Congress has specifically authorized and appropriated funds. These grants finance varying amounts of a project's total costs depending upon the specific authorization. According to a program assessment conducted by the Office of Management and Budget (OMB), the Congress has chosen Reclamation to fill a void for projects that are larger and more complex than other rural water projects and which do not meet the criteria of other rural water programs. For example, the Mni Wiconi Project Act of 1988, as amended, directs Reclamation to provide funding to three Indian tribes and seven counties for a rural water supply project in South Dakota that encompasses 16 percent of state's total land area. For the Mni Wiconi project, Reclamation grants provide funding for 100 percent of the project costs on Indian lands and 80 percent of the project costs on non-Indian lands. Table 5 provides the number of Reclamation funded rural water supply projects by state for fiscal years 2004 through 2006. While rural water supply projects are outside of Reclamation's traditional mission, according to Reclamation officials, the agency became involved in such projects because individual communities or groups of communities proposed projects directly to the Congress. In response, the Congress created specific authorizations for these rural water supply projects, and Reclamation was designated responsibility for funding and overseeing the construction of the projects. Because Reclamation is responding to Congressional direction in implementing these projects, it has not established eligibility criteria for communities or prioritized these projects for funding. In a May 11, 2005 testimony, the Commissioner of the Bureau of Reclamation indicated that the agency would like more authority to plan and oversee the development and construction of rural water supply projects. In 2006, the Congress passed the Rural Water Supply Act directing Reclamation to develop a rural water supply program. Within 1 year, Reclamation was required to develop standardized criteria to determine eligibility requirements for rural communities and prioritize funding requests under this program. Further, the act directed Reclamation to assess within 2 years how the rural water supply projects funded by Reclamation will complement those being funded by other federal agencies. Reclamation is now beginning to address these requirements, including: (1) developing programmatic criteria to determine eligibility for participation and (2) assessing the status of authorized rural water supply projects and other federal programs that address rural water supply issues. According to a Reclamation official, the agency plans to complete these requirements by August 2008 and December 2008, respectively. Reclamation officials also said the development of a rural water supply program will, among other things, allow Reclamation to be directly involved in the planning, design, and prioritization of rural water supply projects and provide recommendations to the Congress regarding which projects should be funded for construction. Projects recommended for funding by Reclamation must still receive a specific congressional authorization for design and construction. The Corps funds rural water supply and wastewater projects under specific congressional authorizations, many of which are pilot programs, and makes funding available to specific communities or programs in certain geographic areas. For example, a section of the Water Resources Development Act of 1999, as amended, authorized a pilot program that directed the Corps to provide funding to water supply and wastewater projects to communities in Idaho, Montana, rural Nevada, New Mexico, and rural Utah. When directed to fund these types of projects, the Corps provides either grants or reimbursements for project costs incurred by the community. To receive reimbursements, a community submits invoices received from its contractors to the Corps, and the Corps generally reimburses the community up to 75 percent of project costs. Table 6 provides the number of Corps funded rural water supply and wastewater projects by state for fiscal years 2004 through 2006. Even though the Corps provides congressionally directed funding to specific geographic areas through these pilot programs, eligibility criteria and the degree to which projects compete for funding can differ between programs. For example, the Corps' Southern and Eastern Kentucky Environmental Improvement Program is available only to communities located in 29 counties in southeastern Kentucky. The program requires these communities to submit formal applications, which are prioritized and ranked annually against all received applications. The Corps, in conjunction with a nonprofit organization, selects projects for funding based on certain factors such as economic need. For example, the Wood Creek Water District submitted a formal application and received approximately $500,000 in reimbursements--about 72 percent of the total project costs--to extend sewer service to a school and 154 households who live near the school. In contrast, the Corps' Rural Utah Program is available to communities in 24 counties and part of another county that the Congress designated as rural. This program requires communities in these counties to submit a request letter that includes, among other things, a brief project description and an estimate of total project costs. Request letters are considered for funding on a rolling basis by Corps officials, and no other formal eligibility criteria exist. For example, Park City, Utah, submitted a letter that provided a project description and the estimated total cost for the project. According to a Corps official, the Corps evaluated the letter and provided approximately $300,000 in reimbursements--or about 60 percent of the total project costs--for the replacement of water and sewer lines in Park City's Old Town area. While the Corps funds projects carried out under these pilot programs as directed by the Congress, it does not request funds for them as part of its annual budget process because, according to Corps officials, these types of projects fall outside the Corps' primary mission of navigation, flood control, and ecosystem restoration. This position was reiterated in a May 11, 2007, policy document released by OMB, which stated that funding of such local water supply and wastewater projects is outside of the Corps' mission, costs taxpayers hundreds of millions of dollars, and diverts funds from more meritorious Corps Civil Works projects. When the Congress authorized the Corps to fund these various pilot programs, it also required the agency to evaluate the effectiveness of several of them and recommend to the Congress whether these pilot programs should be implemented on a national basis. The Corps has completed 9 of the 12 required evaluations. Of the completed evaluations, only four made recommendations--all in favor of the establishment of a national program. The other five evaluations either did not make the required recommendation or stated that the agency had not yet funded enough projects to effectively evaluate the program. However, we found that between fiscal years 2004 and 2006, the Corps provided funding to over 100 rural water supply and wastewater projects under pilot programs, and it is unclear why the Corps has still not completed all of the evaluations required by the Congress. In the absence of the outstanding evaluations and recommendations, the Congress does not have information on whether, collectively, the projects carried out under the Corps' pilot programs merit continued funding, duplicate other agency efforts, or should be implemented on a national basis. The Congress has determined that RUS, EDA, and now Reclamation should provide funding for rural water projects as part of their overall missions and target federal assistance to certain communities based on their population size, economic need, or geographic location. However, for the Corps, the Congress has not yet determined whether funding of rural water supply projects should permanently be included within the agency's water portfolio. To help inform congressional decision making on this issue, the Corps was required to evaluate its various water supply and wastewater pilot programs and recommend to the Congress whether these programs should be continued. However, the Corps has not consistently provided the information required by the Congress even though it has completed over 100 rural water projects under various pilot programs. As a result, the Congress does not have the information it needs to determine whether the Corps' projects meet a previously unmet rural water need or duplicate the efforts of other agencies. Such information is important for making decisions on how to allocate limited federal resources in a time when the nation continues to face long-term fiscal challenges. To ensure that the Congress has the information it needs to determine whether the Corps should continue to fund rural water supply and wastewater projects, we recommend that the Secretary of Defense direct the Commanding General and the Chief of Engineers of the U.S. Army Corps of Engineers to provide a comprehensive report on the water supply and wastewater projects that the Corps has funded under its pilot programs and determine whether these pilot programs duplicate other agency efforts and should be discontinued, or whether these pilot programs address an unmet need and should be expanded and made permanent at a national level. We provided the Departments of Agriculture, Commerce, Defense, and the Interior with a draft of this report for review and comment. The Department of Defense concurred with GAO's findings and recommendation, and its written comments are included in appendix III. The Department of the Interior also agreed with GAO's findings, and its written comments are included in appendix IV. The Departments of Agriculture and Commerce provided us with technical comments, which we have incorporated throughout the report, as appropriate. We will send copies of this report to interested congressional committees; the Secretaries of Agriculture, Commerce, Defense, and the Interior; and other interested parties. We will also make copies available to others upon request. In addition, the report will be available at no charge on the GAO Web site at http://www.gao.gov. If you or your staff has any questions about this report, please contact me at (202) 512-3841, or [email protected]. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this report. Key contributors to this report are listed in appendix V. To determine how much federal funding the U.S. Department of Agriculture's Rural Utilities Service (RUS), the Department of Commerce's Economic Development Administration (EDA), the Department of the Interior's Bureau of Reclamation (Reclamation), and the U.S. Army Corps of Engineers (Corps) obligated for rural water supply and wastewater projects for fiscal years 2004 through 2006, we collected and analyzed obligation and project location data submitted by each agency. We determined that the data were sufficiently reliable for the purposes of this report. To identify water supply and wastewater projects that were located in rural areas, we applied the definition of rural used by RUS, EDA, and Reclamation to the geographic location each agency provided for its water supply and wastewater projects. Because the Corps does not have a definition for rural areas, we asked the Corps to use the U.S. Census Bureau's density-based urban and rural classification system to identify projects that it funds in rural areas. This classification system divides geographical areas into urban areas, urban clusters, and nonurban areas and clusters. Using this information, we determined that Corps funded water supply and wastewater projects were in rural areas if they were located in: (1) any nonurban areas or clusters, (2) urban clusters with a population of less than 20,000, and (3) areas of Nevada and Utah that the Congress specifically defined as rural in the Water Resources Development Act of 1999, as amended. Table 7 provides the definition of rural area used by each agency for water supply and wastewater projects. To determine the extent to which each RUS, EDA, Reclamation, and the Corps eligibility criteria and the projects they fund differed, we reviewed and analyzed applicable statutes, agency regulations, and policy guidance. In addition, we used a nonprobability sample to select 16 rural water supply and wastewater projects, including at least one project funded by each of the four agencies, and conducted site visits to each of the selected projects. These projects were selected based upon project type (water supply or wastewater), geographic location, type of assistance (loan, grant, or a combination of these) and the federal agency funding the project. During the site visits, we interviewed local officials from the communities receiving funding and federal agency officials responsible for managing the funding of those projects. We also collected and analyzed project-specific documentation such as applications and letters of intent. Table 8 lists the 16 projects we selected for site visits and the type of project, location, type of assistance, and funding agency(ies) for each project. To determine the overhead costs and number of personnel needed to support rural water supply and wastewater projects, we collected and analyzed agency policy guidance and interviewed agency officials to determine the extent to which RUS, EDA, Reclamation, and the Corps tracks these data for rural water supply and wastewater projects. We also requested these data from each agency to the extent they could provide them to us. We conducted our work from September 2006 through August 2007 in accordance with generally accepted auditing standards. The U.S. Department of Agriculture's Rural Utilities Service (RUS), the Department of Commerce's Economic Development Administration (EDA), the Department of the Interior's Bureau of Reclamation (Reclamation), and the U.S. Army Corps of Engineers (Corps) each calculate their overhead costs, commonly referred to as general and administrative (G&A) costs, and the number of personnel needed to manage rural water supply and wastewater projects, referred to as full- time equivalents (FTE), differently. This appendix describes how each agency calculates these costs for rural water supply and wastewater projects. RUS and EDA each receive separate appropriations to fund their agencywide G&A costs. These agencies do not track these costs or FTEs on a project-by-project basis. Therefore, we were unable to calculate each agencies total G&A costs and total FTEs by rural water supply and wastewater project. Reclamation divides water supply project costs into two categories, direct costs and indirect costs. According to Reclamation, if all activities are correctly and consistently charged, then all activities assigned to indirect costs can be considered overhead costs for a project. Although a standard formula is used to determine indirect cost rates, which are applied as a percentage of labor, Reclamation officials stated that the rates may vary by area office and region depending primarily on the amount of costs that can be charged directly to a project. Furthermore, according to documentation provided by Reclamation officials, these indirect cost rates were updated each fiscal year. As can be seen in table 9, Reclamation provided the following indirect costs and FTE estimates for the 11 rural water projects for which Reclamation obligated funds for fiscal years 2004 through 2006. The Corps' G&A costs for its headquarters and divisions are funded through a general expenses appropriation. G&A costs at the district level are distributed to projects and programs through the use of predetermined rates established by the district Commander at the beginning of each fiscal year and are automatically distributed to specific projects or programs based on the direct labor charged to the projects or programs. There are two types of overhead costs charged by the districts, general and administrative overhead and departmental overhead. General and administrative overhead includes administrative and support costs incurred in the day-to-day operations of a district. Departmental overhead includes costs incurred within technical divisions at the district headquarters that are not attributable to a specific project or program. While a standard formula is used to determine overhead rates, these rates may vary by district depending on a variety of factors including, geographic location--an office in a high cost area will cost more to operate than a similar office in a rural area, and composition of the workforce--an office staffed by senior-level employees will cost more to operate than an office staffed by junior-level employees. The Corps G&A costs and FTE data for its water supply and wastewater projects are calculated at the program level and cover projects in both rural and urban areas. The Corps could not readily provide these data for obligations on a rural water supply and wastewater project basis. In addition to the individual named above, Ed Zadjura, Assistant Director; Patrick Bernard; Diana Goody; John Mingus; Lynn Musser; Alison O'Neill; Matthew Reinhart; and Barbara R. Timmerman made significant contributions to this report. | funds for constructing and upgrading water supply and wastewater treatment facilities. As a result, they typically rely on federal grants and loans, primarily from the Rural Utilities Service (RUS), Economic Development Administration (EDA), Bureau of Reclamation (Reclamation), and the U.S. Army Corps of Engineers (Corps), to fund these projects. Concern has been raised about potential overlap between the projects these agencies fund. For fiscal years 2004 through 2006 GAO determined the (1) amount of funding these agencies obligated for rural water projects and (2) extent to which each agency's eligibility criteria and the projects they fund differed. GAO analyzed each agency's financial data and reviewed applicable statutes, regulations, and policies. From fiscal years 2004 through 2006, RUS, EDA, Reclamation, and the Corps obligated nearly $4.7 billion to about 3,100 rural water supply and wastewater projects. RUS obligated the majority of these funds--about $4.2 billion--to about 2,800 projects. Of this $4.2 billion, RUS loans accounted for about $2.7 billion, and RUS grants accounted for about $1.5 billion. EDA, Reclamation, and the Corps, combined, obligated a total of about $500 million in grants to rural communities for about 300 water projects. RUS, EDA, Reclamation, and the Corps fund similar rural water supply and wastewater projects, but they have varied eligibility criteria that limit funding to certain communities based on population size, economic need, or geographic location. RUS, EDA, and the Corps provide funding for both water supply and wastewater projects, while Reclamation provides funding only for water supply projects. Eligible water projects can include constructing or upgrading distribution lines, treatment plants, and pumping stations. RUS and EDA have formal nationwide programs with standardized eligibility criteria and processes under which communities compete for funding. In contrast, Reclamation and the Corps fund water projects in defined geographic locations under explicit congressional authorizations. In 2006 the Congress passed the Rural Water Supply Act, directing Reclamation to develop a rural water supply program with standard eligibility criteria. The Corps continues to fund rural water supply and wastewater projects under specific congressional authorizations, many of which are pilot programs. The Congress required the Corps to evaluate the effectiveness of these various pilot programs and recommend whether they should be implemented on a national basis. The Corps has only completed some of the required evaluations and, in most cases, has not made the recommendations that the Congress requested about whether or not the projects carried out under these pilot programs should be implemented on a national basis. | 6,190 | 556 |
Established in 1965, HUD is the principal federal agency responsible for the programs dealing with housing and community development and fair housing opportunities. Among other things, HUD's programs provide (1) mortgage insurance to help families become homeowners and to help provide affordable multifamily rental housing for low- and moderate-income families, (2) rental subsidies for lower-income families and individuals, and (3) grants and loans to states and communities for community development and neighborhood revitalization activities. HUD's fiscal year 1997 budget proposal requests about $22 billion in discretionary budget authority and plans about $33 billion in discretionary outlays. Compared with HUD's fiscal year 1996 appropriation, this request represents about a 7-percent increase in budget authority and 10 percent increase in outlays. HUD believes that this increase in outlays between fiscal years 1996 and 1997 is somewhat misleading. For example, 1996 outlays were unusually low because HUD expended $1.2 billion--which normally would have been disbursed early in fiscal year 1996--in late fiscal year 1995 because of the government shutdown. In addition, reforms in the mortgage assignment program generated a significant one-time savings of over $1 billion in fiscal year 1996 (under credit reform as scored by the Congressional Budget Office). HUD's March 1995 blueprint, HUD Reinvention: From Blueprint to Action, proposed to merge 60 of its 240 separate programs into three performance-based funds that would be allocated directly to the states and localities. HUD's objectives were to provide communities with greater flexibility and instill a level of accountability in its programs through the use of performance measures and a series of rewards and incentives. As of March of this year, few of the proposals in this reinvention document have been adopted. HUD's second reinvention proposal, Renewing America's Communities from the Ground Up: The Plan to Continue the Transformation of HUD, also known as Blueprint II, would supersede the first proposal but continue the move toward accountability by fiscal year 1998 by (1) consolidating over 20 community development programs into three performance funds where high-performing grant recipients would be awarded bonuses, (2) replacing 15 separate public housing programs with two performance funds, and (3) consolidating the 14 existing voucher and certificate funds. Appendix II summarizes HUD's plans to fund the proposals in Blueprint II through its fiscal year 1997 budget request. HUD's fiscal year 1997 budget request discusses how a planned, major restructuring of the multifamily housing program is likely to affect its budget over the next 6 years and beyond. The restructuring is aimed at addressing serious and longstanding problems affecting properties with HUD-insured mortgages that also receive rental subsidies tied to units in the properties (project-based assistance). HUD deserves credit for attempting to address these complex problems. However, HUD's assumptions about its ability to quickly restructure properties with high subsidy costs appear overly optimistic and could be responsible for HUD underestimating its request for rental assistance for low-income families. According to HUD's latest data, 8,636 properties with about 859,000 apartments would be subject to the restructuring proposal; the unpaid loan balances for these properties total about $17. 8 billion. In many cases, HUD pays higher amounts to subsidize properties than are needed to provide the households living in them with decent, affordable housing. In other cases, rents set by HUD are lower than required to maintain the properties' physical condition, contributing to poor living conditions for families with low incomes. Initially termed "mark to market" in last year's budget request, and now referred to as "multifamily portfolio reengineering," the goal and general framework of HUD's proposal remain the same: eliminate excess subsidy costs and improve the poor physical condition of some of the properties by relying primarily on market forces. Specifically, for properties with mortgages insured by FHA that also receive project-based assistance, HUD has been proposing to let the market set the property rents to market levels and reduce mortgage debt if necessary to permit a positive cash flow. In addition, HUD has proposed replacing project-based rental subsidies with portable tenant-based subsidies, thereby requiring the properties to compete in the marketplace for residents. While maintaining this general framework, HUD made several changes to its proposal this year. For example, under the initial proposal all rents would have been reset to market levels whether the market rents were above or below the subsidized rents. The current proposal gives priority attention initially to properties with subsidized rents above market. In addition, HUD plans to let state and local governments decide whether to continue with project-based rent subsidies after mortgages are restructured or to switch to tenant-based assistance. HUD has also indicated that it will allow owners to apply for FHA insurance on the new, restructured mortgage loans, whereas last year the proposal expressly disallowed FHA insurance on restructured loans. We are currently evaluating a study by Ernst & Young LLP released on May 2, 1996, that was designed to provide the Department with current information on HUD's multifamily portfolio. This information could form the basis for the improvement of key assumptions needed to estimate the net savings or costs associated with the reengineering proposal. In this regard, HUD's contract with Ernst & Young LLP requires that the firm update HUD's information on (1) market rents versus the project-based rents that the agency subsidizes and (2) the physical condition of the properties.These two variables strongly influence whether a property can operate at market rents without debt reduction or what amount of debt reduction is needed to cover the property's expenses. Having good data on these variables will allow FHA to better develop claims estimates which will be based on the amount of debt write-down. In addition, the rent data are integral to estimating the change in subsidy costs if the project-based rents are replaced with market rents and the residents receive tenant-based assistance. HUD also tasked Ernst & Young with developing a financial model that would show the likely result of reengineering the portfolio and identify the related subsidy costs and claims costs. The results of the Ernst & Young study were not available when the fiscal year 1997 budget was being developed. Because HUD lacked the project-specific data contained in the Ernst & Young study, HUD used assumptions in some cases that represent the Department's "best guess" as to outcome. These assumptions can affect the budgetary savings HUD expects to result from reengineering the portfolio. Ernst & Young's May 2, 1996, report presents information on projects that are expected to be affected by this reengineering. While the report did not directly discuss subsidy and claims costs, we are currently reviewing the results of this study and its cost implications. We plan to issue our report on the Ernst & Young study this summer. On the basis of our ongoing work, we believe that some of the assumptions HUD used may overstate the projected savings associated with reengineering the portfolio. We cannot, however, determine the extent of that overstatement at this time. One of HUD's assumptions is that a substantial number of mortgages with excess subsidy costs will be restructured well ahead of the dates that their rental assistance contracts expire. Although the extent to which HUD will be able to accomplish this remains unclear, this assumption appears optimistic and HUD's budget request may understate its need for funding to renew section 8 rental assistance contracts for fiscal year 1997 and beyond. In its fiscal year 1997 budget, HUD requested $845 million in bonus funding for high-performing grantees in four of its six new block grants. HUD calls the block grants "performance funds." HUD believes that these grants will provide communities with greater flexibility to design local solutions to local problems. HUD plans to competitively award bonuses to grantees who exceed the established performance measures and who submit project proposals. (App. III summarizes the details of the proposed bonus pools.) We generally support performance measurement as a method of building accountability into block grants because it would allow grantees to achieve objectives while also vesting them with responsibility for their choices. Moreover, HUD's development of block grants and performance measures would be consistent with the underlying principles of the Government Performance and Results Act and recommendations for program consolidation made by the National Performance Review. However, the characteristics of the block grants themselves--their program breadth and the flexibility allowed the grantees--will greatly complicate and add significant time to HUD's development of uniform performance measures. HUD is still in the early stages of developing such measures, however, and without them grantees will have difficulty understanding HUD's objectives and performance measurement process. Moreover, because of inadequate information systems to support performance measurement, we question whether HUD's request for bonus funding can be effectively used during fiscal year 1997. Some features inherent to block grants will complicate the implementation of a performance measurement system in fiscal year 1997. These complications would result in extending beyond fiscal year 1997 the time HUD needs to develop adequate measures. We have reported in the past, for instance, that the flexibility and wide latitude allowed grantees make common and comparative measurement very difficult. HUD will need to collaborate with the states to develop performance measures and establish reporting requirements. These entities' interests could vary markedly because HUD would be looking to meet national objectives, while the states are trying to meet local needs. Not only do the federal and state interests differ, but it will take time for both to develop data collection systems and reporting capacities once the initial decisions are made. In addition, measurement is complicated because all observed outcomes cannot be assumed to result from the programs and activities under scrutiny. Some outcomes, such as job creation, will be affected by factors outside of the control of program participants, while other desired outcomes, such as enhanced quality of life for residents, may not be quantifiable. Moreover, our work on block grants at other federal agencies has shown that many of these agencies lack the ability to track progress, evaluate results, and use performance data to improve their agencies' effectiveness. For example, HUD's Inspector General ( IG) recently found that HUD is just beginning to develop a Department-wide strategic plan, the key underpinning and starting point for the process of program goal-setting and performance measurement that the Government Performance and Results Act seeks to establish throughout the federal government. Program performance information comes from sound, well-run information systems that accurately and reliably track actual performance against the standards or benchmarks. Our work has shown, however, that HUD's information systems may not be adequate to support the implementation of the four bonus pools. For example, HUD is proposing a $500 million bonus fund as part of its public housing capital fund. As a requirement for eligibility, housing authorities would have to have earned high scores in the Public Housing Management Assessment Program (PHMAP) and have undertaken substantive efforts to link residents with education and job training. However, HUD generally does not confirm the scores of high scoring housing authorities--many of the data to support the scores are self-reported--and generally accepts the scores as accurate. Our analysis, as well as that of the HUD IG and others, has cast doubt on the accuracy of PHMAP scores for some housing authorities. Three major public housing industry associations also share concerns about PHMAP's use as a tool for awarding bonuses. And finally, HUD itself recently acknowledged that PHMAP scores should not be considered the sole measure of a public housing authority's performance, noting that circumstances can exist in which the best decision a housing authority can make is not always the one that yields the highest PHMAP score in the short term. We believe, therefore, that PHMAP--as it is currently implemented--should not be used as a basis for awarding bonuses to public housing authorities. HUD has said that it intends to draw on its Empowerment Zone/Enterprise Community (EZ/EC) experience with benchmarking to move toward performance-based funding for all HUD programs. However, HUD officials said that developing benchmarks for the first round of EZ/EC grants was a difficult task and they recognize that HUD could have done a better job of explaining the process of developing benchmarks to communities. Given this difficulty and the complications mentioned earlier, we are concerned that HUD is still in the midst of developing its bonus program and measures for its performance funds. In its fiscal year 1997 budget, the Department is requesting $11 million for its Office of Policy Development and Research to continue developing quantifiable measures for each major program, a process for setting benchmarks with grantees, and improvements in how the Department uses information on program performance. Because this development is ongoing, the measures and the processes will not be in place and known to the grantees before HUD uses them to award bonuses with fiscal year 1997 funds. HUD officials believe that bonus funding needs to be offered during fiscal year 1997 to encourage the states and localities to seek higher performance and that the details will be worked out as the program is implemented. We believe that timing is critical in this matter. For the performance bonuses to have equity and merit, HUD needs to be able to specify prior to the year over which performance is measured what results and outcomes will be rewarded and how they will be measured. As we have reported, four long-standing, Department-wide management deficiencies led to our designation of HUD as a high-risk area in January 1994. These deficiencies were weak internal controls, an ineffective organizational structure, an insufficient mix of staff with the proper skills, and inadequate information and financial management systems. In February 1995, we reported that HUD's top management had begun to focus attention on overhauling the Department's operations to correct these management deficiencies. In that report, we outlined actions that the agency needed to take to reduce the risk of waste, fraud, and abuse. In reviewing the proposed 1997 budget, we found budgetary support for the implementation of several of these recommendations. First, we recommended consolidating programs to give the communities greater flexibility in applying for funds and reducing administrative burden. The 1997 budget proposes the consolidation of many individual programs, either now or in the near future, into block grant programs to increase participants' flexibility. HUD is beginning to develop performance measures for many programs to assess the participants' progress. Second, we recommended that HUD be authorized to use more innovative initiatives to leverage private investment in community development and affordable housing. Several HUD programs will now or in the future involve mechanisms such as grant proposals or loan programs that will require either participation or investment by private organizations. In addition, FHA proposes creating new mortgage products that would expand homeownership and that would share risk with other entities. Third, we recommended that HUD continue to strengthen and coordinate its long-range planning. The budget proposal describes new investments to upgrade and expand its computer systems to specifically support implementation of Blueprint II. HUD anticipates that the proposed investments will improve efficiency and reduce operating costs. However, HUD's budget proposes several new, specialized initiatives that seem to run counter to the agency's consolidation efforts to, as described in Blueprint II, "sweep away the clutter of separate application procedures, rules and regulations that has built up at HUD over the past 30 years." For example, HUD is requesting $290 million for its Housing Certificate Fund to assist several groups of people needing preferred housing. These programs include the Welfare-to-Work initiative and housing for homeless mothers with children. However, this funding request is inconsistent with Blueprint II, in which HUD urges the Congress to do away with the statutes that require such preferences. Although the Department deserves credit for its continuing resolve in addressing its long-standing management deficiencies, HUD's recently initiated actions are far from reaching fruition, and the agency's problems continue. In addition, specialized programs are beginning to reappear, and they may undermine the major restructuring of the agency, reduce efficiency, and increase administrative burdens. Therefore, we believe that both now and for the foreseeable future, the agency's programs will continue to be high-risk in terms of their vulnerability to waste. Our statement today discussed several issues that will affect HUD's programs and their need for appropriations. We identified new issues and highlighted changes in other issues on which we have previously testified. By continuing to focus on improving its internal management and coming to closure on how and when it will use the market to eliminate excess subsidy costs and improve the poor physical conditions of its assisted multifamily housing, HUD will be better able to use additional appropriations and implement new policy. Although HUD has recognized many of its management deficiencies and has budgeted funds to address them, we see this as a long-term effort that will continue into the foreseeable future. In connection with the proposed bonus pools, the lack of adequate performance measures and associated information systems leads us to question the basis for awarding additional funding at this time. While HUD officials believe that the details of awarding bonuses will be worked out as the program is implemented, we believe that they are overly optimistic, given the magnitude of the bonus pools and the complexity of developing appropriate performance measures. We recommend that the Congress consider not appropriating the $845 million for HUD's proposed bonus pool funding until the Department develops adequate performance measures and supporting information systems to ensure that these funds are used effectively. Housing and Urban Development: Limited Progress Made on HUD Reforms (GAO/T-RCED-96-112, Mar. 27, 1996). FHA Hospital Mortgage Insurance Program: Health Care Trends and Portfolio Concentration Could Affect Program Stability (GAO/HEHS-96-29, Feb. 27, 1996). GPRA Performance Reports (GAO/GGD-96-66R, Feb. 14, 1996). Homeownership: Mixed Results and High Costs Raise Concerns About HUD's Mortgage Assignment Program (GAO/RCED-96-2, Oct. 18, 1995). Multifamily Housing: Issues and Options to Consider in Revising HUD's Low-Income Housing Preservation Program (GAO/T-RCED-96-29, Oct. 17, 1995). Housing and Urban Development: Public and Assisted Housing Reform (GAO/T-RCED-96-25, Oct. 13, 1995). Block Grants: Issues in Designing Accountability Provisions (GAO/AIMD-95-226, Sept. 1, 1995). Property Disposition: Information on HUD's Acquisition and Disposition of Single-Family Properties (GAO/RCED-95-144FS, July 24, 1995). Housing and Urban Development: HUD's Reinvention Blueprint Raises Budget Issues and Opportunities (GAO/T-RCED-95-196, July 13, 1995). Public Housing: Converting to Housing Certificates Raises Major Questions About Cost (GAO/RCED-95-195, June 20, 1995). Government Restructuring: Identifying Potential Duplication in Federal Missions and Approaches (GAO/T-AIMD-95-161, June 7, 1995). HUD Management: FHA's Multifamily Loan Loss Reserves and Default Prevention Efforts (GAO/RCED/AIMD-95-100, June 5, 1995). Program Consolidation: Budgetary Implications and Other Issues (GAO/T-AIMD-95-145, May 23, 1995). Government Reorganization: Issues and Principles (GAO/T-GGD/AIMD-95-166, May 17, 1995). Managing for Results: Steps for Strengthening Federal Management (GAO/T-GGD/AIMD-95-158, May 9, 1995). Multiple Employment Training Programs: Most Federal Agencies Do Not Know If Their Programs Are Working Effectively (GAO/HEHS-94-88, Mar.2, 1994). Multifamily Housing: Better Direction and Oversight by HUD Needed for Properties Sold With Rent Restrictions (GAO/RCED-95-72, Mar. 22, 1995). Block Grants: Characteristics, Experience, and Lessons Learned(GAO/HEHS-95-74, Feb. 9, 1995). High-Risk Series: Department of Housing and Urban Development (GAO/HR-95-11, Feb. 1995). Program Evaluation: Improving the Flow of Information to the Congress (GAO/PEMD-95-1, Jan. 30, 1995). Housing and Urban Development: Major Management and Budget Issues (GAO/T-RCED-95-86, Jan. 19, 1995, and GAO/T-RCED-95-89, Jan. 24, 1995). Federally Assisted Housing: Expanding HUD's Options for Dealing With Physically Distressed Properties (GAO/T-RCED-95-38, Oct. 6, 1994). Rural Development: Patchwork of Federal Programs Needs to Be Reappraised (GAO/RCED-94-165, July 28, 1994). Federally Assisted Housing: Condition of Some Properties Receiving Section 8 Project-Based Assistance Is Below Housing Quality Standards (GAO/T-RCED-94-273, July 26, 1994, and Video, GAO/RCED-94-01VR). Public Housing: Information on Backlogged Modernization Funds (GAO/RCED-94-217FS, July 15, 1994). Homelessness: McKinney Act Programs Provide Assistance but Are Not Designed to Be the Solution (GAO/RCED-94-37, May 31, 1994). Grantees will use their formula funds for the present wide range of activities eligible under CDBG, but two new features added--performance measures and benchmarks, and a bonus pool. The bonus pool will be devoted exclusively to job creation and economic revitalization efforts. The budget proposes $4.6 billion for the CDBG fund in 1997. In addition, $300 million is requested for a second round of Empowerment Zone/Enterprise Communities grants ($200 million) and a competitive Economic Development Challenge Grant ($100 million) for high-performing jurisdictions. Grantees will use their formula funds to expand the supply of affordable housing. The fund will require grant recipients to set their own performance measures and benchmarks. Ten percent of the fund will be set aside as a bonus pool to create large tracts of homeownership in communities. The budget proposes a total of $1.55 billion for HOME in 1997, including $1.4 billion for the HOME Fund and $135 million for the HOME Fund Challenge Grant for Homeownership Zones. The Budget also proposes to use $15 million of funds provided for the HOME Fund for Housing Counseling. The HAF will allow grantees to shape a comprehensive, flexible, coordinated "continuum of care" approach to solving rather than institutionalizing homelessness. Ten percent of the fund will be set aside as a bonus pool. The budget proposes $1.12 billion for the HAF in 1997. Of this total, $1.01 billion will be for a consolidated needs-based homeless assistance program, and the remaining $110 million will be for the Homeless/Innovations Challenge Grant. HUD will re-propose consolidating several programs (i.e., drug elimination grant, service coordinators) into one Operating Fund by FY 1998. All existing eligible uses under these funds, plus expanded anti-crime activities, will be permitted under the Operating Fund. The budget proposes $2.9 billion for the Operating Fund, an increase of $100 million over the anticipated $2.8 billion for fiscal year 1996. Public Housing Capital FundHUD will re-propose consolidating a series of separate programs into one Capital Fund by FY 1998. This new Fund will largely be modeled after the current modernization program. Eligible activities will include those currently eligible under modernization programs, under programs for distressed public housing developments, and under the development and Family Investment Center Programs. HUD will set aside 10 percent of the Capital Fund as a bonus pool. HUD plans to jump start the Campus of Learners initiatives in fiscal year 1996 by requiring all applications for redevelopment under the public housing capital programs to build in educational, technological, and job linkages. PHA's will need to build viable partnerships with local educational and job placement institutions to be eligible for funding. The budget proposes an appropriation of $3.2 billion for the Capital Fund in 1997. Two-hundred million will be made available for Indian housing construction. The budget assumes that $500 million will be made available in a separate account for a Capital Bonus Fund. The budget does not allocate a specific dollar amount to be used for the Campus of Learners initiative. However, PHA's are encourage to use capital funds to advance this endeavor. (continued) HUD will re-propose consolidating the existing voucher and certificate funds into one performance-based Certificate Fund. The Certificate Fund will be HUD's principal tool for addressing what HUD considers the primary source of severe housing problems in the nation: lagging household incomes and high housing costs. The budget is requesting an appropriation of $290 million for fiscal year 1997 for the Certificate Fund for 50,000 incremental units, of which 30,000 units will be used to help families make a transition to work (25,000 units) and help homeless mothers with children obtain housing (5,000 units). The additional 20,000 units will be used for tenant protection to support families in FHA-insured assisted housing projects directly affected by prepayment, disposition or restructuring. The Community Development Block Grant Fund will comprise the CDBG and Economic Development Challenge Grant. The HOME Fund comprises the Home Investment Partnership Program (HOME), and the HOME Fund Challenge Grant. The Homeless Assistance Fund will consolidate HUD's six McKinney homeless assistance programs-Shelter Plus Care, Supportive Housing, Emergency Shelter Grants, Section 8 Moderate Rehabilitation (Single Room Occupancy), Rural Homeless Grants, and Safe Havens, as well as the Innovative Homeless Initiatives Demonstration Program. It will also include the Homeless/Innovations Challenge Grant. The Public Housing Operating Fund will consolidate the Public and Indian Housing Operating Subsidies. The Housing Certificate Fund consolidates the Section 8 Certificates, Section 8 vouchers, Section 8 Contract Renewals, Section 8 Family Unification, Section 8 for Persons with Disabilities, Section 8 for Persons with AIDS, Section 8 for Homeless, Section 8 Opt-Outs, Section 8 Counseling, Section 8 Pension Fund Certificates, Section 8 Veterans Affairs Supportive Housing, Section 8 Headquarters, Reserve, Lease Adjustments, and Family Self-Sufficiency Coordinators programs. Public Housing Authorities (PHAs) need to have scores of 90 or higher under Public Housing Management Assessment Program (PHMAP) and undertaken substantive efforts to link residents with educational, self-sufficiency intitiatives, or "Campus of Learners" activity. The bonus fund will be split among elegible PHAs based on the Caital Fund formula, and bonus funds may be used for any uses elegible under the Capital Fund. Any CDBG grantee that meets program requirements, meets or exceeds performance measures and benchmarks included in its Consolidated Plan, and demonstrates that it has expended grant funds on a timely basis. Funds are to address brownfields, generate economic revitalization in distressed communities, link people in these communities to jobs. Awards given on a competitive basis to high performing jurisdictions that propose innovative economic revitalization and job creation strategies using a combination of their own resources, private capital, and federal program incentives. Bonus funding is a "challenge grant" awarded on a competitive basis to high-performing jurisdiction that propose creative, cost-effective homeownership strategies using a combination of their own resources, private capital, and federal program incentives. Funds will be used to create Homeownership Zones to support state/local efforts to develop homeownership opportunities in targeted areas. Families earning up to 115 percent of the median income could be assisted. Bonus funding is to address the stated national priorities. Jurisdications need to propose creative strategies using a combination of their own resources, private capital, and federal program. Congressional Justification for 1997 Estimates, HUD, Part 1, April 1996. The first copy of each GAO report and testimony is free. Additional copies are $2 each. Orders should be sent to the following address, accompanied by a check or money order made out to the Superintendent of Documents, when necessary. VISA and MasterCard credit cards are accepted, also. Orders for 100 or more copies to be mailed to a single address are discounted 25 percent. U.S. General Accounting Office P.O. Box 6015 Gaithersburg, MD 20884-6015 Room 1100 700 4th St. NW (corner of 4th and G Sts. NW) U.S. General Accounting Office Washington, DC Orders may also be placed by calling (202) 512-6000 or by using fax number (301) 258-4066, or TDD (301) 413-0006. Each day, GAO issues a list of newly available reports and testimony. To receive facsimile copies of the daily list or any list from the past 30 days, please call (202) 512-6000 using a touchtone phone. A recorded menu will provide information on how to obtain these lists. | GAO discussed the Department of Housing and Urban Development's (HUD) fiscal year 1997 budget request, focusing on: (1) HUD multifamily reengineering cost estimates; (2) proposed bonus pools for high-performing grantees who exceed established performance measures; and (3) HUD progress in addressing management deficiencies. GAO noted that: (1) HUD has requested about $22 billion in discretionary budget authority and plans about $33 billion in discretionary outlays; (2) overly optimistic cost control assumptions about the major restructuring of the multifamily housing program could affect the HUD budget request for rental assistance for low-income families; (3) HUD has requested $845 million in bonus funding for high-performing grantees in some of its new block grants; (4) implementing HUD performance funds will be complicated and time-consuming; and (5) HUD has proposed various internal controls to address management deficiencies. | 6,349 | 191 |
The H-1B program enables companies in the United States to hire foreign workers for work in specialty occupations on a temporary basis. A specialty occupation is defined as one requiring theoretical and practical application of a body of highly specialized knowledge and the attainment of a bachelor's degree or higher (or its equivalent) in the field of specialty. The law originally capped the number of H-1B visas at 65,000 per year; the cap was raised twice pursuant to legislation, but in fiscal year 2004, the cap reverted to its original level of 65,000. Statutory changes also allowed for certain categories of individuals and companies to be exempt from or to receive special treatment under the cap. The American Competitiveness in the Twenty-First Century Act of 2000 exempted from the cap all individuals being hired by institutions of higher education and also nonprofit and government-research organizations. More recently, the H-1B Visa Reform Act of 2004 allowed for an additional 20,000 visas each year for foreign workers holding a master's degree or higher from an American institution of higher education to be exempted from the numerical cap limitation. In 2004, consistent with free trade agreements, up to 6,800 of the 65,000 H-1B visas may be set aside for workers from Chile and Singapore. While the H-1B visa is not considered a permanent visa, H-1B workers can apply for extensions and pursue permanent residence in the United States. Initial petitions are those filed for a foreign national's first-time employment as an H-1B worker and are valid for a period of up to 3 years. Generally, initial petitions are counted against the annual cap. Extensions--technically referred to as continuing employment petitions-- may be filed to extend the initial petitions for up to an additional 3 years. Extensions do not count against the cap. While working under an H-1B visa, a worker may apply for legal permanent residence in the United States. After filing an application for permanent residence, H-1B workers are generally eligible to obtain additional 1-year visa extensions until their U.S. Permanent Resident Cards, commonly referred to as "green cards," are issued. The Departments of Labor (Labor), Homeland Security (Homeland Security), and State (State) each play a role in administering the application process for an H-1B visa. Labor's Employment and Training Administration (Employment and Training) receives and approves an initial application, known as the Labor Condition Application (LCA), from employers. The LCA, which Labor reviews as part of the application process, requires employers to make various attestations designed to protect the jobs of domestic workers and the rights and working conditions of temporary workers. Homeland Security's U.S. Citizenship and Immigration Services (USCIS) reviews an additional employer application, known as the I-129 petition, and ultimately approves H-1B visa petitions. For prospective H-1B workers residing outside the United States, State interviews approved applicants and compares information obtained during the interview against each individual's visa application and supporting documents, and ultimately issues the visa. For prospective H-1B workers already residing in the United States, USCIS updates the workers' visa status without involvement from State. USCIS has primary responsibility for administering the H-1B cap. Generally, it accepts H-1B petitions in the order in which they are received. However, for those years in which USCIS anticipates that the number of I-129 petitions filed will exceed the cap, USCIS holds a "lottery" to determine which of the petitions will be accepted for review. For the lottery, USCIS uses a computer-generated random selection process to select the number of petitions necessary to reach the cap. With regard to enforcement, Labor, the Department of Justice (Justice), and Homeland Security each have specific responsibilities. Labor's Wage and Hour Division (Wage and Hour) is responsible for enforcing program rules by investigating complaints made against employers by H-1B workers or their representatives and assessing penalties when employers are not in compliance with the requirements of the program. Justice is responsible for investigating complaints made by U.S. workers who allege that they have been displaced or otherwise harmed by the H-1B visa program. Finally, USCIS's Directorate of Fraud Detection and National Security (FDNS) collaborates with its Immigration and Customs Enforcement Office to investigate fraud and abuse in the program. Over the past decade, demand for H-1B workers tended to exceed the cap, as measured by the number of initial petitions submitted by employers, one of several proxies used to measure demand since a precise measure does not exist. As shown in figure 1, from 2000 to 2009, initial petitions for new H-1B workers submitted by employers who are subject to the cap exceeded the cap in all but 3 fiscal years. However, the number of initial petitions subject to the cap is likely to be an underestimate of demand since, once the cap has been reached, employers subject to the cap may stop submitting petitions and Homeland Security stops accepting petitions. If initial petitions submitted by employers exempt from the cap are also included in this measure (also shown in figure 1), the demand for new H- 1B workers is even higher, since over 14 percent of all initial petitions across the decade were submitted by employers who are not subject to the cap. In addition to initial requests for H-1B workers, employers requested an average of 148,000 visa extensions per year, for an average of over 280,000 annual requests for H-1B workers. Over the decade, the majority (over 68 percent) of employers were approved to hire only one H-1B worker, while fewer than 1 percent of employers were approved to hire almost 30 percent of all H-1B workers. Among these latter employers are those that function as "staffing companies" that contract out H-1B workers to other companies. The prevalence of such companies participating in the H-1B visa program is difficult to determine. There are no disclosure requirements and Homeland Security does not track such information. However, using publicly available data, we learned that at least 10 of the top 85 H-1B-hiring employers in fiscal year 2009 participated in staffing arrangements, of which at least 6 have headquarters or operations located in India. Together, in fiscal year 2009, these 10 employers garnered nearly 11,456 approvals, or about 6 percent of all H-1B approvals. Further, 3 of these employers were among the top 5 H-1B-hiring companies, receiving 8,431 approvals among them. To better understand the impact of the H-1B program and cap on H-1B employers, GAO spoke with 34 companies across a range of industries about how the H-1B program affects their research and development (R&D) activities, their decisions about whether to locate work overseas, and their costs of doing business. Although several firms reported that their H-1B workers were essential to conducting R&D within the U.S., most companies we interviewed said that the H-1B cap had little effect on their R&D or decisions to locate work offshore. Instead, they cited other reasons to expand overseas including access to pools of skilled labor abroad, the pursuit of new markets, the cost of labor, access to a workforce in a variety of time zones, language and culture, and tax law. The exception to this came from executives at some information technology services companies, two of which rely heavily on the H-1B program. Some of these executives reported that they had either opened an offshore location to access labor from overseas or were considering doing so as result of the H-1B cap or changes in the administration of the H-1B program. Many employers we interviewed cited costs and burdens associated with the H-1B cap and program. The majority of the firms we spoke with had H- 1B petitions denied due to the cap in years when the cap was reached early in the filing season. In these years, the firms did not know which, if any, of their H-1B candidates would obtain a visa, and several firms said that this created uncertainty that interfered with both project planning and candidate recruitment. In these instances, most large firms we interviewed reported finding other (sometimes more costly) ways to hire their preferred job candidates. For example, several large firms we spoke with were able to hire their preferred candidates in an overseas office temporarily, later bringing the candidate into the United States, sometimes on a different type of visa. On the other hand, small firms were sometimes unable to afford these options, and were more likely to fill their positions with different candidates, which they said resulted in delays and sometimes economic losses, particularly for firms in rapidly changing technology fields. Interviewed employers also cited costs with the adjudication and lottery process and suggested a variety of reforms: The majority of the 34 firms we spoke with maintained that the review and adjudication process had become increasingly burdensome in recent years, citing large amounts of paperwork required as part of the adjudication process. Some experts we interviewed suggested that to minimize paperwork and costs, USCIS should create a risk-based adjudication process that would permit employers with a strong track- record of regulatory compliance in the H-1B program to access a streamlined process for petition approval. In addition, several industry representatives told us that because the lottery process does not allow employers to rank their top choices, firms do not necessarily receive approval for the most desired H-1B candidates. Some experts suggested revising the system to permit employers to rank their applications so that they are able to hire the best qualified worker for the job in highest need. Finally, entrepreneurs and venture capital firms we interviewed said that program rules can inhibit many emerging technology companies and other small firms from using the H-1B program to bring in the talent they need, constraining the ability of these companies to grow and innovate in the United States. Some suggested that, to promote the ability of entrepreneurs to start businesses in the United States, Congress should consider creating a visa category for entrepreneurs, available to persons with U.S. venture backing. In our report, we recommended that USCIS should, to the extent permitted by its existing statutory authority, explore options for increasing the flexibility of the application process for H-1B employers. In commenting on our report, Homeland Security and Labor officials expressed reservations about the feasibility of our suggested options, but Homeland Security officials also noted efforts under way to streamline the application process for prospective H-1B employers. For example, Homeland Security is currently testing a system to obtain and update some company data directly from a private data vendor, which could reduce the filing burden on H-1B petitioners in the future. In addition, Homeland Security recently proposed a rule that would provide for employers to register and learn whether they will be eligible to file petitions with USCIS prior to filing an LCA, which could reduce workloads for Labor and reduce some filing burden for companies. The total number of H-1B workers in the United States at any one point in time--and information about the length of their stay--is unknown due to data and system limitations. First, data systems among the various agencies that process H-1B applications are not easily linked, which makes it impossible to track individuals as they move through the application and entry process. Second, H-1B workers are not assigned a unique identifier that would allow agencies to track them over time or across agency databases--particularly if and when their visa status changes. Consequently, USCIS is not able to track the H-1B population with regard to: (1) how many approved H-1B workers living abroad have actually received an H-1B visa and/or ultimately entered the country; (2) whether and when H-1B workers have applied for or were granted legal permanent residency, leave the country, or remain in the country on an expired visa; and (3) the number of H-1B workers currently in the country or who have converted to legal permanent residency. Limitations in USCIS's ability to track H-1B applications also hinder it from knowing precisely when and whether the annual cap has been reached each year--although the Immigration and Nationality Act requires the department to do so. According to USCIS officials, its current processes do not allow them to determine precisely when the cap on initial petitions is reached. To deal with this problem, USCIS estimates when the number of approvals has reached the statutory limit and stops accepting new petitions. Although USCIS is taking steps to improve its tracking of approved petitions and of the H-1B workforce, progress has been slow to date. Through its "Transformation Program," USCIS is developing an electronic I-129 application system and is working with other agencies to create a cross-reference table of agency identifiers for individuals applying for visas that would serve as a unique person-centric identifier. When this occurs, it will be possible to identify who is in the United States at any one point in time under any and all visa programs. However, the agency faces challenges with finalizing and implementing the Transformation Program. We recommended that Homeland Security, through its Transformation Program, take steps to (1) ensure that linkages to State's tracking system will provide Homeland Security with timely access to data on visa issuances, and (2) that mechanisms for tracking petitions and visas against the cap be incorporated into business rules to be developed for USCIS's new electronic petition system. While a complete picture of the H-1B workforce is lacking, data on approved H-1B workers provides some information about the H-1B workforce. Between fiscal year 2000 and fiscal year 2009, the top four countries of birth for approved H-1B workers (i.e., approved initial and extension petitions from employers both subject to the cap and cap- exempt) were India, China, Canada, and the Philippines. Over 40 percent of all such workers were for positions in system analysis and programming. As compared to fiscal year 2000, in fiscal year 2009, approved H-1B workers were more likely to be living in the United States than abroad at the time of their initial application, to have an advanced degree, and to have obtained their graduate degrees in the United States. Finally, data on a cohort of approved H-1B workers whose petitions were submitted between January 2004 and September 2007, indicate that at least 18 percent of these workers subsequently applied for permanent residence in the United States--for which about half were approved, 45 percent were pending, and 3 percent were denied by 2010. The provisions of the H-1B program designed to protect U.S. workers-- such as the requirement to pay prevailing wages, the visa's temporary status, and the cap on the number of visas issued--are weakened by several factors. First, H-1B program oversight is shared by four federal agencies and their roles and abilities to coordinate are restricted by law. As a result, there is only nominal sharing of the kind of information that would allow for better employer screening or more active and targeted pursuit of program abuses. For example, the review of employer applications for H-1B workers is divided between Labor and USCIS, and the thoroughness of both these reviews is constrained by law. In reviewing the employer's LCA, Labor is restricted to looking for missing information and obvious inaccuracies, such as an employer's failure to checkmark all required boxes on a form denoting compliance. USCIS's review of the visa petition, the I-129, is not informed by any information that Labor's Employment and Training Administration may possess on suspicious or problematic employers. With regard to enforcement of the H-1B worker protections, Wage and Hour investigations are constrained, first, by the fact that its investigators do not receive from USCIS any information regarding suspicious or problematic employers. They also do not have access to the Employment and Training's database of employer LCAs. Second, in contrast to its authority with respect to other labor protection programs, Wage and Hour lacks subpoena authority to obtain employer records for H-1B cases. According to investigators, it can take months, therefore, to pursue time-sensitive investigations when an employer is not cooperative. To improve Labor's oversight over the H-1B program, we recommended that its Employment and Training Administration grant Wage and Hour searchable access to the LCA database. Further, we asked Congress to consider granting Labor subpoena power to obtain employer records during investigations under the H-1B program. To reduce duplication and fragmentation in the administration and oversight of the application process, consistent with past GAO matters for Congressional consideration, we asked Congress to consider streamlining the H-1B approval process by eliminating the separate requirement that employers first submit an LCA to Labor for review and certification, since another agency (USCIS) subsequently conducts a similar review of the LCA. Another factor that weakens protection for U.S. workers is the fact that the H-1B program lacks a legal provision to hold employers accountable to program requirements when they obtain H-1B workers through staffing companies. As previously noted, staffing companies contract H-1B workers out to other employers. At times, those employers may contract the H-1B worker out again, creating multiple middlemen, according to Wage and Hour officials (see fig. 2). They explained that the contractual relationship, however, does not transfer the obligations of the contractor for worker protection to subsequent employers. Wage and Hour investigators reported that a large number of the complaints they receive about H-1B employers were related to the activities of staffing companies. Investigators from the Northeast region--the region that receives the highest number of H-1B complaints--said that nearly all of the complaints they receive involve staffing companies and that the number of complaints are growing. H-1B worker complaints about these companies frequently pertained to unpaid "benching"--when a staffing company does not have a job placement for the H-1B worker and does not pay them. In January 2010, Homeland Security issued a memo--commonly referred to as the "Neufeld Memo"--on determining when there is a valid employer- employee relationship between a staffing company and an H-1B worker for whom it has obtained a visa; however officials indicated that it is too early to know if the memo has improved program compliance. To help ensure the full protection of H-1B workers employed through staffing companies, in our report we asked that Congress consider holding the employer where an H-1B visa holder performs work accountable for meeting program requirements to the same extent as the employer that submitted the LCA form. Finally, changes to program legislation have diluted program provisions for protecting U.S. workers by allowing visa holders to seek permanent residency, broadening the job and skill categories for H-1B eligibility, and establishing exemptions to the cap. The Immigration Act of 1990 removed the requirement that H-1B visa applicants have a residence in a foreign country that they have no intention of abandoning. Consequently, H-1B workers are able to pursue permanent residency in the United States and remain in the country for an unlimited period of time while their residency application is pending. The same law also broadened the job and skill categories for which employers could seek H-1B visas. Labor's LCA data show that between June 2009 and July 2010, over 50 percent of the wage levels reported on approved LCAs were categorized as entry-level (i.e. paid the lowest prevailing wage levels). However, such data do not, by themselves, indicate whether these H-1B workers were generally less skilled than their U.S. counterparts, or whether they were younger or more likely to accept lower wages. Finally, exemptions to the H-1B cap have increased the number of H-1B workers beyond the cap. For example, 87,519 workers in 2009 were approved for visas (including both initial and extensions) to work for 6,034 cap-exempt companies. Taken together, the multifaceted challenges identified in our work show that the H-1B program, as currently structured, may not be used to its full potential and may be detrimental in some cases. Although we have recommended steps that executive agencies overseeing the program may take to improve tracking, administration, and enforcement, the data we present raise difficult policy questions about key program provisions that are beyond the jurisdiction of these agencies. The H-1B program presents a difficult challenge in balancing the need for high-skilled foreign labor with sufficient protections for U.S. workers. As Congress considers immigration reform in consultation with diverse stakeholders and experts--and while Homeland Security moves forward with its modernization efforts--this is an opportune time to re-examine the merits and shortcomings of key program provisions and make appropriate changes as needed. Such a review may include, but would not necessarily be limited to the qualifications required for workers eligible under the H-1B program, exemptions from the cap, the appropriateness of H-1B hiring by staffing companies, the level of the cap, and the role the program should play in the U.S. immigration system in relationship to permanent residency. If you or your staffs have any questions about this statement, please contact Andrew Sherrill at (202) 512-7215 or [email protected]. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this statement. In addition to Andrew Sherrill (Director), Michele Grgich (Assistant Director) and Erin Godtland (Economist-in-Charge) led this engagement with writing and technical assistance from Nisha Hazra, Melissa Jaynes, Jennifer McDonald, Susan Bernstein (Education, Workforce and Income Security); and Rhiannon Patterson (Applied Research and Methods). Stakeholders included: Barbara Bovbjerg (Education, Workforce, and Income Security); Tom McCool (Applied Research and Methods); Ronald Fecso (Chief Statistician); Sheila McCoy and Craig Winslow (General Counsel); Hiwotte Amare and Shana Wallace (Applied Research and Methods); Richard Stana and Mike Dino (Homeland Security and Justice); Jess Ford (International Affairs and Trade). Barbara Steel-Lowney referenced the report. This is a work of the U.S. government and is not subject to copyright protection in the United States. The published product may be reproduced and distributed in its entirety without further permission from GAO. However, because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. | This testimony comments on the H-1B program. Congress created the current H-1B program in 1990 to enable U.S. employers to hire temporary, foreign workers in specialty occupations. The law capped the number of H-1B visas issued per fiscal year at 65,000, although the cap has fluctuated over time with legislative changes. The H-1B cap and the program itself have been a subject of continued controversy. Proponents of the program argue that it allows companies to fill important and growing gaps in the supply of U.S. workers, especially in the science and technology fields. Opponents of the program argue that there is no skill shortage and that the H-1B program displaces U.S. workers and undercuts their pay. Others argue that the eligibility criteria for the H-1B visa should be revised to better target foreign nationals whose skills are undersupplied in the domestic workforce. Our comments in this statement for the record are based on the results of our recent examination of the H-1B program, highlighting the key challenges it presents for H-1B employers, H-1B and U.S. workers, and federal agencies. Specifically, this statement presents information on (1) employer demand for H-1B workers; (2) how the H-1B cap impacts employers' costs and whether they move operations overseas; (3) the government's ability to track the cap and H-1B workers over time; and (4) how well the provisions of the H-1B program protect U.S. workers. From 2000 to 2009, the demand for new H-1B workers tended to exceed the cap, as measured by the numbers of initial petitions submitted by employers who are subject to the cap. While the majority (68 percent) of employers was approved for one H-1B worker, demand was driven to a great extent by a small number (fewer than 1 percent) of H-1B employers garnering over one quarter of all H-1B approvals. Cap-exempt employers, such as universities and research institutions, submitted over 14 percent of the initial petitions filed during this period. Most of the 34 H-1B employers GAO interviewed reported that the H-1B program and cap created additional costs for them, such as delays in hiring and projects, but said the global marketplace and access to skilled labor--not the cap--drive their decisions on whether to move activities overseas. Limitations in agency data and systems hinder tracking the cap and H-1B workers over time. For example, data systems among the various agencies that process these individuals are not linked so it is difficult to track H-1B workers as they move through the immigration system. System limitations also prevent the Department of Homeland Security from knowing precisely when and whether the annual cap has been reached each year. Provisions of the H-1B program that could serve to protect U.S. workers--such as the requirement to pay prevailing wages, the visa's temporary status, and the cap itself--are weakened by several factors. First, program oversight is fragmented between four agencies and restricted by law. Second, the H-1B program lacks a legal provision for holding employers accountable to program requirements when they obtain H-1B workers through a staffing company--a company that contracts out H-1B workers to other companies. Third, statutory changes made to the H-1B program over time--i.e. that broadened job and skill categories for H-1B eligibility, increased exceptions to the cap, and allowed unlimited H-1B visa extensions while holders applied for permanent residency--have in effect increased the pool of H-1B workers beyond the cap and lowered the bar for eligibility. | 4,853 | 768 |
Since the 1960s, the United States has operated two separate operational polar-orbiting meteorological satellite systems. These systems are known as the Polar-orbiting Operational Environmental Satellites (POES), managed by the National Oceanic and Atmospheric Administration's (NOAA) National Environmental Satellite, Data, and Information Service (NESDIS), and the Defense Meteorological Satellite Program (DMSP), managed by the Department of Defense (DOD). These satellites obtain environmental data that are processed to provide graphical weather images and specialized weather products, and that are the predominant input to numerical weather prediction models--all used by weather forecasters, the military, and the public. Polar satellites also provide data used to monitor environmental phenomena, such as ozone depletion and drought conditions, as well as data sets that are used by researchers for a variety of studies, such as climate monitoring. Unlike geostationary satellites, which maintain a fixed position above the earth, polar-orbiting satellites constantly circle the earth in an almost north-south orbit, providing global coverage of conditions that affect the weather and climate. Each satellite makes about 14 orbits a day. As the earth rotates beneath it, each satellite views the entire earth's surface twice a day. Today, there are two operational POES satellites and two operational DMSP satellites that are positioned so that they can observe the earth in early morning, mid-morning, and early afternoon polar orbits. Together, they ensure that for any region of the earth, the data provided to users are generally no more than 6 hours old. Figure 1 illustrates the current operational polar satellite configuration. Besides the four operational satellites, there are five older satellites in orbit that still collect some data and are available to provide some limited backup to the operational satellites should they degrade or fail. In the future, both NOAA and DOD plan to continue to launch additional POES and DMSP satellites every few years, with final launches scheduled for 2008 and 2010, respectively. Each of the polar satellites carries a suite of sensors designed to detect environmental data either reflected or emitted from the earth, the atmosphere, and space. The satellites store these data and then transmit the data to NOAA and Air Force ground stations when the satellites pass overhead. The ground stations then relay the data via communications satellites to the appropriate meteorological centers for processing. Under a shared processing agreement among the four processing centers--NESDIS, the Air Force Weather Agency, Navy's Fleet Numerical Meteorology and Oceanography Center, and the Naval Oceanographic Office--different centers are responsible for producing and distributing different environmental data sets, specialized weather and oceanographic products, and weather prediction model outputs via a shared network. Each of the four processing centers is also responsible for distributing the data to its respective users. For the DOD centers, the users include regional meteorology and oceanography centers as well as meteorology and oceanography staff on military bases. NESDIS forwards the data to NOAA's National Weather Service for distribution and use by forecasters. The processing centers also use the Internet to distribute data to the general public. NESDIS is responsible for the long-term archiving of data and derived products from POES and DMSP. In addition to the infrastructure supporting satellite data processing noted above, properly equipped field terminals that are within a direct line of sight of the satellites can receive real-time data directly from the polar- orbiting satellites. There are an estimated 150 such field terminals operated by the U.S. government, many by DOD. Field terminals can be taken into areas with little or no data communications infrastructure-- such as on a battlefield or ship--and enable the receipt of weather data directly from the polar-orbiting satellites. These terminals have their own software and processing capability to decode and display a subset of the satellite data to the user. Figure 2 depicts a generic data relay pattern from the polar-orbiting satellites to the data processing centers and field terminals. Polar satellites gather a broad range of data that are transformed into a variety of products for many different uses. When first received, satellite data are considered raw data. To make them usable, the processing centers format the data so that they are time-sequenced and include earth location and calibration information. After formatting, these data are called raw data records. The centers further process these raw data records into data sets, called sensor data records and temperature data records. These data records are then used to derive weather products called environmental data records (EDR). EDRs range from atmospheric products detailing cloud coverage, temperature, humidity, and ozone distribution; to land surface products showing snow cover, vegetation, and land use; to ocean products depicting sea surface temperatures, sea ice, and wave height; to characterizations of the space environment. Combinations of these data records (raw, sensor, temperature, and environmental data records) are also used to derive more sophisticated products, including outputs from numerical weather models and assessments of climate trends. Figure 3 is a simplified depiction of the various stages of data processing. EDRs can be either images or quantitative data products. Image EDRs provide graphical depictions of the weather and are used to observe meteorological and oceanographic phenomena to track operationally significant events (such as tropical storms, volcanic ash, and icebergs), and to provide quality assurance for weather prediction models. The following figures demonstrate polar-orbiting satellite images. Figure 4 is an image from a DMSP satellite showing an infrared picture taken over the west Atlantic Ocean. Figure 5 is a POES image of Hurricane Floyd, which struck the southern Atlantic coastline in 1999. Figure 6 is a polar- satellite image used to detect volcanic ash clouds, in particular the ash cloud resulting from the eruption of Mount Etna in 2001. Figure 7 shows the location of icebergs near Antarctica in February 2002. Quantitative EDRs are specialized weather products that can be used to assess the environment and climate or to derive other products. These EDRs can also be depicted graphically. Figures 8 and 9 are graphic depictions of quantitative data on sea surface temperature and ozone measurements, respectively. An example of a product that was derived from EDRs is provided in figure 10. This product shows how long a person could survive in the ocean--information used in military as well as search and rescue operations--and was based on sea surface temperature EDRs from polar-orbiting satellites. Another use of quantitative satellite data is in numerical weather prediction models. Based predominantly on observations from polar- orbiting satellites and supplemented by data from other sources such as geostationary satellites, radar, weather balloons, and surface observing systems, numerical weather prediction models are used in producing hourly, daily, weekly, and monthly forecasts of atmospheric, land, and ocean conditions. These models require quantitative satellite data to update their analysis of weather and to produce new forecasts. Table 1 provides examples of models run by the processing centers. Figure 11 depicts the output of one common model. All this information--satellite data, imagery, derived products, and model output--is used in mapping and monitoring changes in weather, climate, the ocean, and the environment. These data and products are provided to weather forecasters for use in issuing weather forecasts and warnings to the public and to support our nation's aviation, agriculture, and maritime communities. Also, weather data and products are used by climatologists and meteorologists to monitor the environment. Within the military, these data and products allow military planners and tactical users to focus on anticipating and exploiting atmospheric and space environmental conditions. For example, Air Force Weather Agency officials told us that accurate wind and temperature forecasts are critical to any decision to launch an aircraft that will need mid-flight refueling. In addition to these operational uses of satellite data, there is also a substantial need for polar satellite data for research. According to experts in climate research, the research community requires long-term, consistent sets of satellite data collected sequentially, usually at fixed intervals of time, in order to study many critical climate processes. Examples of research topics include long- term trends in temperature, precipitation, and snow cover. Given the expectation that merging the POES and DMSP programs would reduce duplication and result in sizable cost savings, a May 1994 Presidential Decision Directive required NOAA and DOD to converge the two satellite programs into a single satellite program capable of satisfying both civilian and military requirements. The converged program is called the National Polar-orbiting Operational Environmental Satellite System (NPOESS), and it is considered critical to the United States' ability to maintain the continuity of data required for weather forecasting and global climate monitoring. To manage this program, DOD, NOAA, and the National Aeronautics and Space Administration (NASA) have formed a tri- agency Integrated Program Office, located within NOAA. Within the program office, each agency has the lead on certain activities. NOAA has overall responsibility for the converged system, as well as satellite operations; DOD has the lead on the acquisition; and NASA has primary responsibility for facilitating the development and incorporation of new technologies into the converged system. NOAA and DOD share the costs of funding NPOESS, while NASA funds specific technology projects and studies. NPOESS is a major system acquisition estimated to cost almost $7 billion over the 24-year period from the inception of the program in 1995 through 2018. The program is to provide satellite development, satellite launch and operation, and integrated data processing. These deliverables are grouped into four main categories: (1) the launch segment, which includes the launch vehicle and supporting equipment, (2) the space segment, which includes the satellites and sensors, (3) the interface data processing segment, which includes the data processing system to be located at the four processing centers, and (4) the command, control, and communications segment, which includes the equipment and services needed to track and control satellites. Program acquisition plans call for the procurement and launch of six NPOESS satellites over the life of the program and the integration of 14 instruments, comprising 12 environmental sensors and 2 subsystems. Together, the sensors are to receive and transmit data on atmospheric, cloud cover, environmental, climate, oceanographic, and solar-geophysical observations. The subsystems are to support nonenvironmental search and rescue efforts and environmental data collection activities. According to the Integrated Program Office, 8 of the 14 planned NPOESS instruments involve new technology development, whereas 6 others are based on existing technologies. The planned instruments and the state of technology on each are listed in table 2. Unlike the current polar satellite program, in which the four centers use different approaches to process raw data into the environmental data records that they are responsible for, the NPOESS integrated data processing systemto be located at the four centers--is expected to provide a standard system to produce these data sets and products. The four processing centers will continue to use these data sets to produce other derived products, as well as for input to their numerical prediction models. NPOESS is planned to produce 55 EDRs, including atmospheric vertical temperature profile, sea surface temperature, cloud base height, ocean wave characteristics, and ozone profile. Some of these EDRs are comparable to existing products, whereas others are new. The user community designated six of these data products--supported by four sensors--as key EDRs, and noted that failure to provide them would cause the system to be reevaluated or the program to be terminated. The NPOESS acquisition program consists of three key phases: the concept and technology development phase, which lasted from roughly 1995 to early 1997; the program definition and risk reduction phase, which began in early 1997 and ended in August 2002; and the engineering and manufacturing development and production phase, which began in August 2002 and is expected to continue through the life of the program. The concept and technology development phase began with the decision to converge the POES and DMSP satellites and included early planning for the NPOESS acquisition. This phase included the successful convergence of the command and control of existing DMSP and POES satellites at NOAA's satellite operations center. The program definition and risk reduction phase involved both system- level and sensor-level initiatives. At the system level, the program office awarded contracts to two competing prime contractors to prepare for NPOESS system performance responsibility. These contractors developed unique approaches to meeting requirements, designing system architectures, and developing initiatives to reduce sensor development and integration risks. These contractors competed for the development and production contract. At the sensor level, the program office awarded contracts to develop five sensors. This phase ended when the development and production contract was awarded. At that point, the winning contractor was expected to assume overall responsibility for managing continued sensor development. The final phase, engineering and manufacturing development and production, began when the development and production contract was awarded to TRW in August 2002. At that time, TRW assumed system performance responsibility for the overall program. This responsibility includes all aspects of design, development, integration, assembly, test and evaluation, operations, and on-orbit support. Shortly after the contract was awarded, Northrop Grumman Space Technology purchased TRW and became the prime contractor on the NPOESS project. In May 1997, the Integrated Program Office assessed the technical, schedule, and cost risks of key elements of the NPOESS program, including (1) overall system integration, (2) the launch segment, (3) the space segment, (4) the interface data processing segment, and (5) the command, control, and communications segment. As a result of this assessment, the program office determined that three elements had high risk components: the interface data processing segment, the space segment, and the overall system integration. Specifically, the interface data processing segment and overall system integration were assessed as high risk in all three areas (technical, cost, and schedule), whereas the space segment was assessed to be high risk in the technical and cost areas, and moderate risk in the schedule area. The launch segment and the command, control, and communications segment were determined to present low or moderate risks. The program office expected to reduce its high risk components to low and moderate risks by the time the development and production contract was awarded, and to have all risk levels reduced to low before the first launch. Table 3 displays the results of the 1997 risk assessment as well as the program office's estimated risk levels by August 2002 and by first launch. In order to meet its goals of reducing program risks, the program office developed and implemented multiple risk reduction initiatives. One risk reduction initiative specifically targeted the space segment risks by initiating the development of key sensor technologies in advance of the satellite system itself. Because environmental sensors have historically taken 8 years to develop, the program office began developing six of the eight sensors with more advanced technologies early. In the late 1990s, the program office awarded contracts for the development, analysis, simulation, and prototype fabrication of five of these sensors. In addition, NASA awarded a contract for the early development of one other sensor. Responsibility for delivering these sensors was transferred from the program office to the prime contractor when the NPOESS contract was awarded in August 2002. Another major risk reduction initiative expected to address risks in three of the four segments with identified risks is called the NPOESS Preparatory Project (NPP). NPP is a planned demonstration satellite to be launched in 2006, several years before the first NPOESS satellite launch in 2009. It is scheduled to host three of the four critical NPOESS sensors (the visible/infrared imager radiometer suite, the cross-track infrared sounder, and the advanced technology microwave sounder), as well as two other noncritical sensors. Further, NPP will provide the program office and the processing centers an early opportunity to work with the sensors, ground control, and data processing systems. Specifically, this satellite is expected to demonstrate about half of the NPOESS EDRs and about 93 percent of its data processing load. Since our statement last year, the Integrated Program Office has made further progress on NPOESS. Specifically, it awarded the contract for the overall program and is monitoring and managing contract deliverables, including products that will be tested on NPP. The program office is also continuing to work on various other risk reduction activities, including learning from experiences with sensors on existing platforms, including NASA research satellites, the WINDSAT/Coriolis weather satellite, and the NPOESS airborne sounding testbed. While the program office has made progress both on the acquisition and risk reduction activities, the NPOESS program faces key programmatic and technical risks that may affect the successful and timely deployment of the system. Specifically, changing funding streams and revised schedules have delayed the expected launch date of the first NPOESS satellite, and concerns with the development of key sensors and the data processing system may cause additional delays in the satellite launch date. These planned and potential schedule delays could affect the continuity of weather data. Addressing these risks may result in increased costs for the overall program. In attempting to address these risks, the program office is working to develop a new cost and schedule baseline for the NPOESS program, which it hopes to complete by August 2003. When the NPOESS development contract was awarded, program office officials identified an anticipated schedule and funding stream for the program. The schedule for launching the satellites was driven by a requirement that the satellites be available to back up the final POES and DMSP satellites should anything go wrong during these satellites' planned launches. In general, program officials anticipate that roughly 1 out of every 10 satellites will fail either during launch or during early operations after launch. Key program milestones included (1) launching NPP by May 2006 in order to allow time to learn from that risk reduction effort, (2) having the first NPOESS satellite available to back up the final POES satellite launch in March 2008, and (3) having the second NPOESS satellite available to back up the final DMSP satellite launch in October 2009. If the NPOESS satellites were not needed to back up the final predecessor satellites, their anticipated launch dates would have been April 2009 and June 2011, respectively. However, a DOD program official reported that between 2001 and 2002, the agency experienced delays in launching a DMSP satellite, causing delays in the expected launch dates of another DMSP satellite. In late 2002, DOD shifted the expected launch date for the final DMSP satellite from 2009 to 2010. As a result, DOD reduced funding for NPOESS by about $65 million between fiscal years 2004 and 2007. According to NPOESS program officials, because NOAA is required to provide no more funding than DOD does, this change triggered a corresponding reduction in funding by NOAA for those years. As a result of the reduced funding, program office officials were forced to make difficult decisions about what to focus on first. The program office decided to keep NPP as close to its original schedule as possible because of its importance to the eventual NPOESS development, and to shift some of the NPOESS deliverables to later years. This shift will affect the NPOESS deployment schedule. Table 4 compares the program office's current estimates for key milestones, given current funding levels. As a result of the changes in funding between 2003 and 2007, project office officials estimate that the first NPOESS satellite will be available for launch 21 months after it is needed to back up the final POES satellite. This means that should the final POES launch fail in March 2008, there would be no backup satellite ready for launch. Unless the existing operational satellite is able to continue operations beyond its expected lifespan, there could be a gap in satellite coverage. Figure 12 depicts the schedule delay. We have reported on concerns about gaps in satellite coverage in the past. In the early 1990s, the development of the second generation of NOAA's geostationary satellites experienced severe technical problems, cost overruns, and schedule delays, resulting in a 5-year schedule slip in the launch of the first satellite; this schedule slip left NOAA in danger of temporarily losing geostationary satellite data coverage--although no gap in coverage actually occurred. In 2000, we reported that geostationary satellite data coverage was again at risk because of a delay in a satellite launch due to a problem with the engine of its launch vehicle. At that time, existing satellites were able to maintain coverage until the new satellite was launched over a year later--although one satellite had exceeded its expected lifespan and was using several backup systems in cases where primary systems had failed. DOD experienced the loss of DMSP satellite coverage in the 1970s, which led to increased recognition of the importance of polar-orbiting satellites and of the impact of the loss of satellite data. In addition to the schedule issues facing the NPOESS program, concerns have arisen regarding key components. Although the program office reduced some of the risks inherent in developing new technologies by initiating the development of these sensors early, individual sensor development efforts have experienced cost increases, schedule delays, and performance shortfalls. The cost estimates for all four critical sensors (the ones that are to support the most critical NPOESS EDRs) have increased, due in part to including items that were not included in the original estimates, and in part to addressing technical issues. These increases range from approximately $60 million to $200 million. Further, while all the sensors are still expected to be completed within schedule, many have slipped to the end of their schedule buffers--meaning that no additional time is available should other problems arise. Details on the status and changes in cost and schedule of four critical sensors are provided in table 5. The timely development of three of these sensors (the visible/infrared imager radiometer suite, the cross-track infrared sounder, and the advanced technology microwave sounder) is especially critical, because these sensors are to be demonstrated on the NPP satellite, currently scheduled for launch in October 2006. Critical sensors are also falling short of achieving the required levels of performance. As part of a review in early 2003, the program officials determined that all four critical sensors were at medium to high risk of shortfalls in performance. Program officials recently reported that since the time of that review, the concerns that led to those risk designations have been addressed, which contributed to the schedule delays and cost increases noted above. We have not evaluated the closure of these risk items. However, program officials acknowledge that there are still performance issues on two critical sensors which they are working to address. Specifically, officials reported that they are working to fix a problem with radio frequency interference on the conical microwave imager/sounder. Also, the program office is working with NASA to fix problems with electrostatic discharge procedures and misalignment of key components on the advanced technology microwave sounder. Further, the program office will likely continue to identify additional performance issues as the sensors are developed and tested. Officials anticipate that there could be cost increases and schedule delays associated with addressing performance issues. Program officials reported that these and other sensor problems are not unexpected; previous experience with such problems was what motivated them to begin developing the sensors early. However, officials acknowledge that continued problems could affect the sensors' delivery dates and potentially delay the NPP launch. Any delay in that launch date could affect the overall NPOESS program because the success of the program depends on learning lessons in data processing and system integration from the NPP satellite. The interface data processing system is a ground-based system that is to process the sensors' data so that they are usable by the data processing centers and the broader community of environmental data users. The development of this system is critical for both NPP and NPOESS. When used with NPP, the data processing system is expected to produce 26 of the 55 EDRs that NPOESS will provide, processing approximately 93 percent of the planned volume of NPOESS data. Further, the central processing centers will be able to work with these EDRs to begin developing their own specialized products with NPP data. These activities will allow system users to work through any problems well in advance of when the NPOESS data are needed. We reported last year that the volumes of data that NPOESS will provide present immense challenges to the centers' infrastructures and to their scientific capability to use these additional data effectively in weather products and models. We also noted that the centers need time to incorporate these new data into their products and models. Using the data processing system in conjunction with NPP will allow them to begin to do so. While the data processing segment is currently on schedule, program officials acknowledge the potential for future schedule delays. Specifically, an initial version of the data processing system is on track to be delivered at the end of July, and a later version is being planned. However, the data processing system faces potential risks that could affect the availability of NPP and in turn NPOESS. Specifically, program officials reported that there is a risk that the roughly 32 months allocated for developing the remaining software and delivering, installing, and verifying the system at two central processing centers will not be sufficient. A significant portion of the data processing system software involves converting scientific algorithms for operational use, but program officials noted that there is still uncertainty in how much time and effort it will take to complete this conversion. Any significant delays could cause the potential coverage gap between the launches of the final POES and first NPOESS satellites to grow even larger. Program officials are working to address the changes in funding levels and schedule, and to make plans for addressing specific sensor and data processing system risks. They acknowledge that delays in the program and efforts to address risks on key components could increase the overall cost of the program, which could result on the loss of some or all of the promised cost savings from converging the two separate satellite systems. However, estimates on these cost increases are still being determined. The program office is working to develop a new cost and schedule baseline based on the fiscal year 2004 President's budget for the NPOESS program. Officials noted that this rebaselining effort will involve a major contract renegotiation. Program officials reported that they hope to complete the new program baseline by August 2003. In summary, today's polar-orbiting weather satellite program is essential to a variety of civilian and military operations, ranging from weather warnings and forecasts to specialized weather products. NPOESS is expected to merge today's two separate satellite systems into a single state-of-the-art weather and environmental monitoring satellite system to support all military and civilian users, as well as the public. This new satellite system is considered critical to the United States' ability to maintain the continuity of data required for weather forecasting and global climate monitoring through the year 2018, and the first satellite was expected to be ready to act as a backup should the launch of the final satellites in the predecessor POES and DMSP programs fail. The NPOESS program office has made progress over the last years in trying to reduce project risks by developing critical sensors early and by planning the NPOESS Preparatory Project to demonstrate key sensors and the data processing system well before the first NPOESS launch. However, the NPOESS program faces key programmatic and technical risks that may affect the successful and timely deployment of the system. Specifically, changing funding streams and revised schedules have delayed the expected launch date of the first NPOESS satellite, and concerns with the development of key sensors and the data processing system may cause additional delays in the satellite launch date. These factors could affect the continuity of weather data needed for weather forecasts and climate monitoring. This concludes my statement. I would be pleased to respond to any questions that you or other members of the Subcommittee may have at this time. If you have any questions regarding this testimony, please contact David Powner at (202) 512-9286 or by E-mail at [email protected]. Individuals making key contributions to this testimony include Barbara Collier, John Dale, Ramnik Dhaliwal, Colleen Phillips, and Cynthia Scott. Our objectives were to provide an overview of our nation's current polar- orbiting weather satellite program and the planned National Polar-orbiting Operational Environmental Satellite System (NPOESS) program and to identify key risks to the successful and timely deployment of NPOESS. To provide an overview of the nation's current and future polar-orbiting weather satellite system programs, we relied on prior GAO reviews of the satellite programs of the National Oceanic and Atmospheric Administration (NOAA) and the Department of Defense (DOD). We reviewed documents from NOAA, DOD, and the National Aeronautics and Space Administration (NASA) that describe the purpose and origin of the polar satellite program and the status of the NPOESS program. We also interviewed Integrated Program Office and NASA officials to determine the program's background, status, and plans. To identify key risks to the successful and timely deployment of NPOESS, we assessed the NPOESS acquisition status and program risk reduction efforts to understand how the program office plans to manage the acquisition and mitigate the risks to successful NPOESS implementation. We reviewed descriptions of the NPOESS sensors and interviewed officials at the Integrated Program Office, NASA, and DOD to determine the status of key sensors, program segments, and risk reduction activities. We also reviewed documents and interviewed program office officials on plans to address NPOESS challenges. NOAA, DOD, and NASA officials generally agreed with the facts as presented in this statement and provided some technical corrections, which we have incorporated. We performed our work at the NPOESS Integrated Program Office, NASA headquarters, and DOD offices, all located in the Washington, D.C., metropolitan area. Our work was performed between April and July 2003 in accordance with generally accepted government auditing standards. This is a work of the U.S. government and is not subject to copyright protection in the United States. It may be reproduced and distributed in its entirety without further permission from GAO. However, because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. | Polar-orbiting environmental satellites provide data and imagery that are used by weather forecasters, climatologists, and the military to map and monitor changes in weather, climate, the ocean, and the environment. The current polar satellite program is a complex infrastructure that includes two satellite systems, supporting ground stations, and four central data processing centers. In the future, the National Polar-orbiting Operational Environmental Satellite System (NPOESS) is to merge the two current satellite systems into a single state-of-the-art environment monitoring satellite system. This new $7 billion satellite system is considered critical to the United States' ability to maintain the continuity of data required for weather forecasting and global climate monitoring through the year 2018. In its testimony GAO was asked, among other topics, to discuss risks to the success of the NPOESS deployment. The NPOESS program faces key programmatic and technical risks that may affect the successful and timely deployment of the system. The original plan for NPOESS was that it would be available to serve as a backup to the March 2008 launch of the final satellite in one of the two current satellite programs--the Polar-orbiting Operational Environmental Satellite (POES) system. However, changing funding streams and revised schedules have delayed the expected launch date of the first NPOESS satellite by 21 months. Thus, the first NPOESS satellite will not be ready in time to back up the final POES satellite, resulting in a potential gap in satellite coverage should that satellite fail. Specifically, if the final POES launch fails and if existing satellites are unable to continue operations beyond their expected lifespans, the continuity of weather data needed for weather forecasts and climate monitoring will be put at risk. Moreover, concerns with the development of key NPOESS components, including critical sensors and the data processing system, may cause additional delays in the satellite launch date. The program office is working to address the changes in funding levels and schedule, and to make plans for addressing specific risks. Further, it is working to develop a new cost and schedule baseline for the NPOESS program by August 2003. | 6,496 | 451 |
Historically, the U.S. government has granted federal recognition through treaties, congressional acts, or administrative decisions within the executive branch--principally by the Department of the Interior. In a 1977 report to the Congress, the American Indian Policy Review Commission criticized the department's tribal recognition policy. Specifically, the report stated that the department's criteria to assess whether a group should be recognized as a tribe were not clear and concluded that a large part of the department's policy depended on which official responded to the group's inquiries. Nevertheless, until the 1960s, the limited number of requests for federal recognition gave the department the flexibility to assess a group's status on a case-by-case basis without formal guidelines. However, in response to an increase in the number of requests for federal recognition, the department determined that it needed a uniform and objective approach to evaluate these requests. In 1978, it established a regulatory process for recognizing tribes whose relationship with the United States had either lapsed or never been established-- although tribes may seek recognition through other avenues, such as legislation or Department of the Interior administrative decisions unconnected to the regulatory process. In addition, not all tribes are eligible for the regulatory process. For example, tribes whose political relationship with the United States has been terminated by Congress, or tribes whose members are officially part of an already recognized tribe, are ineligible to be recognized through the regulatory process and must seek recognition through other avenues. The regulations lay out seven criteria that a group must meet before it can become a federally recognized tribe. Essentially, these criteria require the petitioner to show that it is descended from a historic tribe and is a distinct community that has continuously existed as a political entity since a time when the federal government broadly acknowledged a political relationship with all Indian tribes. The following are the seven criteria for recognition under the regulatory process: (a) The petitioner has been identified as an American Indian entity on a substantially continuous basis since 1900, (b) A predominant portion of the petitioning group comprises a distinct community and has existed as a community from historical times until the present, (c) The petitioner has maintained political influence or authority over its members as an autonomous entity from historical times until the present, (d) The group must provide a copy of its present governing documents and membership criteria, (e) The petitioner's membership consists of individuals who descend from a historical Indian tribe or tribes, which combined and functioned as a single autonomous political entity, (f) The membership of the petitioning group is composed principally of persons who are not members of any acknowledged North American Indian tribe, and (g) Neither the petitioner nor its members are the subject of congressional legislation that has expressly terminated or forbidden recognition. The burden of proof is on petitioners to provide documentation to satisfy the seven criteria. A technical staff within BIA, consisting of historians, anthropologists, and genealogists, reviews the submitted documentation and makes its recommendations on a proposed finding either for or against recognition. Staff recommendations are subject to review by the department's Office of the Solicitor and senior BIA officials. The Assistant Secretary-Indian Affairs makes the final decision regarding the proposed finding, which is then published in the Federal Register and a period of public comment, document submission, and response is allowed. The technical staff reviews the comments, documentation, and responses and makes recommendations on a final determination that are subject to the same levels of review as a proposed finding. The process culminates in a final determination by the Assistant Secretary, who, depending on the nature of further evidence submitted, may or may not rule the same was as was ruled for the proposed finding. Petitioners and others may file requests for reconsideration with the Interior Board of Indian Appeals. While we found general agreement on the seven criteria that groups must meet to be granted recognition, there is great potential for disagreement when the question before BIA is whether the level of available evidence is high enough to demonstrate that a petitioner meets the criteria. The need for clearer guidance on criteria and evidence used in recognition decisions became evident in a number of recent cases when the previous Assistant Secretary approved either proposed or final decisions to recognize tribes when the technical staff had recommended against recognition. Most recently, the current Assistant Secretary has reversed a decision made by the previous Assistant Secretary. Much of the current controversy surrounding the regulatory process stems from these cases. At the heart of the uncertainties are different positions on what a petitioner must present to support two key aspects of the criteria. In particular, there are differences over (1) what is needed to demonstrate continuous existence and (2) what proportion of members of the petitioning group must demonstrate descent from a historic tribe. Concerns over what constitutes continuous existence have centered on the allowable gap in time during which there is limited or no evidence that a petitioner has met one or more of the criteria. In one case, the technical staff recommended that a petitioner not be recognized because there was a 70-year period for which there was no evidence that the petitioner satisfied the criteria for continuous existence as a distinct community exhibiting political authority. The technical staff concluded that a 70-year evidentiary gap was too long to support a finding of continuous existence. The staff based its conclusion on precedent established through previous decisions in which the absence of evidence for shorter periods of time had served as grounds for finding that petitioners did not meet these criteria. However, in this case, the previous Assistant Secretary determined that the gap was not critical and issued a proposed finding to recognize the petitioner, concluding that continuous existence could be presumed despite the lack of specific evidence for a 70-year period. The regulations state that lack of evidence is cause for denial but note that historical situations and inherent limitations in the availability of evidence must be considered. The regulations specifically decline to define a permissible interval during which a group could be presumed to have continued to exist if the group could demonstrate its existence before and after the interval. They further state that establishing a specific interval would be inappropriate because the significance of the interval must be considered in light of the character of the group, its history, and the nature of the available evidence. Finally, the regulations note that experience has shown that historical evidence of tribal existence is often not available in clear, unambiguous packets relating to particular points in time Controversy and uncertainty also surround the proportion of a petitioner's membership that must demonstrate that it meets the criterion of descent from a historic Indian tribe. In one case, the technical staff recommended that a petitioner not be recognized because the petitioner could only demonstrate that 48 percent of its members were descendants. The technical staff concluded that finding that the petitioner had satisfied this criterion would have been a departure from precedent established through previous decisions in which petitioners found to meet this criterion had demonstrated a higher percentage of membership descent from a historic tribe. However, in the proposed finding, the Assistant Secretary found that the petitioner satisfied the criterion. The Assistant Secretary told us that although this decision was not consistent with previous decisions by other Assistant Secretaries, he believed the decision to be fair because the standard used for previous decisions was unfairly high. Again, the regulations intentionally left open key aspects of the criteria to interpretation. In this case they avoid establishing a specific percentage of members required to demonstrate descent because the significance of the percentage varies with the history and nature of the petitioner and the particular reasons why a portion of the membership may not meet the requirements of the criterion. The regulations state only that a petitioner's membership must consist of individuals who descend from historic tribes--no minimum percentage or quantifying term such as "most" or "some" is used. The only additional direction is found in 1997 guidelines, which note that petitioners need not demonstrate that 100 percent of their membership satisfies the criterion In updating its regulations in 1994, the department grappled with both these issues and ultimately determined that key aspects of the criteria should be left open to interpretation to accommodate the unique characteristics of individual petitions. Leaving key aspects open to interpretation increases the risk that the criteria may be applied inconsistently to different petitioners. To mitigate this risk, BIA uses precedents established in past decisions to provide guidance in interpreting key aspects of the criteria. However, the regulations and accompanying guidelines are silent regarding the role of precedent in making decisions or the circumstances that may cause deviation from precedent. Thus, petitioners, third parties, and future decisionmakers, who may want to consider precedents in past decisions, have difficulty understanding the basis for some decisions. Ultimately, BIA and the Assistant Secretary will still have to make difficult decisions about petitions when it is unclear whether a precedent applies or even exists. Because these circumstances require judgment on the part of the decisionmaker, public confidence in BIA and the Assistant Secretary as key decisionmakers is extremely important. A lack of clear and transparent explanations for their decisions could cast doubt on the objectivity of the decisionmakers, making it difficult for parties on all sides to understand and accept decisions, regardless of the merit or direction of the decisions reached. Accordingly, in our November 2001 report, we recommended that the Secretary of the Interior direct BIA to provide a clearer understanding of the basis used in recognition decisions by developing and using transparent guidelines that help interpret key aspects of the criteria and supporting evidence used in federal recognition decisions. In commenting on a draft of this report, the department generally agreed with this recommendation. To implement the recommendation, the department pledged to formulate a strategic action plan by May 2002. To date, this plan is still in draft form. Officials told us that they anticipate completing the plan soon. In conclusion, BIA's recognition process was never intended to be the only way groups could receive federal recognition. Nevertheless, it was intended to provide the Department of the Interior with an objective and uniform approach by establishing specific criteria and a process for evaluating groups seeking federal recognition. It is also the only avenue to federal recognition that has established criteria and a public process for determining whether groups meet the criteria. However, weaknesses in the process have created uncertainty about the basis for recognition decisions, calling into question the objectivity of the process. Without improvements that focus on fixing these and other problems on which we have reported, parties involved in tribal recognition may increasingly look outside of the regulatory process to the Congress or courts to resolve recognition issues, preventing the process from achieving its potential to provide a more uniform approach to tribal recognition. The result could be that the resolution of tribal recognition cases will have less to do with the attributes and qualities of a group as an independent political entity deserving a government-to-government relationship with the United States, and more to do with the resources that petitioners and third parties can marshal to develop successful political and legal strategies. Mr. Chairman, this completes my prepared statement. I would be happy to respond to any questions you or other Members of the Committee may have at this time. | Federal recognition of an Indian tribe can dramatically affect economic and social conditions for the tribe and the surrounding communities because these tribes are eligible to participate in federal assistance programs. There are currently 562 recognized tribes with a total membership of 1.7 million, and several hundred groups are currently seeking recognition. In fiscal year 2002, Congress appropriated $5 billion for programs and funding, almost exclusively for recognized tribes. Recognition also establishes a formal government-to-government relationship between the United States and a tribe. The Indian Gaming Regulatory Act of 1988, which regulated Indian gaming operations, permits a tribe to operate casinos on land in trust if the state in which it lies allows casino-like gaming and if the tribe has entered into a compact with the state regulating its gaming businesses. In 1999, federally recognized tribes reported $10 billion in gaming revenue, surpassing the amounts that the Nevada casinos collected that year. Owing to the rights and benefits that accrue with recognition and the controversy surrounding Indian gaming, the Bureau of Indian Affairs' (BIA) regulatory process has been subject to intense scrutiny by groups seeking recognition and other interested parties--including already recognized tribes and affected state and local governments. BIA's regulatory process for recognizing tribes was established in 1978 and requires that groups that are petitioning for recognition submit evidence that they meet certain criteria--basically that the petitioner has continuously existed as an Indian tribe since historic times. Critics of the process claim that it produces inconsistent decisions and takes too long. The basis for BIA's tribal recognition decisions is not always clear. Although there are set criteria that petitioning tribes must meet to be granted recognition, there is no guidance that clearly explains how to interpret key aspects of the criteria. The lack of guidance over what level of evidence is sufficient to demonstrate that a tribe has continued to exist over time creates controversy and uncertainty for all parties about the basis for decisions reached. | 2,379 | 408 |
Reset encompasses activities related to the repair, upgrade, or replacement of equipment used in contingency operations. Aviation and ground equipment are managed separately within the Marine Corps, and different definitions of reset are used for each. Marine Corps officials defined aviation equipment reset as an aircraft material condition and readiness sustainment effort that is required due to prolonged combat operations. Included are actions to maintain, preserve, and enhance the capability of aircraft. Ground equipment reset is defined by the Marine Corps as actions taken to restore units to a desired level of combat capability commensurate with the unit's future mission. It encompasses maintenance and supply activities that restore and enhance equipment that was destroyed, damaged, stressed, rendered obsolete, or worn out beyond economic repair due to combat operations by repairing, rebuilding, or procuring replacement equipment. Also included as part of ground equipment reset is recapitalization (rebuild or upgrade) that enhances existing equipment through the insertion of new technology or restores selected equipment to near-original condition. The Marine Corps's equipment reset budget totals more than $8 billion for fiscal years 2009 through 2012. Maintenance-related activities included as part of reset are funded from operations and maintenance appropriations, while most recapitalization and all acquisitions of new equipment as part of reset are funded from procurement appropriations. Reset funds are requested and budgeted separately for aviation and ground equipment. Aviation equipment: The Marine Corps' aviation equipment reset budget was approximately $66.7 million in fiscal year 2009 and approximately $57.8 million in fiscal year 2010. The Marine Corps requested approximately $56.1 million for fiscal 2011 and has requested $45.3 million for fiscal year 2012 to reset aviation equipment. As discussed later in this report, reset funding for aviation equipment covers only operations and maintenance appropriations and excludes procurement appropriations. Ground equipment: The Marine Corps' ground equipment reset budget was approximately $2.2 billion in fiscal year 2009 and approximately $1.3 billion in fiscal year 2010. The Marine Corps requested approximately $2.6 billion for fiscal year 2011 and has requested $1.8 billion for fiscal year 2012 to reset ground equipment. This funding includes funds requested as part of operations and maintenance appropriations and procurement appropriations. The fiscal year 2011 request included a $1.1 billion increase in procurement funding over fiscal year 2010, which the Marine Corps attributed to increased equipment combat losses and to the replacement of equipment that is beyond economic repair. Appendix II provides further detail on reset funding for aviation and ground equipment. Our prior work has shown that sound strategic management planning can enable organizations to identify and achieve long-range goals and objectives. We have identified six elements that should be incorporated into strategic plans to establish a comprehensive, results-oriented framework--an approach whereby program effectiveness is measured in terms of outcomes or impact. These elements follow: (1) Mission statement: A statement that concisely summarizes what the organization does, presenting the main purposes for all its major functions and operations. (2) Long-term goals: A specific set of policy, programmatic, and management goals for the programs and operations covered in the strategic plan. The long-term goals should correspond to the purposes set forth in the mission statement and develop with greater specificity how an organization will carry out its mission. (3) Strategies to achieve the goals: A description of how the goals contained in the strategic plan and performance plan are to be achieved, including the operational processes, skills and technology, and other resources required to meet these goals. (4) External factors that could affect goals: Key factors external to the organization and beyond its control that could significantly affect the achievement of the long-term goals contained in the strategic plan. These external factors can include economic, demographic, social, technological, or environmental factors, as well as conditions or events that would affect the organization's ability to achieve its strategic goals. (5) Use of metrics to gauge progress: A set of metrics that will be applied to gauge progress toward attainment of the plan's long-term goals. (6) Evaluations of the plan to monitor goals and objectives: Assessments, through objective measurement and systematic analysis, of the manner and extent to which programs associated with the strategic plan achieve their intended goals. Over the past several years we have reported on equipment reset issues. In 2007, for example, we reported that the Marine Corps could not be certain that its reset strategies would sustain equipment availability for deployed units as well as units preparing for deployment, while meeting ongoing operational requirements. We have also made recommendations aimed at improving DOD's monthly cost reports for reset and defining the types of costs that should be included in the base defense budget rather than funded from supplemental appropriations for contingency operations. Specifically, we recommended DOD amend its Financial Management Regulation to require that monthly Supplemental and Cost of War Execution Reports identify expenditures within the procurement accounts for equipment reset at more detailed subcost category levels, similar to reporting of obligations and expenditures in the operation and maintenance accounts. DOD initially disagreed with this recommendation but later revised its Financial Management Regulation, expanding the definition of acceptable maintenance and procurement costs and directing the military services to begin including "longer war on terror" costs in their overseas contingency operations funding requests. We subsequently recommended that DOD issue guidance defining what constitutes the "longer war on terror," to identify what costs are related to that longer war and to build these costs into the base defense budget. While the department concurred with this recommendation and stated that it has plans to revise its Financial Management Regulation accordingly, it has not yet done so. The Office of Management and Budget (OMB) has issued budget formulation guidance for DOD that addresses overseas contingency operations, including reset funding. Guidance issued in February 2009 provided new criteria for DOD to use when preparing its budget request to assess whether funding, including funding for reset, should be requested as part of the base budget or as part of the budget for overseas contingency operations. The criteria identified geographic areas where overseas contingency operations funding could be used; provided a list of specific categories of spending that should be included in the overseas contingency budget, such as major equipment repairs, ground equipment replacement, equipment modifications, and aircraft replacement; and identified certain spending that should be excluded from the overseas contingency operations budget (i.e., should be included in the base budget) such as funding to support family services at home stations. For example, funding is excluded for the replacement of equipment losses already programmed for replacement in the Future Years Defense Plan. In September 2010, OMB issued updated criteria to, among other things, clarify language and eliminate areas of confusion. DOD has also issued its own budget formulation guidance for overseas contingency operations. In December 2009, DOD issued Resource Management Decision 700 to regulate the funding of the military services' readiness accounts and to require that significant resources from the overseas contingency operations funding be moved into the base defense budget. Specifically, the services' 2012 Program Objective Memorandum submissions for overseas contingency operations funding are restricted to resource levels appropriate for planned and projected troop levels. To facilitate the implementation of this guidance within the department, Resource Management Decision 700 outlines several actions for organizations to take. For example, it directed the Office of the Under Secretary of Defense for Acquisition, Technology and Logistics, in coordination with the Director of Cost Assessment and Program Evaluation, the military services, the DOD Comptroller, and the Joint Staff, to conduct periodic reviews of the services' in-theater maintenance activities and reset maintenance actions that include an assessment of the relationship between maintenance-funded base programs and contingency operations. This assessment was provided to the Deputy Secretary of Defense in July 2010. The Director of Cost Assessment and Program Evaluation tracks estimated total reset costs across the department based on data provided by the services. The total reset costs are the amount of funding needed to reset all equipment used in contingency operations if the operations were to cease. Specifically, the total reset costs equal the sum of the annual unbudgeted reset liability and the annual budgeted reset. The annual unbudgeted reset liability is the amount of equipment eligible for reset that stays in theater and is not reset during the budget year, based on operational decisions. The annual budgeted reset is the amount of equipment planned to return from operations that requires funds budgeted for reset. As part of its ground equipment reset strategy for Iraq, the Marine Corps developed the Reset Cost Model to generate cost estimates for the service's supplemental budget requests. Additionally, the Reset Cost Model allows the Marine Corps to estimate reset costs for ground equipment, including budgeted and unbudgeted reset costs. Since the Reset Cost Model is focused on ground equipment employed in the U.S. Central Command area of responsibility, the Marine Corps continues to use the Reset Cost Model to develop overseas contingency operations budget requests for ground equipment used in Afghanistan. The cost estimates generated by the Reset Cost Model are based on the four possible reset actions: First, equipment returning from theater is inspected to determine if depot-level repairs are required. Depot maintenance actions are conducted if the estimated cost of repair for the equipment is 65 percent or less than the latest acquisition cost. Second, ground equipment used in operations is evaluated at various locations throughout the logistics chain to determine if the equipment requires field-level maintenance. These maintenance actions are conducted by operating forces. Third, upon return to the continental United States, equipment identified as obsolete or uneconomical to repair is replaced through procurement as its reset action. Fourth, if equipment acquired for combat operations does not have a long-term requirement within the Marine Corps, no reset maintenance actions are taken unless there is an immediate requirement in another campaign or theater of operations. Estimating aviation equipment reset costs follows a separate process. For aviation equipment reset, the Marine Corps has a process for requirements determination, budgeting, and execution, all of which are included in the annual budget process. According to Navy and Marine Corps officials, a clearly defined process is used to determine reset costs for aviation equipment that includes requirements generated from the fleet while working closely with the Chief of Naval Operation Fleet Readiness Division and each of the program offices to determine current and future reset requirements. Overseas contingency costs--including reset costs--are generated using issue sheets that record information on each item such as the categorization of funding, the amount of funding requested for a specific item, the number of items requested, and the cost per unit. Once the issue sheets are generated, Headquarters, Marine Corps, and the Commander of Naval Air Forces prioritize the issue sheets and provide a finalized list of the funding priorities according to current needs for which future funding is allocated. The Marine Corps has developed an annual aviation plan and an aviation reset program policy that together constitute its reset strategy for aviation equipment used in Afghanistan. Although separate documents, the annual aviation plan and aviation reset program policy are linked through the aviation plans' reference to the aviation reset policy. Our evaluation of this reset strategy shows that it incorporates the six elements of a comprehensive, results-oriented strategic planning framework. For example, the reset strategy establishes goals and associated time frames for completing detailed reviews of aircraft and aircraft components and transitioning to future aircraft. It also provides strategies for accomplishing key tasks such as scheduling inspections, as well as performance measures and targets. (See table 1.) The Marine Corps is taking steps to develop a strategy addressing the reset of ground equipment used in Afghanistan; however, the timeline for completing and issuing this strategy is uncertain. Although Marine Corps officials agreed that a reset strategy for ground equipment will be needed, they stated that they do not plan to issue a strategy until there is a better understanding of the dates for initial and final drawdown of forces from Afghanistan. While more specific and certain drawdown information is desirable and will be needed to firm-up reset plans, the President stated that troops would begin to withdraw in July 2011, working towards a complete transfer of all security operations to Afghan National Security Forces by 2014. The current dates announced by the President are the best available for the purposes of contingency planning and provide a reasonable basis for developing a timeline to complete its reset strategy. In the meantime, Marine Corps officials are taking the following steps toward developing a reset strategy: First, the Marine Corps completed a force structure review in early 2011 that is aimed at ensuring the service is properly configured. The force structure review included a determination of equipment reset requirements to support the post-Afghanistan Marine Corps force structure. Second, the Marine Corps is currently developing an implementation plan based on the results of the force structure review. A goal of the force structure implementation plan is to ensure that the Marine Corps achieves a restructured force by the time the reset of equipment used in Afghanistan is complete. The focus of this implementation plan is the establishment of the mission-essential tasks and the development of refined tables of equipment in support of those tasks. These refined tables of equipment will determine what equipment the Marine Corps will reset and how the equipment will be reintegrated into nondeployed Marine Corps forces. The Marine Corps plans to issue this force structure implementation plan in summer 2011. Third, following issuance of the force structure implementation plan, the Marine Corps plans to develop and issue formal reset planning guidance that informs operating force units and the Marine Corps Logistics Command what equipment they will receive and be responsible for resetting. Specifically, Marine Corps officials stated that the planning guidance is intended to allow Marine Forces Commands, Marine Expeditionary Forces, and Marine Corps Logistics Command to assess their reset maintenance capacity requirements and identify additional support requirements beyond the maintenance centers' capacity. The officials indicated that the planning guidance would serve as a precursor to a comprehensive reset strategy. Although the Marine Corps has laid out several steps toward developing its ground equipment reset strategy, it has not specified timelines for completing and issuing either the formal reset planning guidance or its reset strategy or indicated how it plans to take into consideration the current dates announced by the President for withdrawal in its reset strategy for Afghanistan. The reset strategy is necessary to help ensure that life-cycle management governance is provided to key organizations responsible for executing reset, such as the Marine Corps Logistics Command. Until the reset strategy is issued, establishing firm plans for reset may be difficult for the Marine Corps Logistics Command to effectively manage the rotation of equipment to units to sustain combat operations or meet the equipment needs of a newly defined post- Afghanistan Marine Corps force structure. In the absence of a reset strategy, Marine Corps Logistics Command officials told us they cannot issue its supporting order which enables its maintenance centers to effectively begin planning for and phasing in a new maintenance workload. It is also uncertain to what extent the Marine Corps plans to align its ground equipment reset strategy with its ground equipment modernization plan. The ground equipment modernization plan is used annually to develop future warfighting capabilities to meet national security objectives. Following the plan guides the Marine Corps in the identification, development, and integration of warfighting and associated support and infrastructure capabilities. Marine Corps officials have stated that they plan to establish a link between the reset strategy for Afghanistan and the ground modernization plan. As a basis for evaluating current reset planning for ground equipment used in Afghanistan, we also reviewed both the aviation reset strategy for Afghanistan and the ground equipment reset strategy that the Marine Corps developed for Iraq. We found that the aviation reset strategy was directly linked to the aviation equipment modernization plan. For example, the aviation equipment modernization plan outlines the transition for the UH-1N Marine Light Attack Helicopter to the UH-1Y, which should be fully phased in by fiscal year 2015. As part of the reset strategy for the UH-1Y, reset requirements for the maintenance centers associated with this transition have been identified. In contrast, we found that the Iraq reset strategy for ground equipment contained no direct reference to the service's equipment modernization plans. Marine Corps officials stated that it was unnecessary to include a direct reference to the equipment modernization plan in its Iraq reset strategy because they are indirectly linked through the roles and responsibilities for the Deputy Commandant, Combat Development and Integration. Specifically, the officials noted that the Iraq reset strategy contains a section outlining these roles and responsibilities and that these same roles and responsibilities are included in the Expeditionary Force Development System instruction. However, this indirect linkage does not provide a clear relationship between reset and modernization. A clear alignment of the ground equipment reset strategy for Afghanistan and modernization plan would help to ensure that the identification, development, and integration of warfighting capabilities also factor in equipment reset strategies so that equipment planned for modernization is not unnecessarily repaired. Without a Marine Corps reset strategy for ground equipment used in operations in Afghanistan that includes clear linkages to the modernization plan, the Marine Corps may not be able to effectively plan and execute ground equipment reset in the most efficient manner. The total costs of reset estimated by the Marine Corps may not be accurate or consistent because of differing definitions of reset that have been used for aviation and ground equipment. These differing definitions exist because DOD has not established a single standard definition for use in DOD's budget process. Specifically, the Marine Corps does not include aviation equipment procurement costs when estimating total reset costs. According to Marine Corps officials, procurement costs are excluded because such costs are not consistent with its definition of aviation equipment reset. Additionally, Marine Corps officials stated that the definition of reset for aviation equipment is to maintain, preserve, and enhance the capability of aircraft through maintenance activities. This definition, according to Marine Corps officials, does not include procurement funding for the replacement of aviation equipment losses in theater. In contrast, the Marine Corps' definition of reset for ground equipment includes procurement costs to replace theater losses. Reset for all types of equipment as defined by other services (e.g., the Army) also includes procurement costs. Although the Marine Corps excludes procurement costs when estimating aviation equipment reset costs, we found that the Director of Cost Assessment and Program Evaluation had obtained a procurement cost estimate for Marine Corps aviation equipment as part of its efforts to track reset costs for the department. DOD's Resource Management Decision 700 tasks the Director of Cost Assessment and Program Evaluation with providing annual departmentwide reset updates that (1) outline current- year reset funding needs, (2) assess the multiyear reset liability based on plans for equipment redeployment, and (3) detail deferred reset funding actions. Based on this tasking, the Marine Corps provided total reset costs that included procurement costs for equipment replacement, as well as maintenance costs, for both ground and aviation equipment. The update showed that total reset costs for Marine Corps aviation equipment was approximately $1.8 billion for fiscal years 2010 through 2012, which includes $1.4 billion for procurement costs. These reported costs were included in the 2010 DOD Reset Planning Projections annual update prepared by the Director of Cost Assessment and Program Evaluation. We were not able to determine the reasons for this apparent inconsistency between what the Marine Corps considers to be valid aviation equipment reset costs (i.e., excludes procurement costs) and what was reported in the 2010 DOD Reset Planning Projections annual update (i.e., includes procurement costs). Navy and Marine Corps officials stated that they were unable to identify any official from the Navy or Marine Corps as the source for providing or producing this total reset cost data for Marine Corp aviation equipment. Therefore, we could not assess the basis for the reported aviation equipment reset costs to determine their accuracy. DOD's Resource Management Decision 700 also directed the DOD Comptroller to publish a DOD definition of reset for use in the DOD overseas contingency operations budgeting process. DOD's definition of reset was to be submitted by the Comptroller to the Deputy Secretary of Defense for approval by January 15, 2010, well ahead of the Marine Corps' initial submission of its total reset liability, which was due by June 1, 2010. However, a single standard definition of reset for budget purposes has not yet been issued to the services. We also found that the Marine Corps' definition of aviation reset differs from the definition of reset provided for use in congressional testimony in a January 2007 memorandum from the Deputy Under Secretary of Defense for Logistics and Materiel Readiness to the under secretaries of the military departments. That memorandum states that reset encompasses maintenance and supply activities that restore and enhance combat capability to units and prepositioned equipment that was destroyed, damaged, stressed, or worn-out beyond economic repair due to combat operations by repairing, rebuilding, or procuring replacement equipment. According to the memorandum, the Office of the Secretary of Defense and the services agreed to this definition of reset; the memorandum emphasizes that it is important that all DOD military departments are consistent in the definition of the terms during congressional testimony. Without a single standard definition for reset for the services to use, the Marine Corps may continue to report its total reset costs for aviation equipment inconsistently. Furthermore, data integrity issues will make it challenging to identify program funding trends within the Marine Corps and among the services for equipment reset. Without accurate reporting of total reset costs for aviation equipment, the level of reset funding the Marine Corps needs to sustain future operations may not be properly communicated to Congress beyond what has been requested for overseas contingency operations. Furthermore, the Office of the Under Secretary of Defense Comptroller, Director of Cost Assessment and Program Evaluation, and OMB may not have the most reliable aviation equipment reset data for their review and oversight of the Marine Corps' overseas contingency operations budget requests. With the increased demands current operations have placed on Marine Corps equipment, and at a time when the federal government is facing long-term fiscal challenges, it is important for the Marine Corps to have a reset strategy in place for both ground and aviation equipment used in operations in Afghanistan as well as a standard DOD definition for reset. Reset strategies provide a framework that allows Marine Corps officials to adequately plan, budget, and execute the reset of equipment used in operations in Afghanistan. The reset strategy, and the timing thereof, could be modified if U.S. drawdown plans subsequently change or should the Marine Corps receive more specific and certain drawdown information. However, without specified timelines for completing and issuing either formal reset planning guidance or its reset strategy that also take into consideration the current dates announced by the President for withdrawal--which are the best available for the purposes of contingency planning--the Marine Corps may be unable to effectively manage the rotation of equipment to units to sustain combat operations, or meet the equipment needs of a newly defined post-Afghanistan Marine Corps force structure. Additionally, without a Marine Corps reset strategy for ground equipment used in operations in Afghanistan that includes clear linkages to the modernization plan, the Marine Corps may not be able to effectively plan and execute ground equipment reset in the most efficient manner. Furthermore, the total reset costs provide information that allows the Marine Corps to more efficiently plan and make informed budget decisions and allows Office of the Under Secretary of Defense (Comptroller) and OMB to have oversight. Until DOD establishes a single standard definition for reset for the services to use, DOD and Congress may have limited visibility over the total reset costs for the services. Accurate reporting of total reset costs for aviation equipment would provide Congress with the level of funding the Marine Corps needs to reset all equipment used in operations in Afghanistan at the conclusion of operations. Furthermore, the Office of the Under Secretary of Defense for the Comptroller and for Cost Assessment and Program Evaluation and OMB may lack the visibility needed over the aviation reset funds in their review and oversight of the Marine Corps overseas contingency operations budget requests. To improve the Marine Corps' ability to plan, budget for, and execute the reset of ground equipment used in Afghanistan, we recommend that the Secretary of Defense direct the Commandant of the Marine Corps to take the following two actions: Establish a timeline for completing and issuing formal reset planning guidance and a ground equipment reset strategy for equipment used in Afghanistan that allows operating force units and the Marine Corps Logistics Command to effectively manage equipment reset. Provide linkages between the ground equipment reset strategy for equipment used in Afghanistan and equipment modernization plans, including the Expeditionary Force Development System and the annual Program Objective Memorandum Marine Air-Ground Task Force Requirements List. To improve oversight and ensure consistency in the reporting of total reset costs, we recommend that the Secretary of Defense direct the Office of the Under Secretary of Defense (Comptroller), in coordination with the Office of the Under Secretary of Defense for Cost Assessment and Program Evaluation, the Office of the Under Secretary of Defense for Acquisitions, Technology and Logistics, the services, and the Joint Staff to act on the tasking in the Resource Management Decision 700 to develop and publish a DOD definition of reset for use in the DOD overseas contingency operations budgeting process. In written comments on a draft of this report, DOD concurred with one of our recommendations and partially concurred with the other two recommendations and provided information on the steps it is taking or plans to take to address them. DOD partially concurred with our recommendation that the Secretary of Defense direct the Commandant of the Marine Corps to establish a timeline for completing and issuing formal reset planning guidance and a ground equipment reset strategy for equipment used in Afghanistan that allows operating force units and the Marine Corps Logistics Command to effectively manage equipment reset. DOD commented that guidance for resetting the force is being developed in its Operation Enduring Freedom Reset Plan, the Operation Enduring Freedom Reset Playbook, and the Marine Air Ground Task Force Integration Plan. However, during the course of our review, the development of a strategy for ground equipment in Afghanistan was in the beginning stages and the Marine Corps did not discuss or provide details regarding the three documents now cited as its guidance for resetting the force. DOD added that the Marine Corps has established a timeline/estimated date of April 30, 2012, for completing and issuing format reset planning guidance and a ground equipment reset strategy for equipment used in Afghanistan. While the Marine Corps has provided DOD with a date for completing and issuing this guidance, the Marine Corps does not appear to have established a sequenced timeline, as we recommended. Specifically, DOD's response has both the formal reset planning guidance and the ground equipment reset strategy being issued at the same time. Marine Corps officials stated that the formal reset planning guidance is intended to serve as a precursor to a comprehensive reset strategy that will allow Marine Forces Commands, Marine Expeditionary Forces, and Marine Corps Logistics Command to assess their reset maintenance capacity requirements and identify additional support requirements beyond the maintenance centers' capacity. We believe this guidance will not be useful if it is not issued sufficiently ahead of time to guide the development of the ground equipment reset strategy. Consequently, we disagree with DOD's statement that the Marine Corps does not need further direction to establish a timeline for completing and issuing formal reset planning guidance and a ground equipment reset strategy for equipment used in Afghanistan. DOD partially concurred with our recommendation that the Secretary of Defense direct the Commandant of the Marine Corps to provide linkages between the ground equipment reset strategy for equipment used in Afghanistan and equipment modernization plans, including the Expeditionary Force Development System and the annual Program Objective Memorandum Marine Air-Ground Task Force Requirements List. DOD commented that it recognizes the importance of providing a linkage between ground equipment reset strategies and equipment modernization plans. Specifically, DOD commented that the Marine Corps plans to outline these linkages in their Operation Enduring Freedom Reset Plan, the Operation Enduring Freedom Reset Playbook, and the Marine Air Ground Task Force Integration Plan, which are currently being developed. While, as previously mentioned, the Marine Corps did not provide specific details regarding the three documents cited above during the course of our review, we believe that including this linkage in these documents would be responsive to our recommendation and will allow the Marine Corps to more effectively and efficiently plan and execute ground equipment reset. DOD concurred with our recommendation that the Secretary of Defense direct the Office of the Under Secretary of Defense (Comptroller), in coordination with the Office of the Under Secretary of Defense for Cost Assessment and Program Evaluation, the Office of the Under Secretary of Defense for Acquisitions, Technology and Logistics, the services, and the Joint Staff to act on the tasking in the Resource Management Decision 700 to develop and publish a DOD definition of reset for use in the DOD overseas contingency operations budgeting process. DOD commented that it is developing a definition of reset for use in the overseas contingencies operations budgeting process that will be incorporated into the DOD Financial Management Regulation. However, during the course of our review DOD had not yet taken action to develop a reset definition, which was to have been submitted by the Comptroller to the Deputy Secretary of Defense for approval by January 15, 2010. In addition, DOD commented that in the interim the department is using specific criteria provided by OMB guidance for determining the reset requirements that are overseas contingency operations or base. While OMB has provided guidance for overseas contingency operations budget requests, this guidance does not provide specific direction concerning what constitutes reset. Consequently, DOD recognizes the need for a common definition of equipment reset for budget purposes, but has not met its goal of establishing one. Resource Management Decision 700 established a January 2010 date for approving a common reset definition, and that definition could have been used in developing the department's fiscal year 2012 budget submission. DOD is now developing its fiscal year 2013 budget submission without the benefit of a common definition. Therefore, we disagree with DOD's statement that additional and separate guidance from the Secretary of Defense is not necessary, and believe that additional direction is needed to emphasize that the Under Secretary of Defense (Comptroller), in coordination with the Office of the Under Secretary of Defense for Cost Assessment and Program Evaluation, the Office of the Under Secretary of Defense for Acquisitions, Technology and Logistics, the services, and the Joint Staff should expedite the development and publication of a DOD definition of reset for use in the DOD overseas contingency operations budgeting process. The department's comments are reprinted in appendix III. We are sending copies of this report to appropriate congressional committees, the Secretary of Defense, and appropriate DOD organizations. In addition, this report will be available at no charge on our website at http://www.gao.gov. If you or your staff have any questions about this report, please contact me at (202) 512-8365 or [email protected]. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this report. GAO staff who made major contributions to this report are listed in appendix IV. To determine the extent to which the Marine Corps has a strategy in place to manage the reset of ground and aviation equipment used in operations in Afghanistan, we obtained and reviewed the Marine Corps reset strategies for ground and aviation equipment used in operations in Afghanistan. Where strategies had not yet been developed, we collected information regarding ongoing reset planning efforts from Marine Corps officials and discussed with them the process used and the factors considered when developing a reset strategy. As a basis for assessing current reset planning efforts for Afghanistan, we also reviewed the reset strategy that the Marine Corps prepared for equipment used in Iraq. We collected written responses and supporting documentation to our inquiries and data requests from Marine Corps officials related to ground and aviation equipment reset strategies. We also discussed with Marine Corps officials the process used and the factors considered when developing these reset strategies. Additionally, we discussed the reset strategies with Marine Corps officials to determine the roles and responsibilities of the maintenance and fleet readiness centers in preparing for equipment requiring reset and determining the appropriate reset strategy. To determine the extent to which the Marine Corps has developed effective reset strategies for the reset of equipment used in operations in Afghanistan that address the key elements of a comprehensive, results- oriented strategic planning framework, we reviewed and analyzed the ground and aviation equipment reset strategies and supporting guidance documents. Specifically, we analyzed the reset strategies and supporting guidance documents to determine if they included the six key elements of a strategic planning framework. In performing our analysis, we reviewed the strategies to determine if they included, partially included, or did not include each of the six key elements. Through our assessment we determined the guidance documents in addition to the aviation equipment reset strategy that comprises the Marine Corps strategic plan for reset. In addition, to understand the extent to which the Marine Corps aligns its modernization plans with its reset strategies, we interviewed Marine Corps officials to discuss the plans used for modernization and discussed the process for how these plans are incorporated with the strategies for equipment reset. To assess the Marine Corps' estimates of total reset costs, we obtained and reviewed the Department of Defense's (DOD) Resource Management Decision 700--separate from the budget formulation guidance--tasking the services to provide annual reset cost updates, and the Marine Corps processes for determining total reset costs for ground and aviation equipment. We collected written responses to our inquiries and data requests from Marine Corps officials about the system they use to determine total reset costs for ground and aviation equipment used in operations in Afghanistan. In addition, we interviewed Marine Corps officials to obtain any information relevant to the system they use to determine total reset costs for equipment used in operations in Afghanistan. To better understand the Marine Corps reset funding needs for ground and aviation equipment, we requested reset budget data for fiscal year 2009 through fiscal year 2012. We reviewed the budget data obtained and met with Marine Corps officials to discuss the data to ensure that we had a correct understanding of the different budget categories, such as procurement and operations and maintenance. We then analyzed the Marine Corps' reset budgets from fiscal year 2009 through fiscal year 2010 for the reset of ground and aviation equipment to identify any trends in the operations and maintenance and procurement funding categories. We discussed the results of our analysis with Marine Corps officials to determine the rationale for any trends in the funding. We interviewed Office of the Secretary of Defense, Department of the Navy, and Marine Corps officials to obtain information and any guidance documents pertaining to the process used for budget development and budget review and approval. To gain a better understanding of how the Marine Corps is using procurement funding, we reviewed the Marine Corps procurement reset funding appropriated for ground equipment in fiscal year 2010 for the 10 items that had the highest amount of funding. To determine the reliability of the reset budget data provided for ground equipment from the Global War on Terror Resources Information Database by Marine Corps officials, we assessed the data reliability of the budget data by obtaining and reviewing agency officials' responses on the data reliability questionnaires provided. Based on our review of the Office of the Secretary of Defense and Marine Corps officials' responses to our data reliability questionnaire, we identified any possible limitations and determined the effect, if any, those limitations would have on our findings. We also spoke with agency officials to clarify how the budget data were used and to ensure that we had a good understanding of how to interpret the data for our purposes. We also reviewed the fiscal year 2009 through fiscal year 2012 reset budget data provided to make sure that the formulas in the database were accurate for the data we planned to use. Based on all of these actions, we did not find any areas of concern with the data and we determined that the data used from the Global War on Terror Resources Information Database were sufficiently reliable for our purposes. To determine the reliability of the reset budget data provided for aviation equipment from the Program Budget Information System, Navy Enterprise Resource Planning system, and the Justification Management System by Navy and Marine Corps officials, we assessed the data reliability of the budget data by obtaining and reviewing agency officials' responses on the data reliability questionnaires provided. Based on our review of Navy and Marine Corps officials' responses to our data reliability questionnaire, we identified any possible limitations and determined the effect, if any, those limitations would have on our findings. We also spoke with agency officials to clarify how the budget data were used and to ensure that we had a good understanding of how to interpret the data for our purposes. Based on all of these actions, we did not find any areas of concern with the data and we determined that the data used from the Program Budget Information System, Navy Enterprise Resource Planning system, and the Justification Management System were sufficiently reliable for our purposes. To address each of our objectives, we also spoke with officials, and obtained documentation when applicable, at the following locations: Office of the Under Secretary of Defense for Acquisitions, Technology and Logistics, Assistant Director of Defense for Material Readiness Office of the Secretary of Defense for Cost Assessment and Program Office of the Under Secretary of Defense (Comptroller) Assistant Secretary of the Navy, Financial Management and Comptroller; Navy Financial Management Branch Naval Air Systems Command Reset Project Office Naval Air Systems Command Comptroller Office Naval Air Systems Command Naval Aviation Enterprise War Council Headquarters Marine Corps Deputy Commandant for Installations and Headquarters Marine Corps Deputy Commandant for Plans, Policies, Headquarters Marine Corps Deputy Commandant for Marine Corps Headquarters Marine Corps Deputy Commandant for Programs and Headquarters Marine Corps Deputy Commandant, Aviation Marine Corps Systems Command Marine Corps Logistics Command We conducted this performance audit from November 2010 through August 2011 in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. This appendix provides further details on funding for Marine Corps equipment reset for fiscal years (FY) 2009 to 2012. Tables 2 and 3 provide a summary of funds that were budgeted or requested to reset ground and aviation equipment. The Marine Corps' top 10 ground equipment reset procurement items totaled approximately $365 million and accounted for approximately 90 percent of their total reset procurement funding in fiscal year 2010. Table 4 provides a summary of the procurement reset funding budgeted for these ground equipment items. In addition to the contact named above, Larry Junek, Assistant Director; Tamiya Lunsford; Stephanie Moriarty; Cynthia Saunders; John Van Schaik; Michael Willems; Monique Williams; and Erik Wilkins-McKee; Tom Gosling; William Graveline; Asif Khan; Thomas McCool; Charles Perdue; Gregory Pugnetti; and William Woods made key contributions to this report. | The U.S. Marine Corps received approximately $16 billion in appropriated funds between fiscal years 2006 and 2010 for reset of aviation and ground equipment that has been degraded, damaged, and destroyed during oversees contingency operations. Reset encompasses activities for repairing, upgrading, or replacing equipment used in contingency operations. The Marine Corps continues to request funding to reset equipment used in Afghanistan. GAO initiated this review under its authority to address significant issues of broad interest to the Congress. GAO's objectives were to evaluate the extent to which the Marine Corps has made progress toward (1) developing effective reset strategies for both aviation and ground equipment used in Afghanistan and (2) providing accurate estimates of total reset costs. The Marine Corps has developed a strategic plan that addresses the reset of aviation equipment used in operations in Afghanistan and includes the elements of a comprehensive, results-oriented strategic planning framework. However, a reset strategy for ground equipment has not yet been developed. The Marine Corps is taking steps to develop such a strategy; however, the timeline for completing and issuing this strategy is uncertain. Although Marine Corps officials agreed that a reset strategy for ground equipment will be needed, they stated that they do not plan to issue a strategy until there is a better understanding of the dates for drawdown of forces from Afghanistan. While more specific drawdown information is desirable and will be needed to firm up reset plans, the President stated that troops would begin to withdraw in July 2011, working towards a transfer of all security operations to Afghan National Security Forces by 2014. Until the ground equipment reset strategy is issued, establishing firm plans for reset may be difficult for the Marine Corps Logistics Command to effectively manage the rotation of equipment to units to sustain combat operations. It is also uncertain to what extent the Marine Corps plans to align its ground equipment reset strategy with its ground equipment modernization plan. GAO found that the Iraq reset strategy for ground equipment contained no direct reference to the service's equipment modernization plans, leaving unclear the relationship between reset and modernization. A clear alignment of the ground equipment reset strategy for Afghanistan and modernization plans would help to ensure that the identification, development, and integration of warfighting capabilities also factor in equipment reset strategies so that equipment planned for modernization is not unnecessarily repaired. The total costs of reset estimated by the Marine Corps may not be accurate or consistent because of differing definitions of reset that have been used for aviation and ground equipment. These differing definitions exist because Department of Defense (DOD) has not established a single standard definition for use in DOD's budget process. Specifically, the Marine Corps does not include aviation equipment procurement costs when estimating total reset costs. According to Marine Corps officials, procurement costs are excluded because such costs are not consistent with its definition of aviation equipment reset. In contrast, the Marine Corps' definition of reset for ground equipment includes procurement costs to replace theater losses. However, GAO found that the Office of the Secretary of Defense Director of Cost Assessment and Program Evaluation had obtained a procurement cost estimate for Marine Corps aviation equipment as part of its efforts to track reset costs for the department. DOD's Resource Management Decision 700 tasks the Office of the Secretary of Defense Director of Cost Assessment and Program Evaluation to provide annual departmentwide reset updates. GAO recommends that the Secretary of Defense (1) establish a timeline for issuing formal reset planning guidance and a ground equipment reset strategy for equipment used in operations in Afghanistan, (2) provide linkages between the ground equipment reset strategy and the modernization plan, and (3) develop and publish a DOD definition of reset for use in the DOD overseas contingency operations budgeting process. DOD concurred with one and partially concurred with two of the recommendations. | 8,133 | 779 |
The Army has 10 active duty divisions, as listed in appendix II. Six of these divisions are called heavy divisions because they are equipped with large numbers of tanks, called armor. Two other divisions are called light divisions because they have no armor. The remaining two divisions are an airborne division and an air assault division. Heavy divisions accounted for the majority of the Army's division training funds, about 70 percent ($808 million) in fiscal year 2000, and these divisions are the focus of this report. The Army uses a building block approach to train its armor forces-- beginning with individual training and building up to brigade-sized unit training, as shown in figure 1. This training approach is documented in the Army's Combined Arms Training Strategy (CATS). The strategy identifies the critical tasks, called mission essential tasks, that units need to be capable of performing in time of war and the type of events or exercises and the frequency with which the units train to the task to produce a combat ready force. The strategy, in turn, guides the development of unit training plans. The Army uses CATS as the basis for determining its training budget. To do this, it uses models to convert training events into budgetary resources, as shown in figure 2. For armor units, the Battalion Level Training Model translates the type of training events identified in CATS and the frequency with which they should be conducted into the number of tank miles to be driven in conducting those training events. The Army then uses another model, the Training Resource Model, to compute the estimated training cost for units based on the previous 3 years' cost experience. The output from these models is the basis for the Army's training budget. CATS, in combination with the Battalion Level Training Model, has established that the tanks in armor units will be driven, on average, about 800 miles each year for home station training. This is the level of training the Army has identified as needed to have a combat ready force, and its budget request states that it includes funds necessary to support that training. While the Army uses the 800-tank mile goal as a tool to develop its divisions' home station budgets, it does not identify the number of tank miles to be driven in its training guidance and training field manuals as a training requirement nor does it mention the miles in unit training plans. To measure the readiness of its units, the Army uses the Global Status of Resources and Training System. Unit commanders use this readiness system to report their units' overall readiness level. Under this readiness system, each reporting unit provides information monthly on the current level of personnel, equipment on hand, equipment serviceability, and training, and the commander's overall assessment of the unit's readiness to undertake its wartime mission. Units can be rated on a scale of C-1 to C-5. A C-1 unit can undertake the full wartime mission for which it is organized and designed; a C-2 unit can undertake most of its wartime mission; a C-3 unit can undertake many but not all elements of its wartime mission; a C-4 unit requires additional resources or training to undertake its wartime mission; and a C-5 unit is not prepared to undertake its wartime mission. Currently, the training readiness portion of the readiness report reflects the commander's assessment of the number of training days that are needed for the unit to be fully combat ready. In addition to the Army setting a training goal of 800 miles for tanks located at unit home stations, in its performance report for fiscal year 1999, DOD began to use 800 tank training miles, including miles driven at units' home station and the National Training Center, as a performance benchmark for measuring near-term readiness in responding to the Government Performance and Results Act. This act is a key component of a statutory framework that Congress put in place during the 1990s to promote a new focus on results. The Army is continuing to move training funds planned for its tank divisions to other purposes. Budget requests should reflect the funds needed to conduct an organization's activities and its spending priorities. The Army's budget request for tank division training includes funding needed to conduct 800 miles of unit home station tank training. However, each year since at least the mid-1990s, the Army has obligated millions of dollars less than it budgets to conduct training, and tanks have not trained to the 800-mile level. For the 4-year period fiscal years 1997 through 2000, the Army obligated a total of almost $1 billion less than Congress provided for training all its divisions. At the same time, the Army trained on its tanks an annual average of 591 miles at home station. Beginning with fiscal year 2001, the Army is taking action to restrict moving tank training funds. Each fiscal year the Army develops a budget request to fund, among other activities and programs, the operation of its land forces. The largest component of the land forces budget is for training the Army's 10 active- duty divisions. The Army, through the President's budget submission, requests more than $1 billion annually in O&M funds to conduct home station division training. The majority of this budget request is for the Army's six heavy divisions to use for unit training purposes. Over the last 4 years, Congress has provided the Army with the training funds it has requested. For much of the past decade, the Army has moved some of these funds from its division training to other purposes, such as base operations and real property maintenance. We previously reported that this occurred in fiscal years 1993 and 1994 and our current work shows that the Army continues to move training funds to other purposes. Although the Army has moved funds from all of its land forces subactivities, as shown in table 1, for the 4-year period fiscal years 1997 through 2000, it moved the most funds from its subactivity planned for division training. Although the Army has moved the most funds out of its division training subactivity, the amount moved has decreased over the past 2 years, as shown in figure 3. Despite the recent decrease in training funds moved from the divisions, the Army moved almost $190 million in fiscal year 2000. Most of the training funds moved occurred within the Army's six heavy divisions. As shown in table 2, $117.7 million of the $189.7 million in division funds that were moved in fiscal year 2000 occurred in the heavy divisions. Although O&M funds cannot generally be traced dollar for dollar to their ultimate disposition, an analysis of funds obligated compared to the funds conferees' initially designated shows which subactivities within the Army's O&M account had their funding increased or decreased during the budget year. Generally, the Army obligated funds planned for training its divisions for other purposes such as base operations, real property maintenance, and operational readiness (such as maintaining its training ranges). Although the Army budgets to achieve 800 tank miles for home station training, it has consistently achieved less than the 800 training miles for the last 4 years (see fig. 4). During this period, armor units missed the 800-tank mile goal annually by about an average of 26 percent. Recently, however, the number of home station tank miles achieved increased, from 568 miles in fiscal year 1999 to 655 miles in fiscal year 2000. There are some valid reasons for not achieving the 800-tank mile goal at home station, which are described below. The Army, however, does not adjust its tank mile goal to reflect these reasons. The Army develops its data on tank mile achievement from each unit's tank odometer readings. Some home station training, however, does not involve driving tanks. Specifically, the 800-tank mile goal for home station training includes a 60 tank mile increment that some units can conduct through the use of training simulators. These 60 miles are included in the funding for the 800-tank miles, but they are not reflected in tank mile reporting because they are not driven on real tanks. In addition, deployment to contingency operations, such as the ones in the Balkans (Bosnia and Kosovo), affects both the available funding and the amount of training that can be conducted at home station. For example, when armor units are deployed to the Balkans they are not able to conduct their normal home station training. During fiscal year 1999, for example, the 1st Cavalry Division deployed to the Balkans for 11 months. Consequently, the division did very little home station training, which affected the Army-wide average home station tank training miles achieved for that year--specifically, an average of 568 tank training miles. However, if the Army had excluded the 1st Cavalry Division because it was deployed to the Balkans for most of that fiscal year, the Army-wide average home station tank mile training would have increased to 655 miles, nearly 90 miles more. In addition, the Army moved and used the funds associated with this missed training to offset the cost of Balkan operations. Although the magnitude of funding shifted to support contingency operations varies annually, the Army does not adjust its methodology and reporting to reflect the tank training miles associated with these cost offsets. Even though the Army is not conducting 800 tank miles of home station training, its armor units are still able to execute their unit training events. During our work at five of the Army's six heavy divisions, we found no evidence to demonstrate that scheduled training events had been delayed or canceled in recent years because training funds were moved out of the division subactivity to other purposes. Training events included those at a unit's home station and at the Army's National Training Center and its Combat Maneuver Training Center. Unit trainers told us that if scheduled training had to be canceled or delayed, it likely would be for reasons such as deployments or bad weather. In addition, when unit trainers establish their training plans for certain training events, they focus on achieving the unit's mission essential tasks and not on how many miles will be driven on the tanks. According to the Army, units can execute their training plans despite funds being moved for several reasons. The major reasons were because most of the movement in funds occurs before the divisions receive the funds, division trainers, using past experience, anticipate the amount of training funds they will likely receive from higher commands and adjust their training plans accordingly and the intensity of the training event can be modified to fit within available funding by taking steps such as driving fewer miles and transporting-- rather than driving--tanks to training ranges. In fiscal year 2001, the Army implemented an initiative to protect training funds from being moved that should result in the Army's using these training dollars for the purposes originally planned. Senior Army leadership directed that for fiscal year 2001, Army land forces would be exempt from any budget adjustments within the discretion of Army headquarters. The senior leadership also required that Army commands obtain prior approval from Army headquarters before reducing training funds. However, subactivities within the Army's O&M account that have received these funds in the past--such as real property maintenance, base operations, and operational readiness--may be affected by less funding unless the Army requests more funds for these subactivities in the future. At the time of our work, this initiative had been in effect for only a few months; thus, we believe it is too early to assess its success in restricting the movement of training funds. Army readiness assessments reported in the Global Status of Resources and Training System show that for the last 4 fiscal years, armor units have consistently reported high levels of readiness, despite reduced training funding and not achieving its tank mile goals. This readiness assessment system does not require considering tank miles driven as an explicit factor when a unit commander determines the unit's training or overall readiness posture. In fact, the number of tank miles driven is not mentioned in readiness reporting regulations. We analyzed monthly Global Status of Resources and Training System data to see how often active-duty Army armor units were reporting readiness at high levels and lower levels. Our analysis showed that most armor units reported high overall readiness for fiscal years 1997 through 2000. In our analysis of monthly readiness reports for fiscal years 1997 through 2000, we found that when armor units reported lower overall readiness the reason was usually personnel readiness. In reviewing comments of commanders who reported degraded readiness for the same period, we found that insufficient funding was rarely cited as a cause of degraded readiness. Only a handful of unit reports filed in the 4-year period covering fiscal years 1997 through 2000, identified instances in which a shortage of funds was cited as a factor in reporting lower readiness levels. During the same period, when commanders cited training as the reason for reporting lower overall readiness, they never cited insufficient funding as a cause. Not only do unit commanders report on their overall readiness levels, but they also are required to report on the four subareas that comprise overall readiness. These subareas are current readiness levels of personnel, equipment on hand, equipment serviceability, and training. For the training readiness component, a unit's training status rating is based upon a commander's estimate of the number of training days required for the unit to become proficient in its wartime mission. Our analysis of these readiness reports showed that most armor units reported that their training status was high throughout fiscal years 1997 through 2000. There seems to be no direct relationship between average tank miles achieved and reported training readiness. There were times when tank miles achieved (1) increased while the proportion of time units reporting high readiness levels declined and (2) declined while the proportion of units reporting high readiness levels increased. For example, tank miles achieved rose more than 25 percent between the second and third quarter of fiscal year 2000 while the proportion of time units were reporting high readiness levels declined. Conversely, tank miles achieved fell by more than 20 percent between the third and fourth quarter of fiscal year 1999 while the proportion of time units were reporting high readiness levels increased. Both the Army and DOD provide Congress with information on tank miles achieved, but reporting is incomplete and inconsistent. The Army reports tank miles achieved to Congress as part of DOD's annual budget documentation. DOD reports tank miles achieved as part of its reporting under the Government Performance and Results Act. Army units train on their tanks at their home stations, at major training centers, and in Kuwait in concert with Kuwait's military forces. All armor training contributes to the Army's goal of having a trained and ready combat force. However, we found that the categories of tank training the Army includes in its annual budget documentation vary from year to year and the categories of training the Army includes in its budget documents and DOD includes in its Results Act reporting vary. In addition to home station training, Army units conduct training away from home station. This additional training is paid from different budget subactivities within the Army's O&M account and thus is not included in the Army's budget request for funds to conduct 800 miles of home station training. One such subactivity funds training at the National Training Center. Armor units based in the United States train at the National Training Center on average once every 18 months. Based on congressional guidance, the Army includes funds for this training in a separate budget subactivity. This subactivity, in essence, pays for tank training miles in addition to the 800 miles for home station training that is funded in the divisions' training subactivity. During the period fiscal years 1997 through 2000, the National Training Center training added an annual average of 87 miles to overall Army tank training in addition to the average of 591 miles of home station training. Because, through fiscal year 2000, these miles have been conducted on prepositioned equipment rather than on a unit's own tanks, they appropriately have not been included in home station training activity. Beginning in fiscal year 2001, the Army plans to have an as yet undetermined number of units transport their own tanks for use at the National Training Center. As this occurs, these units will report National Training Center tank miles achieved as part of their home station training. The Army is examining how to adjust division and the National Training Center budget subactivities to reflect this shift. Similarly, some armor units conduct training in Kuwait in conjunction with Kuwait's military forces in a training exercise called Desert Spring (formerly called Intrinsic Action). Kuwait pays part of the cost of this training and the balance is paid from funds appropriated for contingency operations. The tanks used for this training are prepositioned in Kuwait. Over the last 4 fiscal years, this training added an annual average of about 40 miles to overall Army tank training and was also appropriately not included in the home station training activity. However, this training also contributed to the Army's goal of having a trained and ready combat force. As shown in figure 5, when the miles associated with additional training are included, for the period fiscal years 1997 through 2000, an average of about 127 miles were added to the annual overall tank-miles achieved. The Army has not been consistent about reporting these miles. We found that in some years the Army included these miles in its reporting on tank miles achieved and in some years it did not. For example, for fiscal year 1999, the latest year for which such data were available, the Army reported only home station tank miles in its budget submission, while for fiscal year 1998 it reported both home station and National Training Center miles. Further, the Army did not include tank miles driven in Kuwait in either year. In fiscal year 1999, DOD began to report on the Army's achievement of 800 tank miles of training as one of its performance goals under the Government Performance and Results Act. The Results Act seeks to strengthen federal decision-making and accountability by focusing on the results of federal activities and spending. A key expectation is that Congress will gain a clearer understanding of what is being achieved in relation to what is being spent. To accomplish this, the act requires that agencies prepare annual performance plans containing annual performance goals covering the program activities in agencies' budget requests. The act aims for a closer and clearer link between the process of allocating resources and the expected results to be achieved with those resources. Agency plans that meet these expectations can provide Congress with useful information on the performance consequences of budget decisions. In its Results Act reporting, DOD is using a different training goal than the Army and, depending on the year, is including different categories of training. In response to the Results Act, DOD stated in its fiscal year 1999 performance plan that it planned to use 800 tank miles of training as one of its performance goals for measuring short-term readiness. In DOD's performance report for 1999, DOD reported, among other performance measures, how well it achieved its training mile goal for tanks. In its reporting on progress toward the goal, DOD included mileage associated with training at the National Training Center in its tank mile reporting. As discussed previously, for the Army, the 800-tank mile goal relates exclusively to home station training, and tank miles achieved at the National Training Center are funded through a separate subactivity within the Army's O&M account and tank miles achieved in Kuwait are paid for in part by Kuwait and in part by funds appropriated for contingency operations. In addition, because the Army has varied the categories of training (home station and National Training Center) it includes in its budget submission reporting, depending on the year, the Army and DOD are sometimes using different bases for their tank mile achievement reporting. As a result, Congress is being provided confusing information about what the 800-tank mile goal represents. Because the Army has consistently (1) not obligated all its O&M unit training funds for the purposes it told Congress that it needed them; (2) continues to conduct its required training events; and (3) reports that its heavy divisions remain trained and in a high state of readiness, questions are raised as to the Army's proposed use of funds within its O&M account. In addition, the different ways in which the Army and DOD report tank mile training, results in Congress receiving conflicting information. By not providing Congress with clear and consistent information on Army tank training, the usefulness of the information is diminished. To better reflect Army funding needs and more fully portray all its tank training, we recommend that the Secretary of the Army reexamine the Army's proposed use of funds in its annual O&M budget submission, particularly with regard to the funds identified for division training and for other activities such as base operations and real property maintenance and improve the information contained in the Army's budget documentation by identifying more clearly the elements discussed in this report, such as (1) all funds associated with tank mile training; (2) the type of training conducted (home station, simulator, and National Training Center); (3) the training that could not be undertaken due to Balkan and any future deployments; (4) the budget subactivities within its O&M account that fund the training; and (5) the training conducted in and paid for in part by Kuwait. To provide Congress with a clearer understanding of tank training, we also recommend that the Secretary of Defense, in concert with the Secretary of the Army, develop consistent tank training performance goals and tank mile reporting for use in Army budget submissions and under the Results Act. DOD provided written comments on a draft of this report, which are reprinted in appendix III. DOD fully agreed with our two recommendations concerning improving the information provided to Congress and in part with our recommendation concerning reexamining its O&M funding request. DOD agreed that the Army should reexamine its funding request in all areas of its O&M budget submission. However, DOD objected to the implication that the Army was requesting too much funding for division training and noted that since we had not assessed the level of division training necessary to meet approved Army standards, any conclusion as to the adequacy of training funds is inappropriate. We did not directly examine whether the Army was training to its approved standards. We did examine whether the Army had delayed or canceled training due to the movement of funds. We found no evidence to demonstrate that scheduled training events had been delayed or canceled in recent years because training funds were moved. We also found that Army unit trainers plan their training events to focus on their mission essential tasks. These tasks form the basis of the Army's training strategy. While we believe that our findings, including the Army's movement of almost $1 billion--about 21 percent--of its division training funds to other O&M budget subactivities over the 4-year period fiscal years 1997 through 2000 suggest a need to reexamine the Army's proposed use of funds within that subactivity, we did not conclude that the Army was requesting too much funding in some areas and not enough in others. As noted above, DOD concurs that the Army should make such a reexamination. We have, however, clarified our recommendation to focus on the need to reexamine the Army's planned use of funds. We are sending copies of this report to the Secretary of Defense; the Under Secretary of Defense (Comptroller and Chief Financial Officer); the Secretary of the Army; and the Director, Office of Management and Budget. We will make copies available to others on request. If you or your staff have any questions concerning this report please call me on (757) 552-8100. This report was prepared under the direction of Steve Sternlieb, Assistant Director. Major contributors to this report were Howard Deshong, Brenda Farrell, Madelon Savaides, Frank Smith, Leo Sullivan, and Laura Talbott. To identify whether the Army is continuing to move training funds planned for its divisions, we examined Army budget submissions, the Secretary of Defense's high priority readiness reports to Congress, appropriations acts for the Department of Defense (DOD), and the conference reports on those acts. We focused our analysis on fiscal years 1997 through 2000. We began with fiscal year 1997 because the Army had revised its operation and maintenance (O&M) budget structure for operating forces beginning in that year. We extracted data from these documents to compare the amounts congressional conferees initially designated for the Army's operation of its land forces, including its divisions, to those the Army reported as obligated. We also obtained Army data on tank miles achieved for the Army overall and by armor battalion. To understand how the Army trains its armor forces to be combat ready as well as to ascertain how Army units adjust to reduced funding and if the Army had canceled or delayed any scheduled training due to the movement of training funds, we obtained briefings, reviewed training documents, and interviewed Army personnel at a variety of locations, including Army headquarters, the Army's Forces Command and U.S. Army Europe, five of the six heavy divisions both within the United States and Europe, and the Army's school for armor doctrine and training. We also analyzed tank mile data from the Army's Cost and Economic Analysis Center. To assess the reported readiness of Army tank units, we examined monthly readiness reporting data from DOD's Global Status of Resources and Training System for fiscal years 1997 through 2000. We examined both the reported overall readiness and the training component of the readiness reports. We reviewed this system's readiness status ratings to determine (1) what level of readiness units were reporting, (2) whether unit readiness had declined, (3) whether training readiness was determined to be the primary cause for any decline in readiness, and (4) whether unit commanders had attributed training funding shortfalls as the cause for any decline in readiness levels. To assess whether DOD and the Army are providing Congress with complete and consistent information regarding armor training, we compared Army budget submissions with Army tank training data and DOD's report on its performance required by the Government Performance and Results Act. We also discussed overall training versus home station training and the differences between Army and Results Act reports with Army officials. Our review was conducted from March 2000 through January 2001 in accordance with generally accepted government auditing standards. | Congress has expressed concern about the extent to which the Department of Defense has moved funds that directly affect military readiness, such as those that finance training, to pay for other subactivities within its operation and maintenance (O&M) account, such as real property maintenance and base operations. This report reviews the (1) Army's obligation of O&M division training funds and (2) readiness of the Army's divisions. GAO found that the Army continued to use division training funds for purposes other than training during fiscal year 2000. However, the reduced funding did not interfere with the Army's planned training events or exercises. The Army's tank units also reported that, despite the reduced funding and their failure to meet their tank mileage performance goal, their readiness remained high. Specifically, many tank units reported that they could be fully trained for their wartime mission within a short time period. Units that reported that they would need more time to become fully trained generally cited personnel issues rather than the lack of training funds as the reason. Even so, starting in fiscal year 2001, the Army has taken action to restrict moving training funds by exempting unit training funds from any Army headquarters' adjustments and requiring prior approval before Army commands move any training funds. | 5,471 | 258 |
In March 2014 and April 2015, we reported that CBP had made progress in deploying programs under the Arizona Border Surveillance Technology Plan, but that CBP could take additional action to strengthen its management of the Plan and the Plan's programs. As of May 2016, CBP has initiated or completed deployment of technology to Arizona for each of the programs under the Plan. Additionally, as discussed further below, CBP has reported taking steps to update program schedules and life- cycle cost estimates for the three highest-cost programs under the Plan. For example, in May 2016, CBP provided us with complete schedules for two of the programs, and we will be reviewing them to determine the extent to which they address our recommendation. In March 2014, we found that CBP had a schedule for deployment of each of the Plan's seven programs, and that four of the programs would not meet their originally planned completion dates. We also found that some of the programs had experienced delays relative to their baseline schedules, as of March 2013. Further, in our March 2016 assessment of DHS's major acquisitions programs, we reported on the status of the Plan's Integrated Fixed Tower (IFT) program, noting that from March 2012 to January 2016, the program's initial and full operational capability dates had slipped. Specifically, we reported that the initial operational capability date had slipped from the end of September 2013 to the end of September 2015, and the full operational capability to the end of September 2020. We also reported that this slippage in initial operational capability dates had contributed to slippage in the IFT's full operational capability--primarily as a result of funding shortfalls--and that the IFT program continued to face significant funding shortfalls from fiscal year 2016 to fiscal year 2020. Despite these delays, as of May 2016 CBP reported that it has initiated or completed deployment of technology to Arizona for each of the three highest-cost programs under the plan--IFT, the Remote Video Surveillance System (RVSS), and the Mobile Surveillance Capability (MSC). Specifically, CBP officials stated that MSC deployments in Arizona are complete and that in April 2016, requirements to transition sustainment from the contractor to CBP had been finalized. CBP also reported that the RVSS system has been deployed, and testing on these systems is ongoing in four out of five stations. Further, CBP reported it had initiated deployment of the IFT systems and as of May 2016 has deployed 7 out of 53 IFTs in one area of responsibility. CBP conditionally accepted the system in March 2016 and is working to deploy the remaining IFT unit systems to other areas in the Tucson sector. With regard to schedules, we previously reported that CBP had at least partially met the four characteristics of reliable schedules for the IFT and RVSS schedules and partially or minimally met the four characteristics for the MSC schedule. Scheduling best practices are summarized into four characteristics of reliable schedules--comprehensive, well constructed, credible, and controlled (i.e., schedules are periodically updated and progress is monitored). We assessed CBP's schedules as of March 2013 for the three highest-cost programs and reported in March 2014 that schedules for two of the programs at least partially met each characteristic (i.e., satisfied about half of the criterion), and the schedule for the other program at least minimally met each characteristic (i.e., satisfied a small portion of the criterion). For example, the schedule for the IFT program partially met the characteristic of being credible in that CBP had performed a schedule risk analysis for the program, but the risk analysis did not include the risks most likely to delay the project or how much contingency reserve was needed. For the MSC program, the schedule minimally met the characteristic of being controlled in that it did not have valid baseline dates for activities or milestones by which CBP could track progress. We recommended that CBP ensure that scheduling best practices are applied to the IFT, RVSS, and MSC schedules. DHS concurred with the recommendation and stated that CBP planned to ensure that scheduling best practices would be applied, as outlined in our schedule assessment guide, when updating the three programs' schedules. In May 2016, CBP provided us with complete schedules for the IFT and RVSS programs, and we will be reviewing them to determine the extent to which they address our recommendation. In March 2014, we also found that CBP had not developed an Integrated Master Schedule for the Plan in accordance with best practices. Rather, CBP had used separate schedules for each program to manage implementation of the Plan, as CBP officials stated that the Plan contains individual acquisition programs rather than integrated programs. However, collectively these programs are intended to provide CBP with a combination of surveillance capabilities to be used along the Arizona border with Mexico, and resources are shared among the programs. According to scheduling best practices, an Integrated Master Schedule is a critical management tool for complex systems that involve a number of different projects, such as the Plan, to allow managers to monitor all work activities, how long activities will take, and how the activities are related to one another. We concluded that developing and maintaining an Integrated Master Schedule for the Plan could help provide CBP a comprehensive view of the Plan and help CBP better understand how schedule changes in each individual program could affect implementation of the overall plan. We recommended that CBP develop an Integrated Master Schedule for the Plan. CBP did not concur with this recommendation and maintained that an Integrated Master Schedule for the Plan in one file undermines the DHS-approved implementation strategy for the individual programs making up the Plan, and that the implementation of this recommendation would essentially create a large, aggregated program, and effectively create an aggregated "system of systems." DHS further stated that a key element of the Plan has been the disaggregation of technology procurements. However, as we noted in the 2014 report, collectively these programs are intended to provide CBP with a combination of surveillance capabilities to be used along the Arizona border with Mexico. Moreover, while the programs themselves may be independent of one another, the Plan's resources are being shared among the programs. We continue to believe that developing an Integrated Master Schedule for the Plan is needed. Developing and maintaining an integrated master schedule for the Plan could allow CBP insight into current or programmed allocation of resources for all programs as opposed to attempting to resolve any resource constraints for each program individually. In addition, in March 2014, we reported that the life-cycle cost estimates for the Plan reflected some, but not all, best practices. Cost-estimating best practices are summarized into four characteristics--well documented, comprehensive, accurate, and credible. Our analysis of CBP's estimate for the Plan and estimates completed at the time of our review for the two highest-cost programs--the IFT and RVSS programs-- showed that these estimates at least partially met three of these characteristics: well documented, comprehensive, and accurate. In terms of being credible, these estimates had not been verified with independent cost estimates in accordance with best practices. We concluded that ensuring that scheduling best practices were applied to the programs' schedules and verifying life-cycle cost estimates with independent estimates could help better ensure the reliability of the schedules and estimates, and we recommended that CBP verify the life-cycle cost estimates for the IFT and RVSS programs with independent cost estimates and reconcile any differences. DHS concurred with this recommendation, but stated then that it did not believe that there would be a benefit in expending funds to obtain independent cost estimates and that if the costs realized to date continued to hold, there may be no requirement or value added in conducting full-blown updates with independent cost estimates. We recognize the need to balance the cost and time to verify the life-cycle cost estimates with the benefits to be gained from verification with independent cost estimates. CBP officials stated that in fiscal year 2016, DHS's Cost Analysis Division would begin piloting DHS's independent cost estimate capability on the RVSS program. According to CBP officials, this pilot is an opportunity to assist DHS in developing its independent cost estimate capability and that CBP selected the RVSS program for the pilot because the program is at a point in its planning and execution process where it can benefit most from having an independent cost estimate performed as these technologies are being deployed along the southwest border, beyond Arizona. CBP officials stated that details for an estimated independent cost estimate schedule and analysis plan for the RVSS program have not been finalized. CBP plans to provide an update on the schedule and analysis plan as additional details become available, and provide information on the final reconciliation of the independent cost estimate and the RVSS program cost estimate once the pilot has been completed at the end of fiscal year 2017. Further, CBP officials have not detailed similar plans for the IFT. We continue to believe that independently verifying the life-cycle cost estimates for the IFT and RVSS programs and reconciling any differences, consistent with best practices, could help CBP better ensure the reliability of the estimates. We reported in March 2014 that CBP had identified mission benefits of its surveillance technologies to be deployed under the Plan, such as improved situational awareness and agent safety. However the agency had not developed key attributes for performance metrics for all surveillance technologies to be deployed as part of the Plan, as we recommended in November 2011. Further, in March 2014, we found that CBP did not capture complete data on the contributions of these technologies, which in combination with other relevant performance metrics or indicators, could be used to better determine the impact of CBP's surveillance technologies on CBP's border security efforts, and inform resource allocation decisions. Although CBP had a field within its Enforcement Integrated Database for data on whether technological assets, such as SBInet surveillance towers, and nontechnological assets, such as canine teams, assisted or contributed to the apprehension of illegal entrants and seizure of drugs and other contraband, according to CBP officials, Border Patrol agents were not required to record these data. This limited CBP's ability to collect, track, and analyze available data on asset assists to help monitor the contribution of surveillance technologies, including its SBInet system, to Border Patrol apprehensions and seizures and inform resource allocation decisions. We recommended that CBP require data on asset assists to be recorded and tracked within its database, and once these data were required to be recorded and tracked, that it analyze available data on apprehensions and technological assists-- in combination with other relevant performance metrics or indicators, as appropriate-- to determine the contribution of surveillance technologies to CBP's border security efforts. CBP concurred with our recommendations and has implemented one of them. Specifically, in June 2014, CBP issued guidance informing Border Patrol agents that the asset assist data field within its database was now a mandatory data field. Agents are required to enter any assisting surveillance technology or other equipment before proceeding. Further, as of May 2015, CBP had identified a set of potential key attributes for performance metrics for all technologies to be deployed under the Plan. However, CBP officials stated that this set of performance metrics was under review as the agency continued to refine the key attributes for metrics to assess the contributions and impacts of surveillance technology on its border security mission. In our March 2016 update on the progress made by agencies to address our findings on duplication and cost savings across the federal government, we reported that CBP had modified its time frame for developing baselines for each performance measure and that additional time would be needed to implement and apply key attributes for metrics. According to CBP officials, CBP expected these performance measure baselines to be developed by the end of calendar year 2015, at which time the agency planned to begin using the data to evaluate the individual and collective contributions of specific technology assets deployed under the Plan. Moreover, CBP planned to use the baseline data to establish a tool that explains the qualitative and quantitative impacts of technology and tactical infrastructure on situational awareness in specific areas of the border environment by the end of fiscal year 2016. While CBP had expected to complete its development of baselines for each performance measure by the end of calendar year 2015, as of March 2016 the actual completion is being adjusted pending test and evaluation results for recently deployed technologies on the southwest border. Until CBP completes its efforts to fully develop and apply key attributes for performance metrics for all technologies to be deployed under the Plan, it will not be well positioned to fully assess its progress in implementing the Plan and determining when mission benefits have been fully realized. Our ongoing work shows that as of May 2016, CBP operates nine Predator B from four AMO National Air Security Operations Centers (NASOC) located in Sierra Vista, Arizona; Grand Forks, North Dakota; Corpus Christi, Texas; and Jacksonville, Florida. Three Predator B aircraft are assigned to the NASOCs in Arizona, North Dakota, and Texas while the NASOC in Florida remotely operates Predator B aircraft launched from the other NASOCs. AMO began operation of Predator B aircraft in fiscal year 2006, and all four NASOCs became operational in fiscal year 2011. See figure 1 for a photograph of a CBP Predator B aircraft. CBP's Predator B aircraft may be equipped with video and radar sensors utilized primarily to support the operations of other CBP components, and federal, state, and local law enforcement agencies. CBP's Predator B operations in support of its components and other law enforcement agencies include patrol missions to detect the illegal entry of goods and people at and between U.S. POEs and investigative missions to provide aerial support for law enforcement activities and investigations. For example, CBP's Predator B video and radar sensors support Border Patrol activities to identify and apprehend individuals entering the United States between POEs. CBP collects and tracks information on the number of assists provided for apprehensions of individuals and seizures of contraband, including narcotics, in support of law enforcement operations by Predator B aircraft. In addition, CBP's Predator B aircraft have been deployed to provide aerial support for monitoring natural disasters such as wildfires and floods. For example, CBP's Predator B were deployed in 2010 and 2011 to support federal, state, and local government agencies in response to flooding in the Red River Valley area of North Dakota. CBP's Predator B aircraft operate in the U.S. national airspace system in accordance with Federal Aviation Administration (FAA) requirements for authorizing all UAS operations in the National Airspace System. In accordance with FAA requirements, all Predator B flights must comply with a Certificate of Waiver or Authorization (COA). The COA-designated airspace establishes operational corridors for Predator B activity both along and within 100 miles of the border for the northern border, and along and within 25 to 60 miles of the border for the southern border, exclusive of urban areas. COAs issued by FAA to CBP also include airspace for training missions which involve take offs and landings around a designated NASOC and transit missions to move Predator B aircraft between NASOCs. As of May 2016, CBP has utilized the NASOC in North Dakota as a location to train new and existing CBP Predator B pilots. For our ongoing work, we analyzed CBP data on reported Predator B COA-designated flight hours from fiscal years 2011 to 2015 and found that 81 percent of flight hours were associated with COA-designated airspace along border and coastal areas. For more information on Predator B flight hours in COA-designated airspace, see figure 2. Based on our ongoing work, we found that airspace access and weather can impact CBP's ability to utilize Predator B aircraft. According to CBP officials we spoke with in Arizona, Predator B flights may be excluded from restricted airspace managed by the Department of Defense along border areas which can affect the ability of Predator B to support Border Patrol. CBP officials we spoke with in Arizona and Texas told us that Predator B missions are affected by hazardous weather conditions that can affect their ability to operate the aircraft. According to CBP officials we spoke with in Texas, CBP took steps to mitigate the impact of hazardous weather in January and February 2016 by deploying one Predator B aircraft from Corpus Christi, Texas, to San Angelo, Texas, at San Angelo Regional Airport which had favorable weather conditions. CBP's deployment of a Predator B at San Angelo Regional Airport was in accordance with a FAA-issued COA to conduct its border security mission in Texas and lasted approximately 3 weeks. We plan to evaluate how these factors affect CBP's utilization of Predator B aircraft as part of our ongoing work. Our ongoing work shows that as of May 2016, CBP has deployed six tactical aerostats along the U.S.-Mexico border in south Texas to support Border Patrol. Specifically, CBP deployed five tactical aerostats in Border Patrol's Rio Grande Valley sector and one tactical aerostat In Laredo sector. CBP utilizes three types of tactical aerostats equipped with cameras for capturing full-motion video: Persistent Threat Detection System (PTDS), Persistent Ground Surveillance System (PGSS), and Rapid Aerostat Initial Deployment (RAID). Each type of tactical aerostat varies in size and altitude of operation. See figure 3 for a photograph of a RAID aerostat. CBP owns the RAID aerostats and leases PTDS and PGSS aerostats through the Department of Defense. CBP operates its tactical aerostats in accordance with FAA regulations through the issuance of a COA. Tactical aerostats were first deployed and evaluated by CBP in August 2012 in south Texas. CBP's Office of Technology Innovation and Acquisition manages aerostat technology and the operation of each site through contracts, while Border Patrol agents operate tactical aerostat cameras and provide security at each site. As of May 2016, Border Patrol has taken actions to track the contribution of tactical aerostats to its mission activities. Specifically, agents track and record the number of assists aerostats provide for apprehensions of individuals and seizures of contraband and narcotics. Based on our ongoing work, we found that airspace access, weather, and real estate can impact CBP's ability to deploy and utilize tactical aerostats in south Texas. Airspace access: aerostat site placement is subject to FAA approval to ensure the aerostat does not converge on dedicated flight paths. Weather: aerostat flight is subject to weather restrictions, such as hazardous weather involving high winds or storms. Real estate: aerostat sites utilized by CBP involve access to private property and land owner acceptance, and right of entry is required prior for placement. In addition, CBP must take into consideration any relevant environmental and wildlife impacts prior to deployment of a tactical aerostat, such as flood zones, endangered species, migratory animals, among others. We plan to evaluate how these factors affect CBP's utilization of tactical aerostats as part of our ongoing work. Chairwoman McSally, Ranking Member Vela, and members of the subcommittee, this concludes my prepared statement. I will be happy to answer any questions you may have. For further information about this testimony, please contact Rebecca Gambler at (202) 512-8777 or [email protected]. In addition, contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this statement. Individuals making key contributions to this statement included Kirk Kiester (Assistant Director), as well as Jeanette Espinola, Yvette Gutierrez, Amanda Miller, Jon Najmi, and Carl Potenzieri. 2016 Annual Report: Additional Opportunities to Reduce Fragmentation, Overlap, and Duplication and Achieve Other Financial Benefits. GAO-16-375SP. Washington, D.C.: April 13, 2016. Homeland Security Acquisitions: DHS Has Strengthened Management, but Execution and Affordability Concerns Endure. GAO-16-338SP. Washington, D.C.: March 31, 2016. Southwest Border Security: Additional Actions Needed to Assess Resource Deployment and Progress. GAO-16-465T. Washington, D.C.: March 1, 2016. GAO Schedule Assessment Guide: Best Practices for Project Schedules. GAO-16-89G. Washington, D.C.: December 2015. Border Security: Progress and Challenges in DHS's Efforts to Implement and Assess Infrastructure and Technology. GAO-15-595T. Washington, D.C.: May 13, 2015. Homeland Security Acquisitions: Addressing Gaps in Oversight and Information is Key to Improving Program Outcomes. GAO-15-541T. Washington, D.C.: April 22, 2015. Homeland Security Acquisitions: Major Program Assessments Reveal Actions Needed to Improve Accountability. GAO-15-171SP. Washington, D.C.: April 22, 2015. 2015 Annual Report: Additional Opportunities to Reduce Fragmentation, Overlap, and Duplication and Achieve Other Financial Benefits. GAO-15-404SP. Washington, D.C.: April 14, 2015. Border Security: Additional Efforts Needed to Address Persistent Challenges in Achieving Radio Interoperability. GAO-15-201. Washington, D.C.: March 23, 2015. Unmanned Aerial Systems: Department of Homeland Security's Review of U.S. Customs and Border Protection's Use and Compliance with Privacy and Civil Liberty Laws and Standards GAO-14-849R. Washington, D.C.: September 30, 2014. Arizona Border Surveillance Technology Plan: Additional Actions Needed to Strengthen Management and Assess Effectiveness. GAO-14-411T. Washington, D.C.: March 12, 2014. Arizona Border Surveillance Technology Plan: Additional Actions Needed to Strengthen Management and Assess Effectiveness. GAO-14-368. Washington, D.C.: March 3, 2014. Border Security: Progress and Challenges in DHS Implementation and Assessment Efforts. GAO-13-653T. Washington, D.C.: June 27, 2013. Border Security: DHS's Progress and Challenges in Securing U.S. Borders. GAO-13-414T. Washington, D.C.: March 14, 2013. Border Security: Opportunities Exist to Ensure More Effective Use of DHS's Air and Marine Assets. GAO-12-518. Washington, D.C.: March 30, 2012. U.S. Customs and Border Protection's Border Security Fencing, Infrastructure and Technology Fiscal Year 2011 Expenditure Plan. GAO-12-106R. Washington, D.C.: November 17, 2011. Arizona Border Surveillance Technology: More Information on Plans and Costs Is Needed before Proceeding. GAO-12-22. Washington, D.C.: November 4, 2011. GAO Cost Estimating and Assessment Guide: Best Practices for Developing and Managing Capital Program Costs. GAO-09-3SP. Washington, D.C.: March 2009. This is a work of the U.S. government and is not subject to copyright protection in the United States. The published product may be reproduced and distributed in its entirety without further permission from GAO. However, because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. | CBP employs surveillance technologies, UAS, and other assets to help secure the border. For example, in January 2011, CBP developed the Arizona Border Surveillance Technology Plan, which includes seven acquisition programs related to fixed and mobile surveillance systems, among other assets. CBP has also deployed UAS, including Predator B aircraft, as well as tactical aerostats to help secure the border. In recent years, GAO has reported on a variety of CBP border security programs and operations. This statement addresses (1) GAO findings on DHS's efforts to implement the Arizona Border Surveillance Technology Plan and (2) preliminary observations related to GAO's ongoing work on CBP's use of UAS and tactical aerostats for border security. This statement is based on GAO products issued from November 2011 through April 2016, along with selected updates conducted in May 2016. For ongoing work related to UAS, GAO reviewed CBP documents and analyzed Predator B flight hour data from fiscal years 2011 through 2015, the time period when all Predator B centers became operational. GAO also conducted site visits in Texas and Arizona to view operation of Predator B aircraft and tactical aerostats and interviewed CBP officials responsible for these operations. GAO reported in March 2014 and April 2015 that U.S. Customs and Border Protection (CBP), within the Department of Homeland Security (DHS), had made progress in deploying programs under the Arizona Border Surveillance Technology Plan (the Plan), but could take additional actions to strengthen its management of the Plan and its related programs. Specifically, in March 2014 GAO reported that CBP's schedules and life-cycle cost estimates for the Plan and its three highest-cost programs--which represented 97 percent of the Plan's total estimated cost--met some but not all best practices. GAO recommended that CBP ensure that its schedules and cost estimates more fully address best practices, such as validating cost estimates with independent estimates, and DHS concurred. As of May 2016, CBP has initiated or completed deployment of technology for each of the three highest-cost programs under the Plan, and reported updating some program schedules and cost estimates. For example, in May 2016, CBP provided GAO with complete schedules for two of the programs, and GAO will be reviewing them to determine the extent to which they address GAO's recommendation. GAO also reported in March 2014 that CBP had identified mission benefits of technologies under the Plan, such as improved situational awareness, but had not developed key attributes for performance metrics for all technologies, as GAO recommended in November 2011. As of May 2015, CBP had identified a set of potential key attributes for performance metrics for deployed technologies and expected to complete its development of baselines for measures by the end of 2015. In March 2016, GAO reported that CBP was adjusting the completion date to incorporate pending test and evaluation results for recently deployed technologies under the Plan. GAO's ongoing work on CBP's use of unmanned aerial systems (UAS) for border security shows that CBP operates nine Predator B aircraft in U.S. airspace in accordance with Federal Aviation Administration (FAA) requirements. Specifically, CBP's Air and Marine Operations operates the aircraft in accordance with FAA certificates of waiver or authorization for a variety of activities, such as training flights and patrol missions to support the U.S. Border Patrol's (Border Patrol) efforts to detect and apprehend individuals illegally crossing into the United States between ports of entry. Predator B aircraft are currently equipped with a combination of video and radar sensors that provide information on cross-border illegal activities to supported agencies. CBP data show that over 80 percent of Predator B flight hours were in airspace encompassing border and coastal areas from fiscal years 2011 through 2015. CBP officials stated that airspace access and hazardous weather can affect CBP's ability to utilize Predator B aircraft for border security activities. GAO's ongoing work shows that CBP has deployed six tactical aerostats--relocatable unmanned buoyant craft tethered to the ground and equipped with cameras for capturing full-motion video--along the U.S.-Mexico border in south Texas to support Border Patrol. CBP operates three types of tactical aerostats, which vary in size and altitude of operation. CBP officials reported that airspace access, hazardous weather, and real estate (e.g., access to private property) can affect CBP's ability to deploy and utilize tactical aerostats. Border Patrol has taken actions to track the contribution of tactical aerostats to its mission activities. GAO has previously made recommendations to DHS to improve its management of plans and programs for surveillance technologies and DHS generally agreed. | 5,110 | 1,019 |
As noted earlier, before a rule can become effective, it must be filed in accordance with the statute. GAO conducted a review to determine whether all final rules covered by CRA and published in the Register were filed with the Congress and GAO. We performed this review to both verify the accuracy of our database and to ascertain the degree of agency compliance with CRA. We were concerned that regulated entities may have been led to believe that rules published in the Federal Register were effective when, in fact, they were not unless filed in accordance with CRA. Our review covered the 10-month period from October 1, 1996, to July 31, 1997. In November 1997, we submitted to OIRA a computer listing of the rules that we found published in the Federal Register but not filed with our Office. This initial list included 498 rules from 50 agencies. OIRA distributed this list to the affected agencies and departments and instructed them to contact GAO if they had any questions regarding the list. Beginning in mid-February, because 321 rules remained unfiled, we followed up with each agency that still had rules which were unaccounted for. Our Office has experienced varying degrees of responses from the agencies. Several agencies, notably the Environmental Protection Agency and the Department of Transportation, took immediate and extensive corrective action to submit rules that they had failed to submit and to establish fail-safe procedures for future rule promulgation. Other agencies responded by submitting some or all of the rules that they had failed to previously file. Several agencies are still working with us to assure 100 percent compliance with CRA. Some told us they were unaware of CRA or of the CRA filing requirement. Overall, our review disclosed that: 279 rules should have been filed with us; 264 of these have subsequently 182 were found not to be covered by CRA as rules of particular applicability or agency management and thus were not required to be filed; 37 rules had been submitted timely and our database was corrected; and 15 rules from six agencies have thus far not been filed. We do not know if OIRA ever followed up with the agencies to ensure compliance with the filing requirement; we do know that OIRA never contacted GAO to determine if all rules were submitted as required. As a result of GAO's compliance audit, however, 264 rules now have been filed with GAO and the Congress and are thus now effective under CRA. In our view, OIRA should have played a more proactive role in ensuring that agencies were both aware of the CRA filing requirements and were complying with them. One area of consistent difficulty in implementing CRA has been the failure of some agencies to delay the effective date of major rules for 60 days as required by section 801(a)(3)(A) of the act. Eight major rules have not permitted the required 60-day delay, including the Immigration and Naturalization Service's major rule regarding the expedited removal of aliens. Also, this appears to be a continuing problem since one of the eight rules was issued in January 1998. We find agencies are not budgeting enough time into their regulatory timetable to allow for the delay and are misinterpreting the "good cause" exception to the 60-day delay period found in section 808(2). Section 808(2) states that, notwithstanding section 801, "any rule which an agency for good cause finds (and incorporates the finding and a brief statement of reasons therefor in the rule issued) that notice and public procedure thereon are impracticable, unnecessary, or contrary to the public interest" shall take effect at such time as the federal agency promulgating the rule determines. This language mirrors the exception in the Administrative Procedure Act (APA) to the requirement for notice and comment in rulemaking. 5 U.S.C. SS 553(b)(3)(B). In our opinion, the "good cause" exception is only available if a notice of proposed rulemaking was not published and public comments were not received. Many agencies, following a notice of proposed rulemaking, have stated in the preamble to the final major rule that "good cause" existed for not providing the 60-day delay. Examples of reasons cited for the "good cause" exception include (1) that Congress was not in session and thus could not act on the rule, (2) that a delay would result in a loss of savings that the rule would produce, or (3) that there was a statutorily mandated effective date. The former administrator of OIRA disagreed with our interpretation of the "good cause" exception. She believed that our interpretation of the "good cause" exception would result in less public participation in rulemaking because agencies would forgo issuing a notice of proposed rulemaking and receipt of public comments to be able to invoke the CRA "good cause" exception. OIRA contends that the proper interpretation of "good cause" should be the standard employed for invoking section 553(d)(3) of the APA, "as otherwise provided by the agency for good cause found and published with the rule," for avoiding the 30-day delay in a rule's effective date required under the APA. Since CRA's section 808(2) mirrors the language in section 553(b)(B), not section 553(d)(3), it is clear that the drafters intended the "good cause" exception to be invoked only when there has not been a notice of proposed rulemaking and comments received. One early question about implementation of CRA was whether Executive agencies or OIRA would attempt to avoid designating rules as major and thereby avoid GAO's review and the 60-day delay in the effective date. While we are unaware of any rule that OIRA misclassified to avoid the major rule designation, the failure of agencies to identify some issuances as "rules" at all has meant that some major rules have not been identified. CRA contains a broad definition of "rule," including more than the usual "notice and comment" rulemakings under the Administrative Procedure Act which are published in the Federal Register. "Rule" means the whole or part of an agency statement of general applicability and future effect designed to implement, interpret, or prescribe law or policy. "All too often, agencies have attempted to circumvent the notice and comment requirements of the Administrative Procedure Act by trying to give legal effect to general policy statements, guidelines, and agency policy and procedure manuals. Although agency interpretative rules, general statements of policy, guideline documents, and agency and procedure manuals may not be subject to the notice and comment provisions of section 553(c) of title 5, United States Code, these types of documents are covered under the congressional review provisions of the new chapter 8 of title 5." On occasion, our Office has been asked whether certain agency action, issuance, or policy constitutes a "rule" under CRA such that it would not take effect unless submitted to our Office and the Congress in accordance with CRA. For example, in response to a request from the Chairman of the Subcommittee on Forests and Public Land Management, Senate Committee on Energy and Resources, we found that a memorandum issued by the Secretary of Agriculture in connection with the Emergency Salvage Timber Sale Program constituted a "rule" under CRA and should have been submitted to the Houses of Congress and GAO before it could become effective. Likewise, we found that the Tongass National Forest Land and Resource Management Plan issued by the United States Forest Service was a "rule" under CRA and should have been submitted for congressional review. OIRA stated that, if the plan was a rule, it would be a major rule. The Forest Service has in excess of 100 such plans promulgated or revised which are not treated as rules under CRA. Many of these may actually be major rules that should be subject to CRA filing and, if major rules, subject to the 60-day delay for congressional review. In testimony before the Senate Committee on Energy and Natural Resources and the House Committee on Resources regarding the Tongass Plan, the Administrator of OIRA stated that, as was the practice under the APA, each agency made its own determination of what constituted a rule under CRA and by implication, OIRA was not involved in these determinations. We believe that for CRA to achieve what the Congress intended, OIRA must assume a more active role in guiding or overseeing these types of agency decisions. Other than an initial memorandum following the enactment of CRA, we are unaware of any further OIRA guidance. Because each agency or commission issues many manuals, documents, and directives which could be considered "rules" and these items are not collected in a single document or repository such as the Federal Register, for informal rulemakings, it is difficult for our Office to ascertain if agencies are fully complying with the intent of CRA. Having another set of eyes reviewing agency actions, especially one which has desk officers who work on a daily basis with certain agencies, would be most helpful. We have attempted to work with Executive agencies to get more substantive information about the rules and to get such information supplied in a manner that would enable quick assimilation into our database. An expansion of our database could make it more useful not only to GAO for its use in supporting congressional oversight work, but directly to the Congress and to the public. Attached to this testimony is a copy of a questionnaire designed to obtain basic information about each rule covered by CRA. This questionnaire asks the agencies to report on such items as (1) whether the agency provided an opportunity for public participation, (2) whether the agency prepared a cost-benefit analysis or a risk assessment, (3) whether the rule was reviewed under Executive orders for federalism or takings implications, and (4) whether the rule was economically significant. Such a questionnaire would be prepared in a manner that facilitates incorporation into our database by electronic filing or by scanning. In developing and attempting to implement the use of the questionnaire, we consulted with Executive branch officials to insure that the requested information would not be unnecessarily burdensome. We circulated the questionnaire for comment to 20 agency officials with substantial involvement in the regulatory process, including officials from OIRA. The Administrator of OIRA submitted a response in her capacity as Chair of the Regulatory Working Group, consolidating comments from all the agencies represented in that group. It is the position of the group that the completion of this questionnaire for each of the 4,000 to 5,000 rules filed each year is too burdensome for the agencies concerned. The group points out that the majority of rules submitted each year are routine or administrative or are very narrowly focused regional, site-specific, or highly technical rules. We continue to believe that it would further the purpose of CRA for a database of all rules submitted to GAO to be available for review by Members of Congress and the public and to contain as much information as possible concerning the content and issuance of the rules. We believe that further talks with the Executive branch, led by OIRA, can be productive and that there may be alternative approaches, such as submitting one questionnaire for repetitive or routine rules. If a routine rule does not fit the information on the submitted questionnaire, a new questionnaire could be submitted for only that rule. For example, the Department of Transportation could submit one questionnaire covering the numerous air worthiness directives it issues yearly. If a certain action does not fit the overall questionnaire, a new one for only that rule would be submitted. We note that almost all agencies have devised their own forms for the submission of rules, some of which are as long or almost as extensive as the form we recommend. Additionally, some agencies prepare rather comprehensive narrative reports on nonmajor rules. We are unable to easily capture data contained in such narrative reports with the resources we have staffing this function now. The reports are systematically filed and the information contained in them essentially is lost. Our staff could, however, incorporate an electronic submission or scan a standardized report into our database and enable the data contained therein to be used in a meaningful manner. CRA gives the Congress an important tool to use in monitoring the regulatory process, and we believe that the effectiveness of that tool can be enhanced. Executive Order 12866 requires that OIRA, among other things, provide meaningful guidance and oversight so that each agency's regulatory actions are consistent with applicable law. After almost 2 years' experience in carrying out our responsibilities under the act, we can suggest four areas in which OIRA should exercise more leadership within the Executive branch regulatory community, consistent with the intent of the Executive Order, to enhance CRA's effectiveness and its value to the Congress and the public. We believe that OIRA should: require standardized reporting in a GAO-prescribed format that can readily be incorporated into GAO's database; establish a system to monitor compliance with the filing requirement on an ongoing basis; provide clarification on the "good cause" exception to the 60-day delay provision and oversee agency compliance during its Executive Order 12866 review; and provide clarifying guidance as to what is a rule that is subject to CRA and oversee the process of identifying such rules. Thank you, Mr. Chairman. This concludes my prepared remarks. I would be happy to answer any questions you may have. The first copy of each GAO report and testimony is free. Additional copies are $2 each. Orders should be sent to the following address, accompanied by a check or money order made out to the Superintendent of Documents, when necessary. VISA and MasterCard credit cards are accepted, also. Orders for 100 or more copies to be mailed to a single address are discounted 25 percent. U.S. General Accounting Office P.O. Box 37050 Washington, DC 20013 Room 1100 700 4th St. NW (corner of 4th and G Sts. NW) U.S. General Accounting Office Washington, DC Orders may also be placed by calling (202) 512-6000 or by using fax number (202) 512-6061, or TDD (202) 512-2537. Each day, GAO issues a list of newly available reports and testimony. To receive facsimile copies of the daily list or any list from the past 30 days, please call (202) 512-6000 using a touchtone phone. A recorded menu will provide information on how to obtain these lists. | GAO discussed its experience in fulfilling its responsibilities under the Congressional Review Act (CRA). GAO noted that: (1) its primary role under the CRA is to provide Congress with a report on each major rule concerning GAO's assessment of the promulgating federal agency's compliance with the procedural steps required by various acts and Executive orders governing the regulatory process; (2) these include preparation of a cost-benefit analysis, when required, and compliance with the Regulatory Flexibility Act, the Unfunded Mandates Reform Act of 1995, the Administrative Procedure Act, the Paperwork Reduction Act, and Executive Order 12866; (3) GAO's report must be sent to the congressional committees of jurisdiction within 15 calendar days; (4) although the law is silent as to GAO's role relating to the nonmajor rules , GAO believes that basic information about the rules should be collected in a manner that can be of use to Congress and the public; (5) to do this, GAO has established an database that gathers basic information about the 15-20 rules GAO receives on the average each day; (6) GAO's database captures the title, agency, the Regulation Identification Number, the type of rule, the proposed effective date, the date published in the Federal Register, the congressional review trigger date, and any joint resolutions of disapproval that may be enacted; (7) GAO has recently made this database available, with limited research capabilities, on the Internet; (8) GAO conducted a review to determine whether all final rules covered by CRA and published in the Federal Register were filed with Congress and GAO; (9) as a result of GAO's compliance audit, 264 rules have been filed with GAO and Congress and are now effective under CRA; (10) one area of consistent difficulty in implementing CRA had been the failure of some agencies to delay the effective date of major rules for 60 days as required by the act; (11) one early question about implementation of CRA was whether executive agencies or the Office of Information and Regulatory Affairs (OIRA) would attempt to avoid designating rules as major and thereby avoid GAO's review and the 60-day delay in the effective date; and (12) while GAO is unaware of any rule that OIRA misclassified to avoid the major rule designation, the failure of agencies to identify some issuances as rules at all has meant that some major rules have not been identified. | 3,084 | 522 |
Mr. Chairman and Members of the Subcommittee: I am pleased to be here today to assist the Subcommittee in its review of the Commodity Futures Trading Commission's (CFTC) strategic plan. Hearings such as this one are an important part of assuring that the intent of the Government Performance and Results Act of 1993 (GPRA or Results Act) is met, and we commend you, Mr. Chairman, for holding this hearing. The consultative process provides an important opportunity for Congress and the executive branch to collectively ensure that agency missions are focused, goals are results-oriented, and strategies and funding expectations are appropriate. As you know, the Results Act required executive agencies to complete their initial strategic plans by September 30, 1997, and CFTC met this requirement. My testimony today discusses our review of CFTC's strategic plan. We specifically determined whether the plan contained each of the six components required by the Results Act and assessed each component's strengths and weaknesses. We also reviewed the extent to which CFTC consulted with stakeholders, including the other federal financial market regulators. Finally, we identified challenges that CFTC faces in addressing the requirements of the Results Act. CFTC's strategic plan reflects a concerted effort by the agency to address the requirements of the Results Act. Although the plan could be strengthened in some areas, it compares favorably with the plans of other federal financial regulators that we have reviewed. On the basis of our review, we found that CFTC's plan contained all of the components required by the Results Act but that some of the components could be strengthened. We also found that the plan could be improved by additional stakeholder input, including interagency coordination. Finally, due to the complex set of factors that determine regulatory outcomes, measuring program impacts presents challenges to CFTC in addressing the requirements of the Results Act, as it does for regulatory agencies in general. However, the use of program evaluations to derive results-oriented goals and to measure the extent those goals are achieved is one key to the success of the process. Notwithstanding the need for improvements, we recognize that CFTC's strategic plan is a dynamic document that the agency intends to refine. My comments apply to the strategic plan that CFTC formally submitted to Congress and the Office of Management and Budget (OMB) in September 1997. In general, our assessment of CFTC's plan was based on knowledge of the agency's operations and programs; past and ongoing reviews of CFTC; results of work on other agencies' strategic plans and the Results Act; discussions with CFTC, OMB, and Subcommittee staff; and other information available at the time of our assessment. The criteria we used to determine whether CFTC's plan complied with the requirements of the Results Act were the Results Act itself and OMB guidance on preparing strategic plans (OMB Circular A-11, Part 2). To assess CFTC's consultation with stakeholders and to identify challenges in implementing the Results Act, we relied on the results of our previous work and/or discussions with CFTC and OMB officials. CFTC is an independent agency that administers the Commodity Exchange Act, as amended, and was created by Congress in 1974. The principal purposes of the act are to protect the public interest in the proper functioning of the market's price discovery and risk-shifting functions. In administering the act, CFTC is responsible for fostering the economic utility of the futures market by encouraging its efficiency, monitoring its integrity, and protecting market participants from abusive trade practices and fraud. trade-offs that are necessary for effective policymaking. Improving management in the federal sector will not be easy, but the Results Act can assist in accomplishing this task. The Results Act requires executive agencies to prepare multiyear strategic plans, annual performance plans, and annual performance reports. First, the Act requires agencies to develop a strategic plan covering the period of 1997 through 2002. As indicated in the Results Act and OMB guidance, each plan is to include six major components: (1) a comprehensive statement of the agency's mission, (2) the agency's long-term goals and objectives for all major functions and operations, (3) a description of the approaches (or strategies) for achieving the goals and the various resources needed, (4) an identification of key factors, external to the agency and beyond its control, that could significantly affect its achievement of the strategic goals, (5) a description of the relationship between the long-term strategic goals and annual performance goals, and (6) a description of how program evaluations were used to establish or revise strategic goals and a schedule for future evaluations. In developing their strategic plans, agencies are to consult with Congress and solicit the views of stakeholders. Second, the Results Act requires executive agencies to develop annual performance plans covering each program activity set forth in their budgets. The first annual performance plans, covering fiscal year 1999, are to be provided to Congress after the President's budget is submitted to Congress in early 1998. An annual performance plan is to contain the agency's annual goals, measures to gauge performance toward meeting its goals, and resources needed to meet its goals. And third, the Results Act requires executive agencies to prepare annual reports on program performance for the previous fiscal year. The performance reports are to be issued by March 31 each year, with the first (for fiscal year 1999) to be issued by March 31, 2000. In each report, the agency is to compare its performance against its goals, summarize the findings of program evaluations completed during the year, and describe the actions needed to address any unmet goals. Based on our review, we found that CFTC's strategic plan contains all of the six major components required by the Results Act. The plan defines the agency's mission, establishes goals, lists activities to be performed to achieve the goals, identifies key factors affecting the achievement of the goals, discusses the relation between the goals of the strategic and annual performance plans, and addresses methods for evaluating the agency's programs. However, we identified several areas in which CFTC could improve the plan as it is revised and updated. Consistent with the OMB guidance, the strategic plan contains a brief mission statement that broadly defines CFTC's basic purposes: to protect market users and the public from abusive practices and to foster open, competitive, and financially sound futures and option markets. In addition, the accompanying background of the mission statement defines the agency's core responsibilities and discusses the agency's enabling legislation. Consistent with the OMB guidance, the strategic plan describes CFTC's goals and general objectives, providing staff with direction for fulfilling the agency's mission. The agency's three goals are to (1) protect the economic functions of the commodity futures and options markets, (2) protect market users and the public, and (3) foster open, competitive, and financially sound markets. The plan further defines each goal in terms of a number of outcome objectives. For example, under goal two, the plan includes the outcome objectives of promoting compliance with and deterring violations of federal commodities laws as well as requiring commodities professionals to meet high standards. The OMB guidance notes that a strategic plan's general goals and objectives should be stated in a manner that allows for future assessment of whether the goals and objectives are being achieved. Although the general goals and outcome objectives support the agency's mission, most could benefit by being restated in a way that facilitates future assessment of whether they have been achieved. Examples of objectives that could be restated include overseeing markets used for price discovery and risk shifting as well as promoting markets free of trade practice abuse. Consistent with the OMB guidance, the strategic plan lists key activities that staff are to perform to accomplish the outcome objectives and, in turn, general goals. For example, an outcome objective of goal three is to facilitate the continued development of an effective, flexible regulatory environment. The specific activities to be performed for this objective include providing regulatory relief, as appropriate, to foster the development of innovative transactions and participating in the President's Working Group on Financial Markets to coordinate efforts among U.S. financial regulators. The plan also discusses actions for communicating accountability to CFTC managers and staff. These actions include instituting a performance management system to create a more effective communication tool for mangers and staff and using the annual performance plan to better communicate specific goals and performance levels to staff. The OMB guidance notes that a strategic plan should briefly describe the resources needed to achieve its goals and objectives, for example, in terms of operational processes, staff skills, and technologies, as well as human, capital, and other resources. The guidance further notes that a plan should include schedules for initiating and completing significant actions as well as outline the process for communicating goals and objectives throughout the agency and for assigning accountability to managers and staff for achieving objectives. Although CFTC's plan lists specific activities to be performed to achieve its goals and objectives, it could be made more informative by discussing the resources needed to perform the activities and by providing schedules for initiating and completing significant actions. Similarly, the plan's discussion of communicating accountability could be expanded to address how CFTC will assign accountability to managers and staff for achieving objectives. performed to accomplish each goal. Finally, the strategic plan mentions that the annual plan establishes indicators and targets with the goal of ensuring that day-to-day activities are appropriately defined and measured. According to the OMB guidance, a strategic plan should briefly outline the type, nature, and scope of the annual performance goals and the relevance and use of these goals in helping determine whether the strategic plan's goals and objectives are being achieved. The linkage between the two plans is important because a strategic plan's goals and objectives establish the framework for developing the annual performance plan. Moreover, annual performance goals indicate the planned progress in that particular year toward achieving the strategic plan's goals and objectives. While CFTC's strategic plan discusses performance measures, it does not include performance goals that could be used to indicate the planned progress made each year toward achieving the general goals and objectives. Moreover, its measures focus on activities that are generally not stated in a manner that allows for future assessments and that may not always measure the intended outcomes. Examples of such measures include "potential violators deterred," "informed market users," and "high level of compliance fostered." CFTC could strengthen its plan by discussing performance goals and developing more results-oriented performance measures against which actual performance can be compared. As discussed below, regulatory agencies, such as CFTC, face barriers in developing such measures. Consistent with the OMB guidance, the strategic plan discusses some external challenges that could alter CFTC's ability to meet its goals and objectives, and it also discusses the strategies for meeting such challenges. The external challenges include the growing use of over-the-counter derivatives; structural changes in the financial services industry, including the convergence of the securities, futures, insurance, and banking industries; and globalization of financial markets. Strategies to address such challenges include fostering strong relationships with foreign authorities and responding to structural changes to ensure a level playing field as the futures, insurance, securities, and banking industries become more integrated. diminishing resources, recruiting and retaining qualified staff, and remaining abreast of technology. Strategies to address such challenges include reviewing resource requirements for operations and programs to ensure sound fiscal management, setting standards for staff recruitment, and implementing the agency's data processing plan. According to OMB guidance, a strategic plan should not only discuss key external factors but also indicate their link to particular goals and describe how the factors could affect the achievement of the goals. While the plan discusses external factors and strategies for addressing them, the link between particular factors and goals is not clear. CFTC could strengthen its plan by describing how the external factors are linked with particular goals and how a particular goal could be affected by the external factors. Also, the plan might benefit from a discussion of external factors that represent significant challenges for the financial industry, such as those relating to the "year 2000" computer dating problem and those relating to proposals for revising the Commodity Exchange Act that could affect CFTC's jurisdiction and that of other federal financial market regulators. The strategic plan specifies that CFTC will use methods and processes that are already in place to evaluate how well it is implementing its strategic and annual performance plans for the first 3 years. According to the plan, these processes are to provide information on, among other things, program accomplishments, staff activities, and CFTC's financial condition and resource usage. However, the plan also explains that the reporting process related to program accomplishments will be evaluated to determine how it may be used for reporting on program progress toward meeting the goals, outcome objectives, and activities in the strategic plan as well for setting overall priorities and allocating resources consistent with those priorities. Similarly, reviews and evaluations are described for the systems related to staff activities and resource usage. As such, we note that CFTC's evaluations are to be of its existing measurement and monitoring systems and that CFTC does not appear to be planning any evaluations of the manner and extent to which its programs achieve their objectives. The OMB guidance notes that a strategic plan should briefly describe program evaluations used to prepare the plan and provide a schedule for future evaluations outlining the methodology, scope, and issues to be addressed. CFTC's plan does not mention whether any evaluations were used to prepare the plan; however, CFTC officials told us that no evaluations were used. As CFTC revises and updates its plan, the plan could be made more useful by including the results of program evaluations used to prepare the plan. It could also be made more informative by discussing the timing and scope of future program evaluations as well as the particular issues to be addressed. In developing their strategic plans, agencies are to consult with Congress and solicit the views of stakeholders--those potentially affected by or interested in the plan. Agencies have discretion in determining how this consultation is conducted. The OMB guidance notes that some general goals and objectives will relate to cross-agency functions, programs, or activities. In such cases, it instructs agencies to ensure that appropriate and timely consultation occurs with other agencies during the development of strategic plans with cross-cutting goals and objectives. CFTC's strategic plan identifies numerous stakeholders, stating that they are valuable resources that must be tapped to provide critical feedback on the agency's goals and priorities. The stakeholders identified in the plan include futures exchanges, the National Futures Association, market users, and other federal departments and agencies. The plan also discusses CFTC's working relationships with other organizations and jurisdictions. For example, it notes that CFTC staff work through various intergovernmental partnerships to consult on issues of importance to CFTC and other federal financial regulators, including federal securities and bank regulators. chairperson was contacted and asked to provide feedback on the draft plan, and CFTC officials told us that the draft plan had been provided to the other federal financial market regulators for comment. Nonetheless, CFTC officials told us that there were only two parties outside of Congress, OMB, and at your request, the GAO that had provided the agency feedback on the plan, as of October 16, 1997. CFTC officials told us that they plan to use the same approach in developing future plans as they did in developing the current plan. CFTC's lack of success with this approach suggests that the agency should consider alternative approaches. In enacting the Results Act, Congress realized that the transition to results-oriented management would not be easy. Moving to a results orientation could be especially difficult for CFTC and other regulatory agencies. We analyzed a set of barriers facing certain regulatory agencies in their efforts to implement the Results Act in a June 1997 report. These barriers included the following: (1) problems collecting performance data, (2) complexity of interactions and lack of federal control over outcomes, and (3) results realized only over long time frames. To some extent, each of these barriers is applicable to CFTC. As implementation of the Results Act proceeds, CFTC, like other regulatory agencies, is likely to continue encountering barriers to establishing results-oriented goals and measures and, as a result, in evaluating program impact. Although developing performance measures and evaluating program impact are difficult, it is important that CFTC and other regulatory agencies continue their effort toward that end. Any new methods or research approaches developed by one agency could also be useful to others. continuing to work with the Congress and CFTC to ensure that the requirements of the Results Act are met. Mr. Chairman, this concludes my prepared statement. I will be pleased to respond to any questions that you or Members of the Subcommittee may have. The first copy of each GAO report and testimony is free. Additional copies are $2 each. Orders should be sent to the following address, accompanied by a check or money order made out to the Superintendent of Documents, when necessary. VISA and MasterCard credit cards are accepted, also. Orders for 100 or more copies to be mailed to a single address are discounted 25 percent. U.S. General Accounting Office P.O. Box 37050 Washington, DC 20013 Room 1100 700 4th St. NW (corner of 4th and G Sts. NW) U.S. General Accounting Office Washington, DC Orders may also be placed by calling (202) 512-6000 or by using fax number (202) 512-6061, or TDD (202) 512-2537. Each day, GAO issues a list of newly available reports and testimony. To receive facsimile copies of the daily list or any list from the past 30 days, please call (202) 512-6000 using a touchtone phone. A recorded menu will provide information on how to obtain these lists. | Pursuant to a congressional request, GAO assessed the Commodity Futures Trading Commission's (CFTC) strategic plan for compliance with the Government Performance and Results Act. GAO noted that: (1) CFTC's strategic plan contained all of the major components required by the Results Act; (2) there are several areas in which CFTC could improve its plan; (3) the plan defines goals and objectives that supported CFTC's mission, but most of these could benefit by being restated in a way that would facilitate future assessment; (4) the plan identifies activities for achieving CFTC's goals and objectives, but could be more informative by including the resources needed for the activities, schedules for completing key actions, and ways for assigning accountability to managers and staff; (5) the plan's discussion of the relationship between goals in the annual and strategic plans could be strengthened by including more results-oriented performance measures that could be used to reflect progress made toward achieving its goals; (6) the plan identifies some key external factors that could affect the agency's ability to achieve its goals, but the plan could be improved by describing how such factors are linked to particular goals and how a particular goal can be affected by a specific factor; (7) the plan indicates that CFTC will use its existing processes to evaluate its programs, but the plan could be expanded to include information on the timing and scope of future evaluations; (8) the draft plan was made available to stakeholders late in the process and reflects limited consultation with stakeholders during plan development; (9) the plan does not discuss how CFTC will incorporate stakeholders' views in the development of future plans, and (10) although developing performance measures and measuring program impacts present challenges to CFTC and to other regulatory agencies in addressing the requirements of the Results Act, it is important that CFTC and these agencies continue their efforts toward that end. | 3,840 | 403 |
In the late 1980s, changes in the national security environment resulted in a Defense infrastructure with more bases than the Department of Defense (DOD) needed. To enable DOD to close unneeded bases and realign others, Congress enacted base closure and realignment (BRAC) legislation that instituted base closure rounds in 1988, 1991, 1993, and 1995. For the 1991, 1993, and 1995 rounds, special BRAC Commissions were established to recommend specific base closures and realignments to the President, who, in turn sent the Commissions' recommendations and his approval to Congress. A special commission was also established for the 1988 round that made recommendations to the Committees on Armed Services of the Senate and House of Representatives. For the 1988 round, legislation required DOD to complete its closure and realignment actions by September 30, 1995. For the 1991, 1993, and 1995 rounds, legislation required DOD to complete all closures and realignments within 6 years from the date the President forwarded the recommended actions to the Congress. BRAC has afforded DOD the opportunity to reduce its infrastructure and free funds for high priority programs such as weapons modernization and force readiness. As the closure authority for the last round expires in fiscal year 2001, DOD has reported reducing its domestic infrastructure by about 20 percent and saving billions of dollars that would otherwise have been spent supporting unneeded infrastructure. In essence, reported savings include both distinct savings that actually occur during the budget year or years a BRAC decision is implemented and cost avoidances during future years--costs that DOD would have incurred if BRAC actions had not taken place. Some of the savings are one-time, such as canceled military construction projects. The vast majority of BRAC savings represent a permanent and recurring avoidance of spending that would otherwise occur, such as for personnel. Over time, the value of the recurring savings is the largest and most important portion of overall BRAC savings. DOD reports its BRAC cost and savings estimates to the Congress on a routine basis as part of its annual budget requests. In preparing the estimates, DOD guidance to the military services and defense agencies states that the estimates are to be based on the best projection of what savings will actually accrue from approved realignments and closures. In this regard, prior year estimated savings are required to be updated to reflect actual savings when available. The Congress recognized that an up-front investment was necessary to achieve BRAC savings and established two accounts to fund certain implementation costs. These costs included (1) relocating personnel and equipment from closing to gaining bases, (2) constructing new facilities at gaining bases to accommodate organizations transferred from closing bases, and (3) remedying environmental problems on closing bases. DOD, in its annual budget request, provides the Congress with estimated cost data relative to the implementation of each BRAC round. For the most part, these estimated costs are routinely updated as they are recorded on an ongoing basis in DOD's financial accounting systems. Since we last reported on this issue in December 1998, DOD has increased its net savings estimate for the four BRAC rounds. DOD now estimates a net savings of about $15.5 billion through fiscal year 2001, an increase of $1.3 billion from the previously reported $14.2 billion. DOD data suggest that cumulative savings began to surpass cumulative costs in fiscal year 1998. The increase in net savings is attributable to a combination of lower estimated costs and greater estimated savings, as reported in DOD's fiscal year 2001 budget request and documentation. Overall, DOD has reduced its cost estimates from fiscal year 1999 to fiscal year 2001 for implementing BRAC by about $723 million and increased its savings estimates by about $610 million, resulting in a net savings increase of $1.3 billion. Table 1 summarizes the cumulative cost and savings estimates through fiscal year 2001 for the four BRAC rounds as reflected in DOD's fiscal years 1999 and 2001 BRAC budget requests and documentation, along with associated changes in the various costs and savings categories. In addition to the estimates shown in table 1, DOD now reports annual estimated recurring savings of $6.1 billion beyond fiscal year 2001, an increase from approximately $5.6 billion that DOD reported in fiscal year 1999. As shown in table 1, the cost estimates for implementing the four BRAC rounds have decreased by about $723 million from $22.9 billion to $22.2 billion with most of the decrease, or about $359 million, attributable to lower reported environmental restoration costs through fiscal year 2001. Our analysis of the data shows that most, or about $313 million, of the environmental cost reduction occurred in the Navy BRAC account. Some of this can be attributed to shifting planned actions to future years. Further, estimated revenues generated from actions--such as land sales, property leases, and other reimbursements--have increased by $180 million to $300 million, thereby increasing the offset to BRAC program cost estimates. According to the Air Force, its increased revenues resulted from the reporting of reimbursements received from the city of Chicago, Illinois, for the cost of moving an Air National Guard unit from O'Hare International Airport to Scott Air Force Base, Illinois, and from increased proceeds from land sales and property leases. In addition to reductions in estimated costs, DOD is reporting over $610 million in additional estimated savings through 2001 in its closure accounts. Our analysis shows that more than half, or $381 million, of the $610 million increase in savings shown in table 1 is attributable to Air Force operation and maintenance. Air Force officials told us that the savings increase was attributable to actions at two bases--McClellan Air Force Base, California, and Kelly Air Force Base, Texas. While the Air Force did not provide an estimate for savings at these two bases in its fiscal year 1999 budget request because of uncertainties regarding the performance of the bases' workloads, it reported a $381 million savings estimate in its fiscal year 2001 budget request. Further, an additional $101 million in increased savings is due primarily to inflationary adjustments in the estimated post-implementation savings for the 1988, 1991, and 1993 rounds through fiscal year 2001. Post-implementation savings for the 1995 round do not begin accruing until fiscal year 2002. In addition to the revisions made to the cost and savings estimates through fiscal year 2001, DOD has also revised its annual recurring savings estimate for fiscal years 2002 and beyond. DOD is now projecting annual recurring savings of $6.1 billion for the four BRAC rounds, an increase of approximately $500 million from the $5.6 billion DOD reported in fiscal year 1999. Our analysis shows that the increase is attributable equally to an increase in the BRAC 1995 round savings estimate and to a reported increase in prior rounds' recurring savings caused by using an inflation factor to convert them into current year dollars. Our prior work, along with work by others including the Congressional Budget Office, the DOD Inspector General, and the Army Audit Agency, has shown that BRAC savings are real and substantial, and are related to cost reductions in key operational areas as a result of BRAC actions. At the same time, limitations have existed in DOD's efforts to track actual costs and savings over time, which limits the precision of its net savings estimate. Audits of BRAC financial records have shown that BRAC has enabled DOD to save billions of dollars, primarily through the (1) overall elimination or reduction of base support costs at specific installations, (2) elimination or reduction of military and civilian personnel costs, and (3) cancellation of military construction and family housing projects at closed or realigned bases. Our prior work as well as work of others has shown that eliminating or reducing base support costs at closed or realigned bases is a major contributor to generating BRAC savings. Savings are realized through a number of actions, such as terminating physical security, fire protection, utilities, property maintenance, accounting, payroll, and a variety of other services that have associated costs linked specifically to base operations. For example, as stated in an April 1996 report, our analysis of the operation and maintenance costs at eight closing installations from the 1988 and 1991 rounds indicated that base support costs had been reduced and that annual recurring savings would be substantial--about $213 million--after initial costs were recouped. DOD Inspector General and Army Audit Agency reports have also shown base support reductions at closing and realigning facilities as real and substantial, although not precise. The DOD Inspector General, in affirming savings for a sample of bases in the 1993 BRAC round, consistently found that the services had significantly reduced their operating budgets because of the closure process. The elimination or reduction of military and civilian personnel at closed or realigned bases is also a major contributor to generating savings. In an April 1998 report, DOD estimated that about 39,800 military personnel and about 71,000 civilian positions had been eliminated by BRAC, resulting in an overall recurring savings of about $5.8 billion annually. While we were not able to precisely reconcile these estimated reductions with actual BRAC-related end strength reductions in the services, we reported that the large number of personnel reductions was a significant contributor to the substantial savings achieved through BRAC. DOD Inspector General and Army Audit Agency reports have validated personnel savings at various BRAC locations, although the savings estimates were not well documented in many cases. In other cases, the personnel reductions were greater than estimated. For example, in a review of nine 1995 BRAC bases, the Army Audit Agency found that, in contrast to no savings being identified for the elimination of civilian personnel authorizations at tenant activities providing support to BRAC bases, over $13 million in net recurring savings had accrued. Additionally, the cancellation of planned military construction of facilities and family housing at closed or realigned bases contributes to the savings generated from BRAC. Prior DOD Inspector General and Army Audit Agency reports have affirmed savings attributable to such cancellations. For example, in a May 1998 report, the DOD Inspector General reported that, after a review of a Navy-reported savings of about $205 million from cancelled military construction projects in the 1993 round, the savings were actually $336 million, or $131 million more than reported. Finally, as we reported in 1998, DOD, as part of its budgeting process, has subtracted projected BRAC savings from the expected future cost of each service's funding plans in the Future Years Defense Program. While our work has consistently shown that savings from BRAC actions are expected to be substantial, we have also noted the cost and savings estimates are imprecise. This relates to the development of initial estimates and efforts to track changes in these estimates over time. While cost estimates are routinely updated and tracked in financial accounting systems, they are based on DOD obligations and not actual outlays, thereby adding a degree of imprecision to the actual costs and the basis for savings projections. Also, as we have previously reported, a fundamental limitation in DOD's ability to identify and track savings from BRAC closures and realignments is that DOD's accounting systems, like all accounting systems, are not oriented to identifying and tracking savings.Savings estimates are developed by the services at the time they are developing their initial BRAC implementation budgets and are reported in DOD's BRAC budget justifications. Because the accounting systems do not track savings, updating these estimates would require a separate tracking method or system. Our prior work has shown that the savings estimates have been infrequently updated and, unlike for estimated costs, no method or system has been established to track savings on a routine basis. Over time, this contributes to imprecision as the execution of closures or realignments may vary from the original plans. Further, because arguments can be made as to what costs or savings can be definitely attributed to BRAC, such as environmental restoration costs, the precision of the estimates comes into question. Nevertheless, we and others have consistently expressed the view that these factors are not significant enough to outweigh the fact that substantial savings are being generated from the closure process. In reports issued in November and December 1998, we concluded that, while closure and realignment savings for the four BRAC rounds would be substantial after initial costs were recouped, the estimates were imprecise. In particular, we cited that savings estimates were not being routinely updated and that federal economic assistance costs of over $1 billion that had been provided to communities and individuals impacted by BRAC were not included in DOD's reported costs. Those economic assistance costs now exceed $1.2 billion. While the inclusion of these costs as attributable to BRAC has the effect of delaying the point at which savings surpass costs, it does not negate the fact that the savings are substantial. A July 1998 Congressional Budget Office report also indicated substantial BRAC savings, even though there was imprecision in DOD's cost and savings estimates. In its comments on cost estimates, the Congressional Budget Office cited that not all BRAC-related costs are included in the estimates. As we had also pointed out, the Budget Office cited federal economic assistance costs as not being included in the estimates. Further, the Budget Office pointed out that operating units sometimes had borne unexpected costs when services at DOD facilities were temporarily impacted by BRAC actions. As to savings, the Congressional Budget Office stated its belief that DOD's estimate of $5.6 billion in annual recurring savings at that time was reasonable, given that the Budget Office's estimate was about $5 billion annually. DOD Inspector General reports also pointed out substantial BRAC savings, despite imprecision in cost and savings estimates. In its May 1998 report of more than 70 closed or realigned bases during the 1993 BRAC round, the Inspector General found that, for the 6-year implementation period for carrying out the BRAC Commission's recommendations, the savings would overtake the costs sooner than expected. While DOD's original budget estimate indicated costs of about $8.3 billion and annual recurring savings of $7.4 billion during the implementation period, the Inspector General concluded that costs potentially could be reduced to $6.8 billion and that savings could reach $9.2 billion, a net savings of $2.4 billion. The Inspector General's report indicated that the greater savings were due to such factors as reduced obligations that were not adjusted to reflect actual disbursements, canceled military construction projects, and a lower increase in overhead costs at bases receiving work from closing bases. On the other hand, an Inspector General's review of 23 bases closed during the 1995 BRAC round noted that savings during the implementation period were overstated by $33.2 million, or 1.4 percent, and costs were overstated by $28.8 million, or 4.5 percent of initial budget estimates. Also, the Army Audit Agency, in a July 1997 report on BRAC costs and savings, concluded that savings would be substantial after full implementation for ten 1995 BRAC round sites it had examined but that estimates were not exact. For example, the Agency reported that annual recurring savings beyond the implementation period, although substantial, were 16 percent less than the major commands' estimates. The difficulty in precisely identifying savings is further complicated if one considers the specific actions being undertaken under the BRAC process. For example, while environmental restoration costs are a valid BRAC expenditure, DOD reported that the vast majority of its BRAC environmental restoration costs would have been incurred whether or not an installation is impacted by BRAC. DOD acknowledges, however, that environmental costs under the BRAC process may have been accelerated in the shorter term. Others suggest that in some instances BRAC-related environmental cleanup may be done more stringently than would have been the case had the installation remained open. However, the marginal difference is not easily quantified and depends largely on the end use of the closed installation. To the extent that much of the environmental cost is not considered as an additional cost to DOD, this has the effect of increasing net savings, especially considering that DOD estimates $7 billion in BRAC-related environmental costs through fiscal year 2001. DOD also expects to spend $3.4 billion in environmental costs beyond fiscal year 2001. This is a $1 billion increase over the $2.4 billion environmental cost estimate DOD reported in fiscal year 1999. According to DOD officials, this increase is attributable primarily to the inclusion of cleanup costs for unexploded ordnance, the refinement of cleanup requirements and DOD's cost estimates, and the utilization of more stringent cleanup standards due to changes in the end use of closed installations. While the $3.4 billion in environmental costs is not reflected in DOD's $6.1 billion annual recurring savings estimate, these costs are spread over many years and should have limited impact on cumulative long-term savings. A similar case can be made for new military construction at receiving bases under the BRAC process. While significant funds have been expended on new military construction (an estimated $6.7 billion through fiscal year 2001), the military did benefit from the improvement in its facilities infrastructure. While this is somewhat difficult to precisely quantify, it appears that some portion of the cost would have been incurred under DOD's facilities capital improvement initiatives. If so considered, this would also have the effect of increasing net BRAC savings. In commenting on a draft of this report on July 25, 2001, the Deputy Under Secretary of Defense for Installations agreed with our findings. This official also provided technical clarifications, which we have incorporated as appropriate. To determine the extent to which cost and savings estimates have changed over time, we compared the data contained in DOD's fiscal year 2001 BRAC budget request and documentation with similar data in the fiscal year 1999 budget request and documentation, which were the latest documents available since we last reported on this issue in December 1998. We noted revisions in the data and identified where major changes had occurred in the various costs and savings categories within the BRAC account. To the extent possible within time constraints, we discussed with officials of the Office of the Secretary of Defense and military services the rationale for those cases where the changes were significant, but we did not independently verify the validity of DOD's reported cost and savings data. We are continuing to examine the basis for the changes in DOD's cost and savings estimates and will discuss the issue in greater detail in an overall status report on BRAC that we expect to issue in early 2002. To comment on the validity of the net savings estimates, we relied primarily on our prior BRAC reports and reviewed Congressional Budget Office, DOD, DOD Office of Inspector General, and service agency audit reports. As part of our ongoing broader review of BRAC issues, we are examining the extent to which the military services have updated their cost and savings estimates since we last reported on this issue in December 1998. We will discuss that issue in more detail in the status report that we expect to issue in early 2002. In assessing the accuracy of the cost and savings data, we reviewed the component elements that DOD considered in formulating its overall BRAC savings estimates. Because DOD did not include in its estimates federal expenditures to provide economic assistance to communities and individuals affected by BRAC, we collected these expenditure data from DOD's Office of Economic Adjustment and considered them in our analysis of the estimated BRAC savings. We conducted our review in June and July 2001 in accordance with generally accepted government auditing standards. We are sending copies of this report to the appropriate congressional committees; the Secretaries of Defense, the Army, the Navy, and the Air Force; and the Director, Office of Management and Budget. We also make copies available to others upon request. Please contact me on (202) 512-8412 if you or your staff have any questions concerning this report. Key contributors to this report were Mark Little, James Reifsnyder, Michael Kennedy, and Tom Mahalek. | Through four rounds of base closures and realignments between 1988 and 1995, the Department of Defense (DOD) expected to reduce its domestic infrastructure and provide needed dollars for high priority programs, such as weapons modernization. Although DOD projects it will realize significant recurring savings from the closures and realignments, Congress continues to raise questions about how much, if any, money has been saved through the base closure process. Two GAO reports issued in late 1998 concluded that net savings from the four closure rounds were substantial but that the cost and savings estimates used to calculate the net savings were imprecise. This report reviews (1) the basis for DOD's recent increase in net savings projected to be realized from the closure process and (2) GAO's previous observations on the basis for savings from base closure and realignment actions and the precision of the cost and savings estimates. DOD's fiscal year 2001 budget request and documentation show that it now expects net savings of about $15.5 billion through fiscal year 2001 and about $6.1 billion in annual recurring savings thereafter, an increase from the $14.2 billion and about $5.6 billion, respectively, DOD reported in fiscal year 1999. GAO's analysis of the data showed that the net savings increase through fiscal year 2001 was due primarily to an overall reduction of about $723 million in reported costs and an increase of about $610 million in expected savings resulting from the closures. The net savings for the four rounds of base closures and realignments are substantial and are related to decreased funding requirements in specific operational areas. Reviews by the Congressional Budget Office, the DOD Inspector General, and the Army Audit Agency have affirmed that net savings are substantial after initial investment costs are recouped. However, those same reviews also showed that the estimates are imprecise and should be viewed as a rough approximation of the likely savings. | 4,218 | 396 |
DOE's responsibility for contractors' litigation costs has its roots in the early nuclear programs. Since the inception of these programs in the 1940s, the federal government has relied on contractors to operate its nuclear facilities. However, because of the high risk associated with operating these facilities, the agencies responsible for managing nuclear activities--from the Atomic Energy Commission to DOE--included litigation and claims clauses in their management and operating contracts. These clauses provide that litigation expenses are allowable costs under the contracts. In addition, judgments against the contractors arising from their performance of the contracts are reimbursable by DOE. Over the past several years, class action lawsuits have been filed against many past and present contractors responsible for operating DOE's facilities. In general, these suits contend that the operation of the facilities released radioactive or toxic emissions and caused personal injury, emotional distress, economic injury, and/or property damage. These suits have been filed against the current and former operators of certain DOE facilities throughout the country, such as the Fernald Plant in Fernald, Ohio; the Hanford Site near Richland, Washington; the Los Alamos National Laboratory in Los Alamos, New Mexico; the Rocky Flats Plant in Golden, Colorado; and various other facilities. (App. I lists ongoing class action suits against DOE contractors during fiscal years 1991-93.) DOE has the option of undertaking the defense against such class action litigation on its own; however, it has generally opted to have the contractors defend the case in good faith. As standard practice, DOE has authorized contractors to proceed with their defense and has limited its own involvement to approving the hiring of outside counsel, reviewing billings, and agreeing upon any settlement amounts. The cognizant DOE field office is responsible for funding each contractor's litigation and overseeing the litigation effort. DOE has not maintained complete information on the costs of litigation against present and former DOE contractors. According to officials from DOE's Office of General Counsel, costs for contractors' legal defense are budgeted and controlled by each responsible contractor and field office. These officials said that each DOE field office, through its Office of Chief Counsel, is responsible for managing the costs associated with its contractors' litigation. The officials added that DOE headquarters has not maintained overall cost data because it was not involved in the day-to-day management of these cases. Nevertheless, DOE has collected some data indicating that it is incurring substantial costs for the services of outside law firms. In 1993, a subgroup of DOE's Contract Reform Team surveyed the Chief Counsels' offices to determine how much DOE was spending to reimburse its contractors for their legal expenses. According to the data the subgroup collected, DOE contractors paid over $31 million to outside law firms in fiscal year 1992 and almost $24 million during the first 8 months of fiscal year 1993. The subgroup attributed these large costs to "toxic tort" class action lawsuits filed against current and former contractors reporting to DOE's Albuquerque, Oak Ridge, and Richland operations offices. The costs associated with these class action suits are large, in part, because several of the suits involve multiple contractors and law firms. Many lawyers work on each case, and the monthly costs can exceed $500,000. The In Re: Hanford case, for example, has six former and present DOE contractors as codefendants, and 10 separate law firms are representing them. In just 1 month in 1992, DOE paid for the services of 62 outside attorneys, 25 of whom billed at least $200 per hour, and 44 legal assistants working on the case. The cost of these services alone was over $455,000. (See app. II for detailed information on the billings for this particular month.) DOE has incurred additional costs for contractors' litigation that were not reflected in the data collected by DOE. The most significant of these are costs for establishing data bases. For each of the major class action lawsuits we examined--In Re: Hanford, Cook et al. v. Rockwell/Dow, In Re: Los Alamos, and Day v. NLO--the contractors and the outside legal firms have established data bases of documents and other information. According to DOE officials in the field offices and representatives of the contractors, these data bases provide unique capabilities to identify and retrieve information needed for the contractors' legal defense. The costs for these data bases increase DOE's total outside litigation costs substantially. Data obtained from the cognizant Chief Counsels' offices show that from fiscal year 1991 through fiscal year 1993, over $25 million was spent for developing litigation data bases for these four cases. The data base for the Fernald litigation was the most costly--exceeding $14 million--but the other data bases cost over $2 million each. (App. III contains information on the costs of data bases.) When the fiscal year 1992 costs for data bases are added to the expenses paid to outside law firms during the same fiscal year, the total costs incurred by DOE for its contractors' legal defense during that fiscal year exceed $40 million. Other costs that should be considered as litigation-related costs include all funds associated with the activities of NLO, Inc., and the in-house legal costs at current M&O contractors. NLO--a former operator of the Fernald Plant--is currently in existence only to manage its legal defense under a postoperations contract. From fiscal year 1991 through fiscal year 1993, NLO received $15.7 million from DOE--$8 million for costs incurred by outside law firms, an estimated $2.5 million for developing the litigation data base, and much of the remaining $5.2 million for activities directly supporting the litigation. For example, consultants hired by NLO over this period conducted various projects for the outside law firm, NLO staff assisted in activities related to the litigation, and the firm earned almost $1 million in fees for managing the litigation. Similarly, current M&O contractors incurred in-house costs to monitor and manage ongoing legal activities; however, the portion of these costs related to litigation against the contractors is not known. Contractor officials at Oak Ridge, Sandia, and Hanford all stated that they have lawyers on staff who manage outside litigation activities and in some cases participate in litigation activities. The in-house costs related to these activities, however, were not available. The officials said that data are not maintained on the costs related to the internal efforts associated with such litigation. Legal fees represent the largest and most visible cost associated with DOE contractors' litigation expenses. These costs include the hourly rates charged by the outside attorneys and other expenses incurred by the law firms in defending the contractors. However, DOE exercised little control over these costs. Specifically, DOE did not establish any criteria or guidelines for allowable costs, and it did not develop procedures requiring detailed reviews of law firms' bills. As a result, DOE paid for legal expenses that would not be allowed under criteria established by certain other federal organizations. Cost guidelines are necessary for contractors and law firms to know what costs will or will not be reimbursed; however, DOE had not developed and implemented such cost criteria. Two federal corporations--the Federal Deposit Insurance Corporation (FDIC) and the Resolution Trust Corporation (RTC)--have developed cost guidelines for outside counsel. These corporations' guidelines clearly specify what costs will be allowable and at what rates. These guidelines appear to be consistent with an opinion issued in December 1993 by the American Bar Association. The association's opinion--although nonbinding--suggests that law firms can recoup only reasonable and actual costs for services. Comparing DOE's reimbursements with the corporations' guidelines, we found that DOE had paid significantly more than these guidelines allow for professional fees, duplication and facsimile costs, travel costs, and office overhead expenses. The corporations require that discounts on fees for legal services be sought in all cases. Their guidelines direct law firms seeking to represent the corporations to offer a discount on their rates. A corporation official stated that FDIC receives at least a 5-percent discount. Most of the law firms representing FDIC discount their rates by 10 percent--some firms, by as much as 20 percent. DOE, however, did not require its contractors to seek discounts on professional fees from outside law firms. Consequently, few discounts were obtained. Only 2 of the 16 law firms' bills we examined contained any discounts. If DOE were to adopt this guideline, it could obtain substantial cost savings, as the following example shows. One law firm is representing DOE contractors in three separate class action suits. Over a 3-year period, the firm received $8 million in professional fees for its work on these cases. If a 5-percent discount had been applied, DOE could have saved over $400,000. At a 10-percent discount rate, the savings could have been over $800,000. (See app. IV for further examples of the savings DOE could have obtained through discounts on fees.) Law firms charge for certain administrative tasks that they perform for their clients. One of these tasks is duplicating documents. The corporations' criteria state that charges for photocopying shall not exceed 8 cents per page. DOE was reimbursing its contractors at a much higher rate. The amounts charged for reproducing documents varied among the DOE contractors' law firms, ranging from 10 cents per page to 25 cents per page. Gibson, Dunn, and Crutcher charged almost $170,000 for duplicating documents over a 3-year period. For 13 months, the firm charged 25 cents per page, and for 23 months, it lowered the rate to 20 cents per page. Had the firm been allowed to charge only 8 cents per page, the total cost reimbursed by DOE would have been $58,750, a savings of nearly $109,000. Limiting all firms to this rate would have saved almost $425,000. (App. V contains further details on costs for duplicating.) Another administrative task for which DOE was paying high rates is facsimile transmission. An FDIC official stated that this charge is to be billed at the actual cost--the cost of the telephone call. However, several firms representing DOE contractors charged as much as $1.75 per page plus the cost of the long-distance call. For example, the law firm of Gibson, Dunn, and Crutcher was reimbursed by DOE for more than $47,000 in telefax and telecopying charges--in addition to the related telephone charges--over a 3-year period. Travel costs incurred by law firms representing DOE contractors exceeded guidelines set forth by RTC and FDIC. The corporations' criteria limit travel costs to coach airfare, moderate hotel prices, and federal per diem rates for meals. Travel costs reimbursed by DOE were significantly higher. For example, two firms--Hunton and Williams and Perkins Coie--billed first-class airfare for their senior partners. Additionally, attorneys often were reimbursed for the costs of high-priced hotel rooms. Lawyers from Kirkland and Ellis billed for hotel rooms in Washington, D.C., that cost from $215 to as much as $250 per night. In contrast, the government's lodging allowance for that city is $113 per night. Additionally, some firms billed for meals costing far more than federal per diem rates. In many cases, the meals cost almost $100 per person. For example, the law firm of Perkins Coie billed for a four-person dinner in New York City costing $95 per person (the federal per diem allowance in this city is $38) and billed for a five-person dinner in Seattle costing $90 per person (the federal per diem allowance in this city is $34). This firm also billed for meal expenses that consisted only of drinks--an expense that is not allowable under federal per diem regulations. Furthermore, some of the meal expenses were incurred for attorneys and staff who were not on travel. One firm--Perkins Coie--billed over $9,000 for expenses labeled as "conference meals" over a 3-year period. Review of the supporting documentation indicates that these expenses were for meals purchased while many of the staff in attendance were not on travel and/or for activities associated with "client development." In another instance, Crowell and Moring billed not only for the meals of its local attorneys but for the meals of their spouses as well. According to a legal opinion from one DOE operations office, meal expenses for attorneys and staff who are not on travel are not reimbursable. Nevertheless, although such costs were not allowed by contractors within that particular region, they were allowed by other contractors and were reimbursed in full by DOE in other regions. Other costs were incurred and charged to DOE that, under the two federal corporations' guidelines, are considered to be law firm overhead that should be subsumed within the professional fees. These include costs for word processing services, overtime, utilities and supplies, and legal publications. In many instances, however, DOE allowed these charges. Although these costs could conceivably, in some cases, be appropriately charged and reimbursed, we found many instances in which the charges were inappropriate. For example, Shea and Gardner billed for purchasing American Bar Association publications, such as a guide to taking depositions. Crowell and Moring marked up its telephone charges 25 percent above the actual cost and its computer research 50 percent above the actual cost. Additionally, according to the federal corporations' guidelines, expenses for activities conducted by lawyers to develop subject matter expertise are not to be charged to the federal corporations. Instead, law firms must absorb the cost of developing an understanding of specialty issues. In contrast, some law firms--Shea and Gardner and Gibson, Dunn, and Crutcher--billed DOE contractors for staff to attend seminars on toxic/radiation litigation. DOE did not have requirements mandating and facilitating detailed reviews by contractors and/or DOE of the bills submitted by law firms. As a result, the quality of the reviews varied greatly, and some reviews were inadequate. For example, one contractor--Westinghouse Hanford Company--performed an internal audit 2 years into the In Re: Hanford litigation and found that it did not have adequate reviews of the legal bills submitted to it. The audit also revealed that several costs that were not allowable under the company's own in-house criteria had been paid, such as first class airfares. In another instance, UNC, Inc.--a former contractor at Hanford--never examined detailed billings of its principal law firm and instead approved all of its bills on the basis of a monthly two-page billing summary. These summaries lacked detailed information on the activities that each lawyer had performed; in fact, they did not even specify the number of hours that lawyers had worked on the case. DOE's review of bills was also inadequate. At only one DOE operations office--Oak Ridge--did Chief Counsel officials perform detailed reviews of legal costs before approving bills for payment. This office disallowed numerous costs--including costs for meals charged by lawyers who were not on travel and expenses for seminars--that were allowed by other operations offices. At Albuquerque, few detailed reviews of bills were performed, and when performed, such reviews took place after the bills had been paid. At Richland, bills were approved for payment by the Chief Counsel primarily on the basis of billing summaries, and any detailed reviews were conducted annually or semiannually. In our view, the summaries were not specific enough for a reviewer to determine what the costs were for and whether they were appropriate. Additionally, DOE did not require the bills to be presented in a format that included enough detail to allow a reviewer to understand the basis for the charges. Consequently, even when detailed reviews were performed, many of the charges in the bills could not be adequately assessed. For example, some charges were listed simply for "research" or "reviewing documents," while others were listed for meetings with specific individuals, but no mention was made of either the purpose of the meeting or the subject discussed. In other instances, activities were cumulated into a daily total and briefly described; this information did not indicate how much time was spent on each activity and whether the time spent was appropriate. Charges for activities performed by attorneys and their staffs might have been questioned if DOE had established adequate review procedures and sufficient criteria for reasonableness. For instance, several firms charged time for staff to prepare monthly bills, review and catalog newspaper articles, prepare security clearance forms, and rearrange or move file rooms. Additionally, General Electric hired a public relations firm to analyze trends in the case and passed these costs along to DOE for reimbursement. In our view, these activities were of such questionable benefit to DOE that a detailed review would have raised concerns about the appropriateness of DOE's paying for them. DOE has recognized that its controls over contractors' litigation costs are problematic and has taken some actions to improve them. In March 1994, DOE issued guidance on managing litigation, directing its field office Chief Counsels to ensure that the rates charged are reasonable. The guidance also requires that contractors develop for each case a formal understanding concerning, among other things, allowable expenses, billing procedures, and contractors' reviews of bills. In testimony before this Subcommittee on July 13, 1994, we stated that although these actions represented a step in the right direction, they did not go far enough. The guidance still gave contractors considerable discretion in controlling costs. Given our experience with the way contractors had applied cost controls in the past, we were not convinced that this guidance would ensure that consistent and effective cost controls were developed and applied to all legal bills. Since the hearing, however, DOE's Office of General Counsel has begun to develop and adopt additional measures to address the problems identified. On August 25, 1994, DOE issued an acquisition letter (No. 94-13) setting forth interim policies for contracting officers to consider in determining whether particular litigation costs are reasonable. The cost guidelines--which became effective for all ongoing class action suits on October 1, 1994--establish limits and terms for the costs that DOE will reimburse to contractors for outside litigation. For example, the guidelines specify that costs for duplication are not to exceed 10 cents per page; telephone charges, facsimile transmission costs, and computer-assisted research costs are not to exceed the actual costs; airfare is not to exceed the coach fare; and other travel expenses should be moderate, consistent with the rates set forth in the Federal Travel Regulations. The guidelines also set forth DOE's policy for reimbursing attorneys' fees, profit and overhead, and overtime expenses, and they designate specific nonreimbursable costs. Additionally, officials from the Office of General Counsel have met with RTC and FDIC officials to gain insight from their experience in developing systems for auditing bills to determine the reasonableness of both the professional activity and the related expenses. A staff has been assembled in headquarters to develop requirements and procedures for reviewing bills and to conduct detailed review of bills. Chief Counsel staff in regional offices are also developing review procedures that will be coordinated with the headquarters requirements. DOE is still in the initial stages of developing an audit function but plans to have one in place by early 1995. Furthermore, a cost-reporting system is being implemented that will provide monthly reports on all litigation. This reporting system will collect Department-wide cost data in a consistent format. According to DOE's General Counsel, this system will report all costs, including data base costs and contractors' in-house costs, within 10 days after the end of each month. DOE plans to compare the actual with the budgeted costs for each case to better ensure that the costs remain reasonable. This system is now operational, although Office of General Counsel officials acknowledge that the data are not yet complete. Finally, DOE is consolidating its legal defense in various cases--a measure with the greatest cost-saving potential. The In Re: Hanford case, for example, has six codefendants--each represented by at least one law firm and some by as many as three firms. DOE acknowledges that duplication of effort is likely and, with it, unnecessary costs. To prevent further duplication, DOE informed the codefendants that beginning in fiscal year 1995, it would not reimburse any contractor for the services of any outside counsel other than the law firm selected to serve as lead counsel for the litigation. At the time this report was being completed, a lead contractor had been designated and that contractor--with concurrence from DOE--had selected a lead counsel. DOE estimates that by consolidating, it will reduce its annual outside litigation expenses by nearly 60 percent, saving millions of dollars on this case alone. Office of General Counsel officials estimated that these efforts--establishing cost criteria, implementing an audit function, and consolidating class action cases--would save DOE $5 million to $7 million annually. During fiscal years 1991 through 1993, DOE incurred large litigation costs but, in many cases, did not have the internal controls needed to ensure that these costs were appropriate. At a recent hearing before this Subcommittee, we discussed these problems and, as a result, DOE began to improve its management of contractors' litigation costs. If DOE's recent efforts are fully implemented and successful, substantial cost savings could accrue to the government. Additionally, DOE should have cost controls and case management principles in place to ensure that any future lawsuits are handled efficiently. DOE is to be commended for its quick and thorough response to the problems we identified. However, it remains to be seen whether or not these new procedures will be universally implemented within DOE's field offices and whether or not all contractors will accept and abide by these new procedures. We discussed the facts in this report with DOE officials, including the General Counsel and other officials from the Office of General Counsel. They agreed with the facts presented; however, they expressed concern that the tone of the report might lead readers to believe that DOE was not addressing the problems we had identified. They provided comments and information on the actions they are taking to reduce litigation costs and improve cost controls. We have incorporated these comments into the report where appropriate. As requested, we did not obtain written agency comments on a draft of this report. We performed our work between November 1993 and August 1994 in accordance with generally accepted government auditing standards. Appendix VI contains details on the objectives, scope, and methodology of our review. As arranged with your office, unless you publicly announce its contents earlier, we plan no further distribution of this report until 30 days after the date of this letter. At that time, we will send copies of the report to the appropriate Senate and House committees; interested Members of Congress; the Secretary of Energy; and other interested parties. We will make copies available to others on request. Major contributors to this report are listed in appendix VII. If we can be of further assistance, please contact me at (202) 512-3841. Contractor(s) NLO, Inc. Attorney (2) Paralegal/ Litigation support (7 people) Gibson, Dunn, and Crutcher Williams, Kastner, Gibbs (continued) Contract Staff - Clerk(s) Contract Staff - D.E. Clerk (continued) Helsell, Fetterman, Martin, Todd, and Hokanson (continued) Stoel, Rives, Boley, Jones, and Grey Davis, Wright, Tremaine Data not available. On October 29, 1993, the Chairman of the Subcommittee on Oversight and Investigations, House Committee on Energy and Commerce, asked us to review the Department of Energy's (DOE) expenses for outside litigation. After discussions with the Chairman's office, we agreed to (1) determine how much DOE was spending for litigation to defend its contractors, (2) evaluate whether adequate controls are in place to ensure that all of these costs are appropriate, and (3) assess the efforts being made by DOE to improve its controls over these outside litigation costs. To respond to this request, we met with staff in DOE's Office of General Counsel in Washington, D.C., to obtain an overall perspective on the litigation activities of the Department's various contractors, the underlying issues associated with such litigation, and the rationale for DOE's paying the costs of the contractors' litigation. Additionally, we selected and visited three of DOE's operations offices--Albuquerque, Oak Ridge, and Richland--and examined records of the litigation activities and costs incurred in each office. We selected these offices because DOE data indicated that these offices had incurred about 75 percent of the Department's expenses for contractors' litigation. To address the first objective, we discussed litigation costs with DOE headquarters and operations office officials. We discussed the types of costs associated with the litigation and the records maintained on these costs. We also obtained and reviewed data covering the period from October 1991 through May 1993 compiled by an internal DOE litigation management task force assessing the costs of litigation. To verify the data on costs for outside legal firms' services developed by the task force and to attempt to obtain complete cost data for fiscal year 1993, we examined available records at the three operations offices, including the data that were submitted to the task force, supporting documentation, and various other records detailing expenditures for outside legal firms' services. However, we were not able to obtain sufficient data on costs to ensure that the amounts provided to the task force were accurate or to calculate the total costs for fiscal year 1993. In addition, we discussed other costs of litigation with these DOE officials and obtained data from them detailing the costs of developing litigation data bases. We also contacted contractors and law firms responsible for developing and managing the data bases and obtained data on the costs incurred. Furthermore, we discussed in-house costs with contractor officials at all three operations offices. To address the second objective, we (1) evaluated the charges and expenses of the outside law firms engaged by the contractors and (2) assessed the process used by the contractors and DOE to review these costs. We obtained and reviewed the billings of outside law firms involved in four major class action suits: In Re: Hanford, Cook et al. v. Rockwell/Dow, In Re: Los Alamos, and Day v. NLO. We examined the supporting documentation for the various charges, and when the available data were insufficient, we contacted the contractors and/or law firms to obtain information on the rates and charges for activities, or in some cases, we visited the law firms to review documentation supporting the charges. We did not, however, obtain and examine law firms' internal documents supporting the hourly charges of individual lawyers or legal assistants. To evaluate the reasonableness of the law firms' charges and expenses, we compared these costs to the guidelines developed and used by the Federal Deposit Insurance Corporation and the Resolution Trust Corporation. These federal corporations use outside law firms to conduct much of their legal work and have had cost guidelines in place for several years to ensure that the expenses they incur for litigation are reasonable. We judged the corporations' guidelines to be an appropriate benchmark for evaluating the costs incurred by DOE. Additionally, we used the American Bar Association's Formal Ethics Opinion 93-379 as another guide for judging the reasonableness of the law firms' charges. Finally, we met with a litigation management consultant to obtain further guidance on reasonable and prudent costs to be paid for legal services. To assess the adequacy of the review of the law firms' billings, we discussed review procedures with each DOE operations office we visited and obtained available documentation that showed evidence of review and comment on the law firms' charges. In addition, we met with representatives of the contractors--DuPont, Martin Marietta Energy Systems, NLO, UNC, the University of California, and Westinghouse Hanford Company. We discussed review procedures by telephone with Atlantic Richfield Hanford Corporation, Dow Chemical Company, General Electric, and Rockwell International. To keep apprised of DOE's efforts to develop and implement cost controls over litigation costs, we discussed actions proposed by the agency with officials from the Office of General Counsel at DOE headquarters and the Office of Chief Counsel at the Albuquerque, Oak Ridge, and Richland operations offices. We obtained documents detailing the actions DOE intends to take to better control litigation costs and ensure more effective litigation management. Furthermore, we discussed planned procedures for auditing law firms' bills with the official responsible for this activity in DOE's Office of Inspector General. Peter Fernandez Ernie V. Limon, Jr. John E. Cass The first copy of each GAO report and testimony is free. Additional copies are $2 each. Orders should be sent to the following address, accompanied by a check or money order made out to the Superintendent of Documents, when necessary. Orders for 100 or more copies to be mailed to a single address are discounted 25 percent. U.S. General Accounting Office P.O. Box 6015 Gaithersburg, MD 20884-6015 Room 1100 700 4th St. NW (corner of 4th and G Sts. NW) U.S. General Accounting Office Washington, DC Orders may also be placed by calling (202) 512-6000 or by using fax number (301) 258-4066. Each day, GAO issues a list of newly available reports and testimony. To receive facsimile copies of the daily list or any list from the past 30 days, please call (301) 258-4097 using a touchtone phone. A recorded menu will provide information on how to obtain these lists. | Pursuant to a congressional request, GAO provided information on the Department of Energy's (DOE) efforts to control its litigation costs, focusing on: (1) the amount DOE spends on litigation to defend its contractors; and (2) whether DOE controls are adequate to ensure that these legal costs are appropriate. GAO found that: (1) although DOE cannot accurately determine the total amount it reimburses contractors for their outside litigation costs, preliminary findings show that in 1992, DOE spent about $40 million on its contractor litigation costs; (2) most DOE contractor legal costs are incurred through the hiring of outside law firms; (3) DOE does not have effective cost controls for reimbursing outside legal services; (4) DOE has been billed at higher rates than other federal entities for professional legal fees, travel, word processing, document duplication, and other litigation expenses because it has not effectively overseen contractor payments or developed adequate criteria that define which costs are reimbursable; and (5) DOE efforts to improve its cost controls include issuing specific cost guidelines, instituting procedures for periodically reporting all litigation costs, establishing audit functions that enable it to conduct detailed reviews of legal bills, and consolidating cases involving multiple contractors and law firms to improve case management and reduce costs. | 6,283 | 263 |
IRS administers America's tax laws and collects the revenues that fund government operations and public services. In fiscal year 2006, IRS collected more than $2.5 trillion in revenue. IRS's Taxpayer Service and Enforcement programs generate more than 96 percent of the total federal revenue collected for the U.S. government. Total federal revenues have fluctuated from roughly 16 to 21 percent of gross domestic product between 1962 and 2004. Given the amount of federal revenue collected by IRS, a disruption of IRS operations could have great impact on the U.S. economy. The IRS headquarters building is located in Washington, D.C., and houses over 2,200 of the agency's estimated 104,000 employees. The headquarters building contains the offices of IRS executive leaders, such as the Commissioner and deputy commissioners, and headquarters personnel for 14 of the agency's 17 individual business units. On June 25, 2006, the IRS headquarters building suffered flooding during a period of record rainfall and sustained extensive damage to its infrastructure. The subbasement and basement were flooded, and critical parts of the facility's electrical and mechanical equipment were destroyed or heavily damaged. The subbasement--which contained equipment such as electrical transformers, electrical switchgears, and chillers--was submerged in more than 20 feet of water. In addition, the basement level-- which housed the building's fitness center, food service canteens, computer equipment, and the basement garage--was flooded with 5 feet of water. As a result of the flood damage, the building was closed until December 8, 2006. In response to the flood and the closure of the building, IRS headquarters officials reported activating several of the agency's emergency operations plans. Over 2,000 employees normally assigned to the headquarters building were relocated to other facilities throughout the Washington, D.C., metropolitan area. Although the flood severely damaged the building and necessitated the relocation of IRS employees to alternate office space, particular circumstances limited potential damage and made response and recovery activities easier: No employees were injured, killed, or missing as a result of the flood. Damage was limited to the basement and subbasement levels, and employees were able to enter the building to retrieve equipment and assets 5 days following the flood. IRS and the General Services Administration were able to identify and allocate alternate work space to accommodate all displaced employees, not just those considered critical or essential. According to IRS status reports following the flood, facility space was provided for critical personnel within 10 days and for all headquarters employees within 29 days. Table 1 provides a time line of activities following the flood. The Treasury Inspector General for Tax Administration also reviewed the IRS response to the flooding. According to the Inspector General's reports, IRS adequately protected sensitive data and restored computer operations to all employees approximately 1 month following the flood. In addition, he reported that the flood caused no measurable impact on tax administration because of the nature of the work performed at this building and the contingency plans that IRS had in place. Finally, he reported that IRS paid $4.2 million in salary costs for 101,000 hours of administrative leave granted to IRS personnel following the flooding. While $3 million was paid for administrative leave during the first week following the flooding, the amount paid for administrative leave decreased in subsequent weeks. IRS headquarters has multiple emergency operations plans that if activated, are intended to work in conjunction with each other during emergencies. These plans include a suite of business continuity plans comprised of, among others, a business resumption plan for each IRS business unit and an Incident Management Plan. In addition, IRS has a COOP plan for emergency events affecting IRS executive leadership and essential functions. Table 2 summarizes the IRS emergency operations plans and their purposes. FEMA developed FPC 65 to provide guidance to federal executive branch departments and agencies in developing contingency plans and programs to ensure the continuity of essential agency operations. All federal executive branch agencies are required to have such a capability in place to maintain essential government services across a wide range of all hazard emergencies. This guidance defines the elements of a viable continuity capability for agencies to address in developing their continuity plans. Table 3 summarizes eight general elements of federal continuity guidance that agency plans should address. IRS supplemented federal guidance with sections of its Internal Revenue Manual--a document outlining the agency's organization, policies, and procedures--related to business resumption plans. Similar to the federal continuity guidance, the Internal Revenue Manual outlined minimum requirements for business resumption plans, including the need to identify people and resources to perform critical functions. The IRS headquarters emergency operations plans we reviewed collectively addressed several of the general elements of guidance identified in FPC 65. For example, the plans adequately identified the people needed to continue performing essential functions and had established procedures for activation. However, other elements were not addressed or were addressed only in part. Specifically, IRS identified two separate lists of essential functions--critical business processes and essential functions for IRS leadership--within its plans but only prioritized one of the lists. Furthermore, although the COOP plan outlined provisions for tests, training, and exercises, neither the business resumption plans we reviewed--from Criminal Investigation (CI), Wage and Investment (W&I), and Chief Counsel--nor the Incident Management Plan outlined the need to conduct such activities. While IRS's Office of Physical Security and Emergency Preparedness provided overall guidance to business units on their business resumption plans, the guidance was inconsistent with the federal guidance on several elements, including the preparation of resources and facilities needed to support essential functions and requirements for regular tests, training, and exercises. Until IRS requires all of the plans that contribute to its ability to quickly resume essential functions to fully address federal guidance, it will lack assurance that it is adequately prepared to respond to the full range of potential disruptions. Inconsistencies between IRS's business resumption plans and federal guidance can be attributed in part to gaps in IRS internal guidance. IRS provided its business units with guidance on developing business resumption plans, including general guidance within IRS's Internal Revenue Manual and a business resumption plan template disseminated to the business units. The Internal Revenue Manual provided IRS business units with minimum requirements of elements to include in their plans, such as identifying critical personnel and resources. In addition, the Office of Physical Security and Emergency Preparedness disseminated a business resumption plan template to business units that included, among other things, sections for identifying the critical business processes and personnel to support the resumption of critical activities. IRS's internal guidance addressed several of the elements of a viable continuity capability. For example, the Internal Revenue Manual stated that business resumption plans should include a list of critical personnel, and the business resumption plan template asked each business unit to list its critical team leaders and members and their contact information. Similarly, the IRS guidance adequately addressed execution and resumption. For other continuity planning elements, however, IRS guidance on developing business resumption plans was inconsistent with federal guidance. Specifically, IRS guidance on resources directed business units to identify their need for vital records, systems, and equipment. However, rather than procuring those resources before an event occurs, as outlined in federal guidelines, IRS guidance assumed that business units will work with teams outlined within the Incident Management Plan to acquire those resources following a disruption. Similarly, IRS directed business units to identify alternate work space requirements for personnel, but not to prepare or acquire them until after a disruption occurs. Finally, IRS guidance did not address the need for tests, training, or exercises involving the critical personnel identified within business resumption plans. Officials from the Office of Physical Security and Emergency Preparedness stated that it was the responsibility of business units to conduct adequate tests, training, and exercises of their business resumption plans. Officials further stated that the IRS response to the June 2006 flooding validated the use of its incident command structure outlined in its Incident Management Plan. Although the incident command structure can be effective at securing needed resources over time, IRS will be able to respond to a disruption more quickly if it prepares necessary resources and facilities before an event occurs. This is especially critical in the case of business processes that need to be restored within 24 to 36 hours. Similarly, if personnel are unfamiliar with emergency procedures because of inadequate training and exercises, the agency's response to a disruption could be delayed. IRS officials largely relied upon the Incident Management Plan to direct their response to the emergency conditions created by the June 2006 flooding. This plan guided officials in establishing roles and responsibilities for command and control of the overall resumption effort and a capability for the procurement of alternate facility space and equipment. Business unit officials were initially guided by their business resumption plans, but later response activities differed from those plans because of the circumstances resulting from the event. According to IRS headquarters officials, the headquarters COOP plan was not activated because local space availability made moving the executive leadership to the alternate COOP facility unnecessary and the safety of the leadership was not at risk. We previously reported that in responding to emergencies, roles and responsibilities for leadership must be clearly defined and effectively communicated in order to facilitate rapid and effective decision making. The IRS Incident Management Plan provided agency officials with clear leadership roles and responsibilities for managing the response and recovery process, including the procurement of temporary facility space and equipment necessary to continue critical business processes. Consistent with the plan, the Incident Commander acted as the leader of IRS headquarters response and recovery activities immediately following the flood. To assist in managing the incident, the Incident Commander activated members of the IRS Incident Management Team and other supporting sections, whose roles and responsibilities were outlined in the plan. These individuals included business resumption team leaders from each of the IRS business units and personnel from the central service divisions, such as Real Estate and Facilities Management and Modernization and Information Technology Services. According to minutes from Incident Management Team meetings held in the days following the flood, the following Incident Management supporting teams were activated and provided the following contributions: 1. The Operations Section, responsible for conducting response and recovery activities, gathered information regarding the facility space and equipment requests from the IRS business units, as well as preferences on alternate work location assignments. 2. The Logistics Section, responsible for providing all nonfinancial logistical support, procured and allocated facility space and equipment to IRS business units. 3. The Planning Section, responsible for providing documentation of the emergency, documented decisions and conducted reporting. For example, the Planning Section prepared documents for hearings and maintained relocation schedules and information. 4. The Finance and Administrative Section, responsible for providing all financial support, provided assistance in monitoring agency costs and developing travel and leave policies. According to IRS status reports following the flood, facility space was provided for critical personnel within 10 days and for all headquarters employees within 29 days. The Incident Commander reported that the Incident Management Team and its supporting units stepped down approximately 2 months after the flood. The three business units we reviewed reported that their business resumption plans guided their initial responses to the flood. In later phases of their responses, the business units differed from their plans to account for conditions at the time, such as current work priorities and the availability of alternate office space for more staff than the minimum necessary to perform the most critical functions. The following sections outline how selected business units relied on their business resumption plans when responding to the flood. CI used its business resumption plan to (1) establish an internal command structure to coordinate emergency activities following the flood and (2) identify short-term facility space for selected employees. According to the CI business resumption executive, the business unit used alternate facilities previously identified within the CI business resumption plan to relocate personnel within the first 2 days. CI leadership determined which personnel would be placed first and at what locations, since its business unit's resumption plan did not specify such information. According to the CI business resumption executive, after learning from the Incident Commander that relocation would be for a longer period and that alternate facility space was available to accommodate all displaced CI employees, CI officials submitted a request for facility space and equipment for all of their employees to the Incident Commander and Incident Management Team. In discussing lessons learned, the CI business resumption executive acknowledged that the unit's plan primarily addressed relocation to alternate facilities for short-term emergencies rather than longer-term events like the flood, and that CI should work with IRS's central organizations to better plan for relocation in such situations. Furthermore, the executive stated that better tests and exercises of the CI plan could assist in better preparing for a wider range of future emergencies. W&I officials used their plan to identify and prioritize critical tasks. W&I managers gathered at a previously scheduled off-site retreat the morning following the flood and conducted a review of the business unit's resumption plan, according to the new W&I business resumption executive. The executive stated that the activity was particularly useful in addressing identified knowledge gaps in the wake of the prior W&I business resumption leader's sudden death the day before the flood. Critical business processes and supporting tasks, initially prioritized within the plan, were adjusted to reflect the criticality of several tasks at that time of year. According to the business resumption executive, the revised list of critical business processes allowed W&I managers to identify critical personnel and resources, which were submitted to the Incident Management Team as facility space and resource requests. In addition, the executive stated that W&I managers established a system for placing employees in alternate work space based on their association with the prioritized tasks, although it was not reflected in the W&I business resumption plan. W&I created a document to capture lessons learned following the flood and established an internal business resumption working group to ensure a business resumption capability in all W&I field offices. As W&I officials did not anticipate the need to readjust tasks, one item discussed in the document addressed the need to create a rolling list of critical business processes and critical personnel, as processes and tasks will vary throughout the year. In addition, the W&I business resumption working group developed minimum requirements for all W&I plans and conducted a gap analysis of field office plans to identify areas for improvement. According to the W&I business resumption executive, the working group will conduct a training session for field office business resumption coordinators after the 2007 filing season. Although the Chief Counsel resumption efforts were led by people identified within its plan, the unit's business resumption officials reported that use of the plan was limited because of the high-level content of the document. According to the Chief Counsel's business resumption executive, the plan was written at a high level because it was expected that specific priorities would be determined by the active caseload at the time of the emergency. The executive stated that following the flood, Chief Counsel prioritized resumption activities based on the active caseload and the need to address emerging requirements, such as (1) ensuring that mail addressed to the business unit's processing division was rerouted and processed at another facility and (2) supporting a specific court case being conducted in New York City because of its level of criticality and time sensitivity. The executive further stated that officials identified alternate work space in Chief Counsel offices in the Washington, D.C., metropolitan area and placed approximately 180 employees prioritized based on the organizational hierarchy. Chief Counsel submitted requests to the Incident Commander and Incident Management Team for facility space and resources for over 500 remaining employees. Although Chief Counsel was able to identify tasks, such as tax litigation, that were consistent with responsibilities outlined in its plan and procured facility space and resources for personnel, it established a task force that identified recommendations to improve the business unit's plan in a report documenting lessons learned following the flood. Recommendations included measures to improve the prioritization of critical functions and people and outline provisions for mail processing. In addition, because Chief Counsel experienced delays in recovering a computer server that had not been identified in the business resumption plan but proved to be important following the flood, the task force addressed the need to ensure redundancy of information technology equipment. Chief Counsel is currently drafting an action plan to carry out the recommendations of the task force. In addition, a Chief Counsel business resumption official stated that agencywide tests and exercises of business resumption plans could assist in better integration of emergency efforts for a wider range of future emergencies. According to IRS headquarters officials, the headquarters COOP plan was not activated because local space availability made movement of executive leadership to the alternate COOP facility unnecessary and the safety of the leadership was not at risk. When the June 2006 flood occurred at the IRS headquarters building, the agency had in place a suite of emergency plans that helped guide its response. The agency's Incident Management Plan was particularly useful in establishing clear lines of authority and communications, conditions that we have previously reported to be critical to an effective emergency response. Unit-level business resumption plans we reviewed contributed to a lesser extent and the headquarters COOP plan was not activated because of conditions particular to the 2006 flood. Specifically, damage to the building was limited to the basement and subbasement levels, and employees were able to enter the building to retrieve equipment and assets. In addition, alternate work space was available for all employees within a relatively short period, reducing the importance of identifying critical personnel. Such conditions, however, may not be present during future disruptions. The plans IRS had in place at the time of the flood did not address all of the elements outlined in federal continuity guidance. In particular, the IRS plans did not (1) prioritize all essential functions and set targets for recovery times; (2) outline the preparation of resources and alternate facilities necessary to perform those functions; and (3) develop provisions for tests, training, and exercises of all of its plans. In discussions on lessons learned from the flood response, IRS business unit officials recognized the need to incorporate many of these elements. Unless IRS addresses these gaps, it will have limited assurance that it will be prepared to continue essential functions following a disruption more severe than the 2006 flood. To strengthen the ability of IRS to respond to the full range of potential disruptions to essential operations, we are making two recommendations to the Commissioner of Internal Revenue: Revise IRS internal emergency planning guidance to fully reflect federal guidance on the elements of a viable continuity capability, including the identification and prioritization of essential functions; the preparation of necessary resources and alternate facilities; and the regular completion of tests, training, and exercises of continuity capabilities. Revise IRS emergency plans in accordance with the new internal guidance. The Commissioner of Internal Revenue provided comments on a draft of this report in a March 26, 2007, letter which is reprinted in appendix II. The Commissioner agreed with our recommendations. His letter notes that the agency is actively committed to improving its processes. Specifically the agency will (1) conduct a thorough gap analysis between FPC 65 elements and business continuity planning guidance; (2) update the Internal Revenue Manual guidance and business resumption plan templates to reflect areas of improvement resulting from the gap analysis; and (3) formally direct annual tests, training, and exercises of business resumption plans through the agency's Emergency Management and Preparedness Steering Committee. Finally, the Commissioner stated that the agency will revise and implement its emergency plans based on the results of the aforementioned activities. As agreed with your staff, unless you publicly announce its contents earlier, we plan no further distribution of this report until 30 days after its date. At that time, we will send copies of this report to the Secretary of the Treasury, the Commissioner of Internal Revenue, and other interested parties. This report will also be available at no charge on the GAO Web site at http://www.gao.gov. Should you or your staff have questions on matters discussed in this report, please contact Bernice Steinhardt at (202) 512-6543 or [email protected], or Linda Koontz at (202) 512-6240 or [email protected]. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this report. Key contributions to this report were made by William Doherty, Assistant Director; James R. Sweetman, Jr., Assistant Director; Thomas Beall; Michaela Brown; Terrell Dorn; Nick Marinos; and Nhi Nguyen. The objectives of this report were to evaluate how the Internal Revenue Service's (IRS) emergency operations plans address federal guidance related to continuity planning and evaluate the extent to which IRS emergency operations plans contributed to the actions taken by IRS officials in response to the flood. To address how IRS emergency operations plans address federal guidance related to continuity planning, we obtained the IRS headquarters emergency operations plans that were available to agency officials at the time of the June 2006 flood. These included the Continuity of Operations (COOP) plan and a suite of business continuity plans, including the Incident Management Plan and 13 business resumption plans from business units affected by the flood. Although we also obtained the headquarters Occupant Emergency Plan, we did not evaluate its contributions to addressing the elements because its purpose is limited to outlining procedures for building occupants and emergency personnel in responding to threats that require building evacuations or shelter in place. We did not obtain the Disaster Recovery Plan, a contingency plan for the recovery of information technology equipment, because recovery of information technology equipment was addressed in a report from the Treasury Inspector General for Tax Administration. To evaluate IRS's emergency operations plans in relation to federal guidance on continuity planning, we analyzed Federal Preparedness Circular (FPC) 65 to identify the elements needed to ensure the continuity of essential functions and compared IRS emergency operations plans to the resulting generalized list. Because FPC 65 covers all hazard emergencies, but provides continuity guidance specifically for agency COOP plans, we developed the general elements of guidance to be able to collectively evaluate all IRS emergency operations plans we obtained. From our analysis of FPC 65, we identified eight general elements of guidance related to developing a viable continuity capability. See table 3 for a listing and description of the elements. We reviewed IRS's plans and analyzed how they collectively addressed or did not address these eight general elements of guidance. We also reviewed IRS-defined criteria and guidance for emergency operations plans, including sections of the Internal Revenue Manual-- which provides guidance to IRS officials on developing several of the agency's emergency operations plans--and an internal template provided by IRS's Office of Physical Security and Emergency Preparedness, which is responsible for agencywide emergency planning and policy to guide plan development. Since each business unit within IRS headquarters has an individual plan for business resumption activities, we selected and examined 3 of 13 business resumption plans available for use during the flood from the 3 business units with the most employees affected by the flooding in the headquarters building. According to employee relocation lists from IRS following the flood, the 3 largest business units in the building are Criminal Investigation, Wage and Investment, and Chief Counsel, which collectively represent over 50 percent of the headquarters building employees. To address the extent to which IRS emergency operations plans contributed to the actions taken by IRS officials in response to the flood, we interviewed IRS officials responsible for the development, oversight, and implementation of the headquarters emergency operations plans. In our interviews, we asked IRS officials responsible for each emergency operations plan how the general elements identified in their respective plans guided their actions following the flood, if at all. To supplement the information gained from the interviews, we reviewed agency documentation related to emergency operations activities following the flood, including IRS status reports, employee relocation lists, and emergency operations team meeting minutes. In addition, we reviewed documentation regarding lessons learned from the flood, provided by various headquarters business units, and obtained any updates or changes to emergency operations plans following the flood. We conducted our review in accordance with generally accepted government auditing standards from July 2006 through March 2007. | On June 25, 2006, the Internal Revenue Service (IRS) headquarters building suffered flooding during a period of record rainfall and sustained extensive damage to its infrastructure. IRS officials ordered the closure of the building until December 2006 to allow for repairs to be completed. IRS headquarters officials reported activating several of the agency's emergency operations plans. Within 1 month of the flood, over 2,000 employees normally assigned to the headquarters building were relocated to other facilities throughout the Washington, D.C., metropolitan area. GAO was asked to report on (1) how IRS emergency operations plans address federal guidance related to continuity planning and (2) the extent to which IRS emergency operations plans contributed to the actions taken by IRS officials in response to the flood. To address these objectives, GAO analyzed federal continuity guidance, reviewed IRS emergency plans, and interviewed IRS officials. The IRS headquarters emergency operations plans that GAO reviewed--the headquarters Continuity of Operations (COOP) plan, Incident Management Plan, and three selected business resumption plans--collectively addressed several of the general elements identified within federal continuity guidance for all executive branch departments and agencies. For example, the plans adequately identified the people needed to continue performing essential functions. However, other elements were not addressed or were addressed only in part. Specifically, IRS had two separate lists of essential functions--critical business processes and essential functions for IRS leadership--within its plans, but prioritized only one of the lists. Furthermore, although the COOP plan outlined provisions for tests, training, and exercises, none of the other plans GAO reviewed outlined the need to conduct such activities. While IRS provided overall guidance to its business units on their business resumption plans, the guidance was inconsistent with the federal guidance on several elements, including the preparation of resources and facilities needed to support essential functions and requirements for regular tests, training, and exercises. The IRS Incident Management Plan was particularly useful in establishing clear lines of authority and communications in response to the flooding. Unit-level business resumption plans GAO reviewed contributed to a lesser extent, and the headquarters COOP plan was not activated because of conditions particular to the 2006 flood. Specifically, damage to the building was limited to the basement and subbasement levels, and employees were able to enter the building to retrieve equipment and assets. In addition, alternate work space was available for all employees within a relatively short period, reducing the importance of identifying critical personnel. While its plans helped guide IRS's response to the conditions that resulted from the flood, in more severe emergency events, conditions could be less favorable to recovery. Consequently, unless IRS fills in gaps in its guidance and plans, it lacks assurance that the agency is adequately prepared to respond to the full range of potential disruptions. | 5,210 | 586 |
Mr. Chairman and Members of the Subcommittee: We are pleased to be here today to participate in the Subcommittee's oversight hearing on the U.S. Postal Service. My testimony will (1) focus on the performance of the Postal Service and the need for improving internal controls and protecting revenue in an organization that takes in and spends billions of dollars each year and (2) highlight some of the key reform and oversight issues that continue to challenge the Postal Service and Congress as they consider how U.S. mail service will be provided in the future. I will also provide some observations from our ongoing work relating to labor-management relations at the Postal Service and other areas. My testimony is based on our ongoing work and work that we completed over the past year. First, I would like to discuss both the reported successes and some of the remaining areas of concern related to the Postal Service's performance. Last year, the Postal Service reported that it had achieved outstanding financial and operational performance. Financially, the Postal Service had the second most profitable year in its history. According to the Postal Service's 1996 annual report, its fiscal year 1996 net income was $1.6 billion. Similarly, with regard to mail delivery service, the Postal Service continued to meet or exceed its goals for on-time delivery of overnight mail. Most recently, the Postmaster General announced that, during 1996, the Postal Service delivered 91 percent of overnight mail on time or better. Additionally, during fiscal year 1996, the Postal Service's volume exceeded 182 billion pieces of mail and generated more than $56 billion in revenue. While these results are encouraging, other performance data suggest that some areas of concern warrant closer scrutiny. For example, last year's delivery of 2-day and 3-day mail--at 80 and 83 percent respectively--did not score as high as overnight delivery. Such performance has raised a concern among some customers that the Postal Service's emphasis on overnight delivery is at the expense of 2-day and 3-day mail. Additionally, although its mail volume continues to grow, the Postal Service is concerned that customers increasingly are turning to its competitors or alternative communications methods. In 1996, mail volume increased by about one-half of anticipated increase in volume. showed that its 1996 operating expenses increased 4.7 percent compared to a 3.9 percent increase in operating revenues. Labor costs, which include pay and benefits, continued to account for almost 80 percent of the Postal Service's operating expenses, and the Postal Service expects that its costs for compensation and benefits will grow more than 6 percent in 1997. Moreover, controlling costs will be critical with regard to capital investments in 1997, as the Postal Service plans to commit $6 billion to capital improvements. Over the next 5 years, the Service plans to devote more than $14 billion in capital investments to technology, infrastructure improvements, and customer service and revenue initiatives. The Postal Service's continued success in both operational and financial performance will depend heavily on its ability to control operating costs, strengthen internal controls, and ensure the integrity of its services. However, we found several weaknesses in the Postal Service's internal controls that contributed to unnecessary increased costs. We reported in October 1996 that internal controls over Express Mail Corporate Accounts (EMCA) were weak or nonexistent, which resulted in the potential for abuse and increasing revenue losses over the past 3 fiscal years. Specifically, we found that some mailers obtained express mail services using invalid EMCAs and that the Postal Service did not collect the postage due. Consequently, in fiscal year 1995, the Postal Service lost express mail revenue of about $800,000 primarily because it had not verified EMCAs that were later determined to be invalid. Since our report was issued, the Postal Service has developed plans to address these deficiencies. The Postal Service is revising its regulations to require an initial deposit of $250, up from $100, to open an EMCA. It also plans to issue a memorandum requiring that district managers ensure that employees perform the necessary express mail acceptance checks so that the correct postage amounts can be collected. Finally, the Postal Service plans to install terminals in mail processing plants to allow Express Mail packages that are deposited in collection boxes or picked up at customers' locations to be checked for valid EMCA numbers before they are accepted into the mail system. Similarly, we reported in June 1996 that weaknesses in the Postal Service's controls for accepting bulk business mail prevent it from having reasonable assurance that all significant amounts of postage revenue due are received when mailers claim presort/barcode discounts. We reported that during fiscal year 1994, as much as 40 percent of required bulk mail verifications were not performed. Bulk mail totalled almost one-half of the Postal Service's total revenue of $47.7 billion in fiscal year 1994. At the same time, we found that less than 50 percent of the required follow-up verifications to determine the accuracy of the clerk's work were being performed by the supervisors. In response to our recommendations, the Postal Service is developing new and strengthening existing internal controls to help prevent revenue losses in bulk mailings. For example, the Postal Service plans to improve the processes used in verification of mail, including how units are staffed, how verifications are performed, and how results of acceptance work are reported and reviewed. To avoid additional unwarranted costs, the Postal Service also needs to better ensure the overall integrity of its acquisitions and services. We concluded, in our January 1996 report, that the Postal Service did not follow required procedures for seven real estate or equipment purchases. We estimated that these seven purchases resulted in the Postal Service's expending about $89 million on penalties, unusable, or marginally usable property. Three of the seven purchases involved ethics violations arising from the contracting officers' failure to correct situations in which individuals had financial relationships with the Postal Service and with certain offerors. We also pointed out that the Office of Government Ethics was reviewing the Postal Service's ethics program and reported that all areas of the program required improvement. The Office of Government Ethics subsequently made a number of recommendations designed to ensure that improvement of the Postal Service's ethics program continues through more consistent oversight and management support. of Justice. As a result of these actions, the Office of Government Ethics closed its remaining open recommendations. Additionally, strengthening program oversight is essential to effective mail delivery. We found that the Postal Service did not exercise adequate oversight of its National Change of Address (NCOA) program. We reported that the Postal Service took a positive step toward dealing with the inefficiencies of processing misaddressed mail. However, at the same time, we found that the NCOA program was operating without clear procedures and sufficient oversight to ensure that the program was operating in compliance with the privacy provisions of federal laws. Accordingly, we recommended that the Postal Service strengthen oversight of NCOA by developing and implementing written oversight procedures. In response to our recommendation, the Postal Service developed written oversight procedures for the NCOA program. Most recently, we issued a report that describes how the Postal Service closes post offices and provides information on the number closed since 1970--over 3,900 post offices. We also provided information on the number of appeals and their dispositions, as well as some information about the communities where post offices were closed in fiscal years 1995 and 1996. Generally, the Postal Service initiated the closing process after a postmaster vacancy occurred through retirement, transfer or promotion, or after the termination of the post office building's lease. In each case, the Postal Service proposed less costly alternative postal services to the affected community, such as establishing a community post office operated by a contractor or providing postal deliveries through rural routes and cluster boxes. will bear a uniform rate. In our September 1996 report, we emphasized the importance of recognizing the Statute's underlying purpose and determining how changes may affect universal mail service and uniform rates. Most important among the potential consequences is that relaxing the Statutes could open First-Class mail services to additional competition, thus possibly affecting postal revenues and rates and the Postal Service's ability to carry out its public service mandates. However, at the same time, the American public could benefit through improved service. It will be important to take into account the possible consequences for all stakeholders in deciding how mail services will be provided to the American public in the future. Another key reform issue is the future role of the Postal Service in the constantly changing and increasingly competitive communications market. For example, the use of alternative communications methods such as electronic mail, faxes, and the Internet continues to grow at phenomenal rates in the United States and is beginning to affect the Postal Service markets. At the same time, the Postal Service's competitors continue to challenge it for major shares of the communications market. According to the Postmaster General, the Postal Service has been losing market share in five of its six product lines. It seems reasonable to assume that these alternative communications methods are likely to be used more and more. In addition, international mail has become an increasingly vital market in which the Postal Service competes. In our March 1996 report, we pointed out that, although the Postal Service has more flexibility in setting international rates, it still lost business to competitors because rates were not competitive and delivery service was not reliable. We also identified several issues surrounding the Postal Service's role in the international mail arena that remain unresolved. Chief among them is the appropriateness of the Postal Service's pricing practices in setting rates for international mail services. frequency of mail delivery to some businesses, as well as in urban and rural areas. CPC uses a regulatory rate-making process that includes the opportunity for public comment and government approval for basic domestic and international single-piece letters. However, postage rates for other mail services can be approved by CPC without issuing regulations or obtaining government approval. Some of the key concerns that have been raised by CPC customers include CPC's closure of rural post offices and its conversion of others to private ownership. In addition, CPC's competitors have expressed concern about whether CPC is cross-subsidizing the prices of its courier services with monopoly revenues. The Canadian government has responded to these concerns by continuing its moratorium on post office closings and directing CPC to discontinue delivery of unaddressed advertising mail. The government is also considering a call for additional government oversight of CPC. Mr. Chairman, as you are aware, we also have a number of ongoing reviews related to postal reform. For example, in concert with your focus on the future role of the Postal Service, we are currently reviewing the role and structure of the Postal Service's Board of Governors in order to determine its strengths and weaknesses. The Board of Governors is responsible for directing and controlling the expenditures of the Postal Service, reviewing its practices, participating in long-range planning, and setting policies on all postal matters. In addition to obtaining the views of current and former Board members, we will provide information on the role and structure of Boards in other types of government-created organizations. Another issue important to postal reform that we are reviewing involves access to mailboxes. More specifically, we plan to provide information on (1) public opinions on the issue of mailbox restrictions; (2) views of the Postal Service and other major stakeholders; and (3) this country's experience with mailbox security and enforcement of related laws, compared with the experiences in selected other countries. despite the initiatives that have been established to address them. For example, the number of grievances requiring formal arbitration has increased almost 76 percent, from about 51,000 in fiscal year 1993 to over 90,000 in fiscal year 1996. These difficulties continue to plague the Service primarily because the major postal stakeholders (the Postal Service, four major unions, and three management associations) cannot agree on common approaches for addressing their problems. We continue to believe that until the major postal stakeholders develop a framework agreement that would outline common objectives and strategies, efforts to improve labor-management relations will likely continue to be fragmented and difficult to sustain. The Government Performance and Results Act (GPRA) provides a mechanism that may be useful in focusing a dialogue that could lead to a framework agreement. GPRA provides a legislatively based mechanism for the major stakeholders, including Congress, to jointly engage in discussions that focus on an agency's mission and on establishing goals, measuring performance, and reporting on mission-related accomplishments. GPRA can be instrumental to the Postal Service's efforts to better define its current and future role. As results-oriented goals are established to achieve that role, the related discussions can also provide a foundation for a framework agreement. Successful labor-management relations will be critical to achieving the Postal Service's goals. The Postal Service and Congress will need results-oriented goals and sound performance information to most effectively address some of the policy issues that surround the Postal Service's performance in a dynamic communications market. Recognizing that the changes envisioned by GPRA do not come quickly or easily, sustained oversight by the Postal Service and Congress will be necessary. the residential and business levels will continue to be a critical area as the Postal Service strives to improve customer service in order to remain competitive. The Postal Service has made considerable progress in improving its financial and operational performance. Sustaining this progress will be dependent upon ensuring that the key issues we identified, such as controlling costs, protecting revenues, and clarifying the role of the Postal Service in an increasingly competitive communications market, are effectively addressed by the Postal Service and Congress. Mr. Chairman, this concludes my prepared statement. I have attached a list of our Postal Service products issued since January 1996. I would be pleased to respond to any questions you or members of the Subcommittee may have. U.S. Postal Service: Information on Post Office Closures, Appeals, and Affected Communities (GAO/GGD-97-38BR, Mar. 11, 1997). Postal Reform in Canada: Canada Post Corporation's Universal Service and Ratemaking (GAO/GGD-97-45BR, Mar. 5, 1997). U.S. Postal Service: Revenue Losses From Express Mail Accounts Have Grown (GAO/GGD-97-3, Oct. 24, 1996). Postal Service: Controls Over Postage Meters (GAO/GGD-96-194R, Sept. 26, 1996). Inspector General: Comparison of Certain Activities of the Postal IG and Other IGs (GAO/AIMD-96-150, Sept. 20, 1996). Postal Service Reform: Issues Relevant to Changing Restrictions on Private Letter Delivery (GAO/GGD-96-129A/B, Sept. 12, 1996). U.S. Postal Service: Improved Oversight Needed to Protect Privacy of Address Changes (GAO/GGD-96-119, Aug. 13, 1996). U.S. Postal Service: Stronger Mail Acceptance Controls Could Help Prevent Revenue Losses (GAO/GGD-96-126, June 25, 1996). U.S. Postal Service: Unresolved Issues in the International Mail Market (GAO/GGD-96-51, Mar. 11, 1996). Postal Service: Conditions Leading to Problems in Some Major Purchases (GAO/GGD-96-59, Jan. 18, 1996). The first copy of each GAO report and testimony is free. Additional copies are $2 each. Orders should be sent to the following address, accompanied by a check or money order made out to the Superintendent of Documents, when necessary. VISA and MasterCard credit cards are accepted, also. Orders for 100 or more copies to be mailed to a single address are discounted 25 percent. U.S. General Accounting Office P.O. Box 6015 Gaithersburg, MD 20884-6015 Room 1100 700 4th St. NW (corner of 4th and G Sts. NW) U.S. General Accounting Office Washington, DC Orders may also be placed by calling (202) 512-6000 or by using fax number (301) 258-4066, or TDD (301) 413-0006. Each day, GAO issues a list of newly available reports and testimony. To receive facsimile copies of the daily list or any list from the past 30 days, please call (202) 512-6000 using a touchtone phone. A recorded menu will provide information on how to obtain these lists. | GAO discussed the challenges that confront the Postal Service and Congress as they consider how to sustain the Postal Service's performance and maintain a competitive role in providing mail service to the American public in the future. GAO noted that: (1) the Postal Service reported that fiscal year (FY) 1996 represented the second year in a row that its financial performance was profitable and operational performance improved; (2) the Postal Service's 1996 net income was $1.6 billion and it delivered 91 percent of overnight mail on time; (3) additionally, for FY 1996, the Postal Service's volume exceeded 182 billion pieces of mail and generated more than $56 billion in revenue; (4) while these results are encouraging, other performance data suggest that some areas warrant closer scrutiny; (5) last year's delivery of 2-day and 3-day mail, at 80 and 83 percent respectively, did not score as high as overnight delivery; (6) the concern among customers is that the Postal Service's emphasis on overnight delivery is at the expense of 2-day and 3-day mail; (7) additionally, although its mail volume continues to grow, the Postal Service is concerned that customers increasingly are turning to its competitors or alternative communications methods; (8) in 1996, mail volume increased by about one-half of anticipated increase in volume; (9) containing costs is another key challenge that GAO has reported on previously; (10) GAO has also found several weaknesses in the Postal Service's internal controls that contributed to increased costs; (11) the Postal Service's continued success in both financial and operational performance will depend heavily on controlling operating costs, strengthening internal controls, and ensuring the integrity of its services; (12) the prospect that pending postal legislation may place the Postal Service in a more competitive arena with its private sector counterparts has prompted congressional consideration of some key reform issues; (13) these issues include how proposed changes to the Private Express Statutes may affect universal mail service, postal revenues, and rates; (14) another reform issue is the future role of the Postal Service in an increasingly competitive, constantly changing communications market; (15) congressional oversight remains a key tool for improving the organizational performance of the Postal Service; (16) one of the most important areas for oversight is labor-management relations; (17) despite the initiatives that have been established to address them, the long-standing labor-management relations problems GAO identified in 1994 remain unresolved; and (18) also, the Postal Service's automation efforts will continue to require the attention of both the Postal Service and Congress to ensure that increased productivity and an adequate return on investments are realized. | 3,545 | 550 |
The Recovery Act appropriated $4 billion for the Clean Water SRF program. This funding represents a significant increase compared with federal funds awarded as annual appropriations to the SRF program in recent years. From fiscal years 2000 through 2009, annual appropriations averaged about $1.1 billion for the Clean Water SRF program. Established in 1987, EPA's Clean Water SRF program provides states and local communities independent and permanent sources of subsidized financial assistance, such as low- or no-interest loans for projects that protect or improve water quality and that are needed to comply with federal water quality regulations. In addition to providing increased funds, the Recovery Act included some new requirements for the SRF programs. For example, states were required to have all Recovery Act funds awarded to projects under contract within 1-year of enactment--which was February 17, 2010--and EPA was directed to reallocate any funds not under contract by that date. In addition, under the Recovery Act, states should give priority to projects that were ready to proceed to construction within 12 months of enactment. States were also required to use at least 20 percent of funds as a "green reserve" to provide assistance for green infrastructure projects, water or energy efficiency improvements, or other environmentally innovative activities. Further, states were required to use at least 50 percent of Recovery Act funds to provide assistance in the form of, for example, principal forgiveness or grants. These types of assistance are referred to as additional subsidization and are more generous than the low- or no-interest loans that the Clean Water SRF programs generally provide. The 14 states we reviewed for the Clean Water SRF program met all Recovery Act requirements specific to the Clean Water SRF. Specifically, the states we reviewed had all projects under contract by the 1-year deadline and also took steps to give priority to projects that were ready to proceed to construction within 12 months of enactment of the Recovery Act. Eighty-seven percent of Clean Water SRF projects were under construction within 12 months of enactment. In addition, the 14 Clean Water SRFs we reviewed exceeded the 20 percent green reserve requirement, using 29 percent of Recovery Act SRF funds in these states to provide assistance for projects that met EPA criteria for the green reserve. These states also met or exceeded the 50 percent additional subsidization requirement; overall, the 14 states distributed a total of 79 percent of Recovery Act Clean Water SRF funds as additional subsidization. SRF officials in most of the states we reviewed said that they faced challenges in meeting Recovery Act requirements, especially the 1-year contracting deadline. Under the base program, it could take up to several years from when funds are awarded before the loan agreement is signed, according to EPA officials. Some SRF officials told us that the compressed time frame imposed by the Recovery Act posed challenges and that their workloads increased significantly as a result of the 1-year deadline. Among the factors affecting workload are the following: Reviewing applications for Recovery Act funds was burdensome. Officials in some states said that the number of applications increased significantly, in some cases more than doubling compared with prior years, and that reviewing these applications was a challenge. For example, New Jersey received twice as many applications than in past years, according to SRF officials in that state. Explaining new Recovery Act requirements was time-consuming. Because projects that receive any Recovery Act funds must comply with Buy American requirements and Davis-Bacon wage requirements, state SRF officials had to take additional steps to ensure that both applicants for Recovery Act funds and those awarded Recovery Act funds understood these requirements. Applicants and subrecipients required additional support. Many states took steps to target Recovery Act funds to new recipients, including nontraditional recipients of Clean Water SRF funds, such as disadvantaged communities. According to SRF officials in some states, new applicants and subrecipients required additional support in complying with SRF program and Recovery Act requirements. In the states we reviewed, nearly half of Clean Water SRF subrecipients had not previously received assistance through that program. Project costs were difficult to predict. Officials in some states told us that actual costs were lower than estimated for many projects awarded Recovery Act funds and, as a result, some states had to scramble to ensure that all Recovery Act funds were under contract by the 1-year deadline. For example, in January 2010, officials from Florida's SRF programs told us that a few contracts for Recovery Act-funded projects in the state had come in below their original project cost estimates, and that this was likely to be the program staff's largest concern as the deadline approached. However, lower estimates also allowed some states to undertake additional projects that they would otherwise have been unable to fund with the Recovery Act funding. States used a variety of techniques to address these workload concerns and meet the 1-year contracting deadline, according to state SRF officials with whom we spoke. Some states hired additional staff to help administer the SRF programs, although SRF officials in other states told us that they were unable to do so because of resource constraints. For example, New Jersey hired contractors to help administer the state's base Clean Water SRF funds, allowing experienced staff to focus on meeting Recovery Act requirements, according to SRF officials in that state. Moreover, some states hired contractors to provide assistance to both applicants and subrecipients. For example, California hired contractors--including the Rural Community Assistance Corporation--to help communities apply for Recovery Act funds. Furthermore, states took steps to ensure that they would have all Recovery Act funds under contract even if projects dropped out because of Recovery Act requirements or time frames. For example, most of the states we reviewed awarded a combination of Recovery Act and base funds to projects to allow for more flexibility in shifting Recovery Act funds among projects. States also used a variety of techniques to ensure that they would meet the green reserve requirement. For example, some of the states we reviewed conducted outreach to communities and nonprofit organizations to solicit applications for green projects. Moreover, to make green projects more attractive to communities, some states offered additional subsidization to all green projects or relied on a small number of high-cost green projects to meet the requirement. For example, Mississippi officials told us that the state funded three large energy efficiency projects that helped the state's Clean Water SRF program meet the green reserve requirement. The 14 states we reviewed distributed nearly $2 billion in Recovery Act funds among 890 water projects through their Clean Water SRF program. These states took a variety of approaches to distributing funds. For example, four states distributed at least 95 percent of Recovery Act funds as additional subsidization, while three other states distributed only 50 percent as additional subsidization, the smallest amount permitted under the Recovery Act. Overall, these 14 states distributed approximately 79 percent of Clean Water SRF Recovery Act funds as additional subsidization, with most of the remaining funds provided as low- or no- interest loans that will recycle back into the programs as subrecipients repay their loans. As the funds are repaid, they can then be used to provide assistance to SRF recipients in the future. Furthermore, states varied in the number of projects they chose to fund. For example, Ohio distributed approximately $221 million among 274 Clean Water SRF projects, while Texas distributed more than $172 million among 21 projects. Some states funded more projects than originally anticipated because other projects were less costly than expected, according to officials. For example, Texas was able to provide funds for two additional clean water projects because costs--especially material costs--were lower than anticipated for other projects. States we reviewed used at least 40 percent of Recovery Act Clean Water SRF project funds ($787 million) to provide assistance for projects that serve disadvantaged communities. Most of the states we reviewed took steps to target some or all Recovery Act funds to these low-income communities, generally by considering a community's median household income when selecting projects and determining which projects would receive additional subsidization in the form of principal forgiveness, negative interest loans, or grants. According to state officials from nine Clean Water SRF programs, 50 percent of all projects funded by those states' SRF programs serve disadvantaged communities, and all of these disadvantaged communities were provided with additional subsidization. SRF officials in some states told us that Recovery Act funds--especially in the form of additional subsidization--have provided significant benefits to disadvantaged communities in their states. For example, according to officials from California's Clean Water SRF program, that state used funds to provide assistance for 25 wastewater projects that serve disadvantaged communities, and approximately half of these projects would not have gone forward as quickly or at all without additional subsidization. Officials from the City of Fresno confirmed that one of these projects--which will replace septic systems with connections to the city's sewer systems in two disadvantaged communities--would not have gone forward without additional subsidization. Local officials told us that this project will decrease the amount of nitrates in the region's groundwater, which is the source of the city's drinking water. The Clean Water SRF programs from the 14 states we reviewed used Recovery Act funds to provide assistance for 890 projects that will meet a variety of local needs. Figure 1 shows how the 14 states distributed Recovery Act funds across various clean water categories. In the states we reviewed, the Clean Water SRF programs used more than 70 percent of Recovery Act project funds to provide assistance for projects in the following categories: Secondary treatment and advanced treatment. States we reviewed used nearly half of all Recovery Act project funds to support wastewater infrastructure intended to meet or exceed EPA's secondary treatment standards for wastewater treatment facilities. Projects intended to achieve compliance with these standards are referred to as secondary treatment projects, while projects intended to exceed compliance with these standards are referred to as advanced treatment projects. For example, Massachusetts' Clean Water SRF program awarded over $2 million in Recovery Act funds to provide upgrades intended to help the City of Leominster's secondary wastewater treatment facility achieve compliance with EPA's discharge limits for phosphorous. Sanitary sewer overflow and combined sewer overflow. States we reviewed used about 25 percent of Recovery Act project funds to support efforts to prevent or mitigate discharges of untreated wastewater into nearby water bodies. Such sewer overflows, which can occur as a result of inclement weather, can pose significant public health and pollution problems, according to EPA. For example, Pennsylvania used 56 percent of project funds to address sewer overflows from municipal sanitary sewer systems and combined sewer systems. In another example, Iowa's Clean Water SRF program used Recovery Act funds to help the City of Garwin implement sanitary sewer improvements. Officials from that city told us that during heavy rains, untreated water has bypassed the city's pump station and backed up into basements of homes and businesses, and that the city expects all backups to be eliminated as a result of planned improvements. In addition to funding conventional wastewater treatment projects, 9 of the 14 Clean Water SRF programs we reviewed used Recovery Act funds to provide assistance for projects intended to address nonpoint source pollution--projects intended to protect or improve water quality by, for example, controlling runoff from city streets and agricultural areas. The Clean Water SRF programs we reviewed used 8 percent of project funds to support these nonpoint source projects, but nonpoint source projects account for 20 percent (179 out of 890) of all projects. A large number of these projects--131 out of 179--were initiated by California or Ohio. For example, California used Recovery Act funds to provide assistance for the Tomales Bay Wetland Restoration and Monitoring Program, which restores wetlands that had been converted into a dairy farm. Figure 2 shows the number of projects that fall into various clean water categories. Of the 890 projects awarded Recovery Act funds by the Clean Water SRF programs in the states we reviewed, more than one-third (312) address the green reserve requirement. Of these green projects, 289 (93 percent) were awarded additional subsidization. Figure 3 shows the number of projects that fall into each of the four green reserve categories included in the Recovery Act. Many of these projects are intended to improve energy efficiency and are expected to result in long-term cost savings for some communities as a result of these improvements. For example, the Massachusetts Water Resources Authority is using Recovery Act funds provided through that state's Clean Water SRF program to help construct a wind turbine at the DeLauri Pump Station, and the Authority estimates that, as a result of this wind turbine, more than $350,000 each year in electricity purchases will be avoided. Furthermore, some projects provide green alternatives for infrastructure improvements. For example, New York's Clean Water SRF program provided Recovery Act funds to help construct a park designed to naturally filter stormwater runoff and reduce the amount of stormwater that enters New York City's sewers. More than half of the city's sewers are combined sewers, and during heavy rains, sewage sometimes discharges into Paerdagat Basin, which feeds into Jamaica Bay. EPA has modified its existing oversight of state SRF programs by planning additional performance reviews beyond the annual reviews it is already conducting, but these reviews do not include an examination of state subrecipient monitoring procedures. Specifically, EPA is conducting midyear and end-of-year Recovery Act reviews in fiscal year 2010 to assess how each state is meeting Recovery Act requirements. As part of these reviews, EPA has modified its annual review checklist to incorporate elements that address the Recovery Act requirements. Further, EPA officials will review four project files in each state for compliance with Recovery Act requirements and four federal disbursements to the state to help ensure erroneous payments are not occurring. According to EPA officials, through these added reviews, EPA is providing additional scrutiny over how states are using the Recovery Act funds and meeting Recovery Act requirements as compared with base program funds. As of May 14, 2010, EPA completed field work for its mid-year Recovery Act reviews in 13 of the states we reviewed and completed final reports for 3 of these states (Iowa, Ohio, and Pennsylvania). EPA has plans to begin field work in the final state at the end of May 2010. Although the frequency of reviews has increased, these reviews do not examine state subrecipient monitoring procedures. In 2008, the EPA Office of Inspector General (OIG) examined state SRF programs' compliance with subrecipient monitoring requirements of the Single Audit Act and found that states complied with the subrecipient monitoring requirements but that EPA's annual review process did not address state subrecipient monitoring procedures. The OIG suggested that EPA include a review of how states monitor borrowers as part of its annual review procedures. EPA officials told us that they agreed with the idea to include a review of subrecipient monitoring procedures as part of the annual review but have not had time to implement this suggestion because EPA's SRF program officials have focused most of their attention on the Recovery Act since the OIG published its report. EPA officials also told us that they believe the reviews of project files and federal disbursements could possibly identify internal control weaknesses that may exist for financial controls, such as weaknesses in subrecipient monitoring procedures. These reviews occur as part of the Recovery Act review and aim to assess a project's compliance with Recovery Act requirements and help ensure that no erroneous payments are occurring. In terms of state oversight of subrecipients, EPA has not established new subrecipient monitoring requirements for Recovery Act-funded projects, according to EPA officials. Under the base Clean Water SRF program, EPA gives states a high degree of flexibility to operate their SRF programs based on each state's unique needs and circumstances in accordance with federal and state laws and requirements. According to EPA officials, although EPA has established minimum requirements for subrecipient monitoring, such as requiring states to review reimbursement requests, states are allowed to determine their own subrecipient monitoring procedures, including the frequency of project site inspections. While EPA has not deviated from this approach with regard to monitoring Recovery Act-funded projects, it has provided states with voluntary tools and guidance to help with monitoring efforts. For example, EPA provided states with an optional inspection checklist to help states evaluate a subrecipient's compliance with Recovery Act requirements, such as the Buy American and job reporting requirements. EPA has also provided training for states on the Recovery Act requirements. For example, as of May 14, 2010, EPA has made available 11 on-line training sessions (i.e.webcasts) for state officials in all states to help them understand the Recovery Act requirements. EPA has also provided four workshops with on-site training on its inspection checklist for state officials in California, Louisiana, New Mexico, and Puerto Rico. Although EPA has not required that states change their subrecipient oversight approach, many states have expanded their existing monitoring procedures in a variety of ways. However, the oversight procedures may not be sufficient given that (1) federal funds awarded to each state under the Recovery Act have increased as compared with average annually awarded amounts; (2) all Recovery Act projects had to be ready to proceed to construction more quickly than projects funded with base SRF funds; and (3) EPA and states had little previous experience with some of the Recovery Act's new requirements, such as Buy American provisions, according to EPA officials. The following are ways in which oversight procedures may not be sufficient: Review procedures for job data. According to OMB guidance on Recovery Act reporting, states should establish internal controls to ensure data quality, completeness, accuracy, and timely reporting of all amounts funded by the Recovery Act. We found that most states we reviewed had not developed review procedures to verify the accuracy of job figures reported by subrecipients using supporting documentation, such as certified payroll records. As a result, states may be unable to verify the accuracy of these figures. For example, Mississippi SRF officials told us that they do not have the resources to validate the job counts reported by comparing them against certified payroll records. In addition, during interviews with some subrecipients, we found inconsistencies among subrecipients on the types of hours that should be included and the extent that they verified job data submitted to them by contractors. For example, in New Jersey one subrecipient told us they included hours worked by the project engineer in the job counts, while another subrecipient did not. Review procedures for loan disbursements. According to EPA officials, the agency requires states to verify that all loan payments and construction reimbursements are for eligible program costs. In addition, according to EPA guidance, states often involve technical staff who are directly involved in construction inspections to help verify disbursement requests because they have additional information, such as the status of construction, that can help accurately approve these requests. However, we found that in two states we reviewed, technical or engineering staff did not review documentation supporting reimbursement requests from the subrecipient to ensure they were for legitimate project costs. For example, officials in Pennsylvania told us that technical staff from the state's Department of Environmental Protection--which provides technical assistance to SRF subrecipients--do not verify monthly payments to subrecipients that are made by the Pennsylvania Infrastructure Investment Authority, the state agency with funds management responsibility for the state's SRF programs. Instead, Department of Environmental Protection staff approve project cost estimates prior to loan settlement, when they review bid proposals submitted by contractors, and Pennsylvania Infrastructure Investment Authority officials verify monthly payments against the approved cost estimates. Inspection procedures. According to EPA officials, the agency requires that SRF programs have procedures to help ensure subrecipients are using Recovery Act SRF funding for eligible purposes. While EPA has not established required procedures for state project inspections, it has provided states its optional Recovery Act inspection checklist to help them evaluate a subrecipient's compliance with Recovery Act requirements, such as the Buy American and job reporting requirements. Some states we reviewed have adopted EPA's Recovery Act inspection checklist procedures and modified their procedures accordingly. For example, California and Arizona plan to implement all elements of EPA's checklist for conducting inspections of Recovery Act projects, according to officials in these states. Other states have modified their existing inspection procedures to account for the new Recovery Act requirements. For example, officials from Georgia said they added visual examination of purchased materials and file review steps to their monthly inspections to verify that subrecipients are complying with the Buy American provision. In contrast, the Pennsylvania Department of Environmental Protection's inspection procedures do not include a review of Recovery Act requirements. For example, we found that inspection reports for three Recovery Act projects we visited in Pennsylvania do not include inspection elements that covered Davis-Bacon or Buy American provisions. Instead, the Pennsylvania Infrastructure Investment Authority requires subrecipients to self-certify their compliance with these Recovery Act requirements when requesting payment from the state's funds disbursement system. Registered professional engineers who work for the subrecipients must sign off on these self-certifications and subrecipients could face loss of funds if a certification is subsequently found to be false, according to the Executive Director of the Authority. Frequency and timing of inspections. According to EPA officials, the agency does not have requirements on how often a state SRF program must complete project inspections, and the frequency and complexity of inspections vary by state for the base SRF program. Officials from several states told us they have increased the frequency of project site inspections. For example, Colorado SRF officials said the state is conducting quarterly project site inspections of each of the state's Recovery Act funded SRF projects, whereas under the state's base SRF programs, Colorado inspects project sites during construction only when the state has concerns. However, we found that two states either did not conduct site inspections of some projects that are complete or had not yet inspected projects that were near completion. For example, as of April 19, 2010, Ohio EPA had inspected about 41 percent of its Clean Water SRF projects, but our review of Ohio's inspection records showed that at least 6 projects are complete and have not been inspected, and a number of others are nearing completion and have not been inspected. Monitoring compliance with Recovery Act requirements. We found issues in several states during interviews with SRF subrecipients that suggest uncertainty about subrecipients' compliance with Recovery Act requirements. For example, we interviewed one subrecipient in Ohio whose documentation of Buy American compliance raised questions as to whether all of the manufactured goods used in its project were produced domestically. In particular, the specificity and detail of the documentation provided about one of the products used left questions as to whether it was produced at one of the manufacturer's nondomestic locations. Further, another subrecipient in Ohio was almost 2 months late in conducting interviews of contractor employees to ensure payment of Davis-Bacon wages. In summary, EPA and the states successfully met the Recovery Act deadlines for having all projects under contract by the 1-year deadline, and almost all Clean Water SRF projects were under construction by that date as well. Furthermore, Recovery Act funds were distributed to many new recipients and supported projects that serve disadvantaged communities. In addition, Recovery Act Clean Water SRF program funds have supported a variety of projects that are expected to provide tangible benefits to improving local water quality. However, as demonstrated in the above examples, the oversight mechanisms used by EPA and the states may not be sufficient to ensure compliance with all Recovery Act requirements. The combination of a large increase in program funding, compressed time frames, and new Recovery Act requirements present a significant challenge to EPA's current oversight approach. As a result, we recommended that the EPA Administrator work with the states to implement specific oversight procedures to monitor and ensure subrecipients' compliance with the provisions of the Recovery Act-funded Clean Water and Drinking Water SRF programs. EPA neither agreed nor disagreed with this recommendation. Mr. Chairman, this concludes my prepared statement. I would be pleased to respond to any questions that you or other Members of the Committee might have. For further information regarding this statement, please contact David C. Trimble at (202) 512-3841 or [email protected]. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this statement. Individuals who made key contributions to this statement include Nancy Crothers, Elizabeth Erdmann, Brian M. Friedman, Gary C. Guggolz, Emily Hanawalt, Carol Kolarik, and Jonathan Kucskar. This is a work of the U.S. government and is not subject to copyright protection in the United States. The published product may be reproduced and distributed in its entirety without further permission from GAO. However, because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. | The American Recovery and Reinvestment Act of 2009 (Recovery Act) included $4 billion for the Environmental Protection Agency's (EPA) Clean Water State Revolving Fund (SRF). This testimony--based on GAO's report GAO-10-604 , issued on May 26, 2010, in response to a mandate under the Recovery Act--addresses (1) state efforts to meet requirements associated with the Recovery Act and SRF program, (2) the uses of Recovery Act funds, and (3) EPA's and states' efforts to oversee the use of these funds. GAO's review of the Clean Water SRF program focused on 14 states and selected localities--known as subrecipients--in each of these states. These 14 states received approximately 50 percent of the total appropriated under the Recovery Act for the Clean Water SRF. GAO obtained data from EPA and the 14 states, including the amounts and types of financial assistance each SRF program provided, which subrecipients were first-time recipients of Clean Water SRF funding, and which projects serve disadvantaged communities. The 14 states we reviewed for the Clean Water SRF program had all projects under contract by the 1-year, February 17, 2010, deadline and also took steps to give priority to projects that were ready to proceed to construction by that same date. Eighty-seven percent of Clean Water SRF projects were under construction within 12 months of enactment of the Recovery Act. In addition, the 14 Clean Water SRFs exceeded the 20 percent green reserve requirement, using 29 percent of SRF funds to provide assistance for projects that met EPA criteria for being "green," such as water or energy efficiency projects; these states also met or exceeded the requirement to use at least 50 percent of Recovery Act funds to provide additional subsidization in the form of, for example, principal forgiveness or grants. SRF officials in most of the states we reviewed said that they faced challenges in meeting Recovery Act requirements, including the increased number of applications needing review and the number of new subrecipients requiring additional support in complying with the SRF program and Recovery Act requirements. States used a variety of techniques to address these concerns to meet the 1-year deadline, such as hiring additional staff to help administer the SRF program. The 14 states we reviewed distributed nearly $2 billion in Recovery Act funds among 890 water projects through their Clean Water SRF program. Overall, these 14 states distributed about 79 percent of their funds as additional subsidization, with most of the remaining funds provided as low- or zero-interest loans that will recycle back into the programs as subrecipients repay their loans. In addition, states we reviewed used at least 40 percent of Clean Water SRF Recovery Act project funds ($787 million) to provide assistance for projects that serve disadvantaged communities, and almost all of this funding was provided in the form of additional subsidization. Almost half of the Clean Water SRF subrecipients had never previously received assistance through that program. Of the 890 projects awarded Recovery Act Clean Water SRF program funds in these states, more than one-third are for green projects, and almost all of these (93 percent) were awarded additional subsidization. EPA has modified its existing oversight of state SRF programs by planning additional performance reviews beyond the annual reviews it already conducts, but these reviews do not include an examination of state subrecipient monitoring procedures. According to EPA officials, EPA has not established new subrecipient monitoring requirements for Recovery Act-funded projects and has given states a high degree of flexibility to operate their SRF programs based on each state's unique needs. Although many states have expanded their existing monitoring procedures, the oversight procedures in some states may not be sufficient given that (1) federal funds awarded to each state under the Recovery Act have increased as compared with average annual awards; (2) all Recovery Act projects had to be under contract within 1 year; and (3) EPA and states had little experience with some new Recovery Act requirements, such as the Buy American requirements. For example, some projects have been completed before any site inspection has occurred. | 5,326 | 864 |
Credit unions are nonprofit financial cooperatives organized to provide their members with low-cost financial services. According to NCUA, as of 1996, federally insured credit union assets totaled $326 billion. About one in four Americans belongs to a credit union, and credit unions accounted for about 2 percent of the total financial services in the United States. NCUA supervises and insures more than 7,200 federally chartered credit unions and insures member deposits in an additional 4,200 state-chartered credit unions through the National Credit Union Share Insurance Fund. As part of its goal of maintaining the safety and soundness of the credit unions, NCUA is responsible for ensuring credit unions are addressing the Year 2000 problem. The Year 2000 problem is rooted in the way dates are recorded and computed in automated information systems. For the past several decades, systems have typically used two digits to represent the year, such as "97" representing 1997, in order to conserve on electronic data storage and reduce operating costs. With this two-digit format, however, the year 2000 is indistinguishable from 1900, or 2001 from 1901. As a result of this ambiguity, system or application programs that use dates to perform calculations, comparisons, or sorting may generate incorrect results. According to NCUA, most credit unions rely on computers to provide for processing and updating of records and a variety of other functions. As such, the Year 2000 problem poses a serious dilemma for the industry. For example, the problem could lead to numerous problems when calculations requiring the use of dates are performed, such as calculating interest, calculating truth-in-lending or truth-in-savings disclosures, and determining amortization schedules. Moreover, automated teller machines may also assume that all bank cards are expired due to this problem. In addition, errors caused by Year 2000 miscalculations may expose institutions and data centers to financial liability and risk of damage to customer confidence. Other systems important to the day-to-day business of credit unions may be affected as well. For example, telephone systems could shut down as can vaults, security and alarm systems, elevators, and fax machines. In addressing the Year 2000 problem, credit unions must also consider the computer systems that interface with, or connect to, their own systems. These systems may belong to payment system partners, such as wire transfer systems, automated clearing houses, check clearing providers, credit card merchant and issuing systems, automated teller machine networks, electronic data interchange systems, and electronic benefits transfer systems. Because these systems are also vulnerable to the Year 2000 problem, they can introduce and/or propagate errors into credit unions systems. Accordingly, credit unions must develop comprehensive solutions to this problem and prevent unintentional consequences from affecting their systems and the systems of others. To address these Year 2000 challenges, GAO issued its Year 2000 Assessment Guide to help federal agencies plan, manage, and evaluate their efforts. The Office of Management and Budget (OMB), which is responsible for developing the Year 2000 strategy for federal agencies, also issued similar guidance. Both require a structured approach to planning and managing five delineated phases of an effective Year 2000 program. The phases include (1) raising awareness of the problem, (2) assessing the complexity and impact the problem can have on systems, (3) renovating, or correcting, systems, (4) validating, or testing, corrections, and (5) implementing corrected systems. GAO has also identified other dimensions to solving the Year 2000 problem, such as identifying interfaces with outside organizations and their systems and establishing agreements with these organizations specifying how data will be exchanged in the year 2000 and beyond. In addition, GAO and OMB have established a timeline for completing each of the five phases and believe agencies should have completed assessment phase activities last summer and should be well into renovation with the goal of completing this phase by mid to late 1998. Our work at other federal agencies indicates that because the cost of systems failures can be very high, contingency plans must be prepared so that core business functions will continue to be performed even if systems have not been made Year 2000 compliant. NCUA has developed a three-pronged approach for ensuring that credit unions are aggressively addressing the Year 2000 problem, which encompasses (1) incorporating the Year 2000 issue into its examination and supervision program, (2) disseminating information about the problem to credit unions, and (3) assessing Year 2000 compliance on the part of credit union data processing vendors. The first aspect of NCUA's strategy, the examination and supervision program, involves assessing credit union Year 2000 efforts through regular annual examinations at the 7,200 federally chartered credit unions and 30 to 40 percent of the 4,200 federally insured, state chartered credit unions for which NCUA conducts an insurance review. These examinations seek to identify credit unions that are in danger of not renovating their systems on time and to reach "formal agreements" that specify corrective measures. In conducting these reviews, examiners are to follow NCUA guidelines, which provide step-by-step procedures for identifying problem areas. Once a formal agreement is reached, the examiner is expected to monitor the credit union's implementation of the agreed-upon corrective measures. Also as part of its examination effort, NCUA has contracted a consulting firm to train selected examiners in Year 2000 efforts. Through this training, NCUA expects to have one in-house Year 2000 specialist available as a resource for every eight examiners. In addition, NCUA's board recently authorized the hiring of an electronic data processing (EDP) auditor to provide more in-depth technical assistance and education on Year 2000 problems. Another part of NCUA's examination and supervision strategy includes working with state regulators to ensure that federally insured, state chartered credit unions are also Year 2000 compliant. Officials from NCUA and the National Association of State Credit Union Supervisors told us that all but two state regulators are following the same Year 2000 examination strategy established by NCUA; the other two state regulators are planning on performing added steps in addition to performing those included in NCUA's strategy. The second aspect of NCUA's strategy--information dissemination--seeks to heighten credit union awareness of the Year 2000 problem. In August 1996 and June 1997 letters to federally insured credit unions, NCUA formally alerted credit unions to the potential dangers of the Year 2000 problem, identified the specific impacts the problem could have on the industry, provided detailed explanations of the problem, and identified steps needed to correct the problem. It also related its plans to include Year 2000 evaluations in regular examinations and provided credit unions with copies of its examination guidance. In addition, NCUA has appointed a Year 2000 executive responsible for achieving Year 2000 compliance industrywide and assigned Year 2000 compliance officers to its central office and six regional offices. These staff will be responsible for serving as Year 2000 focal points to coordinate efforts across the agency. Finally, NCUA is working with credit union trade groups, such as the Credit Union National Association, in raising awareness of Year 2000 issues. The third component of NCUA's program--vendor compliance--targets organizations that provide electronic data processing services to credit unions. According to NCUA, approximately 40 vendors provide data processing services to 76 percent of all federally insured credit unions, which account for 79 percent of federally insured credit union assets. Consequently, it is vital that these vendors correct their own systems and help ensure that information can be easily transferred after the Year 2000 deadline. NCUA has begun identifying and contacting major EDP vendors, and it plans to assess their efforts through questionnaires. Specifically, in May 1997 and again in August 1997, NCUA mailed a questionnaire to the 87 vendors, including the 40 vendors that support the bulk of credit unions, requesting information on Year 2000 readiness and, as of September 1997, had received 29 responses. While NCUA has initiated actions to build the Year 2000 issue into examinations and to raise awareness about the issue among credit unions and their vendors, our work to date has identified four issues that must be addressed to provide greater assurance that NCUA efforts will be successful. First and foremost of our concerns is that NCUA still does not have a complete picture of where credit unions and their vendors stand in resolving the Year 2000 problem, and current efforts to determine credit union compliance are behind the schedule established by OMB and GAO. To collect information from the credit unions on their Year 2000 status, NCUA examiners used a high-level questionnaire that inquired whether (1) credit union systems were capable and ready to handle Year 2000 processing, (2) plans were in place to resolve the problem, (3) enough funds were budgeted to correct systems, and (4) responsibility and reporting mechanisms were appropriately established to support the Year 2000 effort. NCUA issued a separate high-level questionnaire to credit union vendors. However, as of the time of our work, NCUA had not yet queried 20 percent of the credit unions and had only received 29 of the 87 vendor responses. In addition, of the credit union and vendor responses received, NCUA has not yet analyzed the information to determine which credit unions and vendors are at high risk of not correcting their systems on time. This problem is compounded by the fact that the NCUA questionnaires did not inquire about the status of efforts in completing each important phase of correction: (1) raising awareness of the problem, (2) assessing the complexity and impact the problem can have on systems, (3) renovating, or correcting, systems, (4) validating, or testing, corrections, and (5) implementing corrected systems. The questionnaires also did not include system interface issues. For example, they did not inquire about (1) identifying interfaces with outside organizations and their systems, such as payment, check clearing, credit card, and benefit transfer systems, and (2) establishing agreements with these organizations specifying how data will be exchanged in the year 2000 and beyond. As a result, even when NCUA assesses the results, it still will not have a complete understanding of how far along the industry is in addressing the problem. In addition, NCUA examinations are conducted only on an annual basis. This means that each credit union will be examined only two more times between the end of 1997 and the year 2000. Further, NCUA has not yet established a formal mechanism for credit unions to submit interim progress reports to provide an up-to-date picture of individual correction efforts between examinations. NCUA officials told us that examiners perform off-site supervision in between exams by tracking performance via credit union financial reports and by contacting credit union officials should a problem arise. However, this may not be enough given the seriousness of the problem and the fact that the Year 2000 deadline is just 2 years away. Further complicating NCUA's situation is the fact that it is still involved in assessment phase activities. According to OMB and GAO guidance, these activities should have been completed back in the summer. As it stands, NCUA does not plan to complete them until the end of this calendar year. Accordingly, we believe NCUA should accelerate agency efforts to complete the assessment of the state of the industry by no later than November 15, 1997, rather than waiting until the end of the year. NCUA should also collect the necessary information to determine the exact phase of each credit union and vendor in addressing the Year 2000 problem. Because NCUA currently does not have a process in place for interim reporting of this information between examinations, NCUA should require credit unions to report the precise status (phase) of their efforts on at least a quarterly basis. One option would be to use the financial reports, commonly referred to as call reports, that credit unions provide to NCUA quarterly. As part of this report, NCUA should also require credit unions to report on the status of identifying their interfaces to determine whether this issue is being adequately addressed and, if not, require credit unions to implement such agreements as soon as possible. A second concern we have with NCUA's efforts is that the agency does not yet have a formal contingency plan. Our Year 2000 Assessment Guidecalls on agencies to initiate realistic contingency plans during the assessment phase for critical systems to ensure the continuity of their core business processes. Contingency planning is important because it identifies alternative activities, which may include manual and contract procedures, to be employed should systems fail to meet the Year 2000 deadline. NCUA guidance directs credit unions to conduct contingency planning, and NCUA officials told us that they have developed numerous contingency options and have discussed among the staff what steps to take should a credit union not be compliant by January 1, 2000. However, officials stated that the precise actions have not been documented in a formal plan. Not having this plan increases the risk of unnecessary problems in an already uncertain situation. Consequently, we recommend that NCUA formally document its contingency plans. A third concern that we have is that credit union auditors may not be addressing the Year 2000 problem as part of their work. NCUA requires each credit union to perform supervisory committee audits. These audits are to determine whether management practices and procedures are sufficient to safeguard members' assets and whether effective internal controls are in place to guard against error, carelessness, and fraud. They are conducted by the credit union's supervisory committee staff or by an outside accountant. However, NCUA officials noted that such reviews typically focus on general controls (e.g., ensuring accurate data is entered into the system, securing data from unauthorized use) and would not specifically include controls to prevent malfunctions due to the Year 2000 problem. Audits are an integral management control and expanding their scope to include important and high-risk Year 2000 issues is critical since it would provide credit union management with greater assurance and understanding about where their institution stands in addressing the problem. Accordingly, we are recommending to NCUA that it require credit unions to implement the necessary management controls to ensure that these financial institutions have adequately mitigated the risks associated with the Year 2000 problem. Specifically, NCUA should require credit union auditors to include Year 2000 issues within the scope of their management and internal control work and report serious problems and corrective actions to NCUA immediately. To aid credit union auditors in this effort, NCUA should provide the auditors with the procedures developed by NCUA for its examiners to use in assessing Year 2000 compliance and any other guidance that would be instructive. We also believe NCUA should require credit unions to establish processes whereby credit union management would be responsible for certifying Year 2000 readiness by a deadline well before the millennium. Such a certification process should include credit union compliance testing by an independent third party and should allow sufficient time for NCUA to review the results. Our fourth concern is that NCUA does not have enough staff qualified to conduct examination work in complex technical areas. At present, NCUA is the process of hiring one EDP auditor to help examine thousands of credit unions. Recognizing this weakness, NCUA is considering hiring up to three EDP auditors. However, these personnel additions may still not suffice given the tremendous workload and the short time frame for getting it done. To mitigate this concern, we recommend that before the end of the year, NCUA determine the level of technical capability needed to allow for thorough review of credit unions' Year 2000 efforts and hire or contract for this capability. | Pursuant to a congressional request, GAO reviewed the National Credit Union Administration's (NCUA) progress in making sure that the automated information systems belonging to the thousands of credit unions it oversees have adequately mitigated the risks associated with the year 2000 date change. GAO noted that: (1) NCUA has taken steps to address the Year 2000 problem; (2) these involve incorporating the Year 2000 issue into its examination and supervision program, disseminating information about the problem, and assessing Year 2000 compliance on the part of data processing vendors; (3) concerns exist that must be resolved if the NCUA is to achieve greater certainty that credit unions will meet their Year 2000 deadline; (4) NCUA still does not have a complete picture of where credit unions and their vendors stand in resolving the Year 2000 problem, and current efforts to determine credit union compliance are behind the schedule established by GAO and the Office of Management and Budget (OMB); (5) while NCUA sent questionnaires to credit unions and data processing vendors about the problem, it has not yet queried 20 percent of credit unions and has only received 29 of 87 vendor responses; (6) of the credit union and vendor responses received, NCUA has not yet analyzed this information to identify high-risk credit unions and vendors; (7) further, the surveys did not specifically ask about the status of corrective efforts and whether interface issues were appropriately being addressed; (8) NCUA has directed credit unions to conduct contingency planning and its staff have discussed what steps they should take should a credit union not be compliant by January 1, 2000; (9) however, the agency still lacks a formal contingency plan; (10) NCUA must take prompt action to ensure that these discussions are formally documented so that it will be well-positioned to handle unforeseen problems; (11) as potentially damaging as the Year 2000 problem is, NCUA has not yet ensured that the issue is addressed by credit union auditors; (12) doing so would provide credit union management with a greater assurance and understanding about where their institution stands in addressing the problem; (13) NCUA does not have enough staff qualified to conduct examination work in complex system areas; (14) at present, NCUA is in the process of hiring an electronic data processing (EDP) auditor and is requesting authority to hire 2 more; and (15) these personnel additions may not suffice given the tremendous workload and short time frame for getting it done. | 3,308 | 522 |
Since the early 2000s, states have been building longitudinal data systems to better address data collection and reporting requirements in federal laws--such as the No Child Left Behind Act of 2001 and the America Creating Opportunities to Meaningfully Promote Excellence in Technology, Education, and Science Act (America COMPETES Act)-- and to inform stakeholders about student achievement and school performance. Federal, state, and private entities have provided funding for these systems. For example, in addition to the SLDS and WDQI programs, other recent federal grant programs including Race to the Top and the Race to the Top-Early Learning Challenge may support states' efforts. The purpose of the SLDS grant program--administered by Education's Institute for Education Sciences, National Center for Education Statistics--is generally to enable state educational agencies to design, develop, implement, and expand statewide longitudinal data systems to manage, analyze, disaggregate, and use individual student data. From fiscal years 2006 to 2013, Education has awarded approximately $613 million in SLDS grants (see table 1). For each grant competition, Education establishes the award period and range of grant amounts to be awarded; SLDS grants have ranged anywhere from 3 to 5 years, with a maximum award amount of $20 million per grantee. See appendix III for a list of states that received SLDS grants and the amount of their awards. Though the SLDS grant requirements have varied over time, states generally could use SLDS funds to build K-12 longitudinal data systems or to expand these systems to include data from other sectors, such as early education, postsecondary education, or workforce (see table 2). The long-term goal of the program is for states to create comprehensive "P20- W"--early learning through workforce--longitudinal data systems that, among other things, will allow for states, districts, schools, educators, and other stakeholders to make informed decisions and conduct research to improve student academic achievement and close achievement gaps. Under the WDQI grant program--administered by DOL's Employment and Training Administration--states are expected to fully develop their workforce longitudinal data systems and then be able to match these data with available education data to analyze education and workforce outcomes.has chosen to award WDQI grants to states that have received an SLDS grant or have a longitudinal data system in place. Among other requirements, all grantees are required to develop or improve workforce longitudinal data systems and enable workforce data to be matched with education data to ultimately follow individuals through school and into the workforce. DOL has provided funding for approximately $36 million in WDQI grants to 33 states since fiscal year 2010 (see table 3). The award period for each grant is 3 years. See appendix III for a list of states that received WDQI grants and the amount of their awards. After analyzing data from DQC's 2013 survey, we determined that over half of grantees have the ability to match data--reliably connect the same record in two or more databases--for some individuals from early As shown in figure 1, individuals can education and into the workforce.take different paths to move from early education into the workforce: (1) via K-12 or (2) via K-12 and postsecondary. Regardless, as the match rate--that is, the percent of unique student records reliably connected between databases--increases, the number of grantees able to match data between sectors decreases. For example, 31 of 48 grantees have the ability to track individuals between all sectors from early education to workforce to at least some degree, but only 6 grantees could do so at the highest match rate. Our analysis of the DQC survey data also shows that more grantees match data among the education sectors than between the education and workforce sectors, though--as was the case with matching data from early education to workforce--the number of grantees that match data For example, 43 decreases as the match rate increases (see table 4).grantees reported matching data between the K-12 and early education sectors, and 31 grantees reported matching data between the K-12 and workforce sectors at least to some degree; however, the number of grantees that reported matching data between these same sectors drops to 37 and 9, respectively, at a match rate of 95 percent or more. Not all grantees are matching data between all sectors, which may partially be the result of receiving grants with different grant requirements. For example, all 20 grantees that received a fiscal year 2009 SLDS ARRA grant were required to have longitudinal data systems that include individual student-level data from preschool through postsecondary education and into the workforce (see table 2). However, fiscal year 2012 grantees could choose from among three different grant priorities, so some grantees may be focused on building a K-12 longitudinal data system while others may be using their grant funds to link existing K-12 data to other sectors. In addition, grantees may have been in different stages of developing their longitudinal data systems prior to receiving a grant, which may help explain why some grantees are able to match data between more sectors than others. Programs Included in the Data Quality Campaign Survey, 2013 Early education: early intervention, Head Start/Early Head Start, special education, state prekindergarten, subsidized child care K-12: elementary and secondary education Postsecondary institutions: less than 2-year public, less than 2-year private not-for-profit, less than 2-year private for-profit, 2-year public, 2-year private not-for-profit, 2-year private for-profit, 4-year and above public, 4- year and above private not-for-profit, 4-year and above private for-profit Workforce: unemployment insurance wage records, unemployment benefits claim data, Workforce Investment Act of 1998 (WIA) adult or dislocated worker program, WIA youth program, adult basic and secondary education, Wagner-Peyser Act employment services, Temporary Assistance for Needy Families (TANF) Of those grantees that match data, we found that few generally do so for all of the possible programs between particular sectors (see sidebar), based on our analysis of DQC survey data (see table 5). For example, only 6 of 31 grantees reported that they were able to match data on all seven programs between the K-12 and workforce sectors, which include unemployment insurance wage records, unemployment benefit claims data, Workforce Investment Act of 1998 (WIA) adult or dislocated worker program, WIA youth program, adult basic and secondary education, Wagner-Peyser Act employment services, and Temporary Assistance for Needy Families (TANF). We also analyzed DQC's data to determine which programs are most commonly matched by grantees between particular sectors (see fig. 2). See appendix IV for a list of the specific programs matched by each grantee. Most grantees that match data also share data between sectors; that is, they exchange at least one type of data (e.g., demographic, enrollment, program participation, etc.) between two databases in at least one direction, based on our analysis of DQC data. However, in general, few grantees share all possible types of data (see sidebar). For example, only 3 of 36 states that match data between the postsecondary and workforce sectors reported sharing all 10 types of data asked about by DQC, which include information on postsecondary degree completion, earnings and wages, and industry of employment, among others (see table 6). Officials in all five grantee states we spoke with said matching K-12 education and workforce data is challenging without using a Social Security number (SSN) that uniquely identifies an individual and, as a result, some states may have greater difficulty tracking particular groups of students over time. SLDS officials in three states--Ohio, Pennsylvania, and Virginia--said collecting a SSN in K-12 education data is prohibited either by state law or agency policy; in the other two states--South Dakota and Washington--officials said collecting a SSN is optional and whether to do so is determined at the district level.unique statewide student identifier is a technical requirement of the SLDS grant program, states can choose the format of the identifier used. Education suggested, in a November 2010 SLDS Technical Brief, that states use a unique identifier distinct from a student's SSN for privacy reasons; however, Education also stated that states should maintain a While establishing a student's SSN as a data element in order to link data between systems. According to a 2010 report from the Social Security Administration's Office of the Inspector General, 28 states collect a SSN in K-12 education data. Unlike the SLDS program, in its evaluation criteria for WDQI grants, DOL specifies that states use SSNs as a personal identifier, as they are already in use throughout the workforce system. To match education and workforce data absent a SSN, state officials said they are developing algorithms to match individual records using other identifiers, which could include an individual's first name, last name, and date of birth. However, a person's last name can change, which Pennsylvania SLDS officials said can make it difficult to reliably track individuals over time. Further, Ohio WDQI officials explained that the absence of a SSN makes it particularly difficult to track students who drop out of high school or to track high school graduates who do not move on to the workforce. Similarly, Ohio SLDS officials said tracking students that do not go on to postsecondary education is a challenge because there is no readily available identifier to determine any workforce participation by those individuals. In four of five grantee states we spoke with, officials also cited data governance as a challenge. Data governance is the exercise of decision- making and authority for data-related matters using agreed-upon rules that describe who can take what actions with what information and when, under what circumstances, and using what methods. SLDS grantees are generally required to develop a governance structure involving both state and local stakeholders that includes a common understanding of data ownership, data management, as well as data confidentiality and access. All WDQI grantees are expected to establish partnerships with relevant workforce agencies and with state education agencies for the purposes of data sharing. Pennsylvania and Ohio officials said it has not been easy to get the various workforce agencies that maintain data on individual workforce programs to share their data as the agencies often operate independently from one another. As a result, Pennsylvania officials said agencies are territorial about their data, making it difficult to build consensus around developing a longitudinal data system. In Ohio, officials said that each agency has to be approached separately to obtain commitment to share data in a longitudinal system. Similarly, officials in Virginia said collecting data on early education programs has been a challenge as the data are scattered across different agencies. An official from the Early Childhood Data Collaborative explained that it can be easier to facilitate data matching between early education programs under the purview of one agency, such as state prekindergarten and special education, which are generally overseen by state educational agencies in addition to K-12 data. Based on our interviews with grantee states, state officials we spoke with said they are in different stages of developing a data governance structure. For example, Pennsylvania WDQI officials said they have not yet established a formal data governance structure. In contrast, Virginia officials have established a data governance structure; officials said they spent 18 months working through the different priorities, cultures, and agendas of the various agencies providing data to the longitudinal data system. State officials in all five grantee states we spoke with also said they have had to manage public concerns about the purpose of data collection or about data privacy. For example, in Ohio, SLDS officials told us there is a lack of understanding about the value of building a longitudinal data system; officials have had to counter misperceptions about what data are being collected in the state's longitudinal data system, what the data will be used for, and why data need to be connected between the education and workforce sectors. South Dakota officials said they have had to respond to concerns from parents and other education stakeholders about the privacy of longitudinal data. Grantees have tried to provide information to the public about the purposes of the data system and steps taken to safeguard information. Forty-six grantees reported using outreach tools to communicate the availability of the data to non-educator stakeholders, according to our analysis of the DQC survey data. These grantees reported using traditional outreach measures, which could include public service announcements, press conferences and news releases, and posting information about the data on the state education agency's website. For example, four of five grantee states we interviewed have web pages dedicated to their longitudinal data systems. These web pages can include overviews of the systems, answers to frequently asked questions, trainings on how to use or access the data, and examples of research studies that use the data. Further, 44 grantees reported on the DQC survey that they take advantage of in-person opportunities, which could include meetings, conferences, and presentations. Lastly, 35 grantees reported using electronic or social media to promote the data, which could include Facebook, Twitter, blogs, and webinars. In the context of discussing the challenge of managing public concerns about data collection or privacy, officials in three of the five grantee states we spoke with specifically said they have provided information about how they protect individual data. Pennsylvania SLDS officials said they took considerable time to convey to parents and taxpayers the steps they are taking to ensure data privacy. Similarly, Virginia officials from both grant programs said explaining all of the precautions the state is taking with respect to data privacy seems to help in reducing concerns. Ohio officials said the state's Department of Education has convened a new workgroup to see if there are better ways to address misperceptions about data collection and use. Lastly, state officials cited the importance of federal funding to their efforts to build their longitudinal data systems and expressed concerns about sustaining their systems after their grants end. Officials we interviewed in all five grantee states said they would not be as far along in developing their longitudinal data systems without the federal funding provided through the SLDS and WDQI programs. For example, officials in Washington said they used their initial SLDS and WDQI grants to focus on building their K-12 data system and workforce systems, respectively. They said the second SLDS grant they received was instrumental in building a P-20W system to connect data between all sectors. Ohio officials said the SLDS funds have provided, among other uses, critical funding for further development of the longitudinal data system, technological updates, and access to technical assistance. However, officials in all five grantee states also expressed concerns about sustaining the systems moving forward. For example, officials in Virginia said they have created a legislative committee to focus on sustainability efforts and will need to request additional funding to keep the system sustainable. Officials in Pennsylvania said they are trying to leverage the existing technical infrastructure and use other available resources, but it is difficult to find funding for their workforce data efforts. According to our analysis of the DQC survey data and our interviews with selected states, SLDS and WDQI grantees use longitudinal data to examine education outcomes and to inform policy decisions. All 48 grantees responded that their state educational agency uses the data to analyze aggregate education outcomes (see fig. 3). For example, the three most common types of analyses are related to high school feedback, cohort graduation or completion, and growth (i.e., changes in the achievement of the same students over time). These aggregate data are used to analyze a particular cohort of students and develop information on students' outcomes over time. They also help guide school-, district-, and state-level improvement efforts. For example, officials from three of the five grantee states we interviewed told us they have used the data to assess kindergarten readiness for children who attended state early education programs. Also, 27 grantees responded to the DQC survey that they use the data to analyze college and career readiness. More specifically, to better understand the courses and achievement levels that high school graduates need to be successful in college, Virginia followed students who graduated from high school from 2006 to 2008 and analyzed enrollment and academic achievement patterns for different groups of students. According to agency officials in Virginia, this analysis resulted in changes to the course requirements for graduation. In addition to examining education outcomes, states also use longitudinal data to assess how cohorts of students fare once they are in the workforce. Washington's Education Research and Data Center, a state center dedicated to analyzing education and workforce issues across the P-20W spectrum, has published several studies examining workforce outcomes for high school and college graduates. For example, one study compared earnings for workers with bachelor's degrees from Washington state colleges and universities to earnings of workers with only diplomas from public high schools. In addition to analyzing aggregate student outcomes, grantees also indicated that they analyze individual-level student outcomes. Our analysis of DQC survey data shows that 45 of 48 grantees examine outcomes for individual students (see fig. 4). Student-level data provide teachers and parents with information they can use to improve student achievement. For example, 32 grantees reported that the data are used in diagnostic analysis, which help teachers identify individual students' strengths and academic needs. Also, 29 grantees responded to the DQC survey that they produce early warning reports, which identify students who are most likely to be at risk of academic failure or dropping out of school. For example, Virginia's early warning report shows demographic and enrollment information about an individual student; flags for warning indicators such as attendance, GPA, and suspensions; and a record of interventions the school has taken to help the student (see fig. 5). Further, officials in three of the grantee states we interviewed told us that educators have access to student-level analyses. In Pennsylvania, teachers can use an educator dashboard, which includes longitudinal data, to determine the educational needs of their students and adjust their teaching plans. Forty-one of 48 grantees reported to the DQC that they use longitudinal data to inform policy and continuous improvement efforts. Specifically, grantees reported that they use the data to inform school turnaround efforts (34 grantees), evaluate intervention strategies or programs (14 grantees), or identify and reward schools that demonstrate high growth (27 grantees), among other things. Officials in three of five grantee states we spoke with provided more specific examples of how they use or plan to use longitudinal data to inform their efforts. Ohio officials told us they used longitudinal data to study students in remediation to help develop a remediation policy. They also said they have been working on a workforce success measures dashboard to compare outcomes across state programs. For example, the dashboard will allow policy makers to assess how successful the state's adult basic education program is compared to the state's vocational education program. Pennsylvania officials told us they will develop a similar dashboard. Washington state officials told us that longitudinal data helped address a concern in the state legislature about whether math and science teachers were leaving to work in the private sector. Researchers identified common teacher and school district characteristics associated with teachers who left for employment in other fields and found that math and science teachers did not leave the field at a higher rate than other teachers. Officials told us that this analysis prompted the state legislature to focus its attention on improving the recruitment of math and science teachers rather than improving retention. While many grantees reported on the DQC survey that they use longitudinal data to analyze outcomes for students and workers and to make policy decisions, officials from all five grantee states we interviewed told us that these analyses are limited because they are still developing their longitudinal data systems. In addition, only three of these states-- Ohio, Virginia, and Washington--are conducting education to workforce analyses. Officials in Pennsylvania and South Dakota said they plan to do this type of analysis, but only after they finish putting all the education and workforce data into their systems and matching these data. Data from the 2013 DQC survey show that 39 SLDS or WDQI grantees have developed research agendas articulating and prioritizing research or policy questions that can be answered with longitudinal data. These research agendas were developed in partnership with higher education institutions, independent researchers, or others. Of the five grantee states we interviewed, only Virginia and Ohio have fully developed their research agendas. Pennsylvania, South Dakota and Washington officials told us they are in the process of doing so. State officials shared two approaches for creating these agendas. Under the first approach, stakeholders from various state agencies comprise a committee that identifies research questions. Virginia took this approach and drafted a list of "burning questions" to answer using longitudinal data. Officials in Virginia explained that they purposefully kept the agenda broad so that the questions will remain relevant over the long term. Washington's Education Research and Data Center has similarly developed a list of critical questions it would like to answer using longitudinal data. Under the second approach, state agencies use information requests and stakeholder feedback on sample reports to shape the research agenda. For example, officials from the South Dakota Department of Education told us they have solicited feedback after training districts on the data and reviewed requests from the governor's office and state legislators. They also told us that they are following the number of hits for individual reports on the state's Department of Education's electronic portal. Forty-three of 48 grantees reported that they have a process by which researchers who are not employees of the state can propose their own studies for approval, according to the 2013 DQC survey data. Four of the grantee states we interviewed have established a formal request process for researchers who would like to access longitudinal data and the fifth state is reviewing its protocols and expects to develop a formal application process. Officials in two grantee states told us that the request process is intended to streamline access to the data and make it easier for researchers to seek approval for data requests. In addition, officials in Ohio told us that when researchers apply for access to Ohio's data, they must include information in their application about how the study will meet the state's research priorities. Since fiscal year 2006, the federal government has made a significant investment--over $640 million in SLDS and WDQI grant funds--to help states build P20-W longitudinal data systems that track individuals from early education and into the workforce. The different grant requirements for linking data between sectors may have contributed to states being in different stages of developing their longitudinal data systems. That is, some grantees are just building their K-12 longitudinal data systems while others are matching data between education and workforce sectors. It remains to be seen whether all grantees will ultimately achieve the long- term goal of developing complete P20-W longitudinal data systems or how long that will take, particularly in light of unresolved concerns about limitations to matching data using a Social Security number and sustainability. Further, even among those grantees that can match data between sectors, most can only do so for a limited number of programs or data types. As grantees continue to refine their systems, maximizing the potential of these systems will rest, in part, with the ability to more fully match information on specific programs and characteristics of individuals that could help in further analyzing education and workforce outcomes. We provided a draft of this report to Education and DOL for their review. Each provided technical comments, which we incorporated as appropriate. As agreed with your office, unless you publicly announce its contents earlier, we plan no further distribution until 30 days after the date of this letter. At that time, we will send copies of this report to the appropriate congressional committees and the Secretaries of Education and Labor. In addition, the report is available at no charge on GAO's website at http://www.gao.gov. If you or your staff have any questions about this report, please contact me at (617) 788-0580 or [email protected]. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this report. GAO staff who made major contributions to this report are listed in appendix VII. The objectives of this report were to examine: (1) the extent to which Statewide Longitudinal Data Systems (SLDS) and Workforce Data Quality Initiative (WDQI) grantees match individual student and worker records and share data between the education and workforce sectors; and (2) how grantees are using longitudinal data to help improve education and workforce outcomes. To answer our objectives, we analyzed state-level data from a 2013 survey conducted by the Data Quality Campaign (DQC), a nonprofit organization that works with state officials and others to support the effective use of data to improve student achievement. DQC's survey focused on 10 "State Actions" the DQC has developed to ensure effective data use (see table 7). DQC has conducted this annual survey since 2005. The survey data include self-reported information on how data are matched and shared between the early education, K-12, postsecondary education, and workforce sectors, as well as information on specific programs within these sectors, how states analyze and use the data, and who has access to the data. To conduct the survey, DQC used an online tool to collect information and invited the governor's office in all 50 states and the District of Columbia to participate. According to DQC, the governor's office is in the best position to bring stakeholders together to respond to the survey. As part of their survey response, states were asked to provide documents or website links as evidence of having specific policies or reports. After survey responses were received, DQC worked with each state to ensure the information reported was as accurate as possible. We analyzed data from eight survey questions (see table 8 in appendix II) to determine the extent to which SLDS and WDQI grantees match individual records and share data among the education sectors and between the education and workforce sectors. For the purposes of our report, a grantee is one of the 48 states that received a SLDS grant, a WDQI grant, or both and responded to the 2013 DQC survey. We considered the District of Columbia to be a state. We excluded Alabama, New Mexico and California from our review because neither Alabama nor New Mexico received a SLDS or a WDQI grant and because California chose not to participate in DQC's 2013 survey. We excluded the U.S. Virgin Islands and Puerto Rico because, while these territories received SLDS grants, DQC did not include them in its survey. We analyzed data on SLDS and WDQI grantee states because the SLDS and WDQI grant programs provide federal funds for developing longitudinal data systems and are complementary. We considered a grantee as matching data between sectors if a grantee matched data from at least one program between sectors (for a list of programs included in the DQC survey, see questions 1, 4, 7, and 10 in table 8 in appendix II). We considered a grantee as sharing data if a grantee matched data according to our definition and also reported exchanging at least one data element between sectors, in either direction (for a list of data elements, see questions 2, 5, 8, and 11 in table 8 in appendix II). We also analyzed data from another twelve survey questions to identify how grantees are using longitudinal data to help improve education and workforce outcomes (see table 9 in appendix II). We conducted a data reliability assessment by reviewing the survey instrument and related documentation, interviewing officials responsible for administering the survey, and testing the data for obvious inaccuracies. We determined that these data are sufficiently reliable for the purposes of this report. In addition to our analysis of DQC survey data we conducted interviews with a nongeneralizable sample of five grantees as well as relevant federal agencies and nonprofit organizations. During our interviews with the five grantee states--Ohio, Pennsylvania, South Dakota, Virginia, and Washington--we asked grantees to identify challenges they faced in building and implementing longitudinal data systems and discussed how grantees have used longitudinal data to inform decision-making in education and workforce programs. We selected these grantees based on factors including the differing levels of progress they have made in establishing data linkages and the federal funding they have received from the SLDS and WDQI programs. Within each state, we spoke with relevant K-12, workforce, postsecondary education, and early education officials. We also interviewed officials at Education, DOL, and the Department of Health and Human Services to obtain information about their roles in helping states build longitudinal data systems. In addition, we spoke with officials from nonprofit organizations to obtain their views on states' implementation of longitudinal data systems. These stakeholder organizations included the Early Childhood Education Collaborative, the State Higher Education Executive Officers Association, and the Workforce Data Quality Campaign. Finally, we reviewed relevant federal laws, regulations, requests for applications, and solicitations for grant applications to understand the requirements of these grants. As explained in appendix I, we analyzed data from DQC's 2013 survey to answer our research objectives. Table 8 and table 9 show the specific questions we analyzed from DQC's survey instrument. For some questions, DQC allowed states to select "other" as a response; we excluded these "other" responses from our analysis. In addition to the contact named above, Janet Mascia, Assistant Director, Jennifer Gregory, and Nisha R. Hazra made key contributions to this report. Also contributing to this report were Deborah Bland, David Chrisinger, Alex Galuten, Amanda Miller, Jeffrey G. Miller, Mimi Nguyen, Yunsian Tai, and Walter Vance. This glossary is provided for reader convenience. It is not intended as a definitive, comprehensive glossary of related terms. Group statistics (numbers, percentages, averages, etc.) based on individual student data. Reports designed to identify students who are on track for readiness or success in college or careers. The exercise of decision-making and authority for data-related matters using agreed-upon rules that describe who can take what actions with what information and when, under what circumstances, and using what methods. Information on individuals designed to identify each student's strengths and academic needs. Programs that serve children prior to kindergarten. Programs include: early intervention, Head Start/Early Head Start, state prekindergarten, special education, and subsidized child care. A report designed to identify students who are most likely to be at risk of academic failure or dropping out of school. Information on outcomes for students after they graduate from a school or district. A report that shows changes in the achievement of the same students over time. Elementary and secondary education. Institutions of higher education. Types of institutions include: less than 2- year public, 2-year public, 4-year and above public, less than 2-year private not-for-profit, 2-year private not-for-profit, 4-year and above private not-for-profit, less than 2-year private for-profit, 2-year private for- profit, and 4-year and above private for-profit. A report that shows how students' success later in the education/workforce pipeline is related to the status of the same students earlier in the pipeline. Reliably connecting the same individual record in two or more databases. The percent of unique individual records reliably connected across databases. Exchanging data between two databases, in either direction. Data elements that could be shared between early education and K-12 include: demographic, family characteristics, program participation, child-level development data; between K-12 and postsecondary: demographic, college readiness assessment scores, college placement assessment scores, high school transcript data, postsecondary enrollment, postsecondary remediation status, postsecondary progress, postsecondary credits earned, postsecondary enrollment intensity, postsecondary outcomes; between K-12 and workforce: demographic, enrollment, transcript data, earnings and wages, employment status, occupation, industry of employment; between post-secondary and workforce: demographic, enrollment, transcript data, financial aid, postsecondary degree completion, earnings and wages, employment status, occupation, industry of employment. Programs that serve individuals in the workforce. Programs include: adult basic and secondary education, TANF, unemployment benefits claims data, unemployment insurance wage records, Wagner-Peyser Act employment, WIA adult or dislocated workers program, and WIA youth program. | From fiscal years 2006 through 2013, the Departments of Education and Labor provided over $640 million in grants to states through the SLDS and WDQI grant programs. These grants support states' efforts to create longitudinal data systems that follow individuals through their education and into the workforce. Analyzing data in these systems may help states improve outcomes for students and workers. GAO was asked to review the status of grantees' longitudinal data systems. This report examines (1) the extent to which SLDS and WDQI grantees match individual student and worker records and share data between the education and workforce sectors and (2) how grantees are using longitudinal data to help improve education and workforce outcomes. To answer these questions, GAO analyzed data from a 2013 survey conducted by the DQC. This survey collected information from states on data linkages among education and workforce programs and on how states use longitudinal data. In addition, GAO interviewed a nongeneralizable sample of five grantees, which were selected based on the progress they have made in matching data and on the funding they have received from the SLDS and WDQI programs. GAO also reviewed relevant federal laws and regulations. GAO is not making recommendations in this report. GAO received technical comments on a draft of this report from the Department of Education and the Department of Labor, and incorporated them as appropriate. Over half of 48 grantee states that received a Statewide Longitudinal Data Systems (SLDS) or Workforce Data Quality Initiative (WDQI) grant have the ability to match data on individuals from early education into the workforce, based on GAO's analysis of 2013 Data Quality Campaign (DQC) survey data. The DQC is a nonprofit organization that supports the effective use of data to improve student achievement. In its survey, DQC collected self-reported information from states on their ability to match, or connect the same individual record, between the (1) K-12 and early education, postsecondary, and workforce sectors and between the (2) postsecondary and workforce sectors. However, as the match rate--that is, the percent of unique individual records reliably connected between databases--increases, the number of grantees able to match data decreases. GAO found that more grantees reported being able to match data among the education sectors than between the education and workforce sectors. Further, most grantees reported that they are not able to match data comprehensively. For example, only 6 of 31 grantees reported that they match K-12 data to all seven possible workforce programs covered by the DQC survey, which include adult basic and secondary education as well as unemployment insurance wage records. State officials cited several challenges to matching data, including state restrictions on the use of a Social Security number. Specifically, officials in three of five grantee states GAO spoke with said state law or agency policy prohibit collecting a Social Security number in K-12 data, which can make it more difficult to directly match individuals' K-12 and workforce records. According to GAO analysis of the DQC survey data, grantees use some longitudinal data to inform policy decisions and to shape research agendas. All 48 grantees reported analyzing aggregate-level data to help guide school-, district-, and state-level improvement efforts. For example, 27 grantees said they analyze data on college and career readiness to help schools determine whether students are on track for success in college or in the workforce. Grantees also reported using longitudinal data to analyze outcomes for individual students. For example, 29 grantees reported that they produce early warning reports that identify students who are most likely to be at risk of academic failure or dropping out of school. Data from the DQC survey also show that 39 grantees reported developing a research agenda in conjunction with their longitudinal data systems. | 7,088 | 827 |
Several organizations are integrally involved in carrying out the Navy's financial management and reporting, including: (1) the Office of the Navy's Assistant Secretary for Financial Management and Comptroller, which has overall financial responsibility, (2) DFAS, which reports to the Department of Defense (DOD) Comptroller and provides accounting and disbursing services, and (3) Navy components, which initiate and authorize financial transactions. To help strengthen financial management, the Chief Financial Officers (CFO) Act of 1990 (Public Law 101-576) required that DOD prepare financial statements for its trust funds, revolving funds, and commercial activities, including those of the Navy. In response to experiences gained under the CFO Act, the Congress concluded that agencywide financial statements contribute to cost-effective improvements in government operations. Accordingly, when the Congress passed the Government Management Reform Act of 1994 (Public Law 103-356), it expanded the CFO Act's requirement for audited financial statements by requiring that all 24 CFO Act agencies, including DOD, annually prepare and have audited agencywide financial statements, beginning with those for fiscal year 1996. The Government Management Reform Act authorizes the Director of the Office of Management and Budget to identify component organizations of the 24 CFO Act agencies that will also be required to prepare financial statements for their operations and have them audited. Consistent with the act's legislative history, the Office of Management and Budget has indicated that it will identify the military services as DOD components required to prepare financial statements and have them audited. Therefore, fiscal year 1996 is the first year for which the Navy will be required to prepare servicewide financial statements for its general funds. At September 30, 1994, the Navy's reported real property account balance was overstated by at least $24.6 billion because DFAS personnel had erroneously double counted $23.9 billion of structures and facilities and $700 million of land. The DFAS, Cleveland Center, personnel compiling these data did not realize that the Center had received some of the same land and building accounting information from two separate sources and had incorrectly included the information from both of them in the consolidated financial reports. To help mitigate situations such as this, in September 1995, the DFAS Director called for the DFAS center directors to take specific steps to increase emphasis on basic internal controls. In November 1995, the DOD Comptroller clarified that DFAS and the Navy are both required to perform quality control reviews of the financial reports and statements. We believe that full and effective implementation of these directives could help to prevent future occurrences of double counting, such as the one noted during our review. For example, if the Navy and DFAS had reviewed reported financial information in that case, they would have found that real property was overstated. The Navy Comptroller Manual, which governs accounting and financial policy for the Navy's plant property, classifies and lists Navy activities as involving either general fund operations or DBOF operations. The Navy and DFAS, Cleveland Center, did not have effective processes in place to ensure that all financial information on plant property from only general fund activities was included in the Navy's consolidated financial reports on general fund operations or that plant property from DBOF operations was excluded. To compile consolidated financial reports on the Navy's general fund operations, a basic control would be to ensure that the reported figures include financial information received from all of the Navy activities identified in the manual as involving general fund operations. However, neither the Navy nor DFAS, Cleveland Center, used the listing as a control to help ensure the accuracy and completeness of the Navy's fiscal year 1994 consolidated financial reports on general fund operations. Although the Navy Comptroller Manual needs updating, as discussed later, it was the best available information at the time of our review and listed 1,226 general fund activities at September 30, 1994. Our comparison of the list and the information used to compile the Navy's fiscal year 1994 consolidated financial reports on general fund operations showed that the reports (1) included $34.9 billion for plant property at 936 activities that the manual listed as general fund activities but (2) did not include an indeterminable amount of plant property for the other 290 activities listed in the manual. Also, the financial reports improperly included $1.9 billion in plant property that belonged to 21 Navy activities engaged in DBOF operations. We identified these activities through discussions with Navy and DBOF officials. The activities had mistakenly reported to DFAS that their plant property related to general fund operations, and neither the Navy nor DFAS, Cleveland Center, detected the error. Navy activities engaged in general fund operations report their plant property account balances to either the Defense Accounting Office (DAO)-Norfolk or DAO-San Diego (DFAS now refers to the DAOs as operating locations). These DAOs compile the activity-level data and submit it to DFAS, Cleveland Center, which prepares both financial reports on the Navy's general fund operations and Navy DBOF financial statements. The DAOs did not compare the listings of reporting activities with those listed in the Navy Comptroller Manual when accumulating the data. Nor did DFAS, Cleveland Center, consult the listings when consolidating the Navy's fiscal year 1994 financial reports on its general fund operations. Officials from both the Navy Comptroller's office and DFAS, Cleveland Center, told us that they had not used the listing when the fiscal year 1994 financial reports on the Navy's general fund operations were prepared because the listing was inaccurate and outdated. Our work verified that the listing was inaccurate and outdated. We found that the reported plant property account balance included $607 million related to 47 general fund activities that were not listed in the manual. Also, the reports included $739 million related to 57 activities that the manual indicated were no longer operating. Updating the manual is the joint responsibility of the Comptroller of the Navy; DFAS, Cleveland Center; and the Naval Industrial Resources Support Activity, which maintains and reports information on government furnished property. According to the Navy and DFAS, because of downsizing and consolidating of activities, updating the manual section on plant property reporting responsibilities was about a year behind schedule. In March 1996, we recommended that the Navy and DFAS require financial information to be reviewed thoroughly to determine its reasonableness, accuracy, and completeness. When implementing this recommendation, an updated Navy Comptroller Manual listing of general fund activities could be used to review the Navy's financial reports for accuracy and completeness. In concurring with the recommendation to thoroughly review this financial information, the DOD Deputy Chief Financial Officer said that the DOD Comptroller's November 1995 clarification of the finance and accounting roles and responsibilities of DOD components and DFAS requires a review of reported financial information. Thus, both the Navy and DFAS are now required to verify the accuracy and completeness of financial reports. Also, the September 1995 DFAS Director's guidance calls for ensuring that component reports of property, equipment, and inventory are promptly submitted and certified as to accuracy. The Navy's plant property account to control in-transit property and incomplete capital improvements (plant property work-in-progress) had a highly questionable $291 million balance. We found that (1) some Navy and DFAS activities were not properly recording plant property work-in-progress transactions and (2) many Navy activities had difficulty resolving millions of dollars of in-transit property recorded in their plant property work-in-progress accounts. Consequently, these accounts were not useful in providing accurate information to ensure the prompt receipt of in-transit property or monitoring the completion of capital improvements, as intended. The plant property work-in-progress account is designed to temporarily account for both nonmilitary equipment a Navy activity has paid for but not yet received and incomplete capital improvements to existing Navy-owned buildings. The Navy Comptroller Manual specifies that all plant property assets are to be recorded first in a work-in-progress account, with the balance then transferred to a plant property on-hand account within 2 months of in-transit property being received or 6 months of capital improvements being completed. First, we found the following instances where the Navy and DFAS were not properly recording plant property work-in-progress transactions in accordance with the Navy Comptroller Manual's requirements. The Naval Sea Systems Command and the Naval Air Systems Command miscoded disbursement transactions for nonmilitary equipment purchases by 75 Navy activities. As a result, the disbursements for these assets were recorded as neither plant property work-in-progress nor nonmilitary equipment but erroneously as expenditures for consumable items. The plant property accounting staff at the Naval Submarine Base in Bangor, Washington, stated they were unaware of the requirement to, and thus did not, record incomplete capital improvements to existing buildings in the plant property work-in-progress account. As a result, for example, $290,000 relating to 22 garages being added to on-base housing had not been recorded in the base's plant property work-in-progress account. DAO-San Diego's computer system was not programmed to record construction on existing buildings to a Navy activity's plant property work-in-progress account. Thus, its work-in-progress account balance did not accumulate the correct data for these assets. When situations such as these occur, the Navy's financial reports are misstated. Further, the failure to properly use plant property work-in-progress accounts essentially circumvents an internal control feature designed to help ensure that nonmilitary equipment in-transit is received and to help monitor completion of capital improvement projects. Second, our analysis of the $291 million plant property work-in-progress reported on the Navy's fiscal year 1994 consolidated financial reports on general fund operations showed that about 73 percent, or $211.2 million, was related to five Navy activities. In at least the following two cases, the September 30, 1994, reported plant property work-in-progress account balances were questionable. The Naval Intelligence Command reported over $84 million in plant property work-in-progress, which is (1) an increase of more than 2,000 percent from the prior year and (2) inconsistent with the $370,000 account balance it reported for nonmilitary equipment and the $0 balance reported for other real property. The Naval Criminal Investigative Service reported over $30 million in plant property work-in-progress, which is (1) an increase of more than 165 percent over the year before and (2) inconsistent with the Service's other reported plant property--about $400,000 in nonmilitary equipment. We discussed with officials of these activities the questionable nature of the amounts recorded for these accounts, which could have been identified by comparing year-to-year balances. They confirmed that these account balances were incorrect and said that the activities were attempting to resolve them. Further, our visits at other Navy activities identified additional instances where plant property work-in-progress accounts had grown substantially and resolving the large outstanding balances was a problem. Examples include the following: At the Fleet Combat Training Center-Atlantic, Virginia Beach, Virginia, the plant property work-in-progress account balance had been reported at about $29 million for 2 consecutive fiscal years ending with September 30, 1993, and had increased during the following 6 months to over $62 million. A concerted effort by the Center's civil engineering staff reduced this amount, but at September 30, 1994, over $34 million remained in the account. At the Tactical Training Group-Atlantic, Virginia Beach, Virginia, the plant property official said that resolving plant property work-in-progress was a problem. For instance, a persistent effort by the Center from November 1991 to September 1993, was necessary to fully resolve $3.5 million in transactions recorded in its plant property work-in-progress account as relating to land and buildings. The group owns no land or buildings and less than $200,000 in nonmilitary equipment. Plant property officials at other Navy activities--including those at the Naval Base in Norfolk, Virginia; the Naval Air Station in Millington, Tennessee; and the U.S. Naval Academy in Annapolis, Maryland--pointed to several factors contributing to problems such as these and making their resolution difficult. They told us, for example, that DAOs assign plant property work-in-progress to Navy activities when payments are made for such items. Quarterly plant property reports to Navy activities from the DAOs show amounts for all types of plant property, including work-in-progress. To identify items to be transferred to a plant property on-hand account, the activities are to match these reports with property received and construction completed. However, the detailed supporting records needed for this comparison, such as the disbursing vouchers the DAOs prepare, are often not available at the activity level. Also, they told us that large plant property work-in-progress account balances can result from data coding errors made by DAO disbursing personnel, causing in-transit property and incomplete construction to be recorded in the wrong activity's property records. These officials and DFAS accounting personnel said that errors can go undetected, and thus not be resolved, for years because, for instance, (1) they require a significant amount of time to identify and correct and are often given a low priority and (2) property accounting clerks lack training on resolving outstanding transactions. The Navy and DFAS maintain separate logistical, custodial, and accounting records for real property, which comprises more than a reported $17 billion in land, structures, and facilities. We found that information is entered separately into each of these three independently maintained sets of records. They are often not reconciled on a timely basis or, in some instances, never reconciled, resulting in undetected and uncorrected errors and unreliable financial information. The Naval Facilities Engineering Command (NAVFAC) maintains logistical records of real property located at all Navy activities. Because the commanding officer of each Navy activity is accountable for real property under his or her custody, each activity maintains real property custodial records. DFAS, through the DAOs, maintains the Navy's official real property accounting records. The Navy Comptroller Manual requires Navy activities to quarterly compare their real property custodial records with (1) official Navy accounting records and (2) NAVFAC logistical records. Any errors identified through these reconciliations are to be investigated and corrected. The Navy's consolidated financial reports on general fund operations at September 30, 1994, included $17.2 billion as the account balance for real property. This information was prepared using the Navy's official accounting records, which included the real property for 371 Navy activities. However, as of the same date, NAVFAC's logistical records included information on 406 general fund activities reporting $17.7 billion of real property. To determine the reasons for this difference, we reviewed the real property records at 10 activities that, for fiscal year 1994, had a total difference of $203 million between DFAS records and NAVFAC records. The following illustrates the types of errors identified at these activities. After the Boston Naval Shipyard was closed in the 1970s, NAVFAC removed the balance of the shipyard's real property accounts. However, DAO-Norfolk officials said they had not been notified of the shipyard's closing; thus, they had not removed the shipyard's $52 million in real property from DAO records. According to NAVFAC records, the Naval Training Center in Bainbridge, Maryland, had $37 million in land and buildings on-hand but under sales contract. However, Navy officials told us that this real property was excluded from the Navy's fiscal year 1994 financial reports because, before the sales contract was executed, DAO-Norfolk erroneously removed the activity from the list of reporting activities. Conversely, NAVFAC's records included $18.9 million for Bainbridge Training Center buildings that had been demolished. DAO and NAVFAC records were corrected when we advised officials of these errors. At DAO-Great Lakes, where the Navy's real property accounting records differed from NAVFAC logistic records by $124 million at September 30, 1994, plant property accounting staff did not demonstrate a basic understanding of Navy and DFAS plant property accounting and reconciliation procedures. In one case, for example, the DFAS staff said that a Navy activity did not tell them a difference existed. In another instance, we were told that a DFAS supervisor could not find property records to support an activity's reported plant property. Rather than contact the activity, the staff stopped reporting the property. Problems such as these are long-standing. In 1989, we recommended that the Navy's financial records and NAVFAC's central inventory of real property be reconciled to identify errors and help ensure accuracy. The Naval Audit Service has consistently reported similar problems in its audits of Navy DBOF financial statements under the CFO Act. For example, these audits found that the failure to reconcile Navy DBOF records and NAVFAC records resulted in a $134 million understatement of real property in Navy DBOF fiscal year 1992 financial statements. Differences were found between these records in fiscal years 1991 and 1994 as well. Most recently, in March 1996, we recommended that the Navy and DFAS place a high priority on implementing basic required financial controls, including reconciliations of accounts and records. The DOD Deputy Chief Financial Officer agreed with our recommendation and said that the DOD Comptroller's November 1995 guidance specifies the roles and responsibilities of DFAS and its customers with respect to reconciliations and resolution of discrepancies. Additionally, the September 1995 DFAS Director's guidance addresses DFAS's responsibility for performing reconciliations of account balances. The Navy's fiscal year 1994 accounting and reporting for plant property were highly unreliable. Accurately reporting the Navy's plant property account balance is especially important to help ensure the reliability of the consolidated financial statements DOD is statutorily required to prepare, beginning with those for fiscal year 1996. The recommendations we made in March 1996 were directed at avoiding the mistakes made in preparing the Navy's fiscal year 1994 consolidated financial reports and overarch many of the basic control weaknesses discussed in this report. These weaknesses underscore the need for the Navy and DFAS to fully and effectively implement the improvements that we recommended and that are required by the DOD Comptroller's and the DFAS Director's recent guidance. Additional specific actions are also necessary to improve plant property accounting and reporting. We recommend that the Navy Assistant Secretary for Financial Management and Comptroller and the DFAS Director require that by September 30, 1996, the Navy Comptroller Manual provision that lists the Navy's activities engaged in general fund operations and DBOF operations be updated and accurately maintained; the Navy and DFAS, Cleveland Center, use this listing as one analytical procedure to help ensure that the plant property account balances reported in the Navy's financial reports are complete and include information from only general fund activities; Navy activities and DFAS routinely monitor plant property work-in-progress accounts and promptly review and resolve large balances; Navy activities promptly request, and DFAS expeditiously provide, information to assist in transferring plant property work-in-progress items to on-hand accounts and in correcting errors; and Navy activities and DFAS personnel be trained to identify and resolve work-in-progress and other plant property problems. In written comments on a draft of this report, DOD generally concurred with our findings and recommendations. DOD said that groups have been established to identify and resolve issues involving the consistency of report information and establish and monitor a plan of action and milestones for improving property reporting and accounting. Also, DOD said that DFAS, Cleveland, has begun a training program for the plant property staff at various DAOs. DOD concurred with each of our recommendations and cited several planned corrective measures. For example, DOD said that improvements will be made to accurately maintain and periodically update information on all Navy activities that own plant property; develop a checklist to identify Navy and Marine Corps activities engaged in general fund operations, which will be used to help ensure that Navy reports provided to DFAS, Cleveland, are complete and include the appropriate general fund reporting activities; reiterate to all DFAS and Navy activities the policy on clearing work-in-progress accounts and ensure that work-in-progress information is promptly reconciled and recorded in DFAS financial records; and train plant property personnel, which has already begun at several DFAS locations. DOD concurred with two of our four findings. DOD partially concurred with two of the findings because it said that references were unclear for two figures cited in our draft report: (1) the 1,226 general fund activities shown in the Navy Comptroller Manual at the time of our review and (2) the $291 million plant property work-in-progress account balance. We provided a DFAS, Cleveland, representative with specific references in the Navy Comptroller Manual and the Navy's consolidated financial statements for fiscal year 1994 that we used as sources for these data. Also regarding our findings, DOD said that DFAS is emphasizing the need for internal and quality controls, such as identifying Navy and Marine Corps activities engaged in general fund operations. DOD also said that it is the goal of DFAS, the Navy, and the Marine Corps to develop and implement automated and integrated system interfaces for tracking work-in-progress accounts. Further, DOD said that the Navy recognizes that it should have removed property it no longer maintained from Navy records but had failed to do so. DOD said that most of its planned corrective actions will be accomplished within the next year and that many are planned to be completed by September 30, 1996. We believe that DOD's planned actions will fulfill the intent of our recommendations. Adhering to the projected completion schedule will help to improve the accuracy and completeness of the Navy's financial statements for general fund operations for fiscal year 1996 and subsequent fiscal years. The full text of DOD's comments is provided in appendix II. Our work was done as part of a broad-based review of various aspects of the Navy's financial management operations between August 1993 and February 1996 and was conducted in accordance with generally accepted government auditing standards. Our scope and methodology are discussed in appendix I and the locations where we conducted audit work are listed in appendix III. We are sending copies of this report to the Chairmen and the Ranking Minority Members of the Senate Committee on Governmental Affairs and the House Committee on Government Reform and Oversight, as well as its Subcommittee on Government Management, Information, and Technology. We are also sending copies to the Secretary of Defense, the Secretary of the Treasury, and the Director of the Office of Management and Budget. We will make copies available to others upon request. If you or your staffs have any questions, please contact me at (202) 512-9095. Major contributors to this report are listed in appendix IV. To gain an understanding of the systems and procedures used to account for and report on plant property, we reviewed applicable Navy Comptroller guidance, DOD and DFAS regulations, and instructions promulgated by Navy commands and activities. Also, we interviewed cognizant Navy, DFAS, and Treasury officials and discussed plant property management and reporting with cognizant Navy shore activity officials. To evaluate the DFAS, Cleveland Center's, process for compiling the Navy's plant property account balance, we obtained and analyzed the detailed schedules for the fiscal years 1993 and 1994 Navy plant property account balance reported by DFAS, Cleveland Center, and its DAOs. Specifically, we compared the number of Navy activities reporting general fund plant property to those listed in the Navy Comptroller Manual, volume 2, chapter 5; compared the account balance of each reporting activity for the 2 fiscal years to identify trends or fluctuations; and traced the reported account balance to the supporting documentation from the DAOs. We visited NAVFAC, Alexandria, Virginia, its Facilities Support Office in Port Hueneme, California, and its Southwest Engineering Field Division, San Diego, California, to examine how NAVFAC's central real property database (the Navy Facility Assets Data Base) works and interfaces with Navy activities and DAOs for reporting on land, facilities, and structures. We also visited the Naval Industrial Resources Support Activity in Philadelphia, Pennsylvania, to determine what property it reported to DFAS, Cleveland Center, for inclusion in the Navy's financial reports. To analyze the amounts reported by Navy for plant property work-in-progress, we obtained the plant property amounts reported for each activity by class--land, buildings, nonmilitary equipment, and work-in-progress. We contacted seven of the activities whose plant property work-in-progress amount appeared to be incorrect when compared with its other reported plant property amounts. At the activities we visited (see appendix III), we examined property accounting procedures and compliance with Navy Comptroller requirements, such as accounting for work-in-progress, reconciliations, and physical inventories. To compare and analyze the account balances and reporting activities among different sources of data that should agree, we obtained the consolidated financial report on general fund operations on real property as reported to DFAS, Cleveland Center, and compared it to NAVFAC's real property logistics records. For September 30, 1993 and 1994, we compared the detail of the reported account balances of land and facilities provided by DFAS, Cleveland Center, with those in NAVFAC's records to determine if they agreed. We did not verify the accuracy of the information in NAVFAC's database because, at the time of our work, the Naval Audit Service was reviewing the reasonableness of the database for estimating costs and savings resulting from base closure and realignment recommendations. In a February 1995 report, The Navy's Implementation of The 1995 Base Closure and Realignment Process, the Service said that the NAVFAC database was a reasonably accurate source of information for that purpose. We requested comments on a draft of this report from the Secretary of Defense or his designee. The DOD Deputy Chief Financial Officer provided us with written comments, which are discussed in the "Agency Comments and Our Evaluation" section and reprinted in appendix II. The following is GAO's comment on the Department of Defense letter dated June 14, 1996. 1. A representative of DFAS, Cleveland, contacted us regarding this figure and, on May 16, 1996, we provided additional information as to its source. DFAS, Cleveland, did not indicate that further clarification was necessary. Pat L. Seaton Catherine W. Arnold Julianne Hartman Cutts Karlin I. Richardson Patricia J. Rennie The first copy of each GAO report and testimony is free. Additional copies are $2 each. Orders should be sent to the following address, accompanied by a check or money order made out to the Superintendent of Documents, when necessary. VISA and MasterCard credit cards are accepted, also. Orders for 100 or more copies to be mailed to a single address are discounted 25 percent. U.S. General Accounting Office P.O. Box 6015 Gaithersburg, MD 20884-6015 Room 1100 700 4th St. NW (corner of 4th and G Sts. NW) U.S. General Accounting Office Washington, DC Orders may also be placed by calling (202) 512-6000 or by using fax number (301) 258-4066, or TDD (301) 413-0006. Each day, GAO issues a list of newly available reports and testimony. To receive facsimile copies of the daily list or any list from the past 30 days, please call (202) 512-6000 using a touchtone phone. A recorded menu will provide information on how to obtain these lists. | GAO reviewed the Navy's fiscal year (FY) 1994 consolidated financial reports, focusing on the areas contributing to the inaccurate financial reporting of the Navy's plant property account balance. GAO found that: (1) substantial weaknesses in the Navy's financial reporting systems caused the Navy to submit inaccurate FY 1994 financial reports; (2) the Defense Finance and Accounting Service (DFAS) erroneously counted $23.9 billion of structures and facilities and $700 million of land twice because it received the information from two separate sources and incorrectly included the information from both sources in the consolidated reports; (3) the Navy failed to ensure that all plant property from general fund activities was included in or that plant property from Defense Business Operations Fund (DBOF) activities was excluded from the reports because the list of general fund activities was outdated; (4) DFAS did not compare the activities included in the reports with the list of general fund activities when it consolidated the Navy's 1994 financial reports; (5) the Navy's reporting of the $291 million plant property work-in-progress balance was highly questionable because not all transactions were properly recorded, and Navy activities found it difficult to resolve in-transit property transactions; (6) the Navy did not reconcile all of its logistics, custodial, and accounting records on a timely basis; and (7) the Navy and DFAS have taken actions to improve their internal controls, verify the accuracy and completeness of financial information, and reconcile plant property accounts. | 6,096 | 315 |
In 2004, President George W. Bush announced his Vision for Space Exploration that included direction for NASA to pursue commercial opportunities for providing transportation and other services to support the space station after 2010. When the project was established in 2005, the approach that NASA laid out was a marked change in philosophy for how the agency planned to service the space station--by encouraging innovation in the private sector with the eventual goal of buying services at a reasonable price. As a result, the agency chose to utilize its other transaction authority under the National Aeronautics and Space Act of 1958, as opposed to a more traditional Federal Acquisition Regulation (FAR) based contract. Generally speaking, other transaction authority enhances the government's ability to acquire cutting-edge science and technology, in part through attracting companies that typically have not pursued government contracts because of the cost and impact of complying with government procurement requirements. These types of agreements are not considered federal government contracts, and are therefore generally not subject to those federal laws and regulations that apply to federal government contracts. NASA established the Commercial Crew and Cargo program office at Johnson Space Center in 2005 and budgeted $500 million for fiscal years 2006 through 2010 for the development and demonstration of cargo transport capabilities. COTS partners, Orbital Sciences Corporation (Orbital) and Space Exploration Technologies Corporation (SpaceX), have also made significant investments in developing these capabilities. The COTS project was originally intended to be executed in two sequential phases: (1) private industry development of cargo transport capabilities in coordination with NASA and (2) procurement of commercial resupply services to the space station once cargo transport capabilities had been successfully demonstrated. In August 2006, NASA competitively awarded a $278 million Space Act agreement to SpaceX to develop and demonstrate end-to-end transportation systems, including the development of the Falcon 9 launch vehicle and Dragon spacecraft, ground operations, and berthing with the space station. In February 2008, NASA awarded a $170 million Space Act agreement to Orbital to develop two COTS cargo capabilities, unpressurized and pressurized cargo delivery and disposal, to culminate in one demonstration flight of its Taurus II launch vehicle and Cygnus spacecraft. Before either partner had successfully demonstrated its COTS cargo transport capabilities, the International Space Station program office awarded two CRS contracts in December 2008 to Orbital and SpaceX under a separate competitive procurement from COTS. These FAR-based contracts were for the delivery of at least 40 metric tons (approximately 88,000 pounds) to the space station between 2010 and 2015. Orbital was awarded 8 cargo resupply missions for approximately $1.9 billion and SpaceX was awarded 12 cargo resupply missions for approximately $1.6 billion. In June 2009, we found that while SpaceX and Orbital had made progress against development milestones, the companies were working under aggressive schedules and had experienced schedule slips that delayed upcoming demonstration launch dates by several months. In addition, we reported that the vehicles being developed through the COTS project were essential to NASA's ability to fully utilize the space station after its assembly was completed and the space shuttle was retired. Finally, we found that NASA's management of the COTS project generally adhered to critical project management tools and activities. Since our 2009 report, the two COTS project partners, Orbital and SpaceX, have made progress in the development of their respective vehicles. SpaceX successfully flew its first COTS demonstration mission in December 2010 and Orbital is planning to fly its COTS demonstration mission in December 2011. Both providers, however, are behind schedule--SpaceX's first COTS demonstration mission slipped 18 months and Orbital's first mission was initially planned for March 2011. Such delays are not atypical of development efforts, especially efforts that are operating under such aggressive schedules. Nonetheless, the criticality of these vehicles to the space station's operations, as well as NASA's ability to affordably execute its science missions has heightened the importance of their timely and successful completion and lessened the level of risk that NASA is willing to accept in this regard. As a result, the project recently requested and received an additional $300 million to augment the partner development efforts with, according to NASA, risk reduction milestones. SpaceX has successfully completed 18 of 22 milestones to date, but has experienced lengthy delays in completing key milestones since we last reported on the company's progress in June 2009. SpaceX's agreement with NASA established 22 development milestones that SpaceX must complete in order to successfully demonstrate COTS cargo capabilities. SpaceX's first demonstration mission readiness review was completed 15 months behind schedule and its successful first demonstration mission was flown in December 2010, 18 months late. The company's second and third demonstration missions have been delayed by almost 2 years to November 2011 and January 2012, respectively. Several factors contributed to the delay in SpaceX's first demonstration mission readiness review and demonstration mission. These factors include, among others, delays associated with (1) launching the maiden Falcon 9 (non-COTS mission), such as Falcon 9 software and database development; (2) suppliers; (3) design instability and production; (4) Dragon spacecraft testing and software development; and (5) obtaining flight safety system approval. For example, SpaceX encountered welding issues during production of the Dragon propellant tanks and also had to redesign the Dragon's battery. In preparing for its second COTS demonstration flight, SpaceX has experienced additional design, development, and production delays. For example, several propulsion-related components needed to be redesigned, the Dragon spacecraft's navigation sensor experienced development testing delays, and delays were experienced with launch vehicle tank production. For example, SpaceX's decision to incorporate design changes to meet future CRS mission requirements has delayed the company's second demonstration mission. Integration challenges on the maiden Falcon 9 launch and the first COTS demonstration mission have also kept SpaceX engineers from moving on to the second COTS demonstration mission. SpaceX officials cited the completion of Dragon development efforts, NASA's safety verification process associated with berthing with the space station, and transitioning into efficient production of the Falcon 9 and Dragon to support space station resupply missions as key drivers of technical and schedule risk going forward. For completing 18 of the 22 milestones, SpaceX has received $258 million in milestone payments thus far, with $20 million yet to be paid. Appendix I describes SpaceX's progress meeting the COTS development milestones in its agreement with NASA. Orbital has successfully completed 15 of 19 COTS milestones to date--8 more than when we initially reported on the program in June 2009. Programmatic changes and developmental difficulties, however, have led to multiple delays of several months' duration and further delays are projected for completing the remaining milestones. For example, according to Orbital officials, the demonstration mission of Orbital's Taurus II launch vehicle and Cygnus spacecraft has been delayed primarily due to an increase in design effort to develop a pressurized cargo carrier in place of the original Cygnus unpressurized cargo design. After NASA awarded Orbital a CRS contract for eight pressurized cargo missions, NASA and Orbital amended their COTS demonstration agreement to replace the unpressurized cargo demonstration mission with a pressurized cargo demonstration. This delayed existing milestones, and the schedule was revised to shift the COTS demonstration mission from December 2010 to March 2011. Since that time, the schedule for some of Orbital's milestones has been revised again and the demonstration mission is now planned for December 2011. COTS program and Orbital officials also noted technical challenges as reasons for milestone delays. For example, Orbital officials said there are several critical Taurus II engine and stage one system tests that need to be completed by the end of the summer, but that the risk inherent in these tests is mitigated through an incremental approach to testing. Specifically, single engine testing has been successfully completed, and testing will be extended this summer to the full stage one (i.e., two-engine) testing. COTS program and Orbital officials also noted delays in Cygnus avionics manufacturing, primarily driven by design modifications aimed at increasing the safety and robustness of the system. According to these officials, integration and assembly of the first Cygnus spacecraft has begun and is now in the initial electrical testing phase. Additionally, the completion of the company's launch facilities at the Mid- Atlantic Regional Space Port in Wallops Island, Virginia, remains the key component of program risk. NASA COTS program and Orbital officials cite completion of the Wallops Island launch facilities as the critical factor for meeting the COTS demonstration mission schedule. Orbital officials said additional resources have been allocated to development of the launch complex to mitigate further slips, and an around-the-clock schedule will be initiated later this summer to expedite the completion of verification testing of the liquid fueling facility, which is the primary risk factor in completing the launch facility. For completing 15 of the 19 milestones, Orbital has received $157.5 million, with $12.5 million remaining to be paid. Appendix I depicts Orbital's progress in meeting the COTS development milestones in its agreement with NASA. In addition to the prior milestones negotiated under the COTS project, NASA has amended its agreements with SpaceX and Orbital to include a number of additional milestones aimed at reducing remaining developmental and schedule risks. COTS officials told us that some milestones reflect basic risk reduction measures, such as thermal vacuum testing, that NASA would normally require on launch vehicle or spacecraft development. A series of amendments were negotiated from December 2010 to May 2011 after Congress authorized $300 million for commercial cargo efforts in fiscal year 2011. These amendments add milestones to (1) augment ground and flight testing, (2) accelerate development of enhanced cargo capabilities, or (3) further develop the ground infrastructure needed for commercial cargo capabilities. These milestones were added incrementally due to NASA operating under continuing resolutions through the first half of fiscal year 2011. In May 2009, the President established a Review of U.S. Human Space Flight Plans Committee composed of space industry experts, former astronauts, government officials, and academics. In its report, the committee stated that it was concerned that the space station, and particularly its utilization, may be at risk after Shuttle retirement as NASA would be reliant on a combination of new international vehicles and as- yet-unproven U.S. commercial vehicles for cargo transport. The committee concluded that it might be prudent to strengthen the incentives to the commercial providers to meet the schedule milestones. NASA officials stated that if funding were available, negotiating additional, risk reduction milestones would improve the chance of mission success, referring specifically to the companies' COTS demonstration missions. Of the $300 million, $236 million, divided equally between SpaceX and Orbital, will be paid upon completion of the additional milestones. Additionally, NASA officials stated the International Space Station program office will pay SpaceX and Orbital $10 million each to fund early cargo delivery to the space station on the companies' final COTS demonstration missions. The COTS program manager stated that SpaceX and Orbital recognize their responsibility under the COTS agreements for any cost overruns associated with their development efforts, and that the companies did not come to NASA with a request for additional funding. SpaceX has completed 4 of its new milestones on time but has experienced minor delays in completing 3 others. SpaceX's agreement with NASA was amended three times between December 2010 and May 2011 to add 18 development milestones that SpaceX must complete in order to successfully demonstrate COTS cargo capabilities. Some of the new milestones, for example, are designed to increase NASA's confidence that SpaceX's Dragon spacecraft will successfully fly approach trajectories to the space station while others are intended to improve engine acceptance rates and vehicle production time frames. Milestones completed thus far include a test of the spacecraft's navigation sensor and thermal vacuum tests. For completing 7 of the 18 milestones, SpaceX has received $40 million in milestone payments thus far, with $78 million yet to be paid. Orbital has completed 4 of its 10 new milestones on schedule and 1 of the new milestones was delayed by about 1 month. In concurrence with NASA's request, Orbital agreed to add an initial flight test of the Taurus II launch vehicle to reduce overall cargo service risk. The test flight not only separates the risks of the first flight of Taurus II from the risks of the first flight of the Cygnus spacecraft, but provides the opportunity to measure the Taurus II flight environments using an instrumented Cygnus mass simulator. The Taurus II test flight is scheduled for October 2011. Overall technical risks associated with Cygnus development are expected to be reduced through additional software and avionics tests. Milestones completed thus far include early mission analyses and reviews, as well as delivery of mission hardware. For completing the first 5 new milestones, Orbital has received $69 million, with $49 million remaining to be paid. Appendix I describes SpaceX's and Orbital's progress meeting the new COTS development milestones in their agreements with NASA. Based on the current launch dates for SpaceX's and Orbital's upcoming COTS demonstration missions, it is likely that both commercial partners will not launch their initial CRS missions on time, but NASA has taken steps to mitigate the short-term impact to the space station. The launch window for SpaceX's first CRS flight is from April to June 2011 and from October to December 2011 for its second CRS flight. These launch windows are either scheduled to occur before or during SpaceX's upcoming COTS demonstration flights and thus will need to be rescheduled. In the case of Orbital, NASA officials told us that the launch window for its first CRS flight is from January to March 2012, but will likely slip from those dates given the Taurus II test flight added to its milestone schedule. NASA officials added that once SpaceX and Orbital have finished completing their COTS demonstration flights, NASA will have to renegotiate the number of flights needed from each partner and re- baseline the launch windows for future CRS missions. International Space Station program officials told us they have taken steps to mitigate the short-term impact of CRS flight delays through prepositioning of cargo on the last space shuttle flights, including cargo that is being launched on the planned contingency space shuttle flight in early July 2011. Officials added that these flights and the planned European Space Agency's Automated Transfer Vehicle and Japan's H-II Transfer Vehicle flights in 2012 will carry enough cargo to sustain the six person space station crew through 2012 and to meet science-related cargo needs through most of 2012. Despite these steps, NASA officials said they would still need one flight each from SpaceX's and Orbital's vehicles in order to meet science-related cargo needs in 2012. Beyond 2012, NASA is highly dependent on SpaceX's and Orbital's vehicles in order to fully utilize the space station. For example, we reported in April 2011 that 29 percent of the flights planned to support space station operations through 2020 were dependent on those vehicles. In addition, NASA officials confirmed that the agency has no plans to purchase additional cargo flights on Russian Progress vehicles beyond 2011 and the European Space Agency and the Japan Aerospace Exploration Agency have no current plans to manufacture additional vehicles beyond their existing commitments or to accelerate production of planned vehicles. We reported previously that if the COTS vehicles are delayed, NASA officials said they would pursue a course of "graceful degradation" of the space station until conditions improve. In such conditions, the space station would only conduct minimal science experiments. NASA's intended use of the COTS Space Act agreements was to stimulate the space industry rather than acquiring goods and services for its direct use. Traditional FAR contracts are to be used when NASA is procuring something for the government's direct benefit. NASA policy provides that funded Space Act agreements can only be used if no other instrument, such as a traditional FAR contract, can be used. Therefore, Space Act agreements and FAR-based contracts are to be used for different purposes. In considering the use of funded Space Act agreements for COTS, NASA identified several advantages. For example: The government can share costs with the agreement partner with fixed government investment. Payment to partner is made only after successful completion of performance-based milestones. The government can terminate the agreement if the partner is not reasonably meeting milestones. Limited government requirements allow optimization of systems to meet company's commercial business needs. These types of agreements can also have disadvantages, however. For example, Space Act agreements may have more limited options for oversight as compared to other science mission and human spaceflight development efforts that are accomplished under more traditional FAR contracts. NASA identified other disadvantages of using a Space Act agreement. For example: The government has limited ability to influence agreement partners in their approach. The government lacks additional management tools (beyond performance payments at milestones) to incentivize partners to meet technical and schedule performance. Given the intended goals of the project and the availability of alternative vehicles to deliver goods to the space station when the COTS agreements were signed, NASA was willing to accept the risks associated with the disadvantages of using a Space Act agreement. As the project has progressed, however, and these alternatives are no longer viable or available, NASA has become less willing to accept the risks involved. As a result, the agency took steps aimed at risk mitigation, primarily through additional funding. I would like to point out that neither Space Act agreements nor more traditional FAR contracts guarantee positive outcomes. Further, many of the advantages and disadvantages identified by NASA for using a Space Act agreement can also be present when using FAR-based contracts, depending on how the instrument is managed or written. For example, both a FAR contract and a Space Act agreement can provide for cost sharing and the government also has the ability to terminate a FAR contract or a Space Act agreement if it is dissatisfied with performance. The ineffective management of the instrument can be an important contributor to poor outcomes. For example, although a Space Act agreement may lack management tools to incentivize partners, we have reported in the past that award fees, which are intended to incentivize performance on FAR-based contracts, are not always applied in an effective manner or even tied to outcomes. Additionally, the oversight that NASA conducts under a FAR-based contract has not always been used effectively to ensure that projects meet cost and schedule baselines. Even with the advantages and disadvantages that can be present in various instruments, given a critical need, the government bears the risk for having to make additional investments to get what it wants, when it wants it. The additional investment required, however, can be lessened by ensuring that accurate knowledge about requirements, cost, schedule, and risks is achieved early on. We have reported for years that disciplined processes are key to ensuring that what is being proposed can actually be accomplished within the constraints that bind the project, whether they are cost, schedule, technical, or any other number of constraints. We have made recommendations to NASA and NASA is taking steps to address these recommendations to help ensure that these fundamentals are present in its major development efforts to increase the likelihood of success. Mr. Chairman, this concludes my prepared statement. I would be happy to respond to any questions you may have at this time. For questions about this statement, please contact me at (202) 512-4841 or [email protected]. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this testimony. Individuals making key contributions to this statement include Shelby S. Oakley, Assistant Director; Jeff Hartnett; Andrew Redd; Megan Porter; Laura Greifner; and Alyssa Weir. This is a work of the U.S. government and is not subject to copyright protection in the United States. The published product may be reproduced and distributed in its entirety without further permission from GAO. However, because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. | Since the National Aeronautics and Space Administration (NASA) created the strategy for the Commercial Orbital Transportation Services (COTS) project in 2005, the space landscape has changed significantly--the Space Shuttle program is retiring and the Ares I will not be available--increasing the importance of the timely development of COTS vehicles. The lack of alternatives for supplying the International Space Station and launching science missions have all contributed to an increased need for the COTS vehicles. The two COTS project partners, Orbital and SpaceX, have made progress in the development of their respective vehicles; however, both providers are behind schedule. As a result, the project recently received an additional $300 million to augment development efforts with risk reduction milestones. This testimony focuses on: (1) COTS development activities, including the recent funding increase; (2) the extent to which any COTS demonstration delays have affected commercial resupply services (CRS) missions and NASA's plans for meeting the space station's cargo resupply needs; and (3) lessons learned from NASA's acquisition approach for COTS. To prepare this statement, GAO used its prior relevant work and conducted additional audit work, such as analyzing each partner's agreement with NASA and interviewing NASA officials. New data in this statement was discussed with agency and company officials who provided technical comments, which we included as appropriate. SpaceX and Orbital continue to make progress completing milestones under their COTS agreements with NASA, but both partners are working under aggressive schedules and have experienced delays in completing demonstration missions. SpaceX successfully flew its first demonstration mission in December 2010, but the mission was 18 months late and the company's second and third demonstration missions have been delayed by almost 2 years due to design, development, and production challenges with the Dragon spacecraft and Falcon 9 launch vehicle. Orbital faced technical challenges developing the Taurus II launch vehicle and the Cygnus spacecraft and in constructing launch facilities, leading to multiple delays in completing program milestones, including its demonstration mission. NASA has amended its agreements with the partners to include a number of new milestones, such as additional ground and flight tests, to reduce remaining developmental and schedule risks; most of the new milestones completed thus far were finished on time, but many milestones remain. Based on the current launch dates for SpaceX's and Orbital's upcoming COTS demonstration missions, it is likely that neither will launch its initial CRS mission on time, but NASA has taken steps to mitigate the short-term impact to the space station. The launch windows for SpaceX's first and second CRS flights are scheduled to occur either before or during its upcoming COTS demonstration flights and will need to be rescheduled. Orbital's first CRS flight will also likely shift due to a Taurus II test flight. NASA officials said that the agency will have to renegotiate the number of flights needed from each partner and re-baseline the launch windows for future CRS missions once COTS demonstration flights are completed. NASA has taken steps to mitigate the short-term impact of CRS delays through prepositioning of cargo, some of which will be delivered on the last space shuttle flight. Despite these efforts, NASA officials said they would still need one flight in 2012 from SpaceX's and Orbital's vehicles to meet science-related cargo needs. In considering the use of a Space Act agreement for COTS, NASA identified several advantages. These advantages include sharing costs with agreement partners and promoting innovation in the private sector. A disadvantage, however, is that NASA is limited in its ability to influence agreement partners in their approach. At the time the agreements were awarded, NASA was willing to accept the risks of using a Space Act agreement given the goals of the project and alternative vehicles that were available to deliver goods to the space station. As the project has progressed, however, and these alternatives are no longer viable or available, NASA has become less willing to accept the risk involved and has taken steps aimed at risk mitigation. Given a critical need, the risk is present that the government will be required to make additional investments to meet mission needs. The amount of investment can be lessened by ensuring that accurate knowledge about requirements, cost, schedule, and risks is achieved early on. GAO has made recommendations to NASA and NASA is taking steps to help ensure that these fundamentals are present in its major development efforts to increase the likelihood of success. | 4,323 | 947 |
Under authority of the Inspector General Act of 1978, the Defense Criminal Investigative Service (DCIS) and the military criminal investigative organization within each of the services investigate alleged procurement fraud. NCIS has primary responsibility for investigating alleged procurement fraud affecting the Navy. Within the Department of Justice, the Federal Bureau of Investigation (FBI) investigates fraud. Each of these investigating agencies provides evidence to support the prosecuting authorities. Between January 1989 and July 1996, NCIS agents participated in over 114,000 criminal investigations. In March 1997, 113 NCIS fraud agents were involved in the investigation of 811 cases for crimes such as antitrust violations, cost mischarging, product substitution, and computer intrusion. Although NCIS agents generally investigate procurement fraud cases independently, investigative jurisdiction in 320 of the 811 cases, or about 39 percent, was shared with DCIS, FBI, and other military or civilian criminal investigative organizations. Agents interview individuals to obtain evidence in criminal investigations. An interview is the formal questioning of an individual who either has or is believed to have information relevant to an investigation. Interviews are normally conducted with willing witnesses and informants. An interrogation is a special type of interview that has an added purpose of securing an admission or confession of guilt regarding the commission or participation in the crime or obtaining pertinent knowledge regarding the crime. Interrogations are normally conducted with suspects or unwilling witnesses. According to NCIS officials, most testimonial evidence in fraud cases is acquired through interviews; however, policies covering areas such as agent demeanor and the display of weapons are the same whether the format of questioning is an interview or interrogation. Over the years, allegations have been made regarding the use of inappropriate interview techniques by NCIS agents when questioning suspects and witnesses. In January 1995, a Department of Defense (DOD) advisory board, commissioned by the Secretary of Defense to review criminal investigations within the agency, reported that it had heard complaints of abusive interview techniques by NCIS agents. In its report, the advisory board noted that several defense attorneys suggested that subjects should be provided with additional protection against potential abuses by requiring that all interviews be videotaped. NCIS interview policies are consistent with those of both DCIS and FBI. Generally, policies of all three agencies seek to ensure that interviews of witnesses and suspects are done in a professional manner without the use of duress, force, and physical or mental abuse. More specifically, these policies prohibit agents from making promises or threats to gain cooperation; using deceit, which courts could view as overcoming an interviewee's free will; or indiscriminately displaying weapons. A detailed comparison of the policies is in appendix I. To ensure that constitutional rights are not violated, NCIS, DCIS, and FBI policies elaborate on the rights of individuals as witnesses and suspects and provide guidance and direction to agents. For example, NCIS policies emphasize that both military and civilian suspects must be informed that they have a right to remain silent and to consult with an attorney and that any statement made may be used against them. In addition, NCIS policies address an individual's right to have counsel present and to terminate the interview at any time. Under 10 U.S.C. 1585 and DOD Directive 5210.56, civilian officers and DOD employees may carry firearms while on assigned investigative duties. NCIS and DCIS policies authorize agents, unless otherwise prohibited, to carry firearms when conducting criminal investigations. FBI policies also require agents to be armed when on official duty. NCIS, DCIS, and FBI policies do not specifically prohibit carrying firearms during interviews. NCIS agents told us that they usually carry weapons during interviews because of the organization's policy requiring that firearms be carried when conducting criminal investigations. However, NCIS policy states that agents should avoid any unnecessary reference to the fact that they are carrying a firearm. In March 1996 correspondence to all NCIS agents, NCIS Headquarters noted that references to the carrying of a firearm include not only verbal, but also physical references, including display of the firearm. DCIS and FBI policies also prohibit the careless display of firearms in public. NCIS policy states that, unless unusual conditions prevail, an agent should not be armed during an interrogation and that the presence of two agents is preferable. NCIS fraud agents told us that, unlike witness interviews, which are typically held at a home or place of employment, formal interrogations of suspects in general crime cases are usually held in a controlled environment in an NCIS field office or a custodial environment, such as a jail. Procurement fraud investigations are usually very long, the target of the investigation is known early in the investigation and has normally obtained legal counsel, and an Assistant U.S. Attorney communicates directly with the suspect's counsel. Interrogations in procurement fraud cases are rare due to the nature of the investigation. NCIS, DCIS, and FBI policies also address agent ethics, conduct, and demeanor during interviews. For example, NCIS policy states that interviews should be conducted in a business-like manner. DCIS policy likewise notes that, when conducting an interview, the agent should maintain a professional demeanor at all times and protect the rights of persons involved in a case, as well as protect himself or herself from allegations of misconduct. The FBI has similar policies regarding agent conduct and demeanor during interviews. NCIS requires an investigation of allegations of agent misconduct. Between January 1989 and July 1996, the NCIS Office of Inspections investigated 304 allegations against agents. However, only 10 cases involved agent conduct during the interview process, and none involved cases of procurement fraud. Corrective actions, ranging from required counseling to job termination, were taken against NCIS agents in the six cases that were substantiated. DOD and NCIS have also established controls to protect individual rights and act as deterrents to inappropriate agent conduct during interviews. These controls include basic and continued agent training; a field office inspection program; and DOD Inspector General oversight of NCIS investigations, including alleged misconduct by agents. The judicial review inherent in the legal process also acts as a deterrent to inappropriate agent behavior. NCIS agents receive considerable training on interview techniques and appropriate interview behavior. At the basic agent course given at the Federal Law Enforcement Training Center, NCIS agents receive 18 hours of instruction concerning interviewing techniques. During their first 24 months with the agency, agents are exposed to a wide range of general crime investigations as they work with and are evaluated by more experienced agents. After the first 24-month period, selected agents are given the opportunity to specialize in procurement fraud investigations. Additional procurement fraud-specific training, both internal and external, and additional interview training is given throughout an agent's career. The internal and external training is supplemented by correspondence issued periodically to agents on various subjects, including interviewing techniques, updates on policy or procedural changes as a result of court cases, or lessons learned from completed investigations. The 23 dedicated fraud agents we interviewed at NCIS field offices in Los Angeles and Washington, D.C., had been with NCIS for an average of 12 years and had worked in the fraud area for an average of 6-1/2 years. NCIS conducts regular operational inspections of headquarters and field locations. Two objectives of the inspections are to assess compliance with established policies and procedures and evaluate anomalies that prevent or inhibit compliance. NCIS guidelines require that these inspections include interviews with all agents and supervisors and a review of all ongoing case files and correspondence. In addition, inspections may include interviews with selected Assistant U.S. Attorneys, military prosecutors, and managers and agents of other federal criminal investigative agencies with whom NCIS agents work. Within 45 days of receipt of the inspection report, the special agent-in-charge of the field location is to report on actions taken, in progress, or proposed to correct all recommendations made during the inspection. Between January 1992 and December 1996, NCIS conducted 45 of these inspections. Our review of inspection reports for all 11 inspections conducted during the 3-year period ending December 1996, found no indications of problems with agent conduct regarding interviews. The Inspector General Act of 1978 gives the DOD Inspector General the responsibility for oversight of investigations performed by the military criminal investigative organizations, including NCIS. During the last 4 years, the DOD Inspector General completed oversight reviews of 29 NCIS cases involving allegations of misconduct against 11 NCIS agents. The Inspector General determined that none of these allegations were substantiated. In April 1996, the Secretary of Defense requested that the DOD Inspector General look into allegations of NCIS agent misconduct during a 4-year procurement fraud investigation that ended in acquittal of the two defendants in early 1995. At the time of our review, the inquiry into these allegations had not been completed. U.S. Attorneys and other prosecuting authorities rely on the results of NCIS investigations to be upheld in the courts. Under rights afforded under the fifth amendment to the U.S. Constitution and Article 31 of the Uniform Code of Military Justice, evidence acquired in violation of the rights of the accused can be inadmissible. Defendants and their attorneys have the right to petition the courts to suppress or exclude any evidence not legally obtained. In addition, civilian witnesses and suspects can bring civil suits against agents if they believe their rights have been violated or laws have been broken. According to the Navy's General Counsel, once a case is accepted for prosecution in federal court, the Assistant U.S. Attorney assumes responsibility for the investigation and determines the need for further investigation, the witnesses who will be interviewed, and the timetable for referring the case to the grand jury for indictment. Thus, the Assistant U.S. Attorney closely monitors the information obtained for its admissibility. We interviewed nine Assistant U.S. Attorneys, all of whom had many years of experience in working with NCIS agents. They characterized the NCIS agents as professional and could not recall any instances in which evidence was suppressed or cases were negatively impacted as a result of misconduct by NCIS agents during interviews. Some of the attorneys said they had attended interviews with NCIS fraud agents and observed nothing that was out of line. NCIS, DCIS, and FBI policies permit audio or video recordings of witness or suspect interviews in significant or controversial cases. However, little support exists for routine taping of interviews, except in particular kinds of cases. In fiscal year 1996, NCIS agents videotaped 56 interviews and 23 interrogations, 51 (or 65 percent) of which involved child abuse cases. Most of the remaining videotapings involved cases of assaults, homicides, and rapes. NCIS fraud agents said that they audiotape very few interviews. Neither DOD nor the Department of Justice favor routinely audio- or videotaping interviews. Both agencies believe that such a practice would not improve the quality of investigations or court proceedings and that the resources necessary to institute such a practice could be better used elsewhere. In its 1995 report, DOD's advisory board recognized that routine videotaping of interviews is a topic of debate within the law enforcement community. However, the board concluded that videotaping was unnecessary in all cases since its study found no widespread abuse of subjects' rights, but it might be advisable under some circumstances. The Navy's General Counsel, NCIS agents, and the Assistant U.S. Attorneys we spoke with expressed concern regarding the routine recording of interviews. They consider routine recording to be unnecessary because the courts do not require it; the practice would take time better used for more productive activities; and, given the large volume of cases, such recordings would be cost-prohibitive and add little value to the process. The Assistant U.S. Attorneys stressed that grand jury hearings and court proceedings are the most appropriate places to obtain testimonial evidence, since witnesses are under oath. NCIS agents and the Assistant U.S. Attorneys we spoke with favored the current NCIS policy of interviews being taped only when a specific reason exists for doing so. The attorneys favored recording interviews of small children in child abuse cases to preclude multiple interviews and possibly the need for the children to appear in court. The agents and attorneys also favored recording witnesses who were likely to be unavailable during court proceedings and those that might be expected to change their story. Officials told us that an NCIS pilot test of videotaping all interviews in the early 1970s did not support routine use because (1) the agents found that they were devoting disproportionate time and energy to the care of equipment rather than gathering facts; (2) the number and breadth of interviews declined, as did the overall quality of investigations; and (3) investigators' productivity decreased due to their inability to conduct a sufficient number of in-depth interviews. NCIS had not computed the additional cost of taping all interviews. However, the Navy's General Counsel noted that the expense of equipment, tapes, transcription, and duplication would be extremely high and could only be justified if no safeguards were already built into the legal system. As an example of the potential transcription cost that could be incurred, we were told that, in one case that was recorded, the interview lasted about 3 hours, filled 4 microcassettes, and ended up being 127 single-spaced typed pages. Information provided by the NCIS Los Angeles field office, one of the larger offices for procurement fraud cases, showed that about 7,600 interviews had been completed for the 117 cases assigned as of January 1997, which translates to an average of about 65 interviews per case. According to officials of the NCIS Washington, D.C., field office, 16 major procurement fraud cases that were essentially completed and awaiting some type of disposition had required 628 interviews--an average of about 39 interviews per case. NCIS closed 533 procurement fraud cases in fiscal year 1995 and 534 in fiscal year 1996. A 1990 study commissioned by the Department of Justice sought to determine the use of audio- and videotaping of interrogations by police and sheriff departments nationwide. The study concluded that videotaping was a useful tool and that one-third of the departments serving populations of 50,000 or more videotaped suspect interrogations and confessions in cases involving violent crime. The benefits claimed by the departments that taped interrogations and confessions included (1) better interrogations because agents prepared more extensively beforehand, (2) easier establishment of guilt or innocence by prosecutors, and (3) increased protection of subjects' rights against police misconduct. Local prosecutors tended to favor videotaping, but defense attorneys had mixed feelings. NCIS has no written policy that specifically addresses whether recordings or written transcriptions of interviews should be made available on demand to the subject or witness. However, NCIS, DCIS, and FBI policies regarding witness statements and confessions do not prohibit copies from being given to the individual making the statement. Also, a 1993 NCIS memorandum said that all witness statements must be provided to the defense counsel and that quotes from a witness are to be considered witness statements. The Assistant U.S. Attorneys we spoke with and NCIS officials believe that written transcripts of audio or video recordings, especially those taken during the early stages of an investigation, would not necessarily reflect all the known facts and might be misleading and subject to inappropriate use. Currently, interview writeups are not provided to witnesses or suspects for their review, since they are considered a summary of the interview results from the agent's perspective. According to the Navy's General Counsel, much of the information in interview writeups is likely to be irrelevant to the case after the issues are narrowed. This official also said that the potential increase in the accuracy of individual interviews would not contribute as much to the total accuracy of an investigation as verifying or disproving the information provided in initial interviews. DOD and the Department of Justice reviewed a draft of this report. The Department of Justice provided informal comments, which we incorporated as appropriate. DOD concurred with our findings. We interviewed officials responsible for fraud investigations at NCIS, DCIS, and FBI headquarters to identify policies and procedures relating to interviewing suspects and witnesses. We focused on the policies and procedures concerning agent conduct and demeanor, the carrying and display of weapons during interviews, and use of audio- and videotaping. To document actual NCIS interview practices, we interviewed fraud case supervisors and agents at the two NCIS field offices responsible for the highest number of closed procurement fraud investigations in fiscal years 1995 and 1996--Los Angeles and Washington, D.C. To determine whether NCIS policies are in line with generally accepted federal law enforcement standards, we compared NCIS interview policies, especially with regard to agent conduct and demeanor and the carrying and display of weapons, with those of DCIS and FBI--two of the larger federal law enforcement agencies involved in procurement fraud investigations. We also reviewed the Federal Law Enforcement Training Center's and NCIS internal training curriculum on interviews. In addition, we reviewed agent training records and discussed interview training with instructors at the Federal Law Enforcement Training Center and NCIS fraud supervisors and agents. To address agent adherence to guidance and identify controls in place to deter inappropriate agent conduct and demeanor during interviews, we interviewed NCIS headquarters officials and the Navy's General Counsel. Through discussions and document reviews, we compared these controls with those of DCIS and FBI. We reviewed cases of alleged agent misconduct investigated internally by NCIS' Office of Inspections and externally by the DOD Inspector General. We also reviewed and documented the results of the 11 operational inspections of NCIS field offices conducted since January 1994. In addition, we reviewed summaries of all NCIS procurement fraud cases closed during fiscal years 1995 and 1996. Regarding oversight of NCIS, we interviewed DOD Inspector General officials responsible for the oversight of NCIS investigative activities and examined cases of alleged NCIS agent misconduct that received oversight by the DOD Inspector General. We also reviewed documents regarding Navy policies and interviewed the Navy's General Counsel and the Navy's Principal Deputy General Counsel. The Assistant U.S. Attorneys we spoke with provided us with insight regarding the adequacy of policies and laws dealing with subject and witness interviews and the performance of NCIS agent interviewing practices, especially with regard to impact on the prosecution of procurement fraud cases. We discussed with NCIS and DCIS managers, NCIS agents, and Assistant U.S. Attorneys, the use of audio and video equipment to tape interviews and the desirability and feasibility of providing the transcripts to witnesses and subjects. We obtained the official positions of the Department of Justice and NCIS regarding these issues. We identified two studies that addressed using audio- and videotaping for recording interviews and discussed these issues with the studies' authors. We also discussed these issues with homicide detectives from one city police department that uses video equipment in interrogations. In addition, we discussed with appropriate DOD and Department of Justice officials any legal and practical ramifications of interviews being taped and transcriptions being provided to witnesses and suspects. We performed our work from July 1996 to March 1997 in accordance with generally accepted government auditing standards. We are sending copies of this report to other interested congressional committees; the Secretaries of Defense and the Navy; the General Counsel of the Navy; the Director of the Naval Criminal Investigative Service; and the Attorney General. Copies will also be made available to others on request. Please contact me at (202) 512-5140 if you or your staff have any questions concerning this report. Major contributors to this report are William E. Beusse, Hugh E. Brady, Kenneth Feng, Mark Speight, and Harry Taylor. Agents are required to carry firearms while on assigned investigative duties. Agents must carry firearms when conducting criminal investigations, except where prohibited or when carrying a firearm is inappropriate. Agents must be armed when on official duty, unless good judgment dictates otherwise. They are authorized to be armed anytime. Any unnecessary reference to the fact that an agent has a firearm on his or her person should be avoided. An agent should not be armed during an interrogation unless unusual conditions prevail. It is better to have two agents present than to be armed. Normally, agents may be armed during interviews because the policy requiring them to be armed while on investigative duties prevails. Area is not specifically addressed, but unnecessary display of firearms, which may heighten the sensitivity of non-law enforcement personnel, is prohibited. In addition, careless display of firearms in public is prohibited. Area is not specifically addressed, but unnecessary display of weapons in public is prohibited. Good judgment must be exercised in all situations. Military suspects must not be interrogated without having first been given the prescribed warning. For civilian suspects, Miranda warnings are applicable in custodial situations, and informing individuals of their right to terminate the interview at any time is required. In addition to the obligation to give the suspect the required warnings, agents are required to be familiar with civil and criminal laws and the Uniform Code of Military Justice so they can recognize an incriminating statement. In addition to the obligation to give the suspect the required warnings, the policies state that the suspect must be advised of the names and official identities of the interviewing agents and the nature of the inquiry. It is desirable that the suspects acknowledgement of the warnings be obtained in writing. Agents do not have the authority to make any promises or suggestions of leniency or more severe action to induce a suspect to make a statement. Agents must refrain from making or implying promises of benefits or rewards or threats of punishments to unlawfully influence the suspect. No attempt is to be made to obtain a statement by force, threats, or promises. Whether a suspect will cooperate is left entirely to the individual. The policies take into account that the court will decide whether the interrogation practices overpowered the accused's ability of self-determination. (continued) Although tricks or other tactics may not be used to prevent a suspect from exercising constitutional rights, once a suspect makes a valid waiver of rights, deceptions are allowable as long as they are not used to obtain an untrue confession. Playing one suspect against another is an allowable interrogation technique. However, agents must ensure that information developed conforms to rules regarding admissibility of evidence and that the rights of persons involved in a case are protected. The presence of trickery, ruse, or deception will not necessarily make a statement involuntary. The courts consider a number of factors in making this determination, including whether the statement resulted from a free and unconstrained choice or from interrogation practices that overpowered the individual's ability of self-determination. Interrogations should be conducted in a business-like and humane manner. Legal restrictions are based on the premise that a person will make false statements to stop any physical or mental discomfort. Agents should be friendly and business-like and maintain a professional demeanor at all times. Agents should also be receptive and sympathetic. Policies prohibit any tactics that may be considered coercive by courts, stressing that tactics that overpower a suspect's ability of self-determination should not be used. Recommended for interviews considered to be potentially significant or controversial but only with the knowledge and concurrence of the interviewee. Recommended for compelling situations with approval from the interviewee, the head of the DCIS field office, and prosecutor. Authorized on a limited, selective basis with approval of the special agent-in-charge and consent of the interviewee. In addition, recording equipment must be in plain view of the interviewee, tapes must not be edited or altered, and the chain of custody must be ensured. No policy. When the individual making a statement asks for a copy, one will be provided. However, prior approval for doing so must be obtained from the cognizant U.S. Attorney or military Staff Judge Advocate, as appropriate. Agents should not volunteer to furnish a copy of a confession or signed or unsigned statement to the subjects or their attorneys. However, if the confession or statement is requested and certain conditions are met, it should be provided. No policy. A determination is made on a case-by-case basis by the U.S. Attorney. No policy. A determination is made on a case-by-case basis by the U.S. Attorney. No policy. A determination is made on a case-by-case basis by the U.S. Attorney. The first copy of each GAO report and testimony is free. Additional copies are $2 each. Orders should be sent to the following address, accompanied by a check or money order made out to the Superintendent of Documents, when necessary. VISA and MasterCard credit cards are accepted, also. Orders for 100 or more copies to be mailed to a single address are discounted 25 percent. U.S. General Accounting Office P.O. Box 6015 Gaithersburg, MD 20884-6015 Room 1100 700 4th St. NW (corner of 4th and G Sts. NW) U.S. General Accounting Office Washington, DC Orders may also be placed by calling (202) 512-6000 or by using fax number (301) 258-4066, or TDD (301) 413-0006. Each day, GAO issues a list of newly available reports and testimony. To receive facsimile copies of the daily list or any list from the past 30 days, please call (202) 512-6000 using a touchtone phone. A recorded menu will provide information on how to obtain these lists. | Pursuant to a legislative requirement, GAO reviewed the Naval Criminal and Investigative Service's (NCIS) policies and practices regarding agent interviews of suspects and witnesses during procurement fraud investigations, focusing on: (1) NCIS's policies on interviewing, including agent conduct and demeanor and the carrying and display of weapons; (2) controls to deter inappropriate conduct by agents; and (3) the desirability and feasibility of audio- or videotaping interviews and making the recording or transcription available to the person interviewed. GAO noted that: (1) according to federal law enforcement experts, NCIS interview policies are in accordance with generally accepted federal law enforcement standards and applicable laws; (2) specifically, NCIS interview policies prohibit the indiscriminate display of weapons or the use of threats, promises, inducements, or physical or mental abuse by agents attempting to influence an individual during interviews; (3) NCIS has established controls to deter, detect, and deal with agent misconduct; (4) NCIS agents are trained in interview policies at their initial training at the Federal Law Enforcement Training Center and through in-house and contractor training; (5) other controls include periodic inspections of NCIS field offices, internal investigations of alleged agent misconduct, oversight of cases and allegations of agent misconduct by the Department of Defense (DOD) Inspector General, and the involvement of the U.S. Attorney's offices in grand jury investigations and prosecutions; (6) furthermore, judicial review of evidence presented also acts as a deterrent to inappropriate agent conduct since inappropriate or illegal behavior may result in the evidence obtained not being admissible in court; (7) the DOD Inspector General and NCIS could identify only six cases since January 1989 in which misconduct was substantiated, and none of those cases involved procurement fraud investigations; (8) NCIS policies do not prohibit audio- or videotaping of interviews or distributing the written or taped results to the interviewee; (9) the NCIS does not routinely tape interviews; (10) officials from NCIS, the Defense Criminal Investigative Service, the Federal Bureau of Investigation, and selected Assistant U.S. Attorneys did not support the idea of routinely taping interviews; (11) NCIS considers routine taping of interviews to be unjustified, given the equipment and transcription costs and the large volume of interviews associated with procurement fraud investigations; (12) DOD and Department of Justice officials noted that routine audio- or videotaping would not improve the quality of the investigation or court proceedings; and (13) the DOD advisory board agreed that the routine taping of interviews was unnecessary, given the lack of evidence supporting a widespread abuse of subjects' rights by agents from military criminal investigative organizations. | 5,509 | 569 |
To date, the commercial space launch industry has primarily focused on putting payloads, such as satellites, into orbit, using launch vehicles that are used only once. The number of launches for this purpose has, however, dropped off, and the industry appears to be increasing its focus on space tourism. Apart from the five manned flights in 2004, efforts thus far have consisted of tests for research and development purposes, but companies are continuing to develop vehicles for manned flights. Concurrently, companies and states are developing additional spaceports to accommodate anticipated commercial space tourism flights, with states providing economic incentives for development. As part of FAA's mission to promote the commercial space industry, federal funds have also supported infrastructure development at one spaceport. There are three main types of space launches--national security, civil, and commercial. National security launches are by the Department of Defense for defense purposes, and civil launches are by NASA for scientific and exploratory purposes. Commercial launch companies compete domestically and internationally for contracts to carry payloads, such as satellites, into orbit using expendable launch vehicles, which are unmanned, single-use vehicles. Except for the launches of SpaceShipOne in 2004, U.S. commercial space launches have been unmanned. Designed to carry crew and one passenger, SpaceShipOne was the first commercial reusable launch vehicle mission licensed by FAA. After reaching a peak of 22 launches in 1998 (see fig. 1), the number of commercial space launches began to fluctuate and generally decline following a downturn in the telecommunications services industry, which was the primary customer of the commercial space launch industry. In the last several years, two trends have emerged. First, there has been a drop- off in U.S. commercial orbital launches. In part, this may be because the U.S. commercial space launch industry is not price competitive with foreign companies, some of which receive extensive government support, according to Department of Commerce officials. Second, FAA began issuing experimental permits in 2006 to companies seeking to conduct test launches of reusable launch vehicles. According to industry experts that we spoke with, over the past 3 years the commercial space launch industry has experienced a steady buildup of research and development efforts, including ground tests and low-altitude flight tests of reusable rocket- powered vehicles that are capable of takeoffs and landings. Manned commercial space launches took place for the first and only time with the five manned flights of SpaceShipOne in 2004. Although additional manned flights were anticipated, they have not materialized since we issued our report in 2006. A number of companies--including Scaled Composites, which is developing SpaceShipTwo--are continuing to develop vehicles for manned flights, but they are not yet developed to a testing stage, which would require a launch license or experimental permit. Since we reported in 2006, private companies and states are developing additional spaceports to accommodate anticipated commercial space tourism flights and to expand the nation's launch capacity. In 2006, there were six FAA-licensed spaceports and eight proposed spaceports. Since then, one of the proposed spaceports (Spaceport America in New Mexico) has begun operating and one (Gulf Coast Regional Spaceport) has terminated its plans. Two new spaceports in Florida have applied for FAA licenses. Figure 2 shows the existing and proposed spaceports and federal launch sites used for commercial launches. States have provided economic incentives to developers--including passing legislation to decrease liability and lower the tax burden for developers, according to FAA--to build spaceports to attract space tourism and provide economic benefits to localities; FAA has provided funding assistance for infrastructure development. For example, New Mexico provided $100 million to construct Spaceport America. According to an official from the Oklahoma spaceport, Oklahoma provides approximately $500,000 annually to the spaceport for operations, and the state paid for the environmental impact statement and the safety analysis needed to apply for an FAA license. The Florida Space Authority, a state agency, invested over $500 million in new space industry infrastructure development, including upgrades to the launch pad, a new space operations support complex, and a reusable launch vehicle support complex. The Mid-Atlantic Regional Spaceport receives half of its funding from Virginia and Maryland, with the remainder coming from revenue from operations. According to FAA, Florida and Virginia also passed bills that grant an exemption from state income tax for either launch services or gains achieved from providing services to the International Space Station. In addition, the Mojave Spaceport in California received an FAA Airport Improvement Program grant of $7.5 million to expand an existing runway to allow for the reentry of horizontally landing reusable vehicles. FAA faces challenges in ensuring that it has a sufficient number of staff with the necessary expertise to oversee the safety of commercial space launches and spaceport operations. In addition, FAA will need to determine whether its current safety regulations are appropriate for all types of commercial space vehicles, operations, and launch sites. FAA will also need to develop safety indicators and collect data to help it determine when to begin to regulate crew and passenger safety after 2012. Continuing to avoid conflicts between its dual roles as a safety regulator and an industry promoter remains another issue to consider as the space tourism industry develops. In 2006, we raised concerns that if the space tourism industry developed as rapidly as some industry representatives suggested, FAA's responsibility for licensing reusable launch vehicle missions would greatly expand. FAA's experience in this area is limited because its launch safety oversight has focused primarily on unmanned launches of satellites into orbit using expendable launch vehicles. Many companies are developing space hardware of different designs that are being tested for the first time, requiring that FAA have a sufficient level of expertise to provide oversight. In addition, FAA has to have an adequate number of staff to oversee the anticipated growth in the number of launches at various locations. We recommended that FAA assess the levels of expertise and resources that will be needed to oversee the safety of the space tourism industry and the new spaceports under various scenarios and timetables. In response to our recommendations, FAA's Office of Commercial Space Transportation hired 12 aerospace engineers, bringing its total staff to 71 full-time employees. In addition, since our report, FAA has established field offices at Edwards Air Force Base and NASA's Johnson Space Center in anticipation of increased commercial space launches. We believe FAA has taken reasonable steps to ensure that it has adequate resources to fulfill its safety oversight role. However, if the industry begins to expand, as senior FAA officials predict, to 200 to 300 annual launches, a reassessment of FAA's resources and areas of expertise would be appropriate. Moreover, as NASA-sponsored commercial space launches increase, FAA's need for regulatory resources and expertise may change, according to industry experts we spoke with. FAA faces the challenge of ensuring that its regulations on licensing and safety requirements for launches and launch sites, which are based on safety requirements for expendable launch vehicle operations at federal launch sites, will also be suitable for operations at spaceports. We reported that the safety regulations for expendable launch vehicles may not be suitable for space tourism flights because of differences in vehicle types and launch operations, according to experts we spoke with. Similarly, spaceport operators and experts we spoke with raised concerns about the suitability of FAA safety regulations for spaceports. Experts told us that safety regulations should be customized for each spaceport to address the different safety issues raised by various types of operations, such as different orbital trajectories and differences in the way that vehicles launch and return to earth--whether vertically or horizontally. To address these concerns, we reported that it will be important to measure and track safety information and use it to determine if the regulations should be revised. We did not make recommendations to FAA concerning these issues because the Commercial Space Launch Amendments Act of 2004 required the Department of Transportation (DOT) to commission an independent report to analyze, among other things, whether expendable and reusable vehicles should be regulated differently from each other, and whether either of the vehicles should be regulated differently if carrying passengers. The report, issued in November 2008, concluded that the launch of expendable vehicles, when used to lift reusable rockets carrying crew and passengers, as well as the launch and reentry of reusable launch vehicles with crew and passengers, should be regulated differently from the launch of expendable vehicles without humans aboard. Similar to our finding, the report noted that the development of a data system to monitor the development and actual performance of commercial launch systems and to better identify different launch risk factors and criteria would greatly assist the regulatory process. FAA has not developed such a data system because so few commercial launches have occurred. Although FAA is prohibited from regulating crew and passenger safety before 2012 except in response to serious injuries or fatalities or an event that poses a high risk of causing a serious or fatal injury, FAA is responsible for the protection of the uninvolved public, which could be affected by a failed mission. FAA has interpreted this limited authority as allowing it to regulate crew safety in certain circumstances and has been proactive in issuing a regulation concerning emergency training for crews and passengers. However, FAA has not developed indicators that it would use to monitor the safety of the developing space tourism sector and determine when to step in and regulate human space flight. To allow the agency to be proactive about safety, rather than responding only after a fatality or serious incident occurs, we recommended that FAA identify and continually monitor indicators of space tourism industry safety that might trigger the need to regulate crew and passenger safety before 2012. According to agency officials, FAA has not addressed our recommendation because there have been no launches with passengers. When such launches occur, those same officials told us, they intend to collect and analyze data on safety-related anomalies, safety-critical system failures, incidents, and accidents. Those officials also told us that they intend to develop a means to share information with and assess lessons learned from the private spaceflight industry. It is unclear when FAA will or should begin regulating crew and passenger safety, since data for evaluating risk do not exist. A senior FAA official told us that the agency does not plan to issue new regulations even after the 2012 prohibition is lifted and that they would like to see how the current procedures, which require passengers to sign an acknowledgement of informed consent, operates before deciding to issues new regulations. Nonetheless, FAA is taking steps that will enable it to be prepared to regulate. Space tourism companies that we spoke with stated that they now informally collect lessons learned and share best practices with each other and with FAA, which eventually could lead to industry standards. Senior FAA officials also told us that FAA is reviewing NASA's human rating of space launch vehicles as well as FAA's Office of Aviation Safety aircraft certification process as they consider possible future regulations on human spaceflight standards. In addition, FAA's Office of Commercial Space Transportation expects to work closely with its industry advisory group--the Commercial Space Transportation Advisory Committee--on the issue. We believe FAA is taking reasonable preliminary steps to regulate crew and passenger safety. In 2006, we reported that FAA faced the potential challenge of overseeing the safety of commercial space launches while promoting the industry. While we found no evidence that FAA's promotional activities--such as sponsoring an annual industry conference and publishing industry studies--conflicted with its safety regulatory role, we noted that potential conflicts may arise as the space tourism sector develops. We reported that as the commercial space launch industry evolves, it may be necessary to separate FAA's regulatory and promotional activities. Recognizing the potential conflict, Congress required the 2008 DOT-commissioned report to discuss whether the federal government should separate the promotion of human space flight from the regulation of such activity. We suggested as a matter for congressional consideration that, if the report did not fully address the potential for a conflict of interest, Congress should revisit the granting of FAA's dual mandate for safety and promotion of human space flight and decide whether the elimination of FAA's promotional role is necessary to alleviate the potential conflict. The 2008 commissioned report concluded there was no compelling reason to remove promotional responsibilities from FAA in the near term (through 2012). Moreover, the report noted that the Office of Commercial Space Transportation's estimated resource allocation for promotional activities was approximately 16 percent of the office's budget in fiscal year 2008, which was significantly less than what the office allocated for activities directly related to safety. However, the report noted that the commercial space launch industry will experience significant changes in its environment in the coming decades; therefore, periodic review of this issue is warranted. We concur with the commissioned report's assessment and see no need for Congress to step in at this time to require a separation of regulatory and promotional activities. However, FAA and Congress must remain vigilant that any inappropriate relationship between FAA and industry-- such as was alleged in 2008 between FAA and the airline industry--does not occur with the commercial space launch industry. The expected expansion of the U.S. commercial space launch industry due to anticipated events such as the development of space tourism and the retirement of NASA's space shuttle and the agency's shift to using the commercial sector to provide space transportation will affect the federal role in various ways such as increasing FAA's licensing and regulatory workload. To assist in the expansion of the industry, other issues will emerge for federal agencies and Congress to consider, such as whether to assist the industry in lowering costs by extending existing liability indemnification and how to enhance the global competitiveness of the U.S. industry. Another issue that will emerge as the industry grows is how FAA will integrate space flights with aircraft traffic as part of efforts to develop the next generation air transportation system (NextGen). A national space launch strategy, which is currently lacking, could provide a cohesive framework for addressing such issues and establishing national priorities. Industry experts that we spoke with and senior officials at FAA expect that the number of commercial space launches will increase over the next several years because of the continued development of vehicles for human space flight and in response to prize competitions. Starting in the next 3 to 5 years, senior FAA officials expect several companies to begin offering paying customers the opportunity to fly onboard suborbital space flights, with numerous launches taking place each year. Virgin Galactic is among the companies that are undertaking research and development for launch vehicles designed to serve the anticipated space tourism market. FAA reported in 2008 that the company had sold 250 seats for its flights. Scaled Composites and Virgin Galactic formed a joint venture to develop SpaceShipTwo for Virgin Galactic. Other companies, such as XCOR Aerospace and Armadillo Aerospace, have announced plans to develop vehicles to serve the personal spaceflight market. In addition, prize competitions are expected to spur the growth of the space launch industry. For example, the Northrop Grumman Lunar Lander Challenge featured $1.65 million in prizes for vehicles that can simulate the liftoff and landing of a lunar spacecraft; prizes were awarded to Masten Space Systems and Armadillo Aerospace in November 2009. Both companies told us that they intend to apply for FAA experimental permits soon. In addition, the $30 million Google Lunar X PRIZE is offered to those who can safely land a robot on the surface of the moon, travel 500 meters, and send video images and data to earth by December 2014. Such competitions spur research and development and require FAA licensing or permitting to ensure the safety of the uninvolved public. Senior FAA officials also expect the agency's licensing and oversight responsibilities to increase as NASA begins to rely on foreign partners and private industry to deliver cargo, and eventually crewmembers, to the International Space Station after it retires the space shuttle in 2010 or shortly thereafter. Two companies--SpaceX and Orbital Sciences--have received NASA contracts to develop new launch vehicles that will service the International Space Station. According to FAA officials and industry experts, test flights for the new vehicles are expected to begin next year with SpaceX at the beginning of the year and Orbital Sciences near the end of the year. FAA is working with SpaceX on its launch license application and Orbital Sciences is in the pre-application phase. FAA has established a field office at the Johnson Space Center in response to the anticipated increase in launches. We reported in 2006 that as the commercial space launch industry expands, it will face key competitive issues concerning high launch costs and export controls that affect its ability to sell its services abroad. Foreign competitors have historically offered lower launch prices than U.S. launch providers, and the U.S. industry has responded by merging launch companies, forming international partnerships, and developing lower-cost launch vehicles. For example, Boeing and Lockheed Martin merged their launch operations to form United Launch Alliance, and SpaceX developed a lower-cost launch vehicle. The U.S. government has responded to the foreign competition by providing the commercial space launch industry support, including research and development funds, government launch contracts, use of its launch facilities, and third-party liability insurance through which it indemnifies launch operators. The continuation of such federal involvement will assist industry growth, according to industry experts that we spoke with. For example, industry players have called for the continuation of indemnification to support U.S. competitiveness. Indemnification secures another party against risk or damage. The U.S. government indemnifies launch operators by providing catastrophic loss protection covering third-party liability claims in excess of required launch insurance in the event of a commercial launch incident. Currently, launch operators are required to buy third-party liability insurance for up to $500 million in addition to insurance for their vehicle and its operations, and the U.S. government provides up to $1.5 billion in indemnification. The law that allows for indemnification expires in December 2009. Some industry experts have said that it is important that the law be extended because the cost of providing insurance for launches could be unaffordable without indemnification. According to a space insurance expert, as there has not been an incident requiring the U.S. government to pay out third-party claims, the cost to the government of providing indemnification has been only for administrative purposes. Nonetheless, according to a senior Commerce official, there is always a possibility of a launch mishap that could invoke indemnification. FAA has asked for the law's extension as a means to promote the growth of the industry, and the Department of Commerce supports this position. A senior Commerce official told us that without federal indemnification, smaller launch companies may go out of business. In addition, industry representatives that we interviewed told us that export licensing requirements affect the ability of the U.S. commercial space launch industry to sell its services abroad. These regulations are designed to establish controls to ensure that arms exports are consistent with national security and foreign policy interests include launch vehicles because they can deliver chemical, biological, and nuclear weapons. A senior Department of Commerce official told us that the U.S. industry has asked Congress to consider changing the statute that restricts space manufacturing items for export. A change in statute would allow for the Departments of State and Defense to review individual items, as they do for other industries. As the space tourism industry develops, the issue will arise of establishing a foundation for a common global approach to launch safety. According to senior FAA officials, space tourism operations are planned to be international, with takeoffs and landings from U.S. spaceports to United Arab Emirates and Singapore spaceports, among others. Thus, the development, interoperability, and harmonization of safety standards and regulations, particularly concerning space tourism flights, will be important for the safety of U.S. and international space operations. In the future, if suborbital point-to-point space travel becomes a reality, entirely new issues will have to be addressed, including bilateral and international interoperability, air and space traffic integration, existing treaty and law implications, national security issues (such as friend or foe identification), customs, international technical standards, and other transportation issues. In response, FAA has established an international outreach program to promote FAA commercial space transportation regulations as a model for other countries to adopt. The outreach program includes establishing initial contacts with interested countries and introductory briefings about FAA regulations. NextGen--FAA's efforts to transform the current radar-based air traffic management system into a more automated, aircraft-centered, satellite- based system--will need to accommodate spacecraft that are traveling to and from space through the national airspace system. As the commercial space launch industry grows and space flight technology advances, FAA expects that commercial spacecraft will frequently make that transition and the agency will need tools to manage a mix of diverse aircraft and space vehicles in the national airspace system. In addition, the agency will need to develop new policies, procedures, and standards for integrating space flight operations into NextGen. For example, it will have to define new upper limits to the national airspace system to include corridors for flights transitioning to space; establish new air traffic procedures for flights of various types of space vehicles, such as aircraft-ferried spacecraft and gliders; develop air traffic standards for separating aircraft and spacecraft in shared airspace; and determine controller workload and crew rest requirements for space operations. FAA has begun to consider such issues and has developed a concept of operations document. Finally, an overarching issue that has implications for the U.S. commercial space launch industry is the lack of a comprehensive national space launch strategy, according to federal officials and industry experts. Numerous federal agencies have responsibility for space activities, including FAA's oversight of commercial space launches, NASA's scientific space activities, the Department of Defense's national security space launches, the State Department's involvement in international trade issues, and the Department of Commerce's advocacy and promotion of the industry. According to the National Academy of Sciences, aligning the strategies of the various civil and national security space agencies will address many current issues arising from or exacerbated by the current uncoordinated, overlapping, and unilateral strategies. A process of alignment offers the opportunity to leverage resources from various agencies to address such shared challenges as the diminished space industrial base, the dwindling technical workforce, and reduced funding levels, according to the Academy report. A national space launch strategy could identify and fill gaps in federal policy concerning the commercial space launch industry, according to senior FAA and Commerce officials. Our research has identified several gaps in federal policy for commercial space launches. For example, while FAA has safety oversight responsibility for the launch and re-entry of commercial space vehicles, agency officials told us that no federal entity has oversight of orbital operations, including the collision hazard while in orbit posed by satellites and debris (such as spent rocket stages, defunct satellites, and paint flakes from orbiting objects). Another issue that has not been resolved is the role of the National Transportation Safety Board (NTSB) in investigating any accidents that occur. NTSB does not have space transportation explicitly included in its statutory jurisdiction, although it does have agreements with FAA and the Air Force under which it will lead investigations of commercial space launch accidents. The 2008 commissioned report on human space flight suggested that Congress may want to consider explicitly designating a lead agency for accident investigations involving space vehicles to avoid potential overlapping jurisdictions. According to senior officials we spoke with at FAA and Commerce, the need for an overall U.S. space launch policy that includes commercial space launches is being discussed within DOT and across departments, as part of the administration's review of national space activities, but the development of a national policy has not yet begun. Mr. Chairman, this concludes my prepared statement. I would be pleased to respond to any questions from you or other Members of the Subcommittee. For further information on this testimony, please contact Dr. Gerald L. Dillingham at (202) 512-2834 or [email protected]. Individuals making key contributions to this testimony include Teresa Spisak, Maureen Luna- Long, Rosa Leung, Erica Miles, David Hooper, and Elizabeth Eisenstadt. FAA has assessed resources and hired 12 additional aerospace engineers. FAA's Office of Commercial Space Transportation should develop a formal process for consulting with the Office of Aviation Safety about licensing reusable launch vehicles. FAA has not developed a formal process, but the two offices signed a formal agreement for the licensing of SpaceShipTwo, which delineates the responsibilities for each office. Agency officials expect that a similar process will be used as future applications are received. FAA should identify and continually monitor space tourism safety indicators that might trigger the need to regulate crew and flight participant safety before 2012. No action has been taken on monitoring safety indicators because commercial human space flights have not occurred since the SpaceShipOne launches in 2004. When commercial human space flights occur, FAA plans to monitor key safety indicators including safety-related anomalies, safety-critical system failures, incidents, and accidents. FAA officials plan to track these indicators, precursors, trends, or lessons learned that would warrant additional FAA regulation. FAA should develop and issue guidance on the circumstances under which it would regulate crew and flight participant safety before 2012. No action has been taken to issue guidance. However, senior FAA officials say that the agency has held internal discussions on the circumstances under which it would regulate crew and space flight participant safety before 2012 in the event of a casualty or close call. The officials noted that launch vehicle operators are required to report to FAA mishaps and safety-related anomalies and failures and take appropriate corrective actions prior to the next launch. As long as it has a promotional role, FAA should work with the Department of Commerce to develop a memorandum of understanding that clearly delineates the two agencies' respective promotional roles in line with their statutory obligations and larger agency missions. FAA's Office of Commercial Space Transportation and Commerce's Office of Space Commercialization signed a memorandum of understanding in September 2007. FAA has no agreement with Commerce's International Trade Administration, which also has responsibilities for promoting the commercial space industry and its competitiveness. This is a work of the U.S. government and is not subject to copyright protection in the United States. The published product may be reproduced and distributed in its entirety without further permission from GAO. However, because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. | Since the Government Accountability Office (GAO) reported on the commercial space launch industry in 2006, the industry has evolved and moved further toward space tourism. Commercial space tourism promises to make human space travel available to the public for the first time. The Federal Aviation Administration (FAA) oversees the safety of commercial space launches, licensing and monitoring the safety of such launches and of spaceports (sites for launching spacecraft), and FAA promotes the industry. FAA is also responsible for overseeing the safety of space tourism, but it may not regulate crew and passenger safety before 2012 except in response to high-risk incidents, serious injuries, or fatalities. This testimony addresses (1) recent trends in the commercial space launch industry, (2) challenges that FAA faces in overseeing the industry, and (3) emerging issues that will affect the federal role. This statement is based on GAO's October 2006 report on commercial space launches, updated with information GAO gathered from FAA, the Department of Commerce, and industry experts in November 2009 on industry trends and recent FAA actions. In past work, GAO recommended that FAA take several actions to improve its oversight of commercial space launches, including assessing its future resource needs. FAA has taken some steps to address the recommendations. Recent Trends: Historically, the commercial space launch industry focused primarily on putting payloads, such as satellites, into orbit, using launch vehicles that did not return to earth. Such launches have, however, dropped off, and the industry is increasing its focus on space tourism. Since five manned commercial flights demonstrated the potential for commercial space tourism in 2004, companies have pursued research and development and are further developing reusable vehicles for manned flights. Concurrently, companies and states are developing additional spaceports to accommodate anticipated increases in commercial space launches. States have provided economic incentives, and FAA has provided some funding for development. Oversight Challenges: In overseeing the commercial space launch industry, including the safety of space tourism, FAA faces several challenges. These include maintaining a sufficient number of staff with the necessary expertise to oversee the safety of launches and spaceport operations; determining whether FAA's current safety regulations are appropriate for all types of commercial space vehicles, operations, and launch sites; developing information to help FAA decide when to regulate crew and passenger safety after 2012; and continuing to avoid conflicts between FAA's regulatory and promotional roles. Emerging Issues: The U.S. commercial space launch industry is expected to expand as space tourism develops and the National Aeronautics and Space Administration starts to rely on the commercial sector for space transportation. This expansion will affect the federal role. For example, FAA will face increases in its licensing and regulatory workload, and federal agencies and Congress will face decisions about whether to support the U.S. industry by continuing to provide liability indemnification to lower its costs. Additionally, FAA will face policy and procedural issues when it integrates the operations of spacecraft into its next generation air transportation system. Finally, coordinating the federal response to the commercial space industry's expansion is an issue for the federal government in the absence of a national space launch strategy for setting priorities and establishing federal agency roles. | 5,725 | 675 |
According to ISS, over 28,000 publicly-traded corporations globally send out proxy statements each year that contain important facts about more than 250,000 separate issues on which shareholders are asked to vote. Votes are solicited on a variety of key issues that could potentially affect the corporations' value, such as the election of directors, executive compensation packages, and proposed mergers and acquisitions, as well as other, more routine, issues that may not affect value, such as approving an auditor and changing a corporate name. The proxy statement typically includes a proxy ballot (also called a proxy card) that allows shareholders to appoint a third party (proxy) to vote on the shareholder's behalf if the shareholder decides not to attend the meeting. The shareholder may instruct the proxy how to vote the shares or may opt to grant the proxy discretion to make the voting decision. The proxy card may be submitted to the company via the mail or online. The proxy advisory industry has grown over the past 20 years as a result of various regulatory and market developments. The management of a mutual fund's or pension plan's assets, including the voting of proxies, is often delegated to a person who is an investment adviser subject to the Investment Advisers Act of 1940. In a 1988 letter, known as the "Avon Letter," the Department of Labor took the position that the fiduciary act of managing employee benefit plan assets includes the voting of proxies associated with shares of stock owned by the plan. According to industry experts, managers of employee retirement plan assets began to seek help in executing their fiduciary responsibility to vote proxies in their clients' best interests. Consequently, the proxy advisory industry--particularly ISS, which had been established in 1985--started to grow. According to industry experts, ISS's reputation and dominance in the proxy advisory industry continued to grow in the 1990s and early 2000s, fueled by the growing fiduciary requirements of institutional investors and increased shareholder activism. This increased shareholder activism has been attributed in part to reaction by investors to the massive financial frauds perpetrated by management of public companies, including the actions that led to the bankruptcies of Enron and WorldCom. Many institutional investors sought the services of proxy advisory firms to assist in their assessments of the corporate governance practices of publicly traded companies and to carry out the mechanics of proxy voting. Finally, in 2003, SEC adopted a rule and amendments under the Investment Advisers Act of 1940 that requires registered investment advisers to adopt policies and procedures reasonably designed to ensure that proxies are voted in the best interests of clients, which industry experts also cited as a reason for the continued growth of the proxy advisory industry. Today, the proxy advisory industry is comprised of five major firms, with ISS serving as the dominant player with over 1,700 clients. The other four firms--Marco Consulting Group (MCG), Glass Lewis & Co. (Glass Lewis), Proxy Governance, Inc. (PGI), and Egan-Jones Proxy Services (Egan- Jones)--have much smaller client bases and are relatively new to the industry: Glass Lewis, PGI, and Egan-Jones were all created within the past 6 years. Founded in 1985, ISS serves clients with its core business, which includes analyzing proxy issues and offering research and vote recommendations. ISS also provides Web-based tools and advisory services to corporate issuers through ISS Corporate Services, Inc. a separate division established in 1997 which was spun-out into a wholly- owned subsidiary in 2006. RiskMetrics Group, a financial risk management firm, acquired ISS in January 2007. RiskMetrics Group provides risk management tools and analytics to assist investors in assessing risk in their portfolios. MCG was established in 1988 to provide investment analysis and advice to Taft-Hartley funds and has since expanded its client base to public employee benefit plans. Glass Lewis, established in 2003, provides proxy research and voting recommendations and was acquired by Xinhua Finance Limited, a Chinese financial information and media company, in 2007. Established in 2004, PGI offers proxy advice and voting recommendations and is a wholly-owned subsidiary of FOLIOfn, Inc., a financial services company that also provides brokerage services and portfolio management technology for individual investors and investment advisers. Egan-Jones was established in 2002 as a division of Egan-Jones Ratings Company, which was incorporated in 1992. Egan-Jones provides proxy advisory services to institutional clients to facilitate making voting decisions. Of the five major proxy advisory firms, three--ISS, MCG, and PGI--are registered with SEC as investment advisers and are subject to agency oversight, while according to corporate officials, the other two firms are not. In their SEC registration filings, the three registered firms have identified themselves as pension consultants as the basis for registering as investment advisers under the Investment Advisers Act. Although Glass Lewis initially identified itself as a pension consultant and registered with SEC as an investment adviser, it withdrew its registration in 2005. According to SEC officials, an investment adviser is not required to disclose a reason for its decision to withdraw its registration in the notice of withdrawal filed with SEC. Officials from Glass Lewis and Egan-Jones did not elaborate on their decisions not to be registered with SEC as investment advisers, other than to note that their decisions were made with advice from their respective counsel. In the proxy advisory industry, various conflicts of interest can arise that have the potential to influence the research conducted and voting recommendations made by proxy advisory firms. The most commonly cited potential for conflict involves ISS, which provides services to both institutional investor clients and corporate clients. Several other circumstances may lead to potential conflicts on the part of proxy advisory firms, including situations in which owners or executives of proxy advisory firms have an ownership interest in or serve on the board of directors of corporations that have proposals on which the firms are offering vote recommendations. Although the potential for these types of conflicts exists, in its examinations of proxy advisory firms that are registered as investment advisers, SEC has not identified any major violations, such as a failure to disclose a conflict, or taken any enforcement actions to date. Industry professionals and institutional investors we interviewed cited ISS's business model as presenting the greatest potential conflict of interest associated with proxy advisory firms because ISS offers proxy advisory services to institutional investors as well as advisory services to corporate clients. Specifically, ISS provides institutional investor clients with recommendations for proxy voting and ratings of companies' corporate governance. In addition, ISS helps corporate clients develop proposals to be voted on and offers corporate governance consulting services to help clients understand and improve their corporate governance ratings. Because ISS provides services to both institutional investors and corporate clients, there are various situations that can potentially lead to conflicts. For example, some industry professionals stated that ISS could help a corporate client design an executive compensation proposal to be voted on by shareholders and subsequently make a recommendation to investor clients to vote for this proposal. Some industry professionals also contend that corporations could feel obligated to subscribe to ISS's consulting services in order to obtain favorable proxy vote recommendations on their proposals and favorable corporate governance ratings. One industry professional further believes that, even if corporations do not feel obligated to subscribe to ISS's consulting services, they still could feel pressured to adopt a particular governance practice simply to meet ISS's standards even though the corporations may not see the value of doing so. ISS has disclosed and taken steps to help mitigate situations that can potentially lead to conflicts. For example, on its Web site, ISS explains that it is "aware of the potential conflicts of interest that may exist between proxy advisory service ... and the business of ISS Corporate Services, Inc. ." The Web site also notes that "ISS policy requires every ISS proxy analysis to carry a disclosure statement advising the client of the work of ICS and advising ISS's institutional clients that they can get information about an issuer's use of ICS's products and services." In addition, some institutional investors we spoke with noted that ISS has on occasion disclosed to them, on a case-by-case basis, the existence of a specific conflict related to a particular corporation. In addition to disclosure, ISS has implemented policies and procedures to help mitigate potential conflicts. For example, according to ISS, it has established a firewall that includes maintaining separate staff for its proxy advisory and corporate businesses, which operate in separate buildings and use segregated office equipment and information databases in order to help avoid discovery of corporate clients by the proxy advisory staff. ISS also notes on its Web site that it is a registered investment adviser and is subject to the regulatory oversight of SEC. In addition, according to ISS's Web site, corporations purchasing advisory services sign an agreement acknowledging that use of such services does not guarantee preferential treatment from ISS's division that provides proxy advisory services. All of the institutional investors--both large and small--we spoke with that subscribe to ISS's services said that they are satisfied with the steps that ISS has taken to mitigate its potential conflicts. Most institutional investors also reported conducting due diligence to obtain reasonable assurance that ISS or any other proxy advisory firm is independent and free from conflicts of interest. As part of this process, many of these institutional investors said they review ISS's conflict policies and periodically meet with ISS representatives to discuss these policies and any changes to ISS's business that could create additional conflicts. Finally, as discussed in more detail later in this report, institutional investors told us that ISS's recommendations are generally not the sole basis for their voting decisions, which further reduces the chances that these potential conflicts would unduly influence how they vote. Although institutional investors said they generally are not concerned about the potential for conflicts from ISS's businesses and are satisfied with the steps ISS has taken to mitigate such potential conflicts, some industry analysts we contacted said there remains reason to question the steps' effectiveness. For example, one academic said that while ISS is probably doing a fair job managing its conflicts, it is difficult to confirm the effectiveness of the firm's mitigation procedures because ISS is a privately-held company, thereby restricting information access. Moreover, according to another industry analyst, because ISS's recommendations are often reported in the media, the corporate consulting and proxy advisory services units could become aware of the other's clients. In addition to the potential conflict of interest discussed above, several other situations in the proxy advisory industry could give rise to potential conflicts. Specifically: Owners or executives of proxy advisory firms may have a significant ownership interest in or serve on the board of directors of corporations that have proposals on which the firms are offering vote recommendations. A few institutional investors told us that such situations have been reported to them by ISS and Glass Lewis, both of which, in order to avoid the appearance of a conflict, did not make voting recommendations. Institutional investors may submit shareholder proposals to be voted on at corporate shareholder meetings. This raises concern that proxy advisory firms will make favorable recommendations to other institutional investor clients on such proposals in order to maintain the business of the investor clients that submitted these proposals. Several proxy advisory firms are owned by companies that offer other financial services to various types of clients, as is common in the financial services industry, where companies often provide multiple services to various types of clients. This is the case at ISS, Glass Lewis, and PGI, and may present situations in which the interests of different sets of clients diverge. SEC reviews registered investment advisers' disclosure and management of potential conflicts, as well as proxy voting situations where a potential conflict may arise. Specifically, SEC's Office of Compliance Inspections and Examinations monitors the operations and conducts examinations of registered investment advisers, including proxy advisory firms. An SEC official stated that, as part of these examinations, SEC may review the adequacy of disclosure of a firm's owners and potential conflicts; particular products and services that may present a conflict; the independence of a firm's proxy voting services; and the controls that are in place to mitigate potential conflicts. As discussed previously, three of the five proxy advisory firms (ISS, MCG, and PGI) are registered as investment advisers while Glass Lewis and Egan-Jones are not. According to SEC, to date, the agency has not identified any major violations of applicable federal securities laws in its examinations of proxy advisory firms that are registered as investment advisers and has not initiated any enforcement action against these firms. As the dominant proxy advisory firm, ISS has gained a reputation with institutional investors for providing reliable, comprehensive proxy research and recommendations, making it difficult for competitors to attract clients and compete in the market. As shown below in table 1, ISS's client base currently includes an estimate of 1,700 institutional investors, more than the other four major firms combined. Several of the institutional investors we spoke with that subscribe to ISS's services explained that they do so because they have relied on ISS for many years and trust it to provide reliable, efficient services. They said that they have little reason to switch to another service provider because they are satisfied with the services they have received from ISS over the years. Because of ISS's clients' level of satisfaction, other providers of proxy advisory services may have difficulty attracting their own clients. In addition, because of its dominance and perceived market influence, corporations may feel obligated to be more responsive to requests from ISS for information about proposals than they might be to other, less- established proxy advisory firms, resulting in a greater level of access by ISS to corporate information that might not be available to other firms. Industry analysts explained that, in addition to overcoming ISS's reputation and dominance in the proxy advisory industry, proxy advisory firms must offer comprehensive coverage of corporate proxies and implement sophisticated technology to attract clients and compete. For instance, institutional investors often hold shares in thousands of different corporations and may not be interested in subscribing to proxy advisory firms that provide research and voting recommendations on a limited portion of these holdings. As a result, proxy advisory firms need to provide thorough coverage of institutional holdings, and unless they offer comprehensive services from the beginning of their operations, they may have difficulty attracting clients. In addition, academics and industry experts we spoke with said that new firms need to implement a sophisticated level of technology to provide the research and proxy vote execution services that clients demand. The initial investment required to develop and implement such technology can be a significant expense for firms. Although newer proxy advisory firms may face challenges attracting clients and establishing themselves in the industry, several of the professionals we spoke with believed that these challenges could be overcome. For example, while firms may need to offer comprehensive coverage of corporate proxies in order to attract clients and although ISS might have access to corporate information that other firms do not, much of the information needed to conduct research and offer voting recommendations is easily accessible. Specifically, anyone can access corporations' annual statements and proxy statements, which are filed with SEC, are publicly available, and contain most of the information that is needed to conduct research on corporations and make proxy voting recommendations. Also, although developing and implementing the technology required to provide research and voting services can be challenging, various industry professionals told us that once a firm has done so, the marginal cost of providing services to additional clients and of updating and maintaining such technology is relatively low. Some of the competitors seeking to enter the proxy advisory industry in recent years that we spoke with have offered their services as alternatives to ISS. Specifically, they have attempted to differentiate themselves from ISS by providing only proxy advisory services to institutional investor clients. ISS's competitors have chosen not to provide corporate consulting services in part to avoid the potential conflicts that exist at ISS. Proxy advisory firms have also attempted to differentiate themselves from the competition on the basis of the types of services provided. For example, some firms have started to focus their research and recommendation services on particular types of proxy issues or on issues specific to individual corporations. The institutional investors we spoke with had a variety of opinions about the level of competition in the industry. Some questioned whether the existing number of firms is sufficient, while others questioned whether the market could sustain the current number of firms. However, many of the institutional investors believe that increased competition could help reduce the cost and increase the range of available proxy advisory services. For example, some institutional investors said that they have been able to negotiate better prices with ISS because other firms have recently entered the market. While some of these newer proxy advisory firms have attracted clients, it is too soon to tell what the firms' ultimate effect on competition will be. We conducted structured interviews with 31 randomly selected institutional investors to gain an understanding of the ways in which they use proxy advisory firms and the influence that such firms have on proxy voting. Of the 20 large institutional investors we interviewed, 19 reported that they use proxy advisory services in one or more ways that may serve to limit the influence that proxy advisory firms have on proxy voting results (see table 2), while only 1 reported relying heavily on a proxy advisory firm's research and recommendations. The following summarizes several of the reasons that large institutional investors' reliance on proxy advisory firms' research and recommendations is limited: Most of the large institutional investors we spoke with (15 out of 20) reported that they generally rely more on their own in-house research and analyses to make voting decisions than on the research and recommendations provided by their proxy advisory services providers. These institutional investors tend to have their own in-house research staffs, and their in-house research reportedly drives their proxy voting decisions. They explained that they use the research and recommendations provided by proxy advisory firms to supplement their own analysis and as one of many factors they consider when deciding how to vote. In addition, many (14) of the large institutional investors we contacted reported that they subscribe to a customized voting policy that a proxy advisory firm executes on the institutions' behalf. These institutional investors develop their own voting policies and guidelines that instruct the advisory firm how to vote on any given proxy issue. In such instances, the proxy advisory firms simply apply their clients' voting policies, which then drive the voting decisions. Further, 8 of the large institutional investors we contacted explained that they subscribe to more than one proxy advisory firm to help determine how to vote. These institutional investors said that they consider multiple sets of proxy advisory firm research and recommendations to gain a broader range of information on proxy issues and to help make well-informed voting decisions. We also interviewed representatives from 11 smaller institutional investors, and the results of these interviews suggest that proxy advisory firm recommendations are of greater importance to these institutions than they are to the large institutional investors we spoke with. In particular, representatives from smaller institutional investors were more likely to say that they rely heavily on their proxy advisory firm and vote proxies based strictly on the research and recommendations of their firm, given these institutions' limited resources. Consequently, the level of influence held by proxy advisory firms appears greater with these smaller institutional investors. However, whether large or small, all of the institutional investors we spoke with explained that they retain the fiduciary obligation to vote proxies in the best interest of their clients irrespective of their reliance on proxy advisory firms. Institutional investors emphasized that they do not delegate this responsibility to proxy advisory firms and retain the right to override any proxy advisory firm recommendations, limiting the amount of influence proxy advisory firms hold. In addition, large and small institutional investors reported that they tend to provide greater in-house scrutiny to, and rely even less on, proxy advisory firm recommendations about certain high-profile or controversial proxy issues, such as mergers and acquisitions or executive compensation. Institutional investors' perspectives on the limited influence of proxy advisory firms reflected what we heard from professionals that we spoke with who have knowledge of the industry. Many of these industry analysts and academics agreed that large institutional investors would be less likely than small institutional investors to rely on proxy advisory firms, because large institutions have the resources available to conduct research and subscribe to more than one proxy advisory service provider. These professionals also thought that large institutional investors would be likely to use proxy advisory firms as one of several factors they consider in the research and analysis they perform to help them decide how to vote proxies. Further, several believed that small institutional investors would be more likely to vote based strictly on proxy advisory firms' recommendations, because they do not have the resources to conduct their own research. The results of our work suggest that the overall influence of advisory firms on proxy vote outcomes may be limited. In particular, large institutional investors, which cast the great majority of proxy votes made by all institutional investors with over $1 billion in assets, reportedly place relatively less emphasis on the firms' research and recommendations than smaller institutional investors. However, we could not reach a definitive conclusion about the firms' influence because the institutional investors we contacted were not necessarily representative of all such investors. Further, we could not identify any studies that comprehensively isolated advisory firm research and recommendations from other factors that may influence institutional investors' proxy voting. We provided a draft of this report to SEC for its review and comment. SEC provided technical comments, which we incorporated into the final report, as appropriate. We also provided relevant sections of the draft to the proxy advisory firms for a technical review of the accuracy of the wording and made changes, as appropriate, based on the firms' comments. As agreed with your office, unless you publicly announce the contents of this report earlier, we plan no further distribution of this report until 30 days from the report date. At that time we will provide copies of this report to the Chairman and Ranking Member, Senate Committee on Banking, Housing, and Urban Affairs; the Chairman, House Committee on Financial Services; the Chairman, House Subcommittee on Capital Markets, Insurance, and Government Sponsored Enterprises, Committee on Financial Services; other interested committees; and the Chairman of the Securities and Exchange Commission (SEC). We will also make copies available to others upon request. In addition, the report will be available at no charge on the GAO Web site at http://www.gao.gov. If you or your staffs have any questions about this report, please contact me at (202) 512-8678 or [email protected]. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this report. Key contributors to this report are listed in appendix II. Our objectives were to (1) identify potential conflicts of interest that exist with proxy advisory firms and the steps that the Securities and Exchange Commission (SEC) has taken to oversee these firms; (2) review the factors that might impede or promote competition in this industry; and (3) analyze institutional investors' use of proxy advisory services to help vote proxies and the influence proxy advisory firms may have on proxy voting. To determine the types of potential conflicts of interest that could arise in the proxy advisory industry, we conducted a literature review and examined studies relating to potential conflicts that may arise in this industry. Further, we interviewed various professionals with knowledge of the proxy advisory industry, including industry experts, academics, industry association representatives, and proxy advisory firm representatives, as well as institutional investors and officials at SEC. We selected these professionals based, in part, on literature searches we conducted on topics relating to proxy advisory and corporate governance services, as well as referrals by several of the professionals we met with. The professionals we spoke with represent a wide range of perspectives, and include experts from academia, business, government, and professional organizations. We did not attempt to assess any of the proxy advisory firms' conflict mitigation policies or procedures and, therefore, did not come to any conclusions about the adequacy of these policies or procedures. To gain an understanding of SEC's oversight of proxy advisory firms, we reviewed relevant investment adviser regulations and examinations conducted by SEC since 2000 and interviewed agency officials. We did not attempt to assess the adequacy of SEC's oversight. To identify the factors that might impede or promote competition in this industry, we reviewed the relevant literature and examined studies relating to the level of competition in the industry, and we spoke with various industry professionals. We did not attempt to evaluate the level of competition in this industry and, therefore, did not come to any conclusions about the extent to which competition exists. Finally, to explore institutional investors' use of proxy advisory services to help vote proxies and the influence proxy advisory firms may have on proxy voting, we conducted structured interviews with 31 institutional investors selected randomly by type, including mutual funds, corporate pension funds, government pension funds, and union pension funds, as well as asset management institutions. Our sample included several of the largest institutional investors and was derived from Standard & Poor's Money Market Directories (January 2006). The sample consisted of a population of mutual funds and pension funds with over $1 billion in assets, and included large and small institutional investors from each investor type. We defined "large" and "small" institutional investors as the top and bottom 15 percent of each institutional investor type. In total, these large and small institutional investors accounted for over 72 percent of assets under management held by mutual funds and pension funds with over $1 billion under management. Although we randomly selected these institutional investors, the size of the sample was small and may not necessarily be representative of the universe of institutional investors. As a result, we could not generalize the results of our analysis to the entire population of institutional investors. We conducted structured interviews with 20 large and 11 small institutional investors. Initially, we had contacted a total of 126 mutual funds and pension funds that were randomly selected from our sample of institutional investors and 20 (13 large and 7 small institutions) reported using proxy advisory firm services and agreed to participate in our structured interviews. The other 106 institutional investors we had initially contacted declined to participate in the structured interviews for several reasons. In particular, many of these institutions said that they do not vote proxies themselves, but rather hire asset management institutions to both manage their investment portfolios and vote proxies on their behalf. We conducted interviews with 11 (7 large and 4 small institutions) of these asset management institutions, which were referred to us by several of the pension funds we had initially contacted. The results of these asset manager interviews are included among the total of 20 large and 11 small institutional investors that we interviewed. In addition, some of the 106 institutional investors declined to participate because they vote proxies themselves or do not vote proxies at all, while others refused to participate or could not be reached. In our structured interviews with the 31 institutional investors, we spoke with officials from the organizations who are responsible for proxy voting activities. We asked these officials a variety of questions relating to their institutions' policies on proxy voting and use of proxy advisory firms. Further, we asked the officials to comment on potential conflicts of interest associated with proxy advisory firms, steps taken to mitigate such potential conflicts, and the level of competition in the proxy advisory industry. Finally, we spoke with various industry professionals discussed earlier to gain their perspectives on the influence of proxy advisory firms. We could not identify any studies that comprehensively measured the influence that these firms have on proxy voting. We conducted our work in Washington, D.C., between September 2006 and June 2007 in accordance with generally accepted government auditing standards. In addition to the above contact, Wes Phillips, Assistant Director; Emily Chalmers; Rudy Chatlos; Eric Diamant; Fred Jimenez; Yola Lewis; and Omyra Ramsingh made key contributions to this report. | At annual meetings, shareholders of public corporations can vote on various issues (e.g., mergers and acquisitions) through a process called proxy voting. Institutional investors (e.g., mutual funds and pension funds) cast the majority of proxy votes due to their large stock holdings. In recent years, concerns have been raised about a group of about five firms that provide research and recommendations on proxy votes to their institutional investor clients. GAO was asked to report on (1) potential conflicts of interest that may exist with proxy advisory firms and the steps that the Securities and Exchange Commission (SEC) has taken to oversee these firms; (2) the factors that may impede or promote competition within the proxy advisory industry; and (3) institutional investors' use of the firms' services and the firms' potential influence on proxy vote outcomes. GAO reviewed SEC examinations of proxy advisory firms, spoke with industry professionals, and conducted structured interviews with 31 randomly selected institutional investors. GAO is not making any recommendations. Various potential conflicts of interest can arise at proxy advisory firms that could affect vote recommendations, but SEC has not identified any major violations in its examinations of such firms. In particular, the business model of the dominant proxy advisory firm--Institutional Shareholder Services (ISS)--has been the most commonly cited potential conflict. Specifically, ISS advises institutional investors how to vote proxies and provides consulting services to corporations seeking to improve their corporate governance. Critics contend that corporations could feel obligated to retain ISS's consulting services in order to obtain favorable vote recommendations. However, ISS officials said they have disclosed and taken steps to mitigate this potential conflict. For example, ISS discloses the potential conflict on its Web site and the firm's policy is to advise clients of relevant business practices in all proxy vote analyses. ISS also maintains separate staff who are located in separate buildings for the two businesses. While all institutional investors GAO spoke with that use ISS's services said they are satisfied with its mitigation procedures, some industry analysts continue to question their effectiveness. SEC conducts examinations of advisory firms that are registered as investment advisers and has not identified any major violations. Although new firms have entered the market, ISS's long-standing position has been cited by industry analysts as a barrier to competition. ISS has gained a reputation for providing comprehensive services, and as a result, other firms may have difficulty attracting clients. Proxy advisory firms must offer comprehensive coverage to compete and need sophisticated systems to provide the services clients demand. But firms interested in entering the market do have access to much of the information needed to make recommendations, such as publicly available documents filed with SEC. Competitors have attempted to differentiate themselves from ISS by, for example, providing only proxy advisory services and not corporate consulting services. While these firms have attracted clients, it is too soon to tell what their ultimate effect on enhancing competition will be. Among the 31 institutional investors GAO spoke with, large institutions reportedly rely less than small institutions on the research and recommendations offered by proxy advisory firms. Large institutional investors said that their reliance on proxy advisory firms is limited because, for example, they have in-house staff to assess proxy vote issues and only use the research and recommendations offered by proxy advisory firms to supplement such research. In contrast, small institutional investors have limited resources to conduct their own research and tend to rely more heavily on the research and recommendations offered by proxy advisory firms. The fact that large institutional investors cast the great majority of proxy votes made by institutional investors and reportedly place relatively less emphasis on advisory firm research and recommendations could serve to limit the firms' overall influence on proxy voting results. | 5,894 | 771 |
The ManTech Program is designed to enable DOD to develop advanced technologies to use in manufacturing weapon systems. Such technologies, in turn, should reduce weapon system costs and improve quality. ManTech projects address development of technology in areas such as metals, composite materials, electronics, munitions, as well as technology to sustain weapons systems. The users of the ManTech Program are service and DLA managers responsible for the development of new weapons systems and for the repair, maintenance and overhaul of fielded systems. However, the projects are executed through agreements or contracts with several types of organizations including defense contractors, government facilities, suppliers, consortia, centers of excellence, academia, and research institutes. The military services and DLA execute the ManTech Program under the general direction of the Director, Defense Research and Engineering, Office of the Deputy Under Secretary of Defense (Science & Technology), Office of Technology Transition. Each component has established a ManTech office within its organization to set policies and procedures for operating its ManTech program and determining which projects to fund. DOD established the Joint Defense Manufacturing Technology Panel, staffed by service and DLA ManTech office personnel, to set program objectives, promote effective integration and program management, conduct joint planning, and oversee program execution. It reports to and receives taskings from the Director of Defense Research & Engineering on manufacturing technology issues of multiservice concern and application. The panel organized the program into subpanels to serve as focal points for specific technology areas. ManTech Program appropriations have fluctuated significantly over the past several years, and annually since fiscal year 1991, the Congress has appropriated more funds to the program than the services requested in the Presidents' budgets. The funding trends for the program since fiscal year 1991 are shown in figure 1. In addition, funding by DOD component has also fluctuated. Figure 2 shows the funding for the services from fiscal years 1997 to 2001. Users in the military services and DLA look to the ManTech Program to help meet certain needs related to weapons systems they are responsible for, such as developing technologies, products and processes that will reduce the total cost and improve the manufacturing quality of their systems. Users reported to us that the ManTech projects we selected in our analysis were generally addressing their needs. In addition, the military services and DLA have processes in place that include users in the project identification and selection process. Such processes increase the likelihood that projects will meet user needs. However, the extent to which some needs are being met is limited by factors related to each program, such as the amount of funding available. During fiscal years 1999 and 2000, DOD had a total of 234 active ManTech projects valued at about $372 million. From that list, we selected 52 projects in the DOD components valued at $206 million and discussed with users whether those projects were responding to their needs. These users told us that the ManTech Program is generally meeting their needs. The projects we selected resulted in improvements ranging from a project that developed new technology to reduce the time and cost required to produce submarine and surface ship propellers; to a project that increased the reliability of electrical circuits used in missile systems by protecting them against dirt and moisture; to a project that enabled the Air Force to replace 83 parts in its F-119 engine with one part and reduce the weight of the engine by 54 pounds. By implementing such projects, officials from the military services and DLA told us that they were able to save tens of millions of dollars. Table 1 provides detailed examples of projects that users reported to us met their needs. Congress has consistently provided more funding for DOD's ManTech Program than requested in the President's budget. For example, in fiscal years 2000 and 2001, the Army received an additional $66.5 million in ManTech funds, of which $45.5 million or nearly 70 percent was designated for the Army's ManTech munitions efforts. These efforts included such projects as developing a more cost-effective and safer manufacturing process for an advanced explosive compound. The Congress believed such efforts were not receiving sufficient funds in the past. The extent to which the ManTech Program meets users needs is due partly to the process by which projects are identified and selected for funding. Furthermore, the statute requires the participation of the prospective technology users in establishing requirements for advanced manufacturing technology. The services and DLA have different planning cycles and criteria for project selection. However, they all have processes that include users in the identification and selection of projects. The processes generally include steps to determine and consolidate users' needs, select the projects to be funded, and perform the work. The following figure depicts the generic ManTech project and identification and selection process. We found that the number of projects selected for inclusion in the ManTech Program differs from the number proposed because of funding limitations. Most of the funding each year is allocated to projects already underway that require multi-year funding. Only a few proposed projects are selected as new starts. Table 2 shows the number of projects proposed and selected for fiscal year 2001. Even though the services and DLA employ different types of selection mechanisms and criteria, they all include users in this process. For example, the Army and the Navy annually solicit ideas for projects from the major subordinate commands where weapons systems are managed. The Air Force encourages users to submit ideas for projects on a continuing basis. All three services require that before a project can be considered for funding, prospective users of the technology endorse the projects. DLA relies on regular dialog with its supply service centers to raise issues related to manufacturing technology for the programs for which it is responsible. Table 3 further details how the services and DLA identify, select, and fund their projects. Some factors limit the extent to which the services and DLA can respond to certain needs. Those limitations include canceling some projects that have not yet been started, terminating projects already underway, or postponing projects already approved for funding because of insufficient funding. For example, the Navy conducts its program through a network of Centers of Excellence and allocates program funding based on what each center received in the past. This strategy helps all of the centers remain viable through the life of their contracts, but demands for projects at a particular center in any given year may be greater than funding at that center. This outcome may result in some projects not being funded, and therefore some users' most urgent ManTech needs may not be met. For example, for fiscal year 2001, two lower priority Naval Sea Systems Command projects were selected for funding because the command's higher priority projects were for Centers of Excellence with insufficient funds to meet all demands. Also, several Army and one ManTech official in the Office of the Under Secretary of Defense whom we talked to expressed concern about the Army's requirement for a program manager cost share on certain projects and a validated cost analysis on all projects. Two of the officials believed that there were projects that would benefit Army weapons systems but would not be selected for funding because (1) it was not possible to obtain a program manager cost share, or (2) a validated cost analysis could not be done for projects with environmental, health, or safety benefits. According to the officials, these projects would help meet user needs by reducing the total cost of ownership or improving the quality of weapons systems. However, our review of a number of Army projects did not reveal any that fell into these categories. Another Army ManTech official and an official from the Office of the Under Secretary of Defense believed that validated cost analyses served a useful purpose in weeding out projects without measurable financial benefits. One official expressed concern about the extent to which the Army relies on validated cost analyses to select projects for funding. The other official did not think the cost analysis was the best or only way to screen projects. However, neither official had alternative suggestions. Additionally, Air Force ManTech officials expressed concern that users' future needs may not be met to the same extent as they have been in recent years. This is because the Air Force Materiel Command may have to absorb a budget shortfall of $100 million in science and technology funding, which includes the ManTech program. As a result, the Materiel Command proposes reducing the Air Force ManTech Program by more than a quarter over the next 5 years between fiscal years 2003 and 2007 or $77.6 million in total. According to ManTech managers, the Air Force may have to terminate some on-going projects and/or cancel planned projects to address the funding shortfall. For the most part, the services and DLA awarded work performed under the ManTech Program using competitive procedures. Of the 36 contracting actions we reviewed, 10 were awarded without competition. In each case, there was a documented justification to award the work on a sole source basis. Table 4 further illustrates the extent to which the services and DLA award their projects competitively and details the rationale for specific sole source awards. DOD is not managing the ManTech Program as efficiently and effectively as possible. Specifically, it is not conducting as many joint projects as it could and therefore is missing opportunities to leverage the limited funding available for ManTech projects. Additionally, DOD does not effectively measure the program's success. Joint projects are those that are jointly funded; have planned implementation benefiting more than one component; or are managed with joint decision-making. These projects allow the services and DLA to leverage their programs by sharing the financial and managerial burdens for projects that can benefit more than one defense component. This is especially important given the limited ManTech budget and the small number of new projects each year that are approved for funding. For example, one currently funded joint project is expected to achieve affordability goals for forged components used on fighter aircraft. The project is expected to benefit the Joint Strike Fighter, the Navy's F/A-18, and the Air Force's F-22. The Navy's National Center for Excellence in Metalworking Technology is managing this project and both the Navy and the Air Force are providing ManTech funds. Another project is expected to achieve significant cost reductions by further developing composite friendly aircraft designs, simulation tools, and material and manufacturing processes. The Air Force, the Navy, and the Army are contributing funds for this project. In fiscal year 2001, joint projects represented 16 of 124 projects, or only 13 percent of all projects reviewed last year. Another 84 projects, or 68 percent, had potential to benefit more than one DOD component, but were not otherwise joint projects. For example, one project would improve, demonstrate, and implement a process for coating electrical circuits to seal them against dirt and moisture, which would increase the reliability of the circuits. This Army project would benefit a number of Army missile systems, such as the Javelin and the Patriot Advanced Capability-3, and the Program Executive Office for Army Tactical Missiles will contribute $750,000 over a 4-year period. In addition, the project could benefit various Air Force and Navy missile systems. Also, according to the Navy ManTech Director, more DOD-wide benefits could accrue through more joint participation in the Best Manufacturing Practices Center of Excellence. The objective of the center is to improve the quality, reliability, and performance of the U.S. defense industrial base. The center identifies and disseminates best practices used by industry to foster technology transfer and improve the competitiveness of the industrial base thereby improving cost, schedule and product performance. The Associate Director, Manufacturing Technology & Affordability, in the Office of the Deputy Under Secretary of Defense (Science & Technology), Office of Technology Transition agreed that more joint programs would help the services and DLA leverage their funding and would facilitate the transfer of technology resulting from ManTech efforts. The Joint Defense Manufacturing Technology Panel, the organization DOD has charged with the joint oversight of the ManTech Program, recognizes the importance of jointly funded and managed programs. Annual reviews of on-going projects conducted by various subpanels include, among other things, identification of the degree to which all projects are joint. Current guidance does not require projects already funded and in process be reviewed for joint participation, but the panel is revising the guidance to include a review of projects that are being considered or have been selected for funding but have not yet started. However, the draft guidance states that these types of projects would not be rated for their degree of jointness. Proposed topics for review would include a discussion of competing technologies or approaches and related work underway or completed, but stops short of identifying potential projects for joint funding or management. DOD does not know the full extent of the success of the ManTech program because it does not track the outcomes past the initial implementation. Statute requires that DOD prepare an annual report for the Congress that includes, among other things, an assessment of the effectiveness of the ManTech Program, including a description of all completed projects and plans and status of implementation of the technologies and processes being developed under the program. For each project listed, the report lists the objective for the project, the completion date, the amount of ManTech funding for the year, the potential beneficiaries for the project, the implementation site, and the expected return on the investment in terms of future cost avoidance. Nevertheless, while the report responds to a congressional requirement, it falls short of validating the long-term benefits predicted for the ManTech program. And currently, DOD lacks a methodology and process for doing so. The ManTech Program could be assessed by providing contractors with a financial incentive to track and report project results or by evaluating project proposals based on a contractor's plans to track and report on implementing it. In addition, DOD could periodically commission an independent survey or study. An external review of the ManTech Program in 1998 stated that while the data on the return on investment for selected projects was impressive, DOD should seek review by an independent third party of projects at the service and agency level. By tracking and validating the long-term benefits of the program, DOD would be able to measure the actual return on investment of a particular project. The department would also know what technologies had been successfully transferred and the extent to which the ManTech Program improved the quality of weapons systems. Without soliciting an independent review or developing a standard for quantifying benefits, DOD cannot be sure that the ManTech Program is providing the financial benefits that have been estimated or that users' long-term needs are being met. Further, it will not have a reliable basis for making decisions on its budgetary priorities and tradeoffs. The Navy, Army, Air Force and DLA all have processes that include users in establishing requirements for ManTech programs. Each service and DLA, however, separately selects, funds, and implements their ManTech program. While users report that the program has been meeting their technology needs, some ManTech officials expressed concern that funding was insufficient. At the same time, however, DOD has not been fully taking advantage of opportunities to leverage funding by conducting joint projects. The Joint Defense Manufacturing Technology Panel's effort to revise its guidance on reviewing planned ManTech projects should provide an opportunity to identify candidates for joint funding and implementation. Finally, DOD does not currently have an effective means to measure the results of completed projects. Without a means for determining project benefits, DOD will not know whether the ManTech program is meeting the long-term needs of users. DOD and the services need to build on existing efforts to identify and conduct joint ManTech projects. The Joint Defense Manufacturing Technology Panel's proposal to get involved earlier and review the services' planned projects is a constructive step forward toward facilitating more joint projects. We recommend that DOD develop additional measures to coordinate the services' planning cycles, budgets, and project selection criteria to better position them to identify and conduct joint projects. We also recommend that DOD develop a more systematic means for determining the results of ManTech projects. This may be done, for example, by (1) using an award or incentive fees to motivate contractor tracking of ManTech benefits over time, (2) including a requirement to track and report implementation as an evaluation criterion for awarding ManTech work, or (3) conducting or contracting for periodic surveys and/or studies of the industrial base to quantify the impact of ManTech projects. In written comments on a draft of this report, DOD partially concurred with our first recommendation on the need to build on existing efforts to conduct joint ManTech projects and concurred with our second recommendation on the need to develop a more systematic means to determine the results of ManTech projects. With respect to the first recommendation, DOD emphasized that the Joint Defense Manufacturing Technology Panel already provides an effective model for how to plan, coordinate, execute, fund, and implement joint ManTech activities and that this warrants positive recognition. DOD further stated that in comparison to other DOD programs that are overseen at the Office of the Under Secretary of Defense level but funded by the military services and defense agencies, the implementation of "only" 16 joint projects should be viewed in a more positive context. However, DOD acknowledged that more could be done to improve the process for developing joint projects. Toward that end, the panel is modifying its process and will review projects that have not yet started or that have recently begun and will rate these projects on the degree to which they are joint. In addition, DOD stated the panel will review the services' and DLA's planning cycles to identify opportunities for more effective coordination of planned projects. We agree that the Joint Defense Manufacturing Technology Panel has helped to improve the coordination of the services and DLA programs and facilitate the implementation of certain joint projects. For example, the 16 jointly funded active projects are evidence that DOD does jointly plan and conduct ManTech projects. However, we continue to believe that additional opportunities exist for pursuing joint projects. This is reflected in the fact that the Panel identified another 84 active projects that could benefit more than one DOD component but were not jointly funded, planned, or managed. The Panel's new review process is a step in the right direction to facilitate more joint projects. However, as with the old process, projects will be reviewed for jointness only after the services and DLA have already selected them for funding. This could limit the extent to which a project can be jointly planned, funded or managed since it is likely the requirements have already been determined. The action initiated by the Joint Defense Manufacturing Technology Panel to review the components' planning cycles is also a positive measure, provided that the results are used to facilitate more joint planning earlier in the process. DOD also provided technical comments that we incorporated into the report as appropriate. DOD's comments appear in appendix II. We will send copies of the report to the Chairmen and the Ranking Minority Members of other appropriate congressional committees; the Secretary of Defense; and the Director, Office of Management and Budget; and other interested parties. We will also make copies available to others on request. Please contact me at (202) 512-4841 or John Oppenheim at (202) 512-3111 if you or your staff have any questions concerning this report. Other major contributors to this report were Myra Watts Butler, Cristina Chaplain, Dayna Foster, Gaines Hensley, and Stephanie May. To determine if projects funded by the program are responsive to the needs of the military services and the Defense Logistics Agency, we reviewed the processes, policy memoranda, and guidance for identifying manufacturing needs, prioritizing those needs and presenting them for consideration for funding at both the systems command ManTech program director level and the weapon system program office level. We discussed various manufacturing technology-related issues including overseeing responsibilities with officials from the Office of the Deputy Under Secretary of Defense (Science & Technology), Office of Technology Transition; the Office of the Under Secretary of Defense for Acquisition and Technology, Deputy Under Secretary of Defense for Science and Technology; the Office of the Deputy Assistant Secretary of the Army for Research and Technology; the Office Naval Research; and the Office of the Assistant Secretary of the Air Force (Acquisition), Science, Technology, and Engineering. At the ManTech program director level, we reviewed memoranda, guidance, and processes for identifying manufacturing needs, prioritizing those needs and project formation. We also met with management officials responsible for implementing the ManTech Program. For example, we met with officials from the Office of Naval Research, Industrial and Corporate Programs Detachment, Manufacturing Technology Program Office, in Arlington, Virginia and Philadelphia, Pennsylvania; the Army Research Laboratory, Aberdeen Proving Ground, Maryland; the Air Force Research Laboratory, Materials and Manufacturing Directorate at Wright Patterson Air Force Base, Ohio; and Defense Logistics Agency at Fort Belvoir, Virginia. To further assess user's satisfaction, we spoke directly with ManTech users concerning their involvement in the ManTech Program and whether the projects were meeting their needs. However, we did not validate reported successes of the program. We identified the users from a selected number of active projects in fiscal year 1999 and 2000 for the Navy, Army, Air Force and Defense Logistics Agency. Specifically, for the Navy, we met with officials of various program executive offices and program managers from the Naval Sea Systems Command at Arlington, Virginia; the Naval Air Systems Command at Patuxent River, Maryland; and the Marine Corps Systems Command at Quantico, Virginia. For the Army, we met with representatives from several missile and aviation weapon systems at the Army Aviation and Missile Command located in Redstone Arsenal, Alabama; the Army Armaments Research and Development Center in Picatinny Arsenal, New Jersey; the Army Materiel Command in Alexandria, Virginia; the Air and Missile Defense Program Executive Office in Huntsville, Alabama; the Aviation Program Executive Office at Redstone Arsenal, Alabama; and Ground Combat Support Systems Program Executive Office at Picatinny Arsenal, New Jersey. For the Air Force, we met with representatives from the Joint Air-To-Surface Standoff Missile Program and the Joint Direct Attack Munitions Program at Eglin Air Force Base, Florida; the Joint Strike Fighter Program, F-119 Engine Program, the Engine Directorate, and Air Force Materiel Command Logistics office at the Wright Patterson Air Force Base, Ohio. To determine whether work being performed under the ManTech Program is being awarded on a competitive basis, we first reviewed the guidance and policy for competitive awards. We interviewed contracting officials as well as engineers who manage ManTech projects to obtain their views concerning specific projects. To assess the degree to which projects are awarded competitively, we randomly selected a sample of ManTech projects from the above list of fiscal years 1999 and 2000 projects for the Army, Navy, Air Force and DLA based on levels of funding, length of the projects, and varying types of technologies and weapons systems. We then reviewed the contract files to determine whether competitive award procedures were used. Because of the way the Navy is organized, we also selected five of nine centers of excellence and reviewed their policies, guidance and processes on competing projects. Specifically, we visited the Center of Excellence for Composites Manufacturing Technology (South Carolina Research Authority) in North Charleston, South Carolina; Electronics Manufacturing Productivity Facility (American Competitiveness Institute) in Philadelphia, Pennsylvania; Navy Joining Center (Edison Welding Institute) in Columbus, Ohio; National Center for Excellence in Metalworking Technology (Concurrent Technologies Corporation) in Johnstown, Pennsylvania; and Gulf Coast Region Maritime Technology Center (University of New Orleans College of Engineering) in New Orleans, Louisiana. We obtained the legal advice of our General Counsel on questionable sole source projects. | The Department of Defense (DOD) established the Defense Manufacturing Technology Program to develop and apply advanced manufacturing technologies to reduce the total cost and improve the manufacturing quality of weapon systems. By maturing and validating emerging manufacturing technology and transferring it to the factory floor, the program bridges the gap between technology invention and industrial application. The program, which has existed in various forms since the 1950's, received about $200 million in funding fiscal year 2001. DOD's Office of the Under Secretary of Defense provides guidance and oversight to the Army, Navy, Air Force, and the Defense Logistics Agency (DLA), but each establishes its own policies and procedures for running the program and determines which technologies to develop. Users told GAO that the program was responding to their needs by developing technologies, products, and processes that reduced the cost and improved the quality of weapons systems. To the extent practicable, DOD used competitive procedures to award the work done under the program. The Army, Air Force, and DLA competitively awarded most of the projects GAO reviewed for fiscal years 1999 and 2000, and the remaining non-competitive awards were based on documented sole source justifications. DOD is missing opportunities to conduct more joint programs and lacks effective measures of program success. Joint projects would enable the services to address the funding issue by leveraging limited funding and integrating common requirements and approaches for developing manufacturing technologies. | 4,955 | 292 |
Since the Social Security Act became law in 1935, workers have had the right to review their earnings records on file at SSA to ensure that they are correct. In 1988, SSA introduced the PEBES to better enable workers who requested such information to review their earnings records and obtain benefit estimates. According to SSA, less than 2 percent of workers who pay Social Security taxes request these statements each year. plans to have mailed statements automatically to more than 70 million workers. By providing these statements, SSA's goals are to (1) better inform the public of benefits available under SSA's programs, (2) assist workers in planning for their financial future, and (3) better ensure that Social Security earnings records are complete and accurate. Correcting earnings records benefits both SSA and the public because early identification and correction of errors in earnings records can reduce the time and cost required to correct them years later when an individual files for retirement benefits. Issuing the PEBES is a significant initiative for SSA. The projected cost of more than $80 million in fiscal year 2000 includes $56 million for production costs, such as printing and mailing the statement, and $24 million for personnel costs. SSA estimates that 608 staff-years will be required to handle the PEBES workload in fiscal year 2000: SSA staff are needed to prepare the statements, investigate discrepancies in workers' earnings records, and respond to public inquiries. Since the PEBES was first developed, SSA has conducted several small-scale and national surveys to assess the general public's reaction to receiving an unsolicited PEBES. In addition, SSA has conducted a series of focus groups to elicit the public's and SSA employees' opinion of the statement and what parts of it they did and did not understand. retirement at age 70. When SSA learned that many people were interested in the effect of early retirement on their benefits, SSA added an estimate for retirement at age 62. Overall public reaction to receiving an unsolicited PEBES has been consistently favorable. In a nationally representative survey conducted during a 1994 pilot test, the majority of respondents indicated they were glad to receive their statements. In addition, 95 percent of the respondents said the information provided was helpful to their families. Overall, older individuals reacted more favorably to receiving a PEBES than did younger individuals. In addition, SSA representatives who answer the toll-free telephone calls from the public have stated that most callers are pleased that they received a PEBES and say that the information is useful for financial planning. Although SSA has taken steps to improve the PEBES, we found that the current statement still provides too much information, which may overwhelm the reader, and presents the information in a way that undermines its usefulness. These weaknesses are attributable, in part, to the process SSA used to develop the PEBES. Additional information and expanded explanations have made the statement longer, but some explanations still confuse readers. Moreover, SSA has not tested for reader comprehension and has not collected detailed information from its front-line workers on the public's response to the PEBES. explanations to understand complex information, the explanations should appear with the information. Easy-to-understand explanations: Readers need explanations of complex programs and benefits in the simplest and most straightforward language possible. In the 1996 PEBES, the message from the Commissioner of Social Security does not clearly explain why SSA is providing the statement. Although the message does include information on the statement's contents and the need for individuals to review the earnings recorded by SSA, its presentation is uninviting, according to the design expert we consulted. More specifically, the type is too dense; the lines are too long; white space is lacking; and the key points are not highlighted. If the PEBES' recipients do not read the Commissioner's message, they may not understand why reviewing the statement is important. The message also attempts to reassure people that the Social Security program will be there when they need it with the following reference (from the 1996 PEBES) to the system's solvency: The Social Security Board of Trustees projects that the system will continue to have adequate resources to pay benefits in full for more than 30 years. This means that there is time for the Congress to make changes needed to safeguard the program's financial future. I am confident these actions will result in the continuation of the American public's widespread support for Social Security. Some participants in SSA focus groups, however, thought the message suggested that the resources would not necessarily be there after 30 years. For example, one participant in a 1994 focus group reviewing a similar Commissioner's message said, ". . . first thing I think about when I read the message is, is not going to be there for me." current statement, some focus group participants and benefit experts suggested that SSA add an index or a table of contents to help readers navigate the statement. SSA has not used the best layout and design to help the reader identify the most important points and move easily from one section to the next. The organization of the statement is not clear at a glance. Readers cannot immediately grasp what the sections of the statement are, and in which order they should read them, according to the design expert with whom we consulted. The statement lacks effective use of features such as bulleting and highlighting that would make it more user friendly. In addition, the PEBES is disorganized: information does not appear where needed. The statement has a patchwork of explanations scattered throughout, causing readers to flip repeatedly from one page to another to find needed information. For example, page two begins by referring the reader to page four, and page three contains six references to information on other pages. Furthermore, to understand how the benefit estimates were developed and any limitations to these estimates, a PEBES recipient must read explanations spread over five pages. The statement's spreading of benefit estimate explanations over several pages may result in individuals missing important information. This is especially true for people whose benefits are affected by special circumstances, which SSA does not take into consideration in developing PEBES benefit estimates. For example, the PEBES estimate is overstated for federal workers who are eligible for both the Civil Service Retirement System and Social Security benefits. For these workers, the law requires a reduction in their Social Security retirement or disability benefits according to a specific formula. In 1996, this reduction may be as much as $219 per month; however, PEBES' benefit estimates do not reflect this reduction. The benefit estimate appears on page three; the explanation of the possible reduction does not appear until the bottom of page five. Without fully reviewing this additional information, a reader may not realize that the PEBES benefit estimate could be overstated. Because PEBES addresses complex programs and issues, explaining these points in simple, straightforward language is challenging. Although SSA made changes to improve the explanation of work credits, for example, many people still do not understand what these credits are, the relevance of the credits to their benefits, and how they are accumulated. The public also frequently asks questions about the PEBES' explanation of family benefits. Family benefits are difficult to calculate and explain because the amount depends on several different factors, such as the age of the spouse and the spouse's eligibility for benefits on his or her own work record. Informing the public about family benefits, however, is especially important: a 1995 SSA survey revealed that as much as 40 percent of the public is not aware of these benefits. A team of representatives from a cross section of SSA offices governed SSA's decisions on the PEBES' development, testing, and implementation. The team revised and expanded the statement in response to feedback on individual problems. The design expert we consulted observed that the current statement "appears to have been the result of too many authors, without a designated person to review the entire piece from the eyes of the readers. It seems to have developed over time, piecemeal . . . ." information collected does not provide sufficient detail for SSA to understand the problems people are having with the PEBES. Although the public and benefit experts agree that the current statement contains too much information, neither a standard benefit statement model exists in the public or private sector nor does a clear consensus on how best to present benefit information. The Canadian government chose to use a two-part document when it began sending out unsolicited benefit statements in 1985. The Canada Pension Plan's one-page statement provides specific individual information, including the earnings record and benefit estimates. A separate brochure details the program explanations. The first time the Plan mails the statement, it sends both the one-page individual information and the detailed brochure; subsequent mailings contain only the single page with the individual information. Although some focus group participants and benefit experts prefer a two-part format, others believe that all information should remain in a single document, fearing that statement recipients will lose or might not read the separate explanations. SSA has twice tested the public's reaction to receiving two separate documents. On the basis of a 1987 focus group test, SSA concluded that it needed to either redesign the explanatory brochure or incorporate the information into one document. SSA chose the latter approach. In a 1994 test, people indicated that they preferred receiving one document; however, the single document SSA used in the test had less information and a more readable format than the current PEBES. SSA, through the Government Printing Office, has awarded a 2-year contract for printing the fiscal years 1997 and 1998 statements. These statements will have the same format as the current PEBES with only a few wording changes. SSA is planning a more extensive redesign of the PEBES for the fiscal year 1999 mailings but only if it will save money on printing costs. By focusing on reduced printing costs as the main reason for redesigning the PEBES, SSA is overlooking the hidden costs of the statement's existing weaknesses. For example, if people do not understand why they got the statement or have questions about information provided in the statement, they may call or visit SSA, creating more work for SSA staff. Furthermore, if the PEBES frustrates or confuses people, it could undermine public confidence in SSA and its programs. Our work suggests, and experts agree, that the PEBES' value could be enhanced by several changes. Yet SSA's redesign team is focusing on reducing printing costs without considering all of the factors that would ensure that PEBES is a cost-effective document. The PEBES initiative is an important step in better informing the public about SSA's programs and benefits. To improve the statement, SSA can quickly make some basic changes. For example, SSA officials told us that, on the basis of our findings, they have revised the Commissioner's message for the 1997 PEBES to make it shorter and less complex. More extensive revisions are needed, however, to ensure that the statement communicates effectively. SSA will need to start now to complete these changes before its 1999 redesign target date. The changes include improving the layout and design and simplifying certain explanations. These revisions will require time to collect data and to develop and test alternatives. SSA can help ensure that the changes target the most significant weaknesses by systematically obtaining more detailed feedback from front-line workers. SSA could also ensure that the changes clarify the statement by conducting formal comprehension tests with a sample of future PEBES recipients. In addition, we believe SSA should evaluate alternative formats for communicating the information presented in PEBES. For example, SSA could present the Commissioner's message in a separate cover letter accompanying the statement, or SSA could consider a two-part option, similar to the approach of the Canada Pension Plan. To select the most cost-effective option, SSA needs to collect and assess additional cost information on options available and test different PEBES formats. Our work suggests that improving PEBES will demand attention from SSA's senior leadership. For example, how best to balance the public's need for information with the problems resulting from providing too much information are too difficult and complex to resolve without senior-level SSA involvement. Mr. Chairman, this concludes my formal remarks. I would be happy to answer any questions from you and other members of the Subcommittee. Thank you. For more information on this testimony, please call Diana S. Eisenstat, Associate Director, Income Security Issues, at (202) 512-5562 or Cynthia M. Fagnoni, Assistant Director, at (202) 512-7202. Other major contributors include Evaluators Kay Brown, Nora Perry, and Elizabeth Jones. The first copy of each GAO report and testimony is free. Additional copies are $2 each. Orders should be sent to the following address, accompanied by a check or money order made out to the Superintendent of Documents, when necessary. VISA and MasterCard credit cards are accepted, also. Orders for 100 or more copies to be mailed to a single address are discounted 25 percent. U.S. General Accounting Office P.O. Box 6015 Gaithersburg, MD 20884-6015 Room 1100 700 4th St. NW (corner of 4th and G Sts. NW) U.S. General Accounting Office Washington, DC Orders may also be placed by calling (202) 512-6000 or by using fax number (301) 258-4066, or TDD (301) 413-0006. Each day, GAO issues a list of newly available reports and testimony. To receive facsimile copies of the daily list or any list from the past 30 days, please call (202) 512-6000 using a touchtone phone. A recorded menu will provide information on how to obtain these lists. | GAO discussed the Social Security Administration's (SSA) Personal Earnings and Benefit Estimate Statement (PEBES). GAO noted that: (1) the public has reacted favorably to unsolicited PEBES, and SSA has improved the statement in response to public feedback; (2) the public generally feels that the statement is a valuable tool for retirement planning, but the statement does not clearly convey its purpose and related information on SSA programs and benefits; (3) PEBES weaknesses have resulted from its piecemeal development and the lack of testing for comprehension; (4) there is no consensus on the best model for PEBES; (5) SSA plans to redesign PEBES only if the redesign results in lower printing costs; (6) this approach fails to recognize the hidden costs arising from the need to answer public inquiries about statement information and the undermining of public confidence in SSA programs by the statement's poor design; (7) SSA needs to improve PEBES layout and design and simplify certain explanations, obtain more detailed feedback from its frontline workers, conduct comprehension tests, and consider alternative statement formats; and (8) SSA senior management attention is needed to ensure the success of the statement initiative by redesigning PEBES to present benefits information more effectively. | 2,951 | 269 |
Seeking to improve the health and nutrition education of American schoolchildren, USDA began the Team Nutrition initiative (commonly known as Team Nutrition) in fiscal year 1995 by seeking and obtaining $20.3 million in funding. The Congress made another $10.5 million available for Team Nutrition in fiscal year 1996 and $10 million for fiscal year 1997. Elementary and secondary schools can participate in the initiative--and become Team Nutrition schools--by agreeing to support the initiative's mission and principles and by making a commitment to meet USDA's dietary guidelines for Americans. Once a school joins the "team," it can obtain nutrition education materials on healthy eating habits. As of August 16, 1996, over 14,000 schools spread across all 50 states, the District of Columbia, Puerto Rico, and the U.S. Virgin Islands had become Team Nutrition schools. The Secretary of Agriculture is encouraging the remaining 80,000 schools across the nation to become Team Nutrition schools so that they can also obtain the materials being developed under the initiative. USDA considers Team Nutrition to be a key vehicle for promoting one of its top priorities: integrating the latest nutrition knowledge into each of USDA's food assistance programs. At Team Nutrition's inception, the Under Secretary for Food, Nutrition and Consumer Services decided that Team Nutrition would be administered by USDA's Food and Consumer Service (FCS). Until August 1996, when it was placed in FCS' Special Nutrition Programs, Team Nutrition was managed by the Office of the FCS Administrator. The initiative is composed of two basic components: (1) training and technical assistance, which is managed by the Associate Administrator for Food and Consumer Services, and (2) nutrition education, which is managed by Team Nutrition's Acting Project Manager. Much of Team Nutrition's efforts are carried out through contracts and cooperative agreements. These contracts and agreements support both of Team Nutrition's components. Each component received approximately half of the funds appropriated for Team Nutrition. We reviewed two contracts for support services, one with Global Exchange, Inc. (Global), and one with Prospect Associates, Ltd. (Prospect); a cooperative agreement with Buena Vista Pictures Distribution, Inc., a subsidiary of the Walt Disney Company (Disney), to use animated characters to promote healthy eating; and a grant to an author to write a children's book about nutrition. Table 1 provides details on these activities. FCS obtains support services for Team Nutrition from Global and Prospect through task order contracts. A task order contract is used when the procuring agency knows the type, but not the precise quantities, of services that will be required during the contract period. These contracts permit flexibility in (1) scheduling tasks and (2) ordering services after the requirements materialize. The Federal Acquisition Regulation stipulates that task order contracts may be used when the agency anticipates a recurring need for the contractor's services. FCS awarded the Global and Prospect contracts to provide (1) marketing and consumer research on how to best market nutrition education; (2) message development, design, and production services for multimedia nutrition education materials; and (3) ways to create and maintain partnerships with organizations concerned about nutrition education. The cooperative agreement with Disney allows FCS to use two of Disney's popular animated characters in Team Nutrition's media campaigns. In accordance with the agreement, Disney developed and distributed four animated public service announcements and additional nutrition education materials featuring Pumbaa and Timon, characters from its recent film, The Lion King. Finally, FCS awarded a $25,000 grant to an author to write a children's book promoting good nutrition. Our review of two contracts, a cooperative agreement, and a grant under the Team Nutrition initiative revealed poor management and, in some cases, a violation of federal procurement law and ethics regulations. The problems we found with each of these efforts are discussed below. We found no irregularities in the manner in which FCS awarded the contract to Global. However, we believe that Team Nutrition officials acted improperly in assigning tasks under the Global contract that were beyond the contract's scope of work. These officials also did not follow normal contracting procedures in dealing with subcontractors under the Global contract. Federal procurement law requires that an agency conduct a separate procurement when it wishes to acquire services that are beyond the scope of an existing contract. A matter is outside the scope of the original contract when it is materially different from the original purpose or nature of the contract. In our view, Team Nutrition officials assigned Global two tasks--tasks 9 and 10--under its contract that materially deviated from the original contract's overall scope of work. Under its contract with FCS, Global was to provide support services to assist Team Nutrition in conducting a national nutrition education campaign, including the planning and development of educational materials and communication efforts related to nutrition. As we discussed in our May 1996 testimony, task 9 was to conduct focus group research to assess the reactions of the general public and food stamp recipients to USDA's proposals to change the Food Stamp Program. We concluded that this work, which cost FCS about $33,000, was outside the scope of Global's support services contract. Similarly, task 10--to evaluate the success of the San Francisco County Jail's garden project and to develop a guidebook on the project to show other communities how to implement similar programs--has no substantive relationship to nutrition education or the dissemination of sound nutrition information. The garden project is a program to rehabilitate former prisoners by having them grow produce that is either donated to the needy or sold to restaurants. This evaluation, for which FCS has budgeted about $49,000, differs materially from the subject matter of the Global contract, which is to assist FCS in its efforts to provide "effective nutrition education" and to communicate "sound nutrition information." Furthermore, contrary to normal contracting practices, Team Nutrition officials directed Global to hire specific subcontractors and did not give Global the opportunity to perform the work itself. Generally, once an agency awards a contract, the contractor is responsible for performing the work, either by using its own resources or by hiring a subcontractor. Team Nutrition officials negotiated directly with five firms to perform work for certain elements of its nutrition education campaign before the five firms signed subcontract agreements with Global. Representatives from three of these firms also met with the Under Secretary to discuss their work before any contractual arrangement had been made between these firms and Global. All five firms then started work for Team Nutrition without the knowledge of, or any signed agreements with, Global. These firms were later added as subcontractors to the Global contract. Because Team Nutrition officials directed Global to hire these firms, Global did not obtain competitive offers, nor did it conduct a cost-reasonableness analysis of their proposed budgets. After they signed subcontract agreements with Global, these subcontractors continued to be directed by Team Nutrition officials instead of Global. These officials often did not include Global in planning meetings with the subcontractors and did not provide the subcontractors with well-defined tasks that had specific deliverables. As a result, Global had little control over its subcontractors' work and costs. Furthermore, Global and FCS officials told us that they did not understand what work one of the subcontractors had done to justify the $40,000 payment it had received. Only after the subcontractor had been paid did Global and FCS officials ask the subcontractor to document the tasks it had performed. As with Global, we found that FCS' contract with Prospect was awarded in a fashion consistent with applicable procurement regulations. However, the history of the Prospect contract indicates a pattern of careless management. This careless management may have reduced the contract's contributions to Team Nutrition. When the Prospect contract was awarded, Team Nutrition officials provided only minimal technical direction for the contract's tasks. The Contracting Officer's Representative (COR), who was not the Team Nutrition Project Manager, did not have a clear understanding of how Prospect was to support the Team Nutrition mission. Therefore, the COR did not provide the technical direction that Prospect needed to effectively perform several tasks. Moreover, without notifying the Contracting Officer, and without having the authority to do so, the COR allowed a number of unauthorized individuals to provide technical direction to Prospect and/or to change the scope of the work defined in at least two tasks. In one instance, the director of a USDA division unrelated to Team Nutrition directed Prospect to conduct focus group research worth about $78,000 without the Contracting Officer's approval. In another instance, a Contracting Officer's Technical Representative directed a significant change in a task's scope of work without authorization. The Contracting Officer and the COR did not become aware of this directed change until Prospect submitted a revised cost proposal to increase the cost of the task by about $500,000. Furthermore, a change to one effort under the Prospect contract, while within the scope of the contract, involved work that was more complex than anticipated, given the statement of work and the projected budget in the contract's task orders. Team Nutrition officials expanded a relatively basic $173,000 evaluation of the effectiveness of Team Nutrition to a more comprehensive $2.3 million effort. FCS contracting officials told us that while this work was within the scope of the contract, it would have been preferable for the agency to obtain this expanded work through a separate, competitive procurement. They believed that a separate procurement was preferable because of the magnitude of the change and the addition of work that required a higher degree of technical expertise than was originally specified. However, FCS contracting officials told us that, given Team Nutrition's desire to move quickly in initiating the work, they did not have sufficient time to solicit and award a new competitive contract. We found no problem with the process FCS used to award the cooperative agreement to Disney. However, once again, we found weaknesses in FCS' performance in managing this cooperative agreement. FCS entered into this agreement, which allows it to use two Disney characters from The Lion King to promote good nutrition, while these characters were also being used in advertisements and in-store promotions for a national fast food restaurant chain. To assess the impact of these characters on the Team Nutrition nutrition education campaign, FCS had Global conduct focus groups to determine what messages children were receiving from these characters. However, in conducting this evaluation, FCS did not test the possible messages children could receive from the fast food advertisements. Therefore, the information gathered from this research may be inconclusive. Furthermore, the Disney agreement, originally scheduled to expire on September 30, 1996, required Team Nutrition to return to Disney all materials that used the animated characters at the expiration of the agreement. These materials are included in the nutrition education kits that FCS is distributing to Team Nutrition schools. When we questioned the potential impact of this requirement on Team Nutrition's goals, we discovered that Team Nutrition officials had not been attentive to the fact that the agreement was about to expire. They acknowledged our concerns, subsequently contacted Disney, and sought Disney's consent to extend the agreement's expiration date. On August 8, 1996, Team Nutrition officials told us that Disney had agreed to a 1-year extension; but as of September 16, 1996, no contract extension had been executed. Even with this extension, under the current terms of the agreement, FCS will be required to return the materials in September 1997. Since Team Nutrition officials had planned to distribute these materials to schools through February 1998, the requirement to return the Disney materials before that date may curtail some elements of the nutrition education campaign. We found that the process FCS followed in the award of a $25,000 sole-source grant to an author to write a children's book on nutrition was consistent with departmental criteria. These criteria allow sole-source grant awards for amounts less than $75,000, and FCS contracting officials exercised their authority under these criteria. However, the Under Secretary for Food, Nutrition and Consumer Services, through her involvement in the administration of this grant, violated federal ethics regulations. These regulations prohibit employees from using public office for the private gain of their friends. Specifically, to ensure that an employee's actions do not create the appearance of the use of public office for private gain, or of giving preferential treatment, these regulations require the employee whose official duties would affect the financial interests of a friend to comply with certain other regulations. These latter regulations prohibit an employee from participating in a specific matter likely to have a direct and predictable effect on the financial interests of the friend, unless that employee has informed the agency's designated ethics official of the appearance problem and received authorization from that official to participate in the matter. The grantee and the Under Secretary have known one another for 15 years and are close personal friends. Despite this relationship, the Under Secretary did not inform USDA's ethics officials about her friendship with the author, nor did she recuse herself from approving the grantee's performance before payment was made to the author, or from other actions that would financially benefit the author. The Under Secretary maintained close personal involvement throughout the period of the grantee's performance. For example, her staff regularly kept her informed of the discussions and developments between FCS and the author's agent, and the Under Secretary provided comments to her staff on these matters. In addition, under the terms of the grant, the author was to receive interim payments based on her performance in writing the book. These interim payments depended upon the Department's review and approval of the author's manuscript. Our review showed that the Under Secretary was given the manuscript for her approval and that her Executive Assistant--although not the COR for this effort--personally conveyed the Department's final approval to the author's agent. Moreover, during the development of the manuscript, the Under Secretary met in person with the author at USDA to convey the Department's comments on the manuscript. To date, FCS has paid the author $11,250. The final payment of $13,750 will be made, as specified by the terms of the grant, when the book is published. Furthermore, the author's grant application explicitly stated that the author hoped and expected to earn "considerably more" through sales of the book. Thus, the publication of the book would provide income to the author in two ways: (1) the final payment under the grant and (2) the sales of the book. In this connection, at least as early as February 1994, the author's agent raised the idea with the Under Secretary's office that USDA would at some point purchase a significant quantity of the published books. During the period in which the manuscript was being developed, there were frequent and insistent communications from the author's agent to USDA about the need for a purchase commitment from USDA for a large quantity of these books as part of the initial production run. The Under Secretary's staff informed her several times about this issue. These developments culminated in October 1995, shortly after USDA gave final approval to the manuscript. The Team Nutrition Project Manager and the COR prepared a procurement request on October 2, 1995, for approximately 25,000 copies of the book, at a cost of approximately $50,000. However, the FCS Budget Division questioned the request because, in less than 1 year, FCS would be able to copy the books itself. When informed of these concerns, the Under Secretary replied, in writing, that "the Need in Schools is Now" and advised that "If justification is adequate, we proceed." However, when told of the circumstances, the FCS Administrator directed that this procurement not go forward. To date, the book has not been published. In our August 1996 report, we identified a number of irregularities in the process used to hire the former Project Manager, set her salary, and collect financial disclosure statements from her and the former Assistant Project Manager. As we previously reported, FCS complied with the federal regulatory procedures for establishing, advertising, and considering applicants for the positions to which the Project Manager, Assistant Project Manager, and Project Coordinator were subsequently appointed. FCS judged each of these employees as qualified for the positions for which they applied, and the Office of Personnel Management certified that these applicants met the general standards for the positions for which they applied. However, our review of the former Project Manager's employment application raised several concerns about her qualifications for the position she held. These concerns included the very short period of time she had spent in a previous job that FCS considered to be crucial experience in judging her qualifications, her apparent misrepresentation of her academic credentials, and her lack of answers to some questions on her application and her incomplete answers to others. Because FCS performed only a perfunctory review of the former Manager's paperwork, it was unaware of the potential problems with her experience and her academic credentials. In addition, we found that FCS did not have an adequate basis for establishing the former Project Manager's salary. FCS did not require her to submit documentation sufficient for it to assess her salary history, as required by USDA's procedures. The former Project Manager may have overstated her prior salary by including in it the estimated value of pro bono consulting work, payments allegedly made to her husband, and projected earnings for several months in which she did not earn a salary. FCS was unaware of the former Manager's apparent overstatement of her prior salary. As a result of her representation of her prior salary, FCS appointed her to a significantly higher pay level than might have otherwise been justified. Finally, although the former Project Manager and the former Assistant Project Manager were required to submit financial disclosure statements within the first 30 days of their employment at FCS, neither employee did so. The former Project Manager did not submit a statement until a year after it was due, and the statement covered only a small portion of the period in which she was employed at FCS. The former Assistant Manager submitted a completed form 5 months after being hired, but only after the threat of disciplinary action. USDA's problems in managing its Team Nutrition procurement and personnel hiring practices can be attributed largely to the failure to follow the agency's procedures and the lack of a strategic plan for the Team Nutrition initiative. From Team Nutrition's inception, the Under Secretary has provided continual and specific direction of the initiative. The Under Secretary suggested the hiring of the former Project Manager and made decisions on procurements and a grant that demonstrated poor judgment and, in some cases, violated federal procurement law and ethics regulations. In addition, even though the initiative has been in effect and operating for nearly 2 years, there is no documented strategic plan to guide its operations. Without a strategic plan in place, FCS has had difficulty in determining how its contracts would be used to support Team Nutrition's goals. The Under Secretary for Food, Nutrition and Consumer Services considers Team Nutrition to be an important initiative that requires her personal leadership. Therefore, from its inception, the Team Nutrition initiative did not operate within FCS' existing program management structure. Instead, the Under Secretary placed the initiative within the Office of the FCS Administrator. According to the Under Secretary, she made this decision so that the new initiative would not be lost among the agency's competing priorities and so that it could benefit from high-level support and attention. The Under Secretary required all Team Nutrition managers to take programmatic direction from her through meetings and weekly reports. She made specific recommendations about whom to hire and how funds should be spent. The agency's normal internal controls and reporting and review processes were not followed for decisions on Team Nutrition. For example, contractors typically select their own subcontractors and monitor their subcontractors' performance. This situation did not occur in the Global contract because the Under Secretary selected some subcontractors and, in some cases, directly managed their work. Consequently, Global had little control over these subcontractors' work and costs. As we noted earlier, FCS and Global officials did not understand what work one subcontractor had done to justify his $40,000 payment. Team Nutrition officials were hampered in their efforts to manage the contracts, cooperative agreement, and grant because they had no documented strategic plan to guide these actions and measure their progress. Without a strategic plan, Team Nutrition officials had little understanding of the specific tasks that should be performed, the order in which these tasks should occur, and the way in which these tasks should be integrated to support Team Nutrition's goals. For example, the COR told us that he was unable to provide Prospect with meaningful, substantive work because Team Nutrition had no documented strategic plan. With no strategic plan to guide their decision-making, Team Nutrition officials added tasks and funds to the Prospect contract in a haphazard fashion. For example, the Team Nutrition Project Manager decided to add six new tasks totaling $3 million to the contract 1 week before its expiration date for adding new work. She requested the work despite the fact that she had informed the FCS contracting officials 14 days earlier that no new work would be added to the contract. According to the FCS contracting officials, they had to rush to complete the modification before the expiration date for adding new work. This time pressure precluded any meaningful price negotiations with the contractor before work began. Similarly, under the Global contract, Team Nutrition officials directed Global to hire five subcontractors but did not clearly define the tasks these subcontractors were to perform, including the products that were to result from these tasks. This lack of clear instructions resulted in duplication of effort and uncertain contributions to the Team Nutrition mission. For example, duplication occurred when FCS asked Global to hire two different firms to develop plans for the June 1995 launch of Team Nutrition. These two subcontracts totaled about $50,000, but neither plan was ever used, according to an FCS official. FCS recognized that it had a number of problems with its procurement administration and personnel management and has begun improvement efforts. In June 1995, FCS took steps to improve its management of the Global and Prospect contracts. These steps included establishing new operational procedures and increasing reporting responsibilities. Nearly a year later, FCS formed a Contract Management Review Task Force that assessed FCS' policies and procedures for contract management. The task force recommended changes to improve FCS' contract management. On June 21, 1996, the FCS Administrator issued numerous directives resulting from the task force's recommendations. Several of these directives recommend that the agency adhere to existing policies. New policies include training requirements for all staff involved with procurement and the establishment of an agency ombudsman for staff to contact about potential procurement improprieties. To sustain the Team Nutrition initiative, on July 26, 1996, the FCS Administrator recommended to the Under Secretary that, in the short term, Team Nutrition's activities be placed in FCS' existing programmatic structure--as part of Special Nutrition Programs. Until a new director for the Nutrition and Technical Services Division is appointed, the Deputy Administrator of Special Nutrition Programs will oversee the initiative's day-to-day operations. She will report to the Associate Administrator for Food and Consumer Services, who will, in turn, report to the FCS Administrator. However, according to the Associate Administrator, although the Under Secretary approved this recommendation on August 8, 1996, the Under Secretary has continued to provide programmatic direction to Team Nutrition managers. With respect to personnel management, as we reported earlier, FCS plans to (1) tighten procedures for examining the qualifications of applicants for senior-level positions; (2) strengthen its procedures for obtaining and properly reviewing documentation submitted by applicants that is sufficient for making appointments at salaries above the minimum rate; and (3) intensify its efforts to collect financial disclosure statements by aggressively following through with disciplinary action if its requests are not successful. In addition, the Administrator told us that he has directed the Human Resources Division to conduct an internal review of its personnel practices and that the Under Secretary had directed the Regional Administrator for FCS' Mid-Atlantic Region to conduct a similar review. The actions FCS has taken so far to address procurement and personnel problems are steps in the right direction. However, it is too soon to determine whether these actions are sufficient to correct the problems that we identified. In conclusion, we found that the Team Nutrition contracts, cooperative agreement, grant, and personnel management practices we examined demonstrate a pattern of poor management and, in some cases, violated federal procurement law and ethics regulations. The problems in the management of the Team Nutrition initiative can be attributed largely to the failure to follow the agency's procedures and the lack of a strategic plan for the initiative. FCS has taken some actions to address its procurement and personnel problems. However, unless better management judgment is exercised and the agency's procedures are adhered to, these problems are likely to persist. Mr. Chairman, this completes my prepared statement. I would be pleased to respond to any questions you or Members of the Subcommittee may have. Provide support for the Team Nutrition launch, including strategic counsel and management of the event. Provide strategic planning for Team Nutrition's public relations campaign and for coordinating the entertainment industry's participation in Team Nutrition. Provide research and development for the Team Nutrition launch, participation in strategic planning, development of press materials, and coordination of invitation mailing lists for the launch. Lake Research, Inc. Conduct focus group research to assess the reactions of the general public and food stamp recipients to the U.S. Department of Agriculture's proposals to change the Food Stamp Program. Podesta Associates, Inc. Develop and execute the U.S. Department of Agriculture's Great Nutrition Adventure, including strategic development, organization of national events, press relations, preparation of press materials, and follow-up contacts. The first copy of each GAO report and testimony is free. Additional copies are $2 each. Orders should be sent to the following address, accompanied by a check or money order made out to the Superintendent of Documents, when necessary. VISA and MasterCard credit cards are accepted, also. Orders for 100 or more copies to be mailed to a single address are discounted 25 percent. U.S. General Accounting Office P.O. Box 6015 Gaithersburg, MD 20884-6015 Room 1100 700 4th St. NW (corner of 4th and G Sts. NW) U.S. General Accounting Office Washington, DC Orders may also be placed by calling (202) 512-6000 or by using fax number (301) 258-4066, or TDD (301) 413-0006. Each day, GAO issues a list of newly available reports and testimony. To receive facsimile copies of the daily list or any list from the past 30 days, please call (202) 512-6000 using a touchtone phone. A recorded menu will provide information on how to obtain these lists. | Pursuant to a congressional request, GAO discussed the Department of Agriculture's (USDA) Team Nutrition contracts, cooperative agreement, and grant for multimedia nutrition education. GAO noted that: (1) Team Nutrition officials acted improperly in assigning tasks under the the Global contract and did not follow normal contracting procedures in dealing with Global's subcontractors; (2) Team Nutrition officials did not provide the technical direction that another contractor needed to perform several tasks; (3) Team Nutrition failed to determine the message that children were receiving from the Team Nutrition Initiative advertisement; (4) the Under Secretary for Food, Nutrition, and Consumer Services (FCS) violated federal ethics regulations by participating in the administration of this grant; (5) FCS improperly reviewed the former project manager's employment application, academic credentials, and financial disclosure statements; (6) USDA management problems resulted from USDA failure to follow agency procedures and its lack of a strategic plan for the Team Nutrition Initiative; and (7) FCS has taken steps to improve its procurement and personnel practices, including establishing new operational procedures, increasing reporting responsibilities, requiring procurement administration training, establishing an agency ombudsman to handle procurement improprieties, tightening procedures for the applications process, reviewing applicant documentation, and intensifying collection of employee financial disclosure statements. | 5,628 | 276 |
The overall process used to implement USERRA is as follows. Outreach and resolution of informal complaints. DOD and DOL share responsibility for outreach--the education of servicemembers and employers about their respective responsibilities under USERRA. Much of DOD's outreach is accomplished through ESGR, which performs most of its work through over 4,000 volunteers. DOL conducts outreach through its Veterans' Employment and Training Service (VETS) investigators, who are located nationwide. These investigators conduct briefings to educate employers and servicemembers about USERRA requirements and responsibilities and handle service-related employment and reemployment questions that are directed to their offices. Servicemembers who have USERRA-related issues with their employers can file informal complaints with DOD's ESGR. In our February 2007 report, we noted that nearly 10,000 informal complaints had been filed with ESGR in fiscal years 2004 and 2005. A subgroup of ESGR's specially trained volunteers serve as impartial ombudsmen who informally mediate USERRA issues that arise between servicemembers and their employers. Formal complaints and prosecution. When ESGR ombudsmen cannot resolve complaints informally, they notify servicemembers about their options. Servicemembers can file a formal complaint with DOL or file complaints directly in court (if it involves nonfederal employers) or the Merit Systems Protection Board (if it involves federal executive branch employers). Under a federal sector demonstration project established by the Veterans Benefits Improvement Act of 2004, DOL investigates complaints against federal executive branch agencies for individuals whose social security numbers end in even numbers, and OSC is authorized to directly receive and investigate complaints and seek corrective action for individuals whose social security numbers end in odd numbers. When a servicemember files a formal complaint with DOL, one of VETS's 115 investigators examines and attempts to resolve it. If VETS's investigators are unable to resolve servicemember complaints, DOL is to inform servicemembers that they may request to have their complaints referred to DOJ (for complaints against private sector employers or state and local governments) or to OSC (for complaints against federal executive branch agencies). Before complaints are sent to DOJ or OSC, they are reviewed by a VETS regional office for accuracy and sufficiency and by a DOL regional Office of the Solicitor, which assesses the legal basis for complaints and makes an independent recommendation. If DOJ or OSC determines that the complaint has merit, it will attempt to resolve the complaint without litigation and, if unsuccessful, represent the complainant in court (for those referred to DOJ) or before the Merit Systems Protection Board (for those referred to OSC). Figure 1 shows servicemembers' options for obtaining federal assistance with their USERRA complaints. Agency databases and reporting requirement. Each of the four federal agencies responsible for assisting servicemembers under USERRA maintains an automated database with complaint information. Both DOD and DOL have electronic complaint files that are stored in automated systems with query capabilities. The Secretary of Labor in consultation with the U.S. Attorney General and the Special Counsel prepares and transmits a USERRA annual report to Congress on, among other matters, the number of USERRA claims reviewed by DOL, and during the current demonstration project by OSC, along with the number of claims referred to DOJ or OSC. The annual report is also to address the nature and status of each claim, state whether there are any apparent patterns of violation of the USERRA provisions, and include any recommendations for administrative or legislative action that the Secretary of Labor, the U.S. Attorney General, or the Special Counsel consider necessary to effectively implement USERRA. Although USERRA defines individual agency roles and responsibilities, it does not make any single individual or office accountable for maintaining visibility over the entire complaint resolution process. In our October 2005 report, we noted that the ability of federal agencies to monitor the efficiency and effectiveness of the complaint process was hampered by a lack of visibility resulting, in part, from the segmentation of responsibility for addressing complaints among multiple agencies. Moreover, from the time informal complaints are filed with DOD's ESGR through final resolution of formal complaints at DOL, DOJ, or OSC, no one entity has visibility over the entire process. We found that the agency officials who are responsible for the complaints at various stages of the process generally have limited or no visibility over the other parts of the process. As a result, federal agencies have developed agency-specific output rather than cross-cutting goals directed toward resolving servicemembers' complaints. For example, agency goals address the complaint processing times of each stage of the process, rather than the entire time that elapses while servicemembers wait to have their complaints addressed. Meanwhile, the servicemember knows how much time is passing since the initial complaint was filed. In October 2005, we reported that more than 430 of the 10,061 formal complaints filed with DOL between October 1, 1996, and June 30, 2005, were closed and reopened and 52 complaints had been closed and reopened two or more times. Our analysis of those 52 complaints showed that the processing times averaged about 3 to 4 months but the total elapsed times that servicemembers waited to have their complaints fully addressed averaged about 20 to 21 months from the time they first filed their initial formal complaints with DOL until the time the complaints were fully addressed by DOL, DOJ, or OSC. We have previously suggested and continue to believe that Congress should consider designating a single individual or office to maintain visibility over the entire complaint resolution process from DOD through DOL, DOJ, and OSC. We believe this would encourage agencies to focus on overall results rather than agency-specific outputs and thereby improve federal responsiveness to servicemember complaints that are referred from one agency to another. In response to this matter, in our 2005 report, both DOL and OSC were supportive, and both agencies noted that they had the expertise to oversee the USERRA complaint resolution process. However, DOL stated that with the mandated demonstration project ongoing, it would be premature to make any suggestions or recommendations for congressional or legislative action until the project has been completed. DOD and DOJ did not provide comments on this matter. Integral to getting servicemembers the help they need is educating them and their employers on their respective responsibilities under USERRA. Since 2002, we have reported on DOD's need to obtain complete and accurate information on the civilian employers to better target its outreach efforts. Accurate, complete, and current civilian employer information is important to DOD to improve its ability to target outreach to employers, to make informed decisions concerning which reservists should be called for active duty to minimize the impact that mobilizations might have on occupations such as law enforcement, and to determine how businesses may be affected by reserve activation. As we recommended in our 2002 report, DOD implemented regulations that required the reporting and collection of employer information for reserve personnel. Additionally, DOD established compliance goals for these servicemembers. We noted in our February 2007 report that the percentage of servicemembers reporting employer information to DOD had increased, but most reserve components had still not reached their compliance goals. In addition, we found that employment data were not necessarily current because some reservists were not aware of requirements to update their employer information and the services had not established a formal mechanism to remind reservists to update this personnel information as necessary to reflect changes in their current employment. To improve the reporting of National Guard and Reserve employment information, we recommended that the Secretary of Defense direct the Office of the Assistant Secretary of Defense for Reserve Affairs to establish specific time frames for reservists to report their employment data, set specific time frames for reserve components to achieve the established compliance reporting goals, and direct the service components to take action to ensure reporting compliance. In response to this recommendation, DOD indicated at the time of our report that its current policy on employer reporting established compliance goals. We noted in our report that DOD needed to establish a new deadline by which reservists must report their employer information to DOD and set specific time frames for reserve components to achieve the established compliance reporting goal. In addition, to encourage reservists to keep their employer data current, we recommended that DOD instruct all military departments to establish a formal review mechanism that would require all reservists to review and update at least annually their reported employment-related information. At the time of our February 2007 report, DOD was in the process of revising its policy on civilian employer reporting to require an annual review of reported employer information. DOD provides USERRA outreach and education to servicemembers using several mechanisms, including a toll-free information line and individual and group briefings. DOD monitors the extent to which it reaches this population and the occurrence of USERRA-related problems by including questions on these areas in its Status of the Forces survey, which is periodically conducted to identify issues that need to be addressed or monitored. We noted in our 2005 report that survey questions offer the potential to provide insight into compliance and employer support issues. However, questions on the surveys vary from year to year and have not always included those pertaining to USERRA compliance and employer support. To gauge the effectiveness of federal actions to support USERRA by identifying trends in compliance and employer support, we recommended that the Secretary of Defense direct the Under Secretary of Defense for Personnel and Readiness to include questions in DOD's periodic Status of Forces Surveys to determine the extent to which servicemembers experience USERRA-related problems; if they experience these problems, from whom they seek assistance; if they do not seek assistance, why not; and the extent to which servicemembers' employers provide support beyond that required by the law. In response to this recommendation, DOD incorporated these additional USERRA-related questions in its June 2006 Status of the Forces survey. Because the resolution of servicemember complaints could involve multiple federal agencies, it is important that the agencies be able to effectively share and transfer information to efficiently process servicemember complaints. In October 2005, we found that the automated systems that DOD, DOL, DOJ, and OSC used to capture data about USERRA complaints were not compatible with each other. As a result, information collection efforts were sometimes duplicated, which slowed complaint processing times. To increase federal agency responsiveness to USERRA complaints, we recommended that DOD, DOL, DOJ, and OSC develop a system to allow the electronic transfer of complaint information. At the time of our report, DOL and OSC concurred with this recommendation, DOJ did not provide comments, and DOD deferred to the other agencies. We noted in our February 2007 report that DOL had implemented an enhancement to its USERRA database in October 2006 to enable the four USERRA coordinating agencies to electronically transfer case information between agencies. The database enhancement allowed DOD, DOL, DOJ, and OSC to access and update the status of cases using the Internet and produce a report containing aggregate USERRA complaint data on the cases over which they have jurisdiction. We further noted in this report that, despite these enhancements to the USERRA database to allow the electronic transfer of complaint information between agencies, DOD only had visibility over those cases that originated with informal complaints to DOD. Even though DOD shares responsibility with DOL for USERRA complaints, DOD did not have access to all USERRA complaint data, and we recommended that DOL provide these data to DOD's ESGR. In response to this recommendation, in October 2007, DOL provided DOD with the ability to view and download aggregate information on all USERRA cases in its database. In addition, in October 2005, we reported that when a complaint is referred from DOL to OSC or DOJ, the agencies are unable to efficiently process complaints because they are forced to create, maintain, copy, and mail paper files to other DOL offices and to OSC and DOJ. To reduce administrative burden and improve oversight of USERRA complaints processing, we recommended that DOL develop a plan to reduce reliance on paper files and fully adopt the agency's automated complaint file system. DOL concurred with this recommendation and, as a result, is developing an electronic case record system, scheduled for completion in October 2008, that will allow all agencies assigned to the case an opportunity to review documents and add investigative notes or records. To effectively identify trends in issues facing servicemembers, it is important in a segmented complaint resolution process that the complaint data generated by each of the federal agencies be sufficiently comparable. In our February 2007 report, we noted that the complaint categories used by each of the four agencies could not be uniformly categorized to reveal trends in USERRA complaints. In particular, we noted that the complaint data collected by DOD and DOL, the two agencies that see the highest volume of cases, were not categorized in a way that is conducive to meaningful comparison. Specifically, we found that the two agencies use different categories to identify reservists' USERRA complaints for issues such as being refused job reinstatement, denied an appropriate pay rate, or being denied vacation time. To allow for the analysis of trends in reporting USERRA complaints, we recommended that DOD and DOL adopt uniform complaint categories in the future that would allow aggregate trend analysis to be performed across the databases. At the time of our report, both DOD and DOL agreed with this recommendation. Since that time, DOD and DOL have collaborated to identify common complaint categories that will allow both agencies to match similar USERRA complaints. According to officials from both DOD and DOL, these complaint categories are expected to be pilot tested in fiscal year 2008. As reservists continue to be exposed to serious injury in operations in Iraq and Afghanistan, the ability to identify disability reemployment complaints becomes more critical. However, we noted in our February 2007 report that the four federal agencies responsible for assisting servicemembers with USERRA complaints could not systematically record and track disability-related complaints. Additionally, we found that these agencies do not distinguish disability-related complaints from other types of complaints for tracking and reporting purposes. For example, the servicemember must indicate that the case involves a disability for it to be classified as such, and these complaints may not be distinguishable from any other types of complaints because a single USERRA complaint may involve a number of issues that complicates the classification of the case by the agency. Further, disability-related complaints are not identified using consistent and compatible complaint categories. DOD classifies USERRA disability-related complaints within three categories including medical benefits, job placement, and time limits for reemployment, while DOL uses one category, reasonable accommodation and retraining for disabled, to classify USERRA disability-related complaints. To provide agencies with better information about disability-related employment complaints, we recommended that DOL develop a system for recording and tracking these complaints and share it with the other agencies that implement USERRA. DOL concurred with this recommendation at the time of this report. According to DOL officials, DOL's USERRA database identifies disability claims, and the agency has recently provided DOD, OSC, and DOJ with access to this system. As previously mentioned, the Secretary of Labor is required to provide an annual report to Congress that includes information on the number of USERRA complaints reviewed by DOL, along with the number of complaints referred to DOJ or OSC. We noted in our February 2007 report that DOL's report to Congress does not include information on informal complaints filed with ESGR. Therefore the complaint data that DOL reported to Congress for fiscal years 2004 and 2005 did not include 80 percent, or 9,975 of the 12,421 total informal and formal USERRA complaints filed by reservists during that period. Without data from ESGR, Congress has limited visibility over the full range of USERRA issues that reservists face following deployment. Further, without these data, Congress may lack the information for its oversight of reserve employment matters. To gain a full perspective of the number and nature of USERRA complaints filed by reservists in gaining reemployment upon returning from active duty, we suggested that Congress consider amending the reporting requirement to require DOL to include data from DOD's ESGR in its annual report to Congress. In response to this matter for congressional consideration, Members of Congress are considering changes to the legislation. In addition to DOL's report to Congress not reflecting informal USERRA complaints, we identified data limitations in our July 2007 report that affected the quality of information reported to Congress that could adversely affect Congress's ability to assess how well federal sector USERRA complaints are processed and whether changes are needed. DOL provides information in its annual report to Congress on the number and percentage of complaints opened by type of employer, issues raised-- such as discrimination or refusal to reinstate--outcome, and total time to resolve. We found that the number of federal sector complaints shown in DOL's USERRA database from February 8, 2005, through September 30, 2006, exceeded the number of unique claims it processed during the period of our review. Duplicate, reopened, and transferred complaints accounted for most of this difference. Also, in our review of a random sample of case files, we found the dates recorded for case closure in DOL's USERRA database did not reflect the dates on the closure letters in 22 of 52 sampled complaints and the closed code, which DOL uses to describe the outcomes of USERRA complaints (e.g., granted, settled, no merit, or withdrawn), was not sufficiently reliable for reporting specific outcomes of complaints. To ensure that accurate information on USERRA complaints' processing is available to DOL and to Congress, we recommended in our July 2007 report that the Secretary of Labor direct the Assistant Secretary of Veterans' Employment and Training to establish a plan of intended actions with target dates for implementing internal controls to ensure that DOL's USERRA database accurately reflects the number of unique USERRA complaints filed annually against federal executive branch agencies, the dates those complaints were closed, and the outcomes of those complaints. In response to our recommendation, DOL issued a memo from the Assistant Secretary of Veterans' Employment and Training in July 2007 instructing investigators to ensure that the closed date entered into DOL's USERRA database match the date on the closure letter to the servicemember, and DOL conducted mandatory training on this memo beginning in August 2007. Further, DOL officials told us that DOL's fiscal year 2007 annual report will count reopened complaints as a single complaint if brought by the same individual, against the same employer, and on the same issue. We reported in July 2007 that in cases where servicemembers sought assistance from DOL and the agency could not resolve the complaints, DOL did not consistently notify servicemembers in writing of their right to have their unresolved complaints against federal executive branch agencies referred to OSC or to bring their claims directly to the Merit Systems Protection Board. Specifically, our review of a random sample of complaint files showed that DOL failed to notify servicemembers in writing in half of the unresolved complaints and notified others of only some of their options. In addition, we found that DOL's USERRA Operations Manual failed to provide clear guidance to its investigators on when to notify servicemembers of their rights and the content of the notifications. In July 2007, we also reported that DOL has no internal process to routinely review investigators' determinations before claimants are notified of them and noted that this lack of review could have caused DOL's inconsistent practice of notifying servicemembers for their rights to referral. We recommended that the Secretary of Labor direct the Assistant Secretary for Veterans' Employment and Training to (1) require VETS's investigators to undergo mandatory training on the procedures to be followed concerning notification of rights to referral, (2) incorporate into the formal update to DOL's USERRA Operations Manual guidance concerning the notification rights to referral, and (3) develop and implement an internal review mechanism for all unresolved complaints before servicemembers are notified of determinations and complaints are closed. Since that time, DOL has taken the following actions: issued a memo in July 2007 from the Assistant Secretary for Veterans' Employment and Training to regional administrators, senior investigators, and directors concerning case closing procedure changes, including standard language to use to ensure that servicemembers (federal and nonfederal) are apprised of their rights; began conducting mandatory training on the memo in August 2007; incorporated the policy changes into the revised Manual, which according to DOL officials is expected to be released in January 2008; and according to DOL officials, beginning in January 2008, all claims are to be reviewed before the closure letter is sent to the claimant. These are positive steps. It is important for DOL to follow through with its plans to ensure that clear and uniform guidance is available to all involved in processing USERRA complaints. Mr. Chairman, Senator Enzi, and Members of the Committee, this concludes our remarks. We will be pleased to take questions at this time. For further information regarding this statement, please contact Brenda Farrell at 202-512-3604 or [email protected] or George Stalcup at 202-512- 9490 or [email protected]. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this testimony. Individuals making contributions to this testimony include Laura Durland, Assistant Director; Belva Martin, Assistant Director; James Ashley; Karin Fangman; K. Nicole Harms; Kenya Jones; Mae Jones; Ronald La Due Lake; Joseph Rutecki; Tamara F. Stenzel; and Kiki Theodoropoulos. This is a work of the U.S. government and is not subject to copyright protection in the United States. It may be reproduced and distributed in its entirety without further permission from GAO. However, because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. | Since September 11, 2001, the Department of Defense (DOD) has mobilized more than 500,000 National Guard and Reserve members. As reservists return to civilian life, concerns exist about difficulties with their civilian employment. The Uniformed Services Employment and Reemployment Rights Act (USERRA) of 1994 protects the employment rights of individuals, largely National Guard and Reserve members, as they transition back to their civilian employment. GAO has issued a number of reports on agency efforts to carry out their USERRA responsibilities. DOD, the Department of Labor (DOL), the Department of Justice (DOJ), and the Office of Special Counsel (OSC) have key responsibilities under the act. GAO was asked to discuss the overall process that the agencies use to implement USERRA. Specifically, this testimony addresses (1) organizational accountability in the implementation of USERRA and (2) actions that the agencies have taken to improve their processes to implement USERRA. For this testimony, GAO drew from its most recent reports on USERRA. USERRA defines individual agency roles and responsibilities; however, it does not designate any single individual or office as accountable for maintaining visibility over the entire complaint resolution process. From the time informal complaints are filed with DOD's Employer Support of the Guard and Reserve through final resolution of formal complaints at DOL, DOJ, or OSC, no one entity has visibility over the entire process. The four agencies have generally been responsive to GAO's recommendations to improve the implementation of USERRA--on outreach to employers, data sharing and trend information, reporting to Congress, and the internal review of DOL's investigators' determinations of USERRA claims. | 4,795 | 361 |
The Nuclear Waste Policy Act of 1982 directed DOE to identify and recommend to the President three sites for detailed investigation as a potential permanent repository for nuclear waste. In May 1986, the President selected three candidate sites, including Yucca Mountain, Nevada. However, faced with escalating costs and public resistance to the disposal program, in December 1987 the Congress amended the act by, among other actions, directing DOE to investigate only the Yucca Mountain site. Before the Congress enacted the 1987 amendments, DOE/OCRWM had decided that a successful disposal program could best be ensured if DOE had a long-term partnership with a management contractor. DOE expected that the proposed management contractor would develop waste storage and transportation capabilities and manage the investigation of candidate repository sites. DOE also expected the number of contractors on the program to decline by transferring the work of some existing contractors to the management contractor. In December 1988, DOE selected a team of contractors--headed by Bechtel Systems Management, Inc. and including SAIC--as the disposal program's management contractor. However, TRW Environmental Safety Systems, Inc. (TRW) asserted, in a bid protest, the existence of a serious conflict of interest by DOE's chairman of the contract's Source Evaluation Board, a previous SAIC employee. In an August 24, 1989, decision on the bid protest, the court agreed, stating that DOE could award the contract to TRW or cancel the procurement action. (See app. I and II.) In February 1991, DOE awarded TRW a 10-year management contract for an estimated $1 billion to perform systems engineering, development, and management of a system to transport and permanently dispose of highly radioactive waste. Even though there were strong indications that relationships between DOE employees and contractor employees might result in ethical problems, OCRWM officials failed to diligently monitor such relationships. The two most senior DOE officials in OCRWM's Yucca Mountain Project at the time--the Project Manager (1987-Oct. 1993) and the Deputy Project Manager (Oct. 1990-Jan. 1994)--had personal relationships with contractor employees that violated Executive Order 12674 and DOE regulations by creating at least the appearance of a loss of impartiality. For example, this Project Manager opposed the transition of work from SAIC to the management contractor, TRW, including the work performed by the SAIC official with whom he had a personal relationship. Additional relationships between DOE and contractor employees involved almost 18 percent of DOE's employees at the project. ". . . avoid any action, whether or not specifically prohibited by the regulations, which might result in, or create the appearance of: (1) using public office for private gain; (2) giving preferential treatment to any person; (3) impeding government efficiency or economy; (4) losing complete independence or impartiality; (5) making a government decision outside official channels; or (6) affecting adversely the confidence of the public in the integrity of the government." DOE's Manager for the Yucca Mountain Project from 1987 to 1993 had a personal relationship with a female official of a major project contractor, SAIC. Our investigation and an April 1995 report by the DOE Office of Inspector General (OIG) concluded that because of this relationship, the Project Manager, as the Fee-Determining Official and the Contracting Officer's Technical Representative for the SAIC contract, had lost the appearance of impartiality in the performance of his official duties, contrary to regulations regarding the ethical conduct of employees. Our investigation and the OIG report disclosed that the Project Manager and the SAIC official had traveled together frequently on official business (over 60 trips in fiscal years 1992 and 1993). Some of these trips involved little apparent business-related justification for the SAIC official, according to one of the Project Manager's supervisors. Despite denials of anything other than a professional relationship, the officials' public behavior repeatedly caused DOE, SAIC, and industry officials to raise concerns. According to the DOE Yucca Mountain Project Special Assistant for Institutional Affairs, the SAIC official functioned primarily as an administrative assistant to the Project Manager, rather than reporting to the Special Assistant as called for within the Yucca Mountain Project organizational structure. One of the Project Manager's supervisors told us she was astonished to find that an SAIC official, while on official trips with the Yucca Mountain Project Manager, would do trivial tasks while her staff went unsupervised. The Project Manager opposed having several SAIC functions--among them the institutional and external affairs functions headed by the SAIC official--transitioned to TRW, the management contractor. He communicated that opposition to individuals who either were in a position to influence or participated in the decision not to transition certain functions, including that for which the SAIC official was responsible. According to SAIC lawyers, if the work had transitioned to TRW as planned, any SAIC employees forced to leave the company would have lost substantial pension and stock/stock option benefits and may have incurred tax liabilities arising from the forced sale of their SAIC stock. The Yucca Mountain Project Manager's opposition to the transition of SAIC work to TRW put him in direct conflict with OCRWM's then Director (Apr. 1990-Jan. 1993) and then Deputy Director (Nov. 1988-Oct. 1993). According to this former OCRWM Director, the Project Manager took SAIC's side in its dispute with OCRWM management over transitioning SAIC work to TRW. The OCRWM Director also told us that he wanted the Project Manager to implement the management contract with TRW; and although the Project Manager never said no, he delayed repeatedly. The OCRWM Director stated that he did not recognize some of these problems until the end of his tenure as Director. Although OCRWM and Yucca Mountain Project officials had reason to be concerned about the relationship between the Yucca Mountain Project Manager and the SAIC official by 1991 or earlier, they took no formal action regarding the relationship until late 1993. In 1990 or 1991, an industry official expressed concern to the then OCRWM Deputy Director about the relationship between the Project Manager and the SAIC official. The Deputy Director took no action other than warning the Project Manager that he was traveling too much with the SAIC official. In 1990 or 1991, the DOE Director of Public Affairs for the Yucca Mountain Project Office cautioned the Project Manager about an appearance problem. Although the Director of Public Affairs stated that he had discussed this with OCRWM's then Deputy Director, no action was taken, such as reporting this to the DOE OIG. In April 1993, OCRWM's Deputy Director, based on his observations, cautioned the Project Manager. Further, although the then DOE Associate Director for Geologic Disposal, based in Las Vegas, Nevada, became aware of rumors about the relationship in June 1993, no investigation of the relationship was undertaken. During this time, the Project Manager disregarded the warnings he had received. In mid-September 1993, the Project Manager and the SAIC official engaged in a public altercation at the Phoenix, Arizona, airport. Shortly after that incident, the then Acting Director of OCRWM (Jan. 1993-Oct. 1993) requested that the DOE OIG evaluate the relationship between the Project Manager and the SAIC official. On September 27, 1993, the Project Manager was removed from professional contact with the SAIC official and directed to meet with DOE counsel to discuss the relationship. Because the Project Manager told the counsel that he and the SAIC official were "only good friends," the counsel concluded a recusal was not necessary. The counsel did, however, suggest to the Project Manager that he contact a DOE ethics counselor at headquarters for advice and counsel, which he never did. In October 1993, DOE took further action, removing the Project Manager from his position and detailing him to another DOE site. He was subsequently reassigned to the DOE Nevada Operations Office at a reduced grade. The Deputy Project Manager from 1990 to 1994 had a personal relationship with a female SAIC employee, beginning in 1984 when the deputy was a Yucca Mountain Branch Chief. Even though this open relationship was public knowledge as early as 1986, no action was taken to ensure that the relationship did not violate federal standards of conduct until 1991. DOE acted again in 1993 and January 1994, shortly after a report of the relationship was aired nationally on the McNeil/Lehrer News Hour. During the Deputy Project Manager's relationship, the previously discussed Project Manager did not act on his deputy's potential ethical problem. However, the deputy did execute a recusal in 1991 to meet a condition of his associate's employment by a prospective employer. His associate was seeking a job with the project's management contractor, TRW; and TRW had requested assurances of the Deputy Project Manager's impartiality. Despite a DOE general counsel's statement to him that there was no need for the recusal that the then Acting OCRWM Director had suggested, the deputy recused himself. His recusal removed him from decisions regarding the transition of work from SAIC to TRW; TRW's contract award fee evaluation; and any decisions regarding his associate's salary, bonuses, and benefits. A subsequent August 1993 recusal somewhat broadened these areas with regard to his associate's position with TRW. However, in early 1994, the newly appointed Project Manager raised concerns about the adequacy of the 1993 recusal with regard to the expanded duties that he envisioned for the deputy position. The project's newly appointed Chief Counsel/ethics officer determined that the recusal was not sufficient to ensure the deputy's impartiality in the new duties. Thus, in late January 1994, the new Yucca Mountain Project Manager placed the Deputy Project Manager in a senior advisory position for which DOE deemed the recusal was sufficient. The former deputy retired in late 1994. Days before the September 1993 public incident involving the Project Manager and the SAIC official, OCRWM began to enforce DOE's ethics regulations more actively. In doing so, it exposed a number of other relationships between DOE and contractor employees that posed potential ethical problems. In September 1993, the then Acting Director of OCRWM issued a memorandum entitled, "Ethics Requirements, Federal-Contractor Employee Relationships." All OCRWM employees were required to sign and date the memorandum, indicating that they were aware of their responsibilities. By mid-1994, an internal memorandum by the Yucca Mountain Project Chief Counsel listed 14 relationships between DOE employees and employees of several contractors that might have posed the appearance of the lack of impartiality and independence. These were in addition to the previously discussed relationships of the Project Manager and Deputy Manager and represented almost 18 percent of the 80 DOE Yucca Mountain Project employees. Upon examination, the Chief Counsel determined that four of these relationships required a recusal or waiver. The others were told that if they had any changes in positions or responsibilities, their cases would require a reexamination. The former Yucca Mountain Project Manager took other questionable actions while in that position. Specifically, he precipitated SAIC's hiring of a project subcontractor, Integrated Resources Group (IRG), primarily because of IRG's political connections that could provide him an opportunity to promote his positions, which were contrary to those of DOE. With those connections, the Project Manager went outside official channels to lobby the Congress for his concept of how the project should be run and funded. Further, the Project Manager's lobbying activities included his improper attendance at a meeting with congressional and contractor officials to discuss the project's future. The Project Manager disagreed with the information that OCRWM's Directors were conveying to the Congress and the Secretary of Energy about the Yucca Mountain Project. He was concerned that the Secretary of Energy did not consider the waste program a major priority and that OCRWM's then Acting Director (Nov. 1988-Mar. 1990) was not effective in communicating the progress being made on the project. The Project Manager also believed that opponents of the project were very effective in implying that the project was making little advancement. He encouraged project contractors to convey to the Congress and the Secretary of Energy the improvements that were being made on the project. Further, the Project Manager opposed the project's management contract with TRW. Under the contract, SAIC, with whose official the Project Manager had a personal relationship, would have relinquished much of its work. According to OCRWM's subsequent Director (Apr. 1990-Jan. 1993), the Yucca Mountain Project Manager did not think that the OCRWM directorate knew what was best for the project. The Project Manager, according to this OCRWM Director, wanted to run the program, independent of Washington. The Project Manager's desire to be the OCRWM director became a point of contention between the Project Manager and his then immediate supervisor, the OCRWM Deputy Director (Nov. 1988-Oct. 1993). According to this Deputy Director, he told the Project Manager several times to stop "seeking the OCRWM directorship." The then OCRWM Director (Apr. 1990-Jan. 1993) said that the Project Manager would come to Washington just to lobby the Congress for himself and other things of interest to him. In early 1990, the Yucca Mountain Project Manager saw an opportunity to provide the Congress his perspective on the Yucca Mountain Project when he was approached by the president of IRG, a management consulting company, about doing technical work in the project. IRG's president promoted his political connections, and the Project Manager said that the IRG's involvement would be in the best interest of the project. After the Project Manager determined that the IRG president did have political connections, he referred the individual to SAIC officials and encouraged them to hire IRG as a subcontractor. SAIC's initial contract award to IRG--to evaluate project training requirements relative to the Nuclear Regulatory Commission's licensing process--was made in March 1990 for $15,000. The SAIC Assistant Vice President responsible for licensing support activities, including work that was to be subcontracted to IRG, told us he doubted that SAIC would have contracted with IRG had it not been for the political contacts of IRG's president and the Project Manager's desire to have IRG in the project. He said that when SAIC considered IRG for a subcontract, it looked at IRG's corporate capabilities, i.e., IRG had considerable expertise in nuclear facility licensing support and regulatory commitment tracking systems. He added, however, that the Project Manager's expressed desire was the motivation behind SAIC's consideration of IRG and except for that expressed desire, SAIC probably would not have subcontracted the work. Another SAIC official recalled clear direction from the Project Manager to SAIC that, if it was procedurally and legally possible, he wanted IRG in the project. Further, once IRG was under contract to SAIC, as IRG's president told us, he became a direct congressional contact for the Project Manager. IRG's president also told us that he believed his efforts, and those of SAIC's hired lobbyists, were instrumental in bringing about a high-level DOE review of the management contract's transition plan. As we reported in December 1994, DOE deferred transferring some SAIC work addressed in the plan until after a June 1993 performance assessment of SAIC. Once the assessment was performed, none of the assessed work was transferred from SAIC to TRW. SAIC awarded a second subcontract in July 1990 to IRG for over $224,000 after receiving consent from a DOE Contracting Officer pursuant to F.A.R. part 44. That part prescribes policies and procedures for consent to subcontract. "Consent to subcontract" is defined at 44.101 as the Contracting Officer's written consent for the prime contractor to enter into a particular subcontract. In a May 30, 1990, letter, SAIC originally requested DOE's consent to add a $185,000 amendment to IRG's March 1990 subcontract for $15,000. According to a Yucca Mountain Project Contracting Officer in 1994, such a request was "irregular," stating that any modification over 20 percent of a contract's value is of "concern" according to the Competition in Contracting Act. DOE apparently never acted on SAIC's request. In early July 1990, SAIC requested bids from the two predetermined firms that had bid on the March 1990 contract--IRG and a larger business in which SAIC held a 49-percent interest and whose unsalaried Chief Financial Officer at the time was an SAIC official in contracting. On July 12, 1990, SAIC requested by letter that DOE approve its decision to award the second time and materials subcontract to IRG as the low bidder for $224,450. In that letter, SAIC advised the Contracting Officer that only two firms had been solicited, largely to perform regulatory compliance strategy reviews and to develop/present related training at the project but also to recommend methods for successful interaction with various entities, including the Congress. On July 13, 1990, the DOE Contracting Officer approved the subcontract award. In determining whether to consent to a subcontract award on a time-and-materials basis, the Contracting Officer must exercise particularly careful and thorough consideration of several factors, including whether the contractor has a sound basis for selecting and determining the responsibility of the proposed subcontractor. (F.A.R. 44.202(a)(7)) Further, the "Competition in Subcontracting" clause at F.A.R. 52.244-5, which provides that contractors must select subcontractors on a competitive basis to the maximum extent practical and consider the objectives and requirements of each contract, was in SAIC's contract. Although the second subcontract called for different services and the resulting amount of the award was significantly higher than that of the first subcontract, the Contracting Officer apparently did not object to SAIC's method of competition. However, according to the project's Chief Counsel, it was highly unusual for SAIC to have only two companies bid for the work that was subcontracted to IRG. The work was not very specialized, and a large pool of companies could have been considered. To have solicited only two bids, she said, defeats the purpose of competition to get the best price for the government. In April 1992, the DOE Yucca Mountain Project Manager engaged in lobbying activities outside proper official channels by attending a meeting that included congressional officials and representatives from SAIC and IRG to discuss the project's future. The meeting--for which IRG's president told us he was the catalyst--breached DOE policy on congressional contacts by senior DOE officials because the Project Manager did not obtain prior Secretarial approval to attend the meeting and because the meeting was not carried out in accordance with the existing policy. Participants stated that discussions at the meeting included (1) future funding for the Yucca Mountain Project and (2) how the Congress could alter the way the project was funded. The evidence shows that the Project Manager argued that the project was substantially underfunded, needing additional funding to meet its scheduled completion date, and discussed how best to use that and other funding. According to the IRG president, he believed that he too was helpful in explaining how additional funding would be used at the project. The Project Manager also discussed removing the project from the annual budget appropriations process and going to an off-budget funding that would give DOE direct access to the Nuclear Waste Fund, financed by the owners and generators of nuclear waste. This latter proposal would have required legislation to accomplish. The then Secretary of Energy told us that this meeting was a breach of DOE policy for interacting with Members of Congress and was unethical on the Project Manager's part. The meeting was neither coordinated with DOE officials beforehand nor carried out according to the existing policy. When the Secretary learned after the fact that SAIC representatives had been present at the meeting, he was concerned because of the previously discussed corporate struggle over project work that was taking place between SAIC and the OCRWM management contractor, TRW. According to the former Secretary, the Project Manager acknowledged that he should have left the meeting when he saw who was there. The current Director, OCRWM; Deputy Director, OCRWM; and other DOE officials provided us their comments on a draft of this report. They were in general agreement with the contents of the draft but expressed concern that, with the draft's identification of DOE officials by title alone, readers may incorrectly attribute the actions discussed to previous or subsequent officeholders. To address that overall concern, we have included in the report's text the dates during which the respective individuals held office. (See also app. II.) In addition, where appropriate, we have clarified sections for which the officials provided additional details. We conducted this inquiry between May 1994 and April 1996 at several locations including the DOE/Office of Civilian Radioactive Waste Management, Washington, D.C.; DOE/Yucca Mountain Project Office and Nevada Operations Office, Las Vegas, Nevada; SAIC Corporate Headquarters, LaJolla, California, and SAIC, Las Vegas, Nevada; and IRG, Metairie, Louisiana, and Las Vegas, Nevada. We interviewed current and former DOE officials and staff and current SAIC and IRG officials. We reviewed DOE, SAIC, and IRG contract files, including solicitations for bids, evaluations of proposals, contractual scopes of work, and contract awards; IRG time and expense reports, and SAIC management and support services charges to DOE; documentary materials regarding the award and implementation of the OCRWM management and operating contract; and federal law and regulation regarding conflicts of interest and lobbying activities. In the course of our investigation, we coordinated with the DOE OIG. We will provide the OIG a copy of this report. As arranged with your office, unless you announce its contents earlier, we plan no further distribution of this report until 30 days after the date of the letter. At that time, we will send copies of the report to interested congressional committees and the Secretary of Energy. We will also make copies available to others on request. If you have further questions or concerns, please contact me at (202) 512-6722. Major contributors are listed in appendix III. An ethical problem surfaced in 1987 at the highest levels of OCRWM management: A conflict of interest by OCRWM's chairman of the Source Evaluation Board for a Yucca Mountain management contract severely undermined OCRWM's effort to award the contract in a timely manner. The board chairman, after returning to DOE from private industry, did not, as initially instructed by DOE, recuse himself from participation as a supervisory employee in certain DOE actions involving SAIC. This resulted in a bid protest and subsequent set-aside of the contract award. The board chairman also served as OCRWM's Acting Director from November 1988 to March 1990. The chairman of the Source Evaluation Board for the Yucca Mountain contract, a longtime DOE employee, left the agency in about 1983 to work in private industry and returned to DOE on June 2, 1986. One employer while he was in the private sector was SAIC. Immediately prior to his return to DOE and while still in SAIC's employ, DOE's Office of General Counsel advised him by letter that for 1 year after returning to DOE he could not participate as a supervisory employee in any DOE action in which SAIC was substantially, directly, or materially involved. However, DOE's Office of General Counsel subsequently prepared an interoffice memorandum which concluded that its earlier advice was in error. The individual had become chairman of the Source Evaluation Board for the OCRWM management contract on May 1, 1987, which was about 1 month before the restriction was to expire. In December 1988, DOE selected Bechtel Systems Management, Inc., which had teamed with SAIC and other companies, as the management contractor. Shortly thereafter, TRW, an unsuccessful bidder, filed a bid protest and motion to enjoin DOE from awarding the contract to Bechtel. These were based, in part, on allegations that the chairman of the Source Evaluation Board had violated the DOE Reorganization Act's conflict-of-interest provision at 42 U.S.C. 7216 by participating in a procurement that involved a previous employer within 1 year of joining DOE. That provision prohibits a supervisory employee for 1 year from participating in any DOE proceeding in which his former employer is substantially, directly, or materially involved. In August 1989, the Claims Court held that the board chairman/ Acting Director had violated 42 U.S.C. 7216 by participating in the procurement involving SAIC. In its decision, the court rejected DOE's pre-hearing attempt to reverse its first instruction. It said, "ne might reasonably have expected that , out of an abundance of caution, would have recused himself in any matter in which SAIC was involved during the restricted period. Unfortunately, such did not occur. . . ." (TRW Envtl. Safety Sys., Inc. v. United States, 18 Cl. Ct. 33, 63 (1989)). TRW, therefore, was granted its motion for a permanent injunction. The court ruled that DOE could not award the contract to any original bidder other than TRW. DOE awarded the management contract to TRW in February 1991. Barbara C. Coles, Senior Attorney The first copy of each GAO report and testimony is free. Additional copies are $2 each. Orders should be sent to the following address, accompanied by a check or money order made out to the Superintendent of Documents, when necessary. VISA and MasterCard credit cards are accepted, also. Orders for 100 or more copies to be mailed to a single address are discounted 25 percent. U.S. General Accounting Office P.O. Box 6015 Gaithersburg, MD 20884-6015 Room 1100 700 4th St. NW (corner of 4th and G Sts. NW) U.S. General Accounting Office Washington, DC Orders may also be placed by calling (202) 512-6000 or by using fax number (301) 258-4066, or TDD (301) 413-0006. Each day, GAO issues a list of newly available reports and testimony. To receive facsimile copies of the daily list or any list from the past 30 days, please call (202) 512-6000 using a touchtone phone. A recorded menu will provide information on how to obtain these lists. | Pursuant to a congressional request, GAO investigated allegations of conflicts of interest at the Department of Energy's (DOE) Yucca Mountain Project, focusing on whether: (1) the DOE Office of Civilian Radioactive Waste Management (OCRWM) properly implemented and adequately enforced federal standards of ethical conduct and DOE ethics regulations; and (2) failure to implement DOE ethics standards may have contributed to contract award and management abuses. GAO found that: (1) the Principles of Ethical Conduct for federal employees contained in Executive Order 12674 and DOE's regulations for ethical conduct by its employees prohibit, among other things, any action that might result in or create the appearance of the loss of impartiality or independence; (2) however, GAO's investigation and DOE's own reviews revealed the appearance of the loss of impartiality by DOE officials at the Yucca Mountain Project; (3) for example, both the Manager of DOE's Yucca Mountain Project from 1987 to October 1993 and the Deputy Manager from October 1990 to January 1994 had long-term personal relationships with personnel of major project contractors, including the Science Applications International Corporation (SAIC); (4) moreover, by 1994, DOE had learned that 14 additional, or almost 18 percent of, DOE employees at the project were engaged in relationships that might have created problems concerning the lack of impartiality and independence; (5) DOE determined that four of these relationships represented potential ethical problems, requiring recusal or waiver; (6) although senior OCRWM officials in Washington, D.C., knew by 1991 that potential ethical problems existed at the Yucca Mountain Project, they did not act to resolve the situation until late 1993; (7) further, GAO's investigation disclosed that this Yucca Mountain Project Manager had engaged in other questionable actions; (8) evidence shows that he encouraged SAIC to hire a certain subcontractor largely because of the subcontractor's stated political connections that could be used to promote the Project Manager's, as well as SAIC's, priorities for the project rather than DOE's priorities; (9) SAIC awarded a small subcontract to the firm after soliciting bids from it and a second firm in which SAIC held a major interest; (10) within a few months, and after soliciting bids from the same two firms, SAIC received DOE's consent to award a second contract, much larger in cost and different in scope, to the same subcontractor; (11) the Project Manager also violated DOE policy by improperly participating in a meeting with congressional and contractor officials, where he lobbied for his own positions concerning the project without, as required, first notifying his superiors. | 5,814 | 570 |
The term "e-cigarettes" refers to a wide range of products that share the same basic design and generally consist of three main parts: a power source (typically a battery), a heating element containing a wick (to deliver liquid to the heating element), and a cartridge or tank containing liquid solution. Cartridges and liquid are often sold separately from e- cigarette devices containing the battery and heating element. Liquid typically contains nicotine, a solvent (e.g., propylene glycol, glycerin, or both), and flavorings. E-cigarettes heat liquids to deliver aerosol that usually contains nicotine and other chemical substances to the user by inhalation. E-cigarettes come in two main forms: Closed systems that include disposable e-cigarettes or require users to buy e-cigarette components, including the cartridge with liquid, from the same manufacturer or brand. Open systems that enable users to purchase the heating element, battery, tank, and liquid separately and from different manufacturers or brands. Industry experts we interviewed estimated that the size of the U.S. e- cigarette market in 2014 was about $2.5 billion. Although there are no definitive data on the relative proportions of imported and domestically manufactured e-cigarettes, industry experts we interviewed told us that the majority of e-cigarettes sold in the United States are imported from China. The U.S. e-cigarette market has developed rapidly in the last decade. U.S. Customs and Border Protection issued a customs ruling for the classification of e-cigarette imports to the United States as early as 2006. USPTO issued a registration for a trademark applied to e-cigarettes as early as May 2008 and had recorded more than 1,600 U.S. trademark registrations for e-cigarette devices, parts, liquid, and services as of March 2015. Hundreds of e-cigarette companies participate in the U.S. e- cigarette market. Large tobacco companies began entering the U.S. e- cigarette market in 2012 and now manufacture some of the leading closed system e-cigarette brands, according to industry experts we interviewed. Some industry experts we spoke with predict that the U.S. e- cigarette market will continue to grow, although factors such as the extent of federal and state regulation create uncertainty about the rate of growth. E-cigarettes are sold in multiple types of outlets, including traditional retail stores, such as convenience stores and grocery stores, as well as at "vape stores" and over the Internet. According to industry experts, closed system e-cigarette products are mainly sold in traditional retail outlets, while open system e-cigarette products are often sold online and at vape stores. Private companies collect point-of-sale data on the quantities and prices of e-cigarettes sold at traditional retail stores, according to documentation from these companies; however, these data do not cover online sales or "vape store" sales. Financial analysts from one firm estimate that 40 to 60 percent of e-cigarettes are sold online or at vape stores. In 2014, CDC reported a statistically significant increase in the percentage of U.S. adults who had used e-cigarettes in the preceding 30 days, from 1 percent in 2010 to 2.6 percent in 2013. Past-month e- cigarette use was especially prominent among current adult cigarette smokers and grew in this population, at a statistically significant level, from 4.9 percent in 2010 and 2011 to 9.4 percent in 2012 and 2013. Past- month e-cigarette use by former adult cigarette smokers also rose, from 1 percent to 1.3 percent during the same period, although the increase was not statistically significant. The National Youth Tobacco Survey by CDC and FDA showed a statistically significant increase in high school students' past-month e- cigarette use, from 1.5 percent in 2011 to 13.4 percent in 2014. In addition, the survey found that in 2014, high school students' past-month e-cigarette use surpassed their use of cigarettes and other tobacco products at a statistically significant level (see fig. 2). The survey further found a statistically significant increase in past-month e-cigarette use among middle school students. In April 2014, FDA issued a proposed rule to deem e-cigarettes and other products meeting the Tobacco Control Act's definition of "tobacco product" to be subject to the agency's regulation. FDA received more than 135,000 comments about the proposed deeming rule during the public comment period, which ended in August 2014. FDA announced its intent to issue the final rule in June 2015 in the spring 2015 semiannual regulatory agenda. The final rule had not been issued as of August 2015. The Tobacco Control Act aimed to, among other things, promote cessation to decrease health risks and social costs associated with tobacco-related diseases. According to the act, FDA can, by regulation, require restrictions on the sale, distribution, advertising, and promotion of a tobacco product if the agency determines that the proposed regulation is appropriate for the protection of public health, based on a consideration of the risks and benefits to the population as a whole, including users and nonusers of tobacco products. In the act, Congress recognized that virtually all new users of tobacco products are under the age of 18. In the proposed deeming rule, FDA stated that it was researching the effect of e-cigarette use on public health. FDA noted that e-cigarettes could have a positive net impact if using them resulted in minimal initiation by children and adolescents and in significant numbers of smokers' quitting. The FDA also noted that e-cigarette use could have a negative net impact if it resulted in significant initiation by young people, minimal quitting, or significant dual use of combustible products, such as cigarettes, and noncombustible products, such as e-cigarettes. The IRC, which defines tobacco products subject to FET and sets rates of tax, does not specifically define or list a tax rate for e-cigarettes. However, two states--Minnesota and North Carolina--have imposed an excise tax on e-cigarettes or vapor products containing nicotine. The Minnesota Department of Revenue issued a notice in 2012 stating its position that e-cigarettes are subject to the tobacco products tax; the current tax rate is 95 percent of the wholesale price of the nicotine- containing liquid or, if the liquid cannot be sold separately, of the complete e-cigarette. North Carolina has taxed vapor products at 5 cents per milliliter of nicotine-containing liquid or other material since June 2015. In addition, at least 18 states and the District of Columbia have proposed legislation to tax e-cigarettes, vapor products, nicotine vapor products, or e-cigarette cartridges since 2013. For example, a bill in Maine proposed to include e-cigarettes in its definition of cigarettes and to apply the same tax rate to cigarettes and e-cigarettes, and a bill in Montana proposed a tax on vapor products, such as e-cigarettes, that would be partially based on the weight in milligrams of the nicotine present in the product. As of January 2015, three countries--Italy, Portugal, and South Korea-- imposed national-level taxes on e-cigarettes that contain nicotine, and each of these countries applies its tax to nicotine-containing e-cigarette liquid, according to an industry expert. In addition, according to research by the Law Library of Congress, Serbia recently enacted legislation to introduce an excise tax on e-cigarette liquid, which went into effect in August 2015. Our analysis of Treasury data on cigarette FET revenue found no current evidence that e-cigarette use has affected the historical decreasing trend in FET collections over the past 6 years. We used a time series regression to determine the change in cigarette FET revenue from April 2009, when the last increase in FET on cigarettes and other tobacco products became effective, through December 2014. Variables in the model control for (1) historical decreases in cigarette FET revenue over the last 6 years; (2) quantities of cigars, pipe tobacco, and roll-your-own tobacco removed from domestic factories or released from customs custody for distribution in the United States; and (3) monthly seasonality effect. Our model tests for the inclusion of e-cigarettes at different points in time and tests for any significant changes from the historical trend. We found no significant evidence that e-cigarettes have decreased the collection of FET revenue from cigarettes at a rate greater than the 6-year historical trend. Specifically, we found that, when other variables in the model are held constant, the 6-year historical trend of cigarette FET revenue decreased at a rate between $4.4 million and $5.5 million per month (see fig. 3). However, we found no significant evidence of a decrease in FET revenue from cigarettes at a rate greater than the 6-year historical trend during the time frame when e-cigarettes have been on the U.S. market. We estimate that cigarette FET revenue would need to decrease by an additional $2 million to $3 million per month to signal a significant effect from e-cigarettes. Several factors may explain why our analysis did not detect an effect of e- cigarette use on cigarette FET revenue. First, the e-cigarette market-- estimated at $2.5 billion in sales in 2014--is relatively small compared with the cigarette market, which had $80 billion in sales in the same year. As a result, without a substantial increase in the e-cigarette market, any effect on the cigarette market would be too minor to significantly affect cigarette FET revenue. Second, comprehensive and reliable data on e- cigarette sales and prices--which would enable us to corroborate the size of the e-cigarette market and accurately identify when it became significant--are not available. Third, comprehensive and reliable data about the extent to which e-cigarettes are used as substitutes for cigarettes are also not available. Without such data and information, estimating the effect of e-cigarette use on cigarette FET revenue will be difficult, even if the e-cigarette market continues to grow. How consumers' use of e-cigarettes relates to their use of cigarettes-- whether e-cigarettes are substitutes, complements, or unrelated--may determine any effect of e-cigarette use on cigarette FET revenue. The relationship between the use of e-cigarettes and cigarettes is currently unknown, according to public health officials. Table 1 describes these three possible relationships and summarizes their potential revenue effects. The most recent data from the National Youth Tobacco Survey by CDC and FDA showing high school students' increasing use of e-cigarettes and decreasing use of cigarettes (see fig. 2), suggest that cigarette FET revenue could decline further if these trends continue. If the percentage of high school students using cigarettes continues to decline, and if other factors such as current levels of regulation remain constant, the number of cigarette smokers could dwindle further in the coming years as the current cohort of high school students ages. A continued decline in cigarette smoking among high school students--which could be due, in part, to increased use of e-cigarettes--would reduce cigarette FET revenue at a greater rate than the average historical trend. FDA and CDC are undertaking efforts that, over time, may enable them to better understand e-cigarettes' relationship to cigarettes and other tobacco products, according to agency officials. For example, FDA and CDC are refining survey instruments that they use to measure adults' and youths' use of e-cigarettes, cigarettes, and other tobacco products, such as the National Health Interview Survey and the National Youth Tobacco Survey. In addition, FDA, in collaboration with the National Institutes of Health, is funding a longitudinal cohort study, the Population Assessment of Tobacco and Health, which asks detailed questions about adults' and youths' use of e-cigarettes, cigarettes, and other tobacco products. FDA officials said that they expect to receive the data from the first year of the study in the summer of 2015. Further, according to FDA and CDC officials, other national surveys, state-level surveys, results of National Institutes of Health and other studies currently under way, and, if available, e-cigarette quantity data could help researchers analyze trends and observe statistical relationships. Treasury and FDA do not collect data on quantities of e-cigarettes on the U.S. market, and we did not identify any other federal agencies that do so. However, Treasury collects data on quantities of domestically manufactured tobacco products that are subject to FET to ensure that the proper FET amount is paid. FDA collects data on quantities of tobacco products that it regulates under its tobacco product authorities to calculate user fees that fund FDA's tobacco regulation activities. Treasury and FDA collect data on quantities for different sets of tobacco products because their authorities to regulate tobacco products stem from different statutes: Treasury's authorities stem from the IRC. The IRC defines "tobacco products" as cigarettes, roll-your-own tobacco, smokeless tobacco, cigars, and pipe tobacco and sets FET rates for these products. The IRC defines each of these products as containing or consisting of tobacco. FDA's tobacco product authorities stem from the Federal Food, Drug, and Cosmetic Act as amended by the Tobacco Control Act. The Tobacco Control Act defines "tobacco product," in part, as any product made or derived from tobacco. The act granted FDA immediate authority over cigarettes, cigarette tobacco, roll-your-own tobacco, and smokeless tobacco. The act also gave FDA authority to deem by regulation any other product meeting the Tobacco Control Act's definition of tobacco product to be subject to FDA's tobacco product authorities. Under this authority, in April 2014 FDA proposed to deem additional products, including e-cigarettes, to be subject to its tobacco product regulation. Treasury collects data on quantities of cigarettes and other federally taxed tobacco products from domestic manufacturers of these products, but does not collect such data for e-cigarettes, because the IRC does not define or list a tax rate for e-cigarettes. According to Treasury officials, on the basis of definitions of the tobacco products enumerated in the IRC, Treasury's ability to tax e-cigarettes--and collect data for them--depends on whether e-cigarettes contain tobacco. Treasury officials said that for e- cigarettes that do not contain tobacco, Treasury could not assert federal taxation and any related data collection by regulation; instead, such authority would require an act of Congress. As of August 2015, Treasury had not collected any FET or data associated with e-cigarettes, according to Treasury officials. FDA does not collect data on quantities of e-cigarettes sold on the U.S. market. FDA's preliminary economic impact analysis accompanying the proposed deeming rule states that when the deemed products become subject to FDA's tobacco product authorities, the agency can begin collecting data to determine the number of regulated entities and to monitor the number and type of unique products sold to the public. At present, FDA collects data on quantities of four tobacco products (cigarettes, cigarette tobacco, roll-your-own tobacco, and smokeless tobacco) that it regulates under its tobacco product authorities to apply the legally mandated method for allocating user fees among the domestic manufacturers and importers of those products. In July 2014, FDA stated that if additional products are deemed subject to its tobacco regulation, the agency would conduct a new rulemaking to make appropriate changes to the user fee regulation. FDA also stated that it recognized that the issue of whether it had authority to assess user fees on some deemed products was controversial and that it intended to solicit public comment to further explore issues related to user fee assessments on tobacco products that may be deemed subject to FDA's tobacco product authorities. According to FDA officials, if e-cigarettes become subject to user fees, FDA would likely need data on quantities of e- cigarettes sold on the U.S. market, comparable to data that the agency collects for the four products currently subject to user fees. Table 2 summarizes information about Treasury's and FDA's collection of data on quantities of cigarettes, other tobacco products, and e-cigarettes. The Department of Labor's Bureau of Labor Statistics (BLS) began collecting limited e-cigarette price information in September 2014 as part of its ongoing data collection for the Consumer Price Index. The Consumer Price Index provides monthly data on changes in the prices paid by urban consumers for a representative "basket" of goods and services. The index is divided into more than 200 categories representing the goods and services that an urban consumer might typically buy. BLS collects e-cigarette price information, under the category "tobacco products other than cigarettes," for disposable e-cigarettes, starter kits, liquid refills, and e-cigarette replacement cartridges. These items may or may not contain nicotine and may have any flavor. According to BLS officials, the number of observations on e-cigarette prices is too small to calculate a reliable national average price or reliable state-level prices. According to the officials, U.S. consumers' e-cigarette expenditures, while increasing, represent a small share of total expenditures in the representative basket of goods and services. Additionally, BLS officials explained that the Consumer Price Index sample for "tobacco products other than cigarettes" is refreshed over a 4- year cycle; the length of time it takes to fully replace samples causes Consumer Price Index sample shares (the percentage of the sample composed of the prices of a given product) to lag real-world percentages for items for which consumers' expenditures are changing rapidly. The Consumer Price Index sample included 10 e-cigarette price observations as of June 2015 and, according to the BLS officials, will increase to 14 e- cigarette price observations by October 2015. BLS would require more resources in order to collect substantially more data on e-cigarettes, according to BLS officials. Our analysis shows no current effect of the growing e-cigarette market on FET revenue from cigarettes. Given the limited information about the e- cigarette market, it is difficult to accurately estimate this market's size or analyze its potential effect on FET revenue from cigarettes and other tobacco products. The increased regulation of tobacco products at the federal and state level, among other things, has contributed to a decline in cigarette use and FET revenue. Recent CDC studies show that e- cigarette use has significantly increased among high school students, while cigarette use has significantly declined. As the regulation of e- cigarettes unfolds and the market develops, e-cigarette use patterns may change. Federal agencies' efforts to develop a better understanding of the relationship between e-cigarette and cigarette use will help analysts and government officials develop a more complete understanding of the e-cigarette market and its effect on cigarette FET revenue. We provided a draft of this report to DOL, HHS, and Treasury. We also provided relevant portions to U.S. Customs and Border Protection and USPTO. We received technical comments from DOL, HHS, and Treasury and incorporated the comments as appropriate. We are sending copies of this report to the appropriate congressional committees; the Secretaries of Health and Human Services, Labor, and the Treasury; and other interested parties. In addition, the report is available at no charge on the GAO website at http://www.gao.gov. If you or your staff have any questions about this report, please contact me at (202) 512-3149 or [email protected]. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this report. GAO staff who made key contributions to this report are listed in appendix III. This report examines the extent to which (1) use of electronic cigarettes (also known as e-cigarettes) affects federal excise tax (FET) revenue from cigarettes and (2) data on quantities and prices of e-cigarettes on the U.S. market are available from federal agencies. To address these objectives, we reviewed documents and interviewed officials from the Department of the Treasury's (Treasury) Alcohol and Tobacco Tax and Trade Bureau and Treasury's Office of Tax Analysis, the Food and Drug Administration (FDA), the Centers for Disease Control and Prevention (CDC), the U.S. Bureau of Labor Statistics (BLS), and the U.S. Patent and Trademark Office (USPTO) to obtain information and views about e-cigarette and tobacco sales and revenue trends and regulation. We determined the reliability of USPTO e-cigarette trademark registration data by interviewing cognizant USPTO officials. We also interviewed industry experts, including e-cigarette industry members, tobacco industry members, financial analysts, researchers, and representatives of public health organizations. We interviewed organizations and companies that represent a range of perspectives. We spoke with industry associations that represent small and midsized e- cigarette companies; we also spoke with representatives of leading companies that produce e-cigarettes, as measured by dollar share from available data, including an independent e-cigarette company and tobacco companies. The views expressed by these representatives are not generalizable and do not represent the views of the entire e-cigarette industry. We also attended an e-cigarette industry conference as well as three FDA public workshops featuring current research on e-cigarette product science and implications of e-cigarette use for individual health and population health. To determine whether e-cigarette use affects cigarette FET revenue, we examined cigarette FET revenue from April 2009 through December 2014. For this analysis, we used monthly data obtained from Treasury on FET revenue from cigarettes removed from domestic factories or released from customs custody for distribution in the United States. In addition, using these removals data and testimonial evidence, we constructed a multivariate model that estimates the effect of e-cigarette use on cigarette FET revenue. In particular, we regressed cigarette FET revenue on a number of variables, including other tobacco products, a trend, presence of e-cigarettes on the market, and seasonality. We assessed the reliability of the data by checking the data for inconsistency errors and for completeness. We determined that the cigarette removals data were sufficiently reliable for the purposes of this report. See appendix II for more explanation of our analysis. To examine the extent to which data on quantities and prices for e- cigarettes on the U.S. market are available from federal agencies, we interviewed cognizant officials from Treasury, FDA, CDC, BLS, and the Congressional Budget Office, as well as industry experts. To describe Treasury's collection of data on quantities of federally taxed tobacco products, we reviewed documents and interviewed officials from Treasury's Alcohol and Tobacco Tax and Trade Bureau. To describe FDA's collection of data on quantities of tobacco products regulated by the agency, we examined FDA's regulatory actions, including its April 2014 proposed rule to deem additional products, including e-cigarettes, to be subject to the agency's tobacco product authorities, and the July 2014 final user fee rule, and we interviewed cognizant FDA officials. To describe BLS's collection of data on e-cigarette prices, we reviewed documents and interviewed BLS officials. We conducted this performance audit from September 2014 to September 2015 in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. We constructed a multivariate model to estimate the effect of electronic cigarette (e-cigarette) use on federal excise tax (FET) revenue from traditional cigarettes. The model uses monthly data obtained from the Department of the Treasury's (Treasury) Alcohol and Tobacco Tax and Trade Bureau and controls for a 6-year historical trend in cigarette FET revenue, from April 2009 through December 2014; the presence of other tobacco products (cigars, roll-your-own tobacco, and pipe tobacco); and seasonality effects. The model also includes a time variable that tests for the presence of e-cigarettes. We used five different dates during the period January 2012 to October 2013 for the time variable, and we estimated regressions for each date. In particular, the model uses the following equation: and imported cigarettes collected in period t; CCCCCC_rrrrrrtt=+ccCCCCccrrcctt+rr_ppCCpprrtt+ttrrrrtttttt+rrccCCCC_ccsspprrtt CCCCCC_rrrrrrtt= the amount of FET revenue, in nominal dollars, from domestic where = an intercept; ccCCCCccrrcctt = the sum of small and large cigar removals in number of sticks rr_ppCCpprrtt = the sum of roll-your-own and pipe tobacco removals in in period t; ttrrrrtttttt = a monthly trend that controls for the historical changes in pounds in period t; rrccCCCC_ccsspprrtt = a dummy variable that equals one for each month on or cigarette revenue at period t; after the date that indicates the presence of e-cigarettes in the market (because there is no clear indicator of this presence, we selected five different dates to indicate the beginning of this presence); monthly seasonality, with June as the reference month; and mmtttthss_CCttttCCccccttrrcctt = a set of eleven dummy variables controlling for utt = an error term assumed to be heteroskedastic and possibly autocorrelated. David Gootnick, (202) 512-3149 or [email protected]. In addition to the contact named above, Christine Broderick (Assistant Director), Christina Werth, Sada Aksartova, Pedro Almoguera, Grace Lui, and Srinidhi Vijaykumar made key contributions to this report. In addition, Tina Cheng and Reid Lowe provided technical assistance. | While use of traditional cigarettes in the United States continues to decline, use of e-cigarettes is increasing. Treasury collects FET on cigarettes and other tobacco products manufactured in the United States. The Internal Revenue Code of 1986, as amended, does not specifically define or list a tax rate for e-cigarettes. The decline in cigarette use has led to a decline in cigarette FET revenue, from $15.3 billion in fiscal year 2010 to $13.2 billion in fiscal year 2014. FDA currently regulates four tobacco products. In April 2014, FDA proposed to deem additional tobacco products, including e-cigarettes, subject to its tobacco product authorities. GAO was asked to examine issues related to the U.S. e-cigarette market. This report examines the extent to which (1) e-cigarette use affects cigarette FET revenue, and (2) data on quantities and prices of e-cigarettes on the U.S. market are available from federal agencies. GAO conducted a regression analysis to assess the effect of e-cigarette use on cigarette FET revenue from April 2009 through December 2014, using Treasury data on FET revenue from cigarettes. GAO also reviewed agency documents and interviewed agency officials and industry experts. GAO's analysis found no evidence that use of electronic cigarettes (e-cigarettes) has affected federal excise tax (FET) revenue from traditional cigarettes, which has been declining over time (see figure). Possible reasons for the lack of a detectable effect include the small size of the e-cigarette market (estimated at $2.5 billion in 2014) relative to the cigarette market ($80 billion in the same year); lack of comprehensive and reliable data on e-cigarette quantities and prices; and lack of comprehensive and reliable information about the extent to which e-cigarettes substitute for cigarettes. If users consume e-cigarettes instead of cigarettes, cigarette FET revenue would decline as fewer cigarettes are consumed. Data from a recent survey by the Centers for Disease Control and Prevention showing high school students' increasing use of e-cigarettes and decreasing use of cigarettes suggest that these students may substitute e-cigarettes for cigarettes to some extent. If the percentage of high school students using cigarettes continues to decline, cigarette FET revenue could also decrease at a greater rate than the average historic trend observed since April 2009, when FET on cigarettes and other tobacco products was last increased. Comprehensive data on e-cigarette quantities and prices are not available from federal agencies. The Department of the Treasury (Treasury) and Food and Drug Administration (FDA) do not collect data on e-cigarette quantities comparable to data that they collect for cigarettes and some other tobacco products. According to FDA officials, if e-cigarettes are deemed subject to FDA's tobacco product authorities as a result of a rule proposed in April 2014, the agency could start collecting some data on the types of e-cigarettes on the U.S. market but will not collect data on the quantities of e-cigarettes sold. The Bureau of Labor Statistics began collecting data on e-cigarette prices in September 2014 as part of its data collection for the Consumer Price Index, but these data are limited. GAO is not making recommendations in this report. | 5,808 | 694 |
Our analysis of FDIC data showed that while the profitability of most minority banks with assets greater than $100 million nearly equaled the profitability of all similarly sized banks (peers), the profitability of smaller minority banks and African-American banks of all sizes did not. Profitability is commonly measured by return on assets (ROA), or the ratio of profits to assets, and ROAs are typically compared across peer groups to assess performance. Many small minority banks (those with less than $100 million in assets) had ROAs that were substantially lower than those of their peer groups in 2005 as well as in 1995 and 2000. Moreover, African- American banks of all sizes had ROAs that were significantly below those of their peers in 2005 as well as in 1995 and 2000 (African-American banks of all sizes and other small minority banks account for about half of all minority banks). Our analysis of FDIC data identified some possible explanations for the relatively low profitability of some small minority banks and African-American banks, such as relatively higher reserves for potential loan losses and administrative expenses and competition from larger banks. Nevertheless, the majority of officials from banks across all minority groups were positive about their banks' financial outlook, and many saw their minority status as an advantage in serving their communities (for example, in providing services in the language predominantly used by the minority community). The bank regulators have adopted differing approaches to supporting minority banks, and, at the time of our review, no agency had assessed the effectiveness of its efforts through regular and comprehensive surveys of minority banks or outcome-oriented performance measures. FDIC-- which supervises more than half of all minority banks--had the most comprehensive program to support minority banks and led an interagency group that coordinates such efforts. Among other things, FDIC has designated officials in the agency's headquarters and regional offices to be responsible for minority bank efforts, held periodic conferences for minority banks, and established formal policies for annual outreach to the banks it regulates to make them aware of available technical assistance. OTS also designated staff to be responsible for the agency's efforts to support minority banks, developed outreach procedures, and focused its efforts on providing technical assistance. OCC and the Federal Reserve, while not required to do so by Section 308 of FIRREA, undertook some efforts to support minority banks, such as holding occasional conferences for Native American banks, and were planning additional efforts. FDIC proactively sought to assess the effectiveness of its support efforts; for example, it surveyed minority banks. However, these surveys did not address key activities, such as the provision of technical assistance, and the agency had not established outcome-oriented performance measures for its support efforts. Furthermore, none of the other regulators comprehensively surveyed minority banks on the effectiveness of their support efforts or established outcome-oriented performance measures. Consequently, the regulators were not well positioned to assess the results of their support efforts or identify areas for improvement. Our survey of minority banks identified potential limitations in the regulators' support efforts that likely would be of significance to agency managers and warrant follow-up analysis. About one-third of survey respondents rated their regulators' efforts for minority banks as very good or good, while 26 percent rated the efforts as fair, 13 percent as poor or very poor, and 25 percent responded "do not know." FDIC-regulated banks were more positive about their agency's efforts than banks that other agencies regulated. However, only about half of the FDIC-regulated banks and about a quarter of the banks regulated by other agencies rated their agency's efforts as very good or good. Although regulators may emphasize the provision of technical assistance to minority banks, less than 30 percent of such institutions said they had used such agency services within the last 3 years. Therefore, the banks may have been missing opportunities to address problems that limited their operations or financial performance. As we found in our 1993 report, some minority bank officials also said that examiners did not always understand the challenges that the banks may face in providing services in their communities or operating environments. Although the bank officials said they did not expect special treatment in the examination process, they suggested that examiners needed to undergo more training to improve their understanding of minority banks and the customer base they serve. To allow the regulators to better understand the effectiveness of their support efforts, our October 2006 report recommended that the regulators review such efforts and, in so doing, consider employing the following methods: (1) regularly surveying the minority banks under their supervision on all efforts and regulatory areas affecting these institutions; or (2) establishing outcome-oriented performance measures to evaluate the extent to which their efforts are achieving their objectives. Subsequent to the report's issuance, the regulators have reported taking steps to better assess or enhance their minority bank support efforts. For example, all of the regulators have developed surveys or are in the process of consulting with minority banks to obtain feedback on their support efforts. I also note that some regulators plan to provide additional training to their examiners on minority bank issues. These initiatives are positive developments, but it is too soon to evaluate their effectiveness. We encourage agency officials to ensure that they collect and analyze relevant data and take steps to enhance their minority bank support efforts as may be warranted. Many minority banks are located in urban areas and seek to serve distressed communities and populations that financial institutions traditionally have underserved. For example, after the Civil War, banks were established to provide financial services to African-Americans. More recently, Asian-American and Hispanic-American banks have been established to serve the rapidly growing Asian and Hispanic communities in the United States. In our review of regulators' lists of minority banks, we identified a total minority bank population of 195 for 2005 (see table 1). Table 2 shows that the distribution of minority banks by size is similar to the distribution of all banks by size. More than 40 percent of all minority banks had assets of less than $100 million. Each federally insured depository institution, including each minority bank, has a primary federal regulator. As shown in table 3, FDIC serves as the primary federal regulator for more than half of minority banks--109 of the 195 banks, or 56 percent--and the Federal Reserve regulates the fewest. The federal regulators primarily focus on ensuring the safety and soundness of banks and do so through on-site examinations and other means. Regulators may also close banks that are deemed insolvent and posing a risk to the Deposit Insurance Fund. FDIC is responsible for ensuring that the deposits in failed banks are protected up to established deposit insurance limits. While the regulators' primary focus is bank safety and soundness, laws and regulations can identify additional goals and objectives. Recognizing the importance of minority banks, Section 308 of FIRREA outlined five broad goals toward which FDIC and OTS, in consultation with Treasury, are to work to preserve and promote minority banks. These goals are: preserving the present number of minority banks; preserving their minority character in cases involving mergers or acquisitions of minority banks; providing technical assistance to prevent insolvency of institutions that are not currently insolvent; promoting and encouraging the creation of new minority banks; and providing for training, technical assistance, and education programs. Technical assistance is typically defined as one-to-one assistance that a regulator may provide to a bank in response to a request. For example, a regulator may advise a bank on compliance with a particular statute or regulation. Regulators also may provide technical assistance to banks that is related to deficiencies identified in safety and soundness examinations. In contrast, education programs typically are open to all banks regulated by a particular agency or all banks located within a regulator's regional office. For example, regulators may offer training for banks to review compliance with laws and regulations. As shown in figure 1, our 2006 report found that, according to FDIC data, most minority banks with assets exceeding $100 million had ROAs in 2005 that were close to those of their peer groups, while many smaller banks had ROAs that were significantly lower than those of their peers. Minority banks with more than $100 million in assets accounted for 58 percent of all minority banks, while those with less than $100 million accounted for 42 percent. Each size category of minority banks with more than $100 million in assets had a weighted average ROA that was slightly lower than that of its peers, but in each case their ROAs exceeded 1 percent. By historical banking industry standards, an ROA of 1 percent or more generally has been considered to indicate an adequate level of profitability. We found that profitability of the larger minority, Hispanic-American, Asian- American, Native American, and women-owned banks were close to, and in some cases exceeded, the profitability of their peers in 2005. In contrast, small minority banks (those with assets of less than $100 million) had an average ROA of 0.4 percent, and their peers had an average ROA of 1 percent. Our analysis of FDIC data for 1995 and 2000 also indicated some similar patterns, with minority banks with assets greater than $100 million showing levels of profitability that generally were close to those of their peers, or ROAs of about 1 percent, and minority banks with assets of less than $100 million showing greater differences with their peers. The profitability of African-American banks generally has been below that of their peers in all size categories (see fig. 2). For example, African- American banks with less than $100 million in assets--which constitute 61 percent of all African-American banks--had an average ROA of 0.16 percent, while their peers averaged 1.0 percent. Our analysis of FDIC data for 2000 and 1995 also found that African-American banks of all sizes had lower ROAs than their peers. Our analysis of 2005 FDIC data also suggests some possible reasons for the differences in profitability between some minority banks and their peers. For example, our analysis of 2005 FDIC data showed that African- American banks with assets of less than $300 million--which constitute 87 percent of all African-American banks--had significantly higher loan loss reserves as a percentage of their total assets than the average for their peers (see fig. 3). Although having higher loan loss reserves may be necessary for the safe and sound operation of any particular bank, they lower bank profits because loan loss reserves are counted as expenses. We also found some evidence that higher operating expenses might affect the profitability of some minority banks. Operating expenses-- expenditures for items such as administrative expenses and salaries-- typically are compared to an institution's total earning assets, such as loans and investments, to indicate the proportion of earning assets that banks spend on operating expenses. As figure 4 indicates, many minority banks with less than $100 million in assets had higher operating expenses than their peers in 2005. Academic studies we reviewed generally reached similar conclusions. Officials from several minority banks we contacted also described aspects of their operating environment, business practices, and customer service that could result in higher operating costs. In particular, the officials cited the costs associated with providing banking services in low-income urban areas or in communities with high immigrant populations. Bank officials also told us that they focus on fostering strong customer relationships, sometimes providing financial literacy services. Consequently, as part of their mission these banks spend more time and resources on their customers per transaction than other banks. Other minority bank officials said that their customers made relatively small deposits and preferred to do business in person at bank branch locations rather than through potentially lower-cost alternatives, such as over the phone or the Internet. Minority bank officials also cited other factors that may have limited their profitability. In particular, in response to Community Reinvestment Act (CRA) incentives, the officials said that larger banks and other financial institutions were increasing competition for minority banks' traditional customer base. The officials said that larger banks could offer loans and other financial services at more competitive prices because they could raise funds at lower rates and take advantage of operational efficiencies. In addition, officials from some African-American and Hispanic banks cited attracting and retaining quality staff as a challenge to their profitability. Despite these challenges, officials from banks across minority groups were optimistic about the financial outlook for their institutions. When asked in our survey to rate their financial outlook compared to those of the past 3 to 5 years, 65 percent said it would be much or slightly better; 21 percent thought it would be about the same, and 11 percent thought it would be slightly or much worse, while 3 percent did not know. Officials from minority banks said that their institutions had advantages in serving minority communities. For example, officials from an Asian-American bank said that the staff's ability to communicate in the customers' primary language provided a competitive advantage. Our report found that FDIC--which supervises 109 of 195 minority banks--had developed the most extensive efforts to support minority banks among the banking regulators (see fig. 5). FDIC had also taken the lead in coordinating regulators' efforts in support of minority banks, including leading a group of all the banking regulators that meets semiannually to discuss individual agency initiatives, training and outreach events, and each agency's list of minority banks. OTS had developed a variety of support programs, including developing a minority bank policy statement and staffing support structure. OCC had also taken steps to support minority banks, such as developing a policy statement. OCC and the Federal Reserve had also hosted events for some minority banks. The following highlights some key support activities discussed in our October 2006 report. Policy Statements. FDIC, OTS, and OCC all have policy statements that outline the agencies' efforts for minority banks. They discuss how the regulators identify minority banks, participate in minority bank events, provide technical assistance, and work toward preserving the character of minority banks during the resolution process. OCC officials told us that they developed their policy statement in 2001 after an interagency meeting of the federal banking regulators on minority bank issues. Both FDIC and OTS issued policy statements in 2002. Staffing Structure. FDIC has a national coordinator in Washington, D.C. and coordinators in each regional office from its Division of Supervision and Consumer Protection to implement the agency's minority bank program. Among other responsibilities, the national coordinator regularly contacts minority bank trade associations about participation in events and other issues, coordinates with other agencies, and compiles quarterly reports for the FDIC chairman based on regional coordinators' reports on their minority bank activities. Similarly, OTS has a national coordinator in its headquarters and supervisory and community affairs staff in each region who maintain contact with the minority banks that OTS regulates. While OCC and the Federal Reserve did not have similar staffing structures, officials from these agencies had contacted minority banks among their responsibilities. Minority Bank Events and Training. FDIC has taken the lead role in sponsoring, hosting, and coordinating events in support of minority banks. For example, in August 2006 FDIC sponsored a national conference for minority banks in which representatives from OTS, OCC, and the Federal Reserve participated. FDIC also has sponsored the Minority Bankers Roundtable (MBR) series, which agency officials told us was designed to provide insight into the regulatory relationship between minority banks and FDIC and explore opportunities for partnerships between FDIC and these banks. In 2005, FDIC held six roundtables around the country for minority banks supervised by all of the regulators. To varying degrees, OTS, OCC, and the Federal Reserve also have held events to support minority banks, such as Native American Institutions. Technical Assistance. All of the federal banking regulators told us that they provided their minority banks with technical assistance if requested, but only FDIC and OTS have specific procedures for offering this assistance. More specifically, FDIC and OTS officials told us that they proactively seek to make minority banks aware of such assistance through established outreach procedures outside of their customary examination and supervision processes. FDIC also has a policy that requires its regional coordinators to ensure that examination case managers contact minority banks from 90 to 120 days after an examination to offer technical assistance in any problem areas that were identified during the examination. This policy is unique to minority banks. OCC and the Federal Reserve provide technical assistance to all of their banks, but had not established outreach procedures for all their minority banks outside of the customary examination and supervision processes. However, OCC officials told us that they were in the process of developing an outreach plan for all minority banks regulated by the agency. Federal Reserve officials told us that Federal Reserve districts conduct informal outreach to their minority banks and consult with other districts on minority bank issues as needed. Policies to Preserve the Minority Character of Troubled Banks. FDIC has developed policies for failing banks that are consistent with FIRREA's requirement that the agency work to preserve the minority character of minority banks in cases of mergers and acquisitions. For example, FDIC maintains a list of qualified minority banks or minority investors that may be asked to bid on the assets of troubled minority banks that are expected to fail. However, FDIC is required to accept the bids on failing banks that pose the lowest expected cost to the Deposit Insurance Fund. As a result, all bidders, including minority bidders, are subject to competition. OTS and OCC have developed written policies that describe how the agencies will work with FDIC to identify qualified minority banks or investors to acquire minority banks that are failing. While the Federal Reserve does not have a similar written policy, agency officials say that they also work with FDIC to identify qualified minority banks or investors. All four agencies also said that they try to assist troubled minority banks improve their financial condition before it deteriorates to the point that a resolution through FDIC becomes necessary. For example, agencies may provide technical assistance in such situations or try to identify other minority banks willing to acquire or merge with the troubled institutions. While FDIC was proactive in assessing its support efforts for minority banks, none of the regulators routinely and comprehensively surveyed their minority banks on all issues affecting the institutions, nor have the regulators established outcome-oriented performance measures. Evaluating the effectiveness of federal programs is vitally important to manage programs successfully and improve program results. To this end, in 1993 Congress enacted the Government Performance and Results Act, which instituted a governmentwide requirement that agencies report on their results in achieving their agency and program goals. As part of its assessment methods, FDIC conducted roundtables and surveyed minority banks on aspects of its minority bank efforts. For example, in 2005, FDIC requested feedback on its efforts from institutions that attended the agency's six MBRs (which approximately one-third of minority banks attended). The agency also sent a survey letter to all minority banks to seek their feedback on several proposals to better serve such institutions, but only 24 minority banks responded. The proposals included holding another national minority bank conference, instituting a partnership program with universities, and developing a minority bank museum exhibition. FDIC officials said that they used the information gathered from the MBRs and the survey to develop recommendations for improving programs and developing new initiatives. While FDIC had taken steps to assess the effectiveness of its minority bank support efforts, we identified some limitations in its approach. For example, in FDIC's surveys of minority banks, the agency did not solicit feedback on key aspects of its support efforts, such as the provision of technical assistance. Moreover, FDIC has not established outcome- oriented performance measures to gauge the effectiveness of its various support efforts. None of the other regulators had surveyed minority banks recently on support efforts or developed performance measures. By not taking such steps, we concluded that the regulators were not well positioned to assess their support efforts or identify areas for improvement. Further, the regulators could not take corrective action as necessary to provide better support efforts to minority banks. Minority bank officials we surveyed identified potential limitations in the regulators' efforts to support them and related regulatory issues, such as examiners' understanding of issues affecting minority banks, which would likely be of significance to agency managers and warrant follow-up analysis. Some 36 percent of survey respondents described their regulators' efforts as very good or good, 26 percent described them as fair, and 13 percent described the efforts as poor or very poor (see fig. 6). A relatively large percentage--25 percent--responded "do not know" to this question. Banks' responses varied by regulator, with 45 percent of banks regulated by FDIC giving very good or good responses, compared with about 25 percent of banks regulated by other agencies. However, more than half of FDIC-regulated banks and about three-quarters of the other minority banks responded that their regulator's efforts were fair, poor, or very poor or responded with a "do not know." In particular, banks regulated by OTS gave the highest percentage of poor or very poor marks, while banks regulated by the Federal Reserve most often provided fair marks. Nearly half of minority banks reported that they attended FDIC roundtables and conferences designed for minority banks, and about half of the 65 respondents that attended these events found them to be extremely or very useful (see fig. 7). Almost a third found them to be moderately useful, and 17 percent found them to be slightly or not at all useful. One participant commented that the information was useful, as was the opportunity to meet the regulators. Many banks also commented that the events provided a good opportunity to network and share ideas with other minority banks. While FDIC and OTS emphasized technical services as key components of their efforts to support minority banks, less than 30 percent of the institutions they regulate reported using such assistance within the last 3 years (see fig. 8). Minority banks regulated by OCC and the Federal Reserve reported similarly low usage of technical assistance services. However, of the few banks that used technical assistance--41--the majority rated the assistance provided as extremely or very useful. Further, although small minority banks and African-American banks of all sizes have consistently faced financial challenges and might benefit from certain types of assistance, the banks also reported low rates of usage of the agencies' technical assistance. While our survey did not address the reasons that relatively few minority banks appear to use the technical assistance and banking regulators cannot compel banks under their supervision to make use of offered technical assistance, the potential exists that many such institutions may be missing opportunities to learn how to correct problems that limit their operational and financial performance. More than 80 percent of the minority banks we surveyed responded that their regulators did a very good or good job of administering examinations, and almost 90 percent felt that they had very good or good relationships with their regulator. However, as in our 1993 report, some minority bank officials said in both survey responses and interviews that examiners did not always understand the challenges the banks faced in providing services in their particular communities. Twenty-one percent of survey respondents mentioned this issue when asked for suggestions about how regulators could improve their efforts to support minority banks, and several minority banks that we interviewed elaborated on this topic. The bank officials said that examiners tended to treat minority banks like any other bank when they conducted examinations and thought such comparisons were not appropriate. For example, some bank officials whose institutions serve immigrant communities said that their customers tended to do business in cash and carried a significant amount of cash because banking services were not widely available or trusted in the customers' home countries. Bank officials said that examiners sometimes commented negatively on the practice of customers doing business in cash or placed the bank under increased scrutiny relative to the Bank Secrecy Act's requirements for cash transactions. While the bank officials said that they did not expect preferential treatment in the examination process, several suggested that examiners undergo additional training so that they could better understand minority banks and the communities that these institutions served. FDIC has conducted such training for its examiners. In 2004, FDIC invited the president of a minority bank to speak to about 500 FDIC examiners on the uniqueness of minority banks and the examination process. FDIC officials later reported that the examiners found the discussion helpful. Many survey respondents also said that a CRA provision that was designed to assist their institutions was not effectively achieving this goal. The provision allows bank regulators conducting CRA examinations to give consideration to banks that assist minority banks through capital investment, loan participation, and other ventures that help meet the credit needs of local communities. Despite this provision, only 18 percent of survey respondents said that CRA had--to a very great or great extent-- encouraged other institutions to invest in or form partnerships with their institutions, while more than half said that CRA encouraged such activities to some, little, or no extent (see fig. 9). Some minority bankers attributed their view that the CRA provision has not been effective, in part, to a lack of clarity in interagency guidance on the act's implementation. They said that the interagency guidance should be clarified to assure banks that they will receive CRA consideration in making investments in minority banks. Our 2006 report recommended that the bank regulators regularly review the effectiveness of their minority bank support efforts and related regulatory activities and, as appropriate, make changes necessary to better serve such institutions. In conducting such reviews, we recommended that the regulators consider conducting periodic surveys of minority banks or developing outcome-oriented performance measures for their support efforts. In conducting such reviews, we also suggested that the regulators focus on the overall views of minority banks about support efforts, the usage and effectiveness of technical assistance (particularly assistance provided to small minority and African-American banks), and the level of training provided to agency examiners on minority banks and their operating environments. Over the past year, bank regulatory officials we contacted identified several steps that they have initiated to assess the effectiveness of their minority bank support efforts or to enhance such support efforts. They include the following actions: A Federal Reserve official told us that the agency has established a working group that is developing a pilot training program for minority banks and new banks. The official said that three training modules have been drafted for different phases of a bank's life, including starting a bank, operating a bank during its first 5 years of existence, and bank expansion. The official said that the program will be piloted throughout the U.S. beginning in early November 2007. Throughout the course of developing, drafting, and piloting the program, Federal Reserve officials said they have, and will continue to, consult with minority bankers to obtain feedback on the effort. An OCC official said that the agency recently sent a survey to minority banks on its education, outreach, and technical assistance efforts that should be completed by the end of October. OCC also plans to follow up this survey with a series of focus groups. In addition, the official said OCC just completed an internal survey of certain officials involved in supervising minority institutions, and plans to review the results of the two surveys and focus groups to improve its minority bank support efforts. FDIC officials told us that the agency has developed a survey to obtain feedback on the agency's minority bank support efforts. They estimate that the survey will be sent out to all minority institutions (not just those minority banks FDIC supervises) in mid-December 2007. An OTS official told us that the agency will send out a survey to the minority banks the agency supervises on its efforts in the next couple weeks and that it has also conducted a series of roundtables with minority banks in the past year. The federal banking agencies have also taken some steps to address other issues raised in our report. For example, Federal Reserve and FDIC officials told us that that the agencies will provide additional training on minority bank issues to their examiners. In addition, in July 2007 the federal banking agencies published a CRA Interagency Notice that requested comments on nine new "Questions and Answers" about community reinvestment. One question covers how majority banks may engage in and receive positive CRA consideration for activities conducted with minority institutions. An OCC official said that the comments on the proposed "Q and As" are under review. While the regulators' recent efforts to assess and enhance their minority bank support efforts and other activities are encouraging, it is too soon to assess their effectiveness. For example, the Federal Reserve's pilot training program for minority and new banks is not scheduled to begin until later this year. Further, the other regulators' efforts to survey minority banks on support efforts generally also are at an early stage. We encourage agency officials to ensure that they collect and analyze relevant data and take steps to enhance their minority bank support efforts as warranted. Mr. Chairman, this concludes my prepared statement. I would be happy to address any questions that you or subcommittee members may have. For further information about this testimony, please contact George A. Scott on (202) 512-7215 or at [email protected]. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this statement. Individuals making key contributions include Wesley M. Phillips, Assistant Director; Allison Abrams; Kevin Averyt; and Barbara Roesmann. This is a work of the U.S. government and is not subject to copyright protection in the United States. The published product may be reproduced and distributed in its entirety without further permission from GAO. However, because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. | Minority banks can play an important role in serving the financial needs of historically underserved communities and growing populations of minorities. For this reason, the Financial Institutions, Reform, Recovery, and Enforcement Act of 1989 (FIRREA) established goals that the Federal Deposit Insurance Corporation (FDIC) and the Office of Thrift Supervision (OTS) must work toward to preserve and promote such institutions (support efforts). While not required to do so by FIRREA, the Board of Governors of the Federal Reserve System (Federal Reserve) and Office of the Comptroller of the Currency (OCC) have established some minority bank support efforts. This testimony, based on a 2006 General Accountability Office (GAO) report, discusses the profitability of minority banks, regulators' support and assessment efforts, and the views of minority banks on the regulators' efforts as identified through responses from a survey of 149 such institutions. GAO reported in 2006 that the profitability of most large minority banks (assets greater than $100 million) was nearly equal to that of their peers (similarly sized banks) in 2005 and earlier years, according to FDIC data. However, many small minority banks and African-American banks of all sizes were less profitable than their peers. GAO's analysis and other studies identified some possible explanations for these differences, including relatively higher loan loss reserves and operating expenses and competition from larger banks. Bank regulators had adopted differing approaches to supporting minority banks, but no agency had regularly and comprehensively assessed the effectiveness of its efforts. FDIC--which supervises over half of all minority banks--had the most comprehensive support efforts and leads interagency efforts. OTS focused on providing technical assistance to minority banks. While not required to do so by FIRREA, OCC and the Federal Reserve had taken some steps to support minority banks. Although FDIC had recently sought to assess the effectiveness of its support efforts through various methods, none of the regulators comprehensively surveyed minority banks or had developed performance measures. Consequently, the regulators were not well positioned to assess their support efforts. GAO's survey of minority banks identified potential limitations in the regulators' support efforts that would likely be of significance to agency managers and warrant follow-up analysis. Only about one-third of survey respondents rated their regulators' efforts for minority banks as very good or good, while 26 percent rated the efforts as fair, 13 percent as poor or very poor, and 25 percent responded "don't know". Banks regulated by FDIC were more positive about their agency's efforts than banks regulated by other agencies. However, only about half of the FDIC-regulated banks and about a quarter of the banks regulated by other agencies rated their agency's efforts as very good or good. Although regulators may have emphasized the provision of technical assistance to minority banks, less than 30 percent of such institutions have used such agency services within the last 3 years and therefore may be missing opportunities to address problems that limit their operations or financial performance. | 6,271 | 645 |
The public faces a high risk that critical services provided by the government and the private sector could be severely disrupted by the Year 2000 computing crisis. Financial transactions could be delayed, flights grounded, power lost, and national defense affected. Moreover, America's infrastructures are a complex array of public and private enterprises with many interdependencies at all levels. These many interdependencies among governments and within key economic sectors could cause a single failure to have adverse repercussions. Key economic sectors that could be seriously affected if their systems are not Year 2000 compliant include information and telecommunications; banking and finance; health, safety, and emergency services; transportation; power and water; and manufacturing and small business. The information and telecommunications sector is especially important. In testimony in June, we reported that the Year 2000 readiness of the telecommunications sector is one of the most crucial concerns to our nation because telecommunications are critical to the operations of nearly every public-sector and private-sector organization. For example, the information and telecommunications sector (1) enables the electronic transfer of funds, the distribution of electrical power, and the control of gas and oil pipeline systems, (2) is essential to the service economy, manufacturing, and efficient delivery of raw materials and finished goods, and (3) is basic to responsive emergency services. Reliable telecommunications services are made possible by a complex web of highly interconnected networks supported by national and local carriers and service providers, equipment manufacturers and suppliers, and customers. In addition to the risks associated with the nation's key economic sectors, one of the largest, and largely unknown, risks relates to the global nature of the problem. With the advent of electronic communication and international commerce, the United States and the rest of the world have become critically dependent on computers. However, there are indications of Year 2000 readiness problems in the international arena. For example, a June 1998 informal World Bank survey of foreign readiness found that only 18 of 127 countries (14 percent) had a national Year 2000 program, 28 countries (22 percent) reported working on the problem, and 16 countries (13 percent) reported only awareness of the problem. No conclusive data were received from the remaining 65 countries surveyed (51 percent). In addition, a survey of 15,000 companies in 87 countries by the Gartner Group found that the United States, Canada, the Netherlands, Belgium, Australia, and Sweden were the Year 2000 leaders, while nations including Germany, India, Japan, and Russia were 12 months or more behind the United States. The Gartner Group's survey also found that 23 percent of all companies (80 percent of which were small companies) had not started a Year 2000 effort. Moreover, according to the Gartner Group, the "insurance, investment services and banking are industries furthest ahead. Healthcare, education, semiconductor, chemical processing, agriculture, food processing, medical and law practices, construction and government agencies are furthest behind. Telecom, power, gas and water, software, shipbuilding and transportation are laggards barely ahead of furthest-behind efforts." The following are examples of some of the major disruptions the public and private sectors could experience if the Year 2000 problem is not corrected. Unless the Federal Aviation Administration (FAA) takes much more decisive action, there could be grounded or delayed flights, degraded safety, customer inconvenience, and increased airline costs. Aircraft and other military equipment could be grounded because the computer systems used to schedule maintenance and track supplies may not work. Further, the Department of Defense (DOD) could incur shortages of vital items needed to sustain military operations and readiness. Medical devices and scientific laboratory equipment may experience problems beginning January 1, 2000, if the computer systems, software applications, or embedded chips used in these devices contain two-digit fields for year representation. According to the Basle Committee on Banking Supervision--an international committee of banking supervisory authorities--failure to address the Year 2000 issue would cause banking institutions to experience operational problems or even bankruptcy. Recognizing the seriousness of the Year 2000 problem, on February 4, 1998, the President signed an executive order that established the President's Council on Year 2000 Conversion led by an Assistant to the President and composed of one representative from each of the executive departments and from other federal agencies as may be determined by the Chair. The Chair of the Council was tasked with the following Year 2000 roles: (1) overseeing the activities of agencies, (2) acting as chief spokesperson in national and international forums, (3) providing policy coordination of executive branch activities with state, local, and tribal governments, and (4) promoting appropriate federal roles with respect to private-sector activities. Addressing the Year 2000 problem in time will be a tremendous challenge for the federal government. Many of the federal government's computer systems were originally designed and developed 20 to 25 years ago, are poorly documented, and use a wide variety of computer languages, many of which are obsolete. Some applications include thousands, tens of thousands, or even millions of lines of code, each of which must be examined for date-format problems. The federal government also depends on the telecommunications infrastructure to deliver a wide range of services. For example, the route of an electronic Medicare payment may traverse several networks--those operated by the Department of Health and Human Services, the Department of the Treasury's computer systems and networks, and the Federal Reserve's Fedwire electronic funds transfer system. In addition, the year 2000 could cause problems for the many facilities used by the federal government that were built or renovated within the last 20 years and contain embedded computer systems to control, monitor, or assist in operations. For example, building security systems, elevators, and air conditioning and heating equipment could malfunction or cease to operate. Agencies cannot afford to neglect any of these issues. If they do, the impact of Year 2000 failures could be widespread, costly, and potentially disruptive to vital government operations worldwide. Nevertheless, overall, the government's 24 major departments and agencies are making slow progress in fixing their systems. In May 1997, the Office of Management and Budget (OMB) reported that about 21 percent of the mission-critical systems (1,598 of 7,649) for these departments and agencies were Year 2000 compliant. A year later, in May 1998, these departments and agencies reported that 2,914 of the 7,336 mission-critical systems in their current inventories, or about 40 percent, were compliant. However, unless agency progress improves dramatically, a substantial number of mission-critical systems will not be compliant in time. In addition to slow governmentwide progress in fixing systems, our reviews of federal agency Year 2000 programs have found uneven progress. Some agencies are significantly behind schedule and are at high risk that they will not fix their systems in time. Other agencies have made progress, although risks continue and a great deal of work remains. The following are examples of the results of some of our recent reviews. Last month, we testified about FAA's progress in implementing a series of recommendations we had made earlier this year to assist FAA in completing overdue awareness and assessment activities. These recommendations included assessing how the major FAA components and the aviation industry would be affected if Year 2000 problems were not corrected in time and completing inventories of all information systems, including data interfaces. Officials at both FAA and the Department of Transportation agreed with these recommendations, and the agency has made progress in implementing them. In our August testimony, we reported that FAA had made progress in managing its Year 2000 problem and had completed critical steps in defining which systems needed to be corrected and how to accomplish this. However, with less than 17 months to go, FAA must still correct, test, and implement many of its mission-critical systems. It is doubtful that FAA can adequately do all of this in the time remaining. Accordingly, FAA must determine how to ensure continuity of critical operations in the likely event of some systems' failures. In October 1997, we reported that while the Social Security Administration (SSA) had made significant progress in assessing and renovating mission-critical mainframe software, certain areas of risk in its Year 2000 program remained. Accordingly, we made several recommendations to address these risk areas, which included the Year 2000 compliance of the systems used by the 54 state Disability Determination Services that help administer the disability programs. SSA agreed with these recommendations and, in July 1998, we reported that actions to implement these recommendations had either been taken or were underway.Further, we found that SSA has maintained its place as a federal leader in addressing Year 2000 issues and has made significant progress in achieving systems compliance. However, essential tasks remain. For example, many of the states' Disability Determination Service systems still had to be renovated, tested, and deemed Year 2000 compliant. Our work has shown that much likewise remains to be done in DOD and the military services. For example, our recent report on the Navy found that while positive actions have been taken, remediation progress had been slow and the Navy was behind schedule in completing the early phases of its Year 2000 program. Further, the Navy had not been effectively overseeing and managing its Year 2000 efforts and lacked complete and reliable information on its systems and on the status and cost of its remediation activities. We have recommended improvements to DOD's and the military services' Year 2000 programs with which they have concurred. In addition to these examples, our reviews have shown that many agencies had not adequately acted to establish priorities, solidify data exchange agreements, or develop contingency plans. Likewise, more attention needs to be devoted to (1) ensuring that the government has a complete and accurate picture of Year 2000 progress, (2) setting governmentwide priorities, (3) ensuring that the government's critical core business processes are adequately tested, (4) recruiting and retaining information technology personnel with the appropriate skills for Year 2000-related work, and (5) assessing the nation's Year 2000 risks, including those posed by key economic sectors. I would like to highlight some of these vulnerabilities, and our recommendations made in April 1998 for addressing them. First, governmentwide priorities in fixing systems have not yet been established. These governmentwide priorities need to be based on such criteria as the potential for adverse health and safety effects, adverse financial effects on American citizens, detrimental effects on national security, and adverse economic consequences. Further, while individual agencies have been identifying mission-critical systems, this has not always been done on the basis of a determination of the agency's most critical operations. If priorities are not clearly set, the government may well end up wasting limited time and resources in fixing systems that have little bearing on the most vital government operations. Other entities have recognized the need to set priorities. For example, Canada has established 48 national priorities covering areas such as national defense, food production, safety, and income security. Second, business continuity and contingency planning across the government has been inadequate. In their May 1998 quarterly reports to OMB, only four agencies reported that they had drafted contingency plans for their core business processes. Without such plans, when unpredicted failures occur, agencies will not have well-defined responses and may not have enough time to develop and test alternatives. Federal agencies depend on data provided by their business partners as well as services provided by the public infrastructure (e.g., power, water, transportation, and voice and data telecommunications). One weak link anywhere in the chain of critical dependencies can cause major disruptions to business operations. Given these interdependencies, it is imperative that contingency plans be developed for all critical core business processes and supporting systems, regardless of whether these systems are owned by the agency. Our recently issued guidance aims to help agencies ensure such continuity of operations through contingency planning. Third, OMB's assessment of the current status of federal Year 2000 progress is predominantly based on agency reports that have not been consistently reviewed or verified. Without independent reviews, OMB and the President's Council on Year 2000 Conversion have little assurance that they are receiving accurate information. In fact, we have found cases in which agencies' systems compliance status as reported to OMB has been inaccurate. For example, the DOD Inspector General estimated that almost three quarters of DOD's mission-critical systems reported as compliant in November 1997 had not been certified as compliant by DOD components.In May 1998, the Department of Agriculture (USDA) reported 15 systems as compliant, even though these were replacement systems that were still under development or were planned for development. (The department removed these systems from compliant status in its August 1998 quarterly report.) Fourth, end-to-end testing responsibilities have not yet been defined. To ensure that their mission-critical systems can reliably exchange data with other systems and that they are protected from errors that can be introduced by external systems, agencies must perform end-to-end testing for their critical core business processes. The purpose of end-to-end testing is to verify that a defined set of interrelated systems, which collectively support an organizational core business area or function, will work as intended in an operational environment. In the case of the year 2000, many systems in the end-to-end chain will have been modified or replaced. As a result, the scope and complexity of testing--and its importance--is dramatically increased, as is the difficulty of isolating, identifying, and correcting problems. Consequently, agencies must work early and continually with their data exchange partners to plan and execute effective end-to-end tests. So far, lead agencies have not been designated to take responsibility for ensuring that end-to-end testing of processes and supporting systems is performed across boundaries, and that independent verification and validation of such testing is ensured. We have set forth a structured approach to testing in our recently released exposure draft. In our April 1998 report on governmentwide Year 2000 progress, we made a number of recommendations to the Chair of the President's Council on Year 2000 Conversion aimed at addressing these problems. These included establishing governmentwide priorities and ensuring that agencies set developing a comprehensive picture of the nation's Year 2000 readiness, requiring agencies to develop contingency plans for all critical core requiring agencies to develop an independent verification strategy to involve inspectors general or other independent organizations in reviewing Year 2000 progress, and designating lead agencies responsible for ensuring that end-to-end operational testing of processes and supporting systems is performed. We are encouraged by actions the Council is taking in response to some of our recommendations. For example, OMB and the Chief Information Officers Council adopted our guide providing information on business continuity and contingency planning issues common to most large enterprises as a model for federal agencies. However, as we recently testified before this Subcommittee, some actions have not been fully addressed--principally with respect to setting national priorities and end-to-end testing. State and local governments also face a major risk of Year 2000-induced failures to the many vital services--such as benefits payments, transportation, and public safety--that they provide. For example, food stamps and other types of payments may not be made or could be made for incorrect amounts; date-dependent signal timing patterns could be incorrectly implemented at highway intersections, and safety severely compromised, if traffic signal systems run by state and local governments do not process four-digit years correctly; and criminal records (i.e., prisoner release or parole eligibility determinations) may be adversely affected by the Year 2000 problem. Recent surveys of state Year 2000 efforts have indicated that much remains to be completed. For example, a July 1998 survey of state Year 2000 readiness conducted by the National Association of State Information Resource Executives, Inc., found that only about one-third of the states reported that 50 percent or more of their critical systems had been completely assessed, remediated, and tested. In a June 1998 survey conducted by USDA's Food and Nutrition Service, only 3 and 14 states, respectively, reported that the software, hardware, and telecommunications that support the Food Stamp Program, and the Women, Infants, and Children program, were Year 2000 compliant. Although all but one of the states reported that they would be Year 2000 compliant by January 1, 2000, many of the states reported that their systems are not due to be compliant until after March 1999 (the federal government's Year 2000 implementation goal). Indeed, 4 and 5 states, respectively, reported that the software, hardware, and telecommunications supporting the Food Stamp Program, and the Women, Infants, and Children program would not be Year 2000 compliant until the last quarter of calendar year 1999, which puts them at high risk of failure due to the need for extensive testing. State audit organizations have identified other significant Year 2000 concerns. For example, (1) Illinois' Office of the Auditor General reported that significant future efforts were needed to ensure that the year 2000 would not adversely affect state government operations, (2) Vermont's Office of Auditor of Accounts reported that the state faces the risk that critical portions of its Year 2000 compliance efforts could fail, (3) Texas' Office of the State Auditor reported that many state entities had not finished their embedded systems inventories and, therefore, it is not likely that they will complete their embedded systems repairs before the year 2000, and (4) Florida's Auditor General has issued several reports detailing the need for additional Year 2000 planning at various district school boards and community colleges. State audit offices have also made recommendations, including the need for increased oversight, Year 2000 project plans, contingency plans, and personnel recruitment and retention strategies. In the course of these field hearings, states and municipalities have testified about Year 2000 practices that could be adopted by others. For example: New York established a "top 40" list of priority systems having a direct impact on public health, safety, and welfare, such as systems that support child welfare, state aid to schools, criminal history, inmate population management, and tax processing. According to New York, "the Top 40 systems must be compliant, no matter what." The city of Lubbock, Texas, is planning a Year 2000 "drill" this month. To prepare for the drill, Lubbock is developing scenarios of possible Year 2000-induced failures, as well as more normal problems (such as inclement weather) that could occur at the change of century. Louisiana established a $5 million Year 2000 funding pool to assist agencies experiencing emergency circumstances in mission-critical applications and that are unable to correct the problems with existing resources. Regarding Ohio, our review of the state's Year 2000 Internet World Wide Web site found that it had developed a detailed Year 2000 certification checklist. The checklist included items such as the first potential failure date, date fields, interfaces, and testing. However, according to Ohio's Year 2000 Administrator, implementation of this checklist is voluntary. According to Ohio's Year 2000 Internet World Wide Web site, while many of the state's agencies estimated that they would complete their Year 2000 remediation in late 1998 or early 1999, several critical agencies are not due to be compliant until mid-1999. For example, Ohio's (1) Department of Education reported it was 35 percent complete as of June 1998 and planned to be complete in July 1999, (2) Department of Health reported that it was 70 percent complete as of August 1998 and planned to be complete in July 1999, and (3) Department of Transportation reported that it was 70 percent complete as of April 1998 and planned to be complete in June 1999. To fully address the Year 2000 risks that states and the federal government face, data exchanges must also be confronted--a monumental issue. As computers play an ever-increasing role in our society, exchanging data electronically has become a common method of transferring information among federal, state, and local governments. For example, SSA exchanges data files with the states to determine the eligibility of disabled persons for disability benefits. In another example, the National Highway Traffic Safety Administration provides states with information needed for driver registrations. As computer systems are converted to process Year 2000 dates, the associated data exchanges must also be made Year 2000 compliant. If the data exchanges are not Year 2000 compliant, data will not be exchanged or invalid data could cause the receiving computer systems to malfunction or produce inaccurate computations. Our recent report on actions that have been taken to address Year 2000 issues for electronic data exchanges revealed that federal agencies and the states use thousands of such exchanges to communicate with each other and other entities. For example, federal agencies reported that their mission-critical systems have almost 500,000 data exchanges with other federal agencies, states, local governments, and the private sector. To successfully remediate their data exchanges, federal agencies and the states must (1) assess information systems to identify data exchanges that are not Year 2000 compliant, (2) contact exchange partners and reach agreement on the date format to be used in the exchange, (3) determine if data bridges and filters are needed and, if so, reach agreement on their development, (4) develop and test such bridges and filters, (5) test and implement new exchange formats, and (6) develop contingency plans and procedures for data exchanges. At the time of our review, much work remained to ensure that federal and state data exchanges will be Year 2000 compliant. About half of the federal agencies reported during the first quarter of 1998 that they had not yet finished assessing their data exchanges. Moreover, almost half of the federal agencies reported that they had reached agreements on 10 percent or fewer of their exchanges, few federal agencies reported having installed bridges or filters, and only 38 percent of the agencies reported that they had developed contingency plans for data exchanges. Further, the status of the data exchange efforts of 15 of the 39 state-level organizations that responded to our survey was not discernable because they were not able to provide us with information on their total number of exchanges and the number assessed. Of the 24 state-level organizations that provided actual or estimated data, they reported, on average, that 47 percent of the exchanges had not been assessed. In addition, similar to the federal agencies, state-level organizations reported having made limited progress in reaching agreements with exchange partners, installing bridges and filters, and developing contingency plans. However, we could draw only limited conclusions on the status of the states' actions because data were provided on only a small portion of states' data exchanges. To strengthen efforts to address data exchanges, we made several recommendations to OMB. In response, OMB agreed that it needed to increase its efforts in this area. For example, OMB noted that federal agencies had provided the General Services Administration with a list of their data exchanges with the states. In addition, as a result of an agreement reached at an April 1998 federal/state data exchange meeting,the states were supposed to verify the accuracy of these initial lists by June 1, 1998. OMB also noted that the General Services Administration is planning to collect and post information on its Internet World Wide Web site on the progress of federal agencies and states in implementing Year 2000 compliant data exchanges. In summary, federal, state, and local efforts must increase substantially to ensure that major service disruptions do not occur. Greater leadership and partnerships are essential if government programs are to meet the needs of the public at the turn of the century. Mr. Chairman, this concludes my statement. I would be happy to respond to any questions that you or other members of the Subcommittee may have at this time. FAA Systems: Serious Challenges Remain in Resolving Year 2000 and Computer Security Problems (GAO/T-AIMD-98-251, August 6, 1998). Year 2000 Computing Crisis: Business Continuity and Contingency Planning (GAO/AIMD-10.1.19, August 1998). Internal Revenue Service: Impact of the IRS Restructuring and Reform Act on Year 2000 Efforts (GAO/GGD-98-158R, August 4, 1998). Social Security Administration: Subcommittee Questions Concerning Information Technology Challenges Facing the Commissioner (GAO/AIMD-98-235R, July 10, 1998). Year 2000 Computing Crisis: Actions Needed on Electronic Data Exchanges (GAO/AIMD-98-124, July 1, 1998). Defense Computers: Year 2000 Computer Problems Put Navy Operations at Risk (GAO/AIMD-98-150, June 30, 1998). Year 2000 Computing Crisis: A Testing Guide (GAO/AIMD-10.1.21, Exposure Draft, June 1998). Year 2000 Computing Crisis: Testing and Other Challenges Confronting Federal Agencies (GAO/T-AIMD-98-218, June 22, 1998). Year 2000 Computing Crisis: Telecommunications Readiness Critical, Yet Overall Status Largely Unknown (GAO/T-AIMD-98-212, June 16, 1998). GAO Views on Year 2000 Testing Metrics (GAO/AIMD-98-217R, June 16, 1998). IRS' Year 2000 Efforts: Business Continuity Planning Needed for Potential Year 2000 System Failures (GAO/GGD-98-138, June 15, 1998). Year 2000 Computing Crisis: Actions Must Be Taken Now to Address Slow Pace of Federal Progress (GAO/T-AIMD-98-205, June 10, 1998). Defense Computers: Army Needs to Greatly Strengthen Its Year 2000 Program (GAO/AIMD-98-53, May 29, 1998). Year 2000 Computing Crisis: USDA Faces Tremendous Challenges in Ensuring That Vital Public Services Are Not Disrupted (GAO/T-AIMD-98-167, May 14, 1998). Securities Pricing: Actions Needed for Conversion to Decimals (GAO/T-GGD-98-121, May 8, 1998). Year 2000 Computing Crisis: Continuing Risks of Disruption to Social Security, Medicare, and Treasury Programs (GAO/T-AIMD-98-161, May 7, 1998). IRS' Year 2000 Efforts: Status and Risks (GAO/T-GGD-98-123, May 7, 1998). Air Traffic Control: FAA Plans to Replace Its Host Computer System Because Future Availability Cannot Be Assured (GAO/AIMD-98-138R, May 1, 1998). Year 2000 Computing Crisis: Potential for Widespread Disruption Calls for Strong Leadership and Partnerships (GAO/AIMD-98-85, April 30, 1998). Defense Computers: Year 2000 Computer Problems Threaten DOD Operations (GAO/AIMD-98-72, April 30, 1998). Department of the Interior: Year 2000 Computing Crisis Presents Risk of Disruption to Key Operations (GAO/T-AIMD-98-149, April 22, 1998). Tax Administration: IRS' Fiscal Year 1999 Budget Request and Fiscal Year 1998 Filing Season (GAO/T-GGD/AIMD-98-114, March 31, 1998). Year 2000 Computing Crisis: Strong Leadership Needed to Avoid Disruption of Essential Services (GAO/T-AIMD-98-117, March 24, 1998). Year 2000 Computing Crisis: Federal Regulatory Efforts to Ensure Financial Institution Systems Are Year 2000 Compliant (GAO/T-AIMD-98-116, March 24, 1998). Year 2000 Computing Crisis: Office of Thrift Supervision's Efforts to Ensure Thrift Systems Are Year 2000 Compliant (GAO/T-AIMD-98-102, March 18, 1998). Year 2000 Computing Crisis: Strong Leadership and Effective Public/Private Cooperation Needed to Avoid Major Disruptions (GAO/T-AIMD-98-101, March 18, 1998). Post-Hearing Questions on the Federal Deposit Insurance Corporation's Year 2000 (Y2K) Preparedness (AIMD-98-108R, March 18, 1998). SEC Year 2000 Report: Future Reports Could Provide More Detailed Information (GAO/GGD/AIMD-98-51, March 6, 1998). Year 2000 Readiness: NRC's Proposed Approach Regarding Nuclear Powerplants (GAO/AIMD-98-90R, March 6, 1998). Year 2000 Computing Crisis: Federal Deposit Insurance Corporation's Efforts to Ensure Bank Systems Are Year 2000 Compliant (GAO/T-AIMD-98-73, February 10, 1998). Year 2000 Computing Crisis: FAA Must Act Quickly to Prevent Systems Failures (GAO/T-AIMD-98-63, February 4, 1998). FAA Computer Systems: Limited Progress on Year 2000 Issue Increases Risk Dramatically (GAO/AIMD-98-45, January 30, 1998). Defense Computers: Air Force Needs to Strengthen Year 2000 Oversight (GAO/AIMD-98-35, January 16, 1998). Year 2000 Computing Crisis: Actions Needed to Address Credit Union Systems' Year 2000 Problem (GAO/AIMD-98-48, January 7, 1998). Veterans Health Administration Facility Systems: Some Progress Made In Ensuring Year 2000 Compliance, But Challenges Remain (GAO/AIMD-98-31R, November 7, 1997). Year 2000 Computing Crisis: National Credit Union Administration's Efforts to Ensure Credit Union Systems Are Year 2000 Compliant (GAO/T-AIMD-98-20, October 22, 1997). Social Security Administration: Significant Progress Made in Year 2000 Effort, But Key Risks Remain (GAO/AIMD-98-6, October 22, 1997). Defense Computers: Technical Support Is Key to Naval Supply Year 2000 Success (GAO/AIMD-98-7R, October 21, 1997). Defense Computers: LSSC Needs to Confront Significant Year 2000 Issues (GAO/AIMD-97-149, September 26, 1997). Veterans Affairs Computer Systems: Action Underway Yet Much Work Remains To Resolve Year 2000 Crisis (GAO/T-AIMD-97-174, September 25, 1997). Year 2000 Computing Crisis: Success Depends Upon Strong Management and Structured Approach (GAO/T-AIMD-97-173, September 25, 1997). Year 2000 Computing Crisis: An Assessment Guide (GAO/AIMD-10.1.14, September 1997). Defense Computers: SSG Needs to Sustain Year 2000 Progress (GAO/AIMD-97-120R, August 19, 1997). Defense Computers: Improvements to DOD Systems Inventory Needed for Year 2000 Effort (GAO/AIMD-97-112, August 13, 1997). Defense Computers: Issues Confronting DLA in Addressing Year 2000 Problems (GAO/AIMD-97-106, August 12, 1997). Defense Computers: DFAS Faces Challenges in Solving the Year 2000 Problem (GAO/AIMD-97-117, August 11, 1997). Year 2000 Computing Crisis: Time Is Running Out for Federal Agencies to Prepare for the New Millennium (GAO/T-AIMD-97-129, July 10, 1997). Veterans Benefits Computer Systems: Uninterrupted Delivery of Benefits Depends on Timely Correction of Year-2000 Problems (GAO/T-AIMD-97-114, June 26, 1997). Veterans Benefits Computer Systems: Risks of VBA's Year-2000 Efforts (GAO/AIMD-97-79, May 30, 1997). Medicare Transaction System: Success Depends Upon Correcting Critical Managerial and Technical Weaknesses (GAO/AIMD-97-78, May 16, 1997). Medicare Transaction System: Serious Managerial and Technical Weaknesses Threaten Modernization (GAO/T-AIMD-97-91, May 16, 1997). Year 2000 Computing Crisis: Risk of Serious Disruption to Essential Government Functions Calls for Agency Action Now (GAO/T-AIMD-97-52, February 27, 1997). Year 2000 Computing Crisis: Strong Leadership Today Needed To Prevent Future Disruption of Government Services (GAO/T-AIMD-97-51, February 24, 1997). High-Risk Series: Information Management and Technology (GAO/HR-97-9, February 1997). The first copy of each GAO report and testimony is free. Additional copies are $2 each. Orders should be sent to the following address, accompanied by a check or money order made out to the Superintendent of Documents, when necessary. VISA and MasterCard credit cards are accepted, also. Orders for 100 or more copies to be mailed to a single address are discounted 25 percent. U.S. General Accounting Office P.O. Box 37050 Washington, DC 20013 Room 1100 700 4th St. NW (corner of 4th and G Sts. NW) U.S. General Accounting Office Washington, DC Orders may also be placed by calling (202) 512-6000 or by using fax number (202) 512-6061, or TDD (202) 512-2537. Each day, GAO issues a list of newly available reports and testimony. To receive facsimile copies of the daily list or any list from the past 30 days, please call (202) 512-6000 using a touchtone phone. A recorded menu will provide information on how to obtain these lists. | GAO discussed the year 2000 computer system risks facing the nation, focusing on: (1) GAO's major concerns with the federal government's progress in correcting its systems; (2) state and local government year 2000 issues; and (3) critical year 2000 data exchange issues. GAO noted that: (1) the public faces a high risk that critical services provided by the government and the private sector could be severely disrupted by the year 2000 computing crisis; (2) the year 2000 could cause problems for the many facilities used by the federal government that were built or renovated within the last 20 years and contain embedded computer systems to control, monitor, or assist in operations; (3) overall, the government's 24 major departments and agencies are making slow progress in fixing their systems; (4) in May 1997, the Office of Management and Budget (OMB) reported that about 21 percent of the mission-critical systems for these departments and agencies were year 2000 compliant; (5) in May 1998, these departments reported that 40 percent of the mission-critical systems were year 2000 compliant; (6) unless progress improves dramatically, a substantial number of mission-critical systems will not be compliant in time; (7) in addition to slow governmentwide progress in fixing systems, GAO's reviews of federal agency year 2000 programs have found uneven progress; (8) some agencies are significantly behind schedule and are at high risk that they will not fix their systems in time; (9) other agencies have made progress, although risks continue and a great deal of work remains; (10) governmentwide priorities in fixing systems have not yet been established; (11) these governmentwide priorities need to be based on such criteria as the potential for adverse health and safety effects, adverse financial effects on American citizens, detrimental effects on national security, and adverse economic consequences; (12) business continuity and contingency planning across the government has been inadequate; (13) in their May 1998 quarterly reports to OMB, only four agencies reported that they had drafted contingency plans for their core business processes; (14) OMB's assessment of the status of federal year 2000 progress is predominantly based on agency reports that have not been consistently reviewed or verified; (15) GAO found cases in which agencies' systems' compliance status as reported to OMB had been inaccurate; (16) end-to-end testing responsibilities have not yet been defined; (17) state and local governments also face a major risk of year 2000-induced failures to the many vital services that they provide; (18) recent surveys of state year 2000 efforts have indicated that much remains to be completed; and (19) at the time of GAO's review, much work remained to ensure that federal and state data exchanges will be year 2000 compliant. | 7,126 | 567 |
The Economy Act, as amended (31 U.S.C. 1535), authorizes the head of an agency to place an order with another agency for goods or services if, among other requirements, a decision is made that the items or services cannot be obtained by contract as conveniently or cheaply from a commercial enterprise. The interagency ordering practice authorized by the Economy Act, sometimes referred to as "contract off-loading," can save the government duplicative effort and costs when appropriately used. Examples of appropriate use may include circumstances of one agency already having a contract for goods and services similar to those needed by another agency, or an agency having unique capabilities or expertise that qualify it to enter into or administer a contract. In July 1993, the Subcommittee on Oversight of Government Management, Senate Governmental Affairs Committee, held a hearing to examine the practice of off-loading at federal agencies and the abuses of this practice. Its hearing record, which included testimony from the Inspectors General of DOD, the Department of Energy, and the Tennessee Valley Authority, was critical of DOD's and other agencies' off-loading practices. Subsequently, the National Defense Authorization Act for Fiscal Year 1994 required the Secretary of Defense to prescribe regulations governing DOD's use of the Economy Act that included specific statutory limitations intended to rectify identified abuses. The Volpe Center is a federally owned and operated facility located in Cambridge, Massachusetts, and was established in 1970 to fulfill the need of the newly formed Department of Transportation for an in-house systems research capability. Since then, the center's research, analysis, and project management expertise has been applied to a wide variety of transportation and logistics problems. Its only funding is through formal reimbursable agreements negotiated with individual agencies for specific tasks. Initially, the center's services were provided almost exclusively to the Office of the Secretary of Transportation and the operating administrations within the Department of Transportation. As its capabilities evolved and its systems approach became better known, demand grew within non-Department of Transportation agencies. Through a formal memorandum of understanding with DOD, the Secretary broadened the center's mission in 1985 to include work on transportation and logistics problems facing other agencies, including the Joint Chiefs of Staff and the U.S. Transportation Command. Similar arrangements were made with civilian agencies. The Volpe Center's current labor pool consists of about 1,500 personnel evenly divided among 3 labor categories: federal employees, on-site contractor employees, and off-site contractor employees. On-site contractors provide services in computer analysis, technical information support, and documentation support. The off-site contractor employees comprise a "multiple contractor resource base," which allows quick, competitive access to a broad range of high technology capabilities and skills needed to meet the Volpe Center's programmatic requirements. Volpe Center contracting is regulated by the Federal Acquisition Regulation. In response to an audit conducted by the Department of Transportation's Inspector General, the Volpe Center issued formal work acceptance criteria in February 1995. According to Volpe Center management, the criteria are designed to assure that the center will not accept projects unless it can make substantive contributions derived from its status as part of the federal government. Examples of substantive contributions include project definition and planning in cooperation with the requesting agency, and support of contracts awarded and administered by the Volpe Center. In advance of promulgating regulations, the Secretary issued a policy memorandum in February 1994 that imposed limitations on the use of Economy Act orders by DOD activities. The Secretary's policy, which addressed Economy Act orders released outside of DOD for contract action, was, however, more stringent than either the National Defense Authorization Act for Fiscal Year 1994 or the Economy Act in the area of cost considerations by requiring a determination that the supplies or services cannot be provided "as conveniently and cheaply" by contracting directly with a private source. The Authorization Act did not address this cost issue and the Economy Act uses the phrase "as conveniently or cheaply." The Secretary's use of the "and" rather than the "or" introduces more cost analysis into the decision-making process. The Secretary also changed the level of approval authority for Economy Act purchases. Instead of having contracting officers or other officials designated by the agency head approve Economy Act transactions, the Secretary's memorandum placed the approval level no lower than a senior executive service official, a general or flag officer, or an activity commander. The Coast Guard, which is a component of the Department of Transportation, has acquired services from the Volpe Center. In November 1994, the Coast Guard issued an instruction providing guidance on its use of the center. The instruction established a review, justification, and approval process to ensure that acquisition of Volpe Center services are in the Coast Guard's best economic interest. The instruction designates the Director of Finance and Procurement, a senior executive service position within the Office of the Chief of Staff, as the approving official for all Coast Guard work performed through the center. The guidance requires a demonstration that the cost to use the Volpe Center is at least roughly comparable to commercial cost. To document this comparability, Coast Guard sponsors must develop an independent estimate of expected project costs using recognized techniques such as engineering analysis, market research, or application of actual cost data from prior projects. While the Coast Guard instruction acknowledges that the Volpe Center offers convenience, it is the Coast Guard policy that the center shall be used when there are clear economic, technical, and mission-essential reasons for doing so. For example, officials informed us that in one area of the country the Coast Guard is now completing 5 years of environmental compliance and restoration work with the Volpe Center. They explained that the center's support was critical in the early years of this work for the Coast Guard to gain an understanding of the various technologies involved in restoring areas at Coast Guard installations that were contaminated. Coast Guard officials said it has acquired the technical expertise and it is now ready, at least in that area of the country, to transition away from the center for this work and contract directly with private companies. The Air Force, Army, and Navy have each taken a different approach to implementing the Secretary's policy memorandum. Collectively, however, they are producing similar mixed results. While there is considerable up-to-date guidance available to contracting officials on interagency purchases, not all DOD files on Volpe Center projects we reviewed contained required information. In addition, DOD has not yet implemented a statutorily mandated monitoring system for interagency purchases; the monitoring system is currently scheduled for implementation in October 1995. The Air Force introduced the Secretary of Defense's policy changes in June 1994 through a revision to its Federal Acquisition Regulation Supplement. The supplement states that the Air Force shall not place an order with another agency unless adequate supporting documentation, including a Determination and Finding (D&F), is prepared. The D&F must be approved at a level no lower than senior executive service, flag or general officer, or activity commander. The activity's contracting office is required to retain a record copy of each D&F in a central file. The supplement offers a model format for the D&F, which requires that 12 specific findings be listed, including 1 that states "the supplies or services cannot be provided as conveniently and more economically by private contractors under an Air Force contract." The Army implemented the Secretary's policy changes in an August 1994 policy letter from the Office of the Assistant Secretary, Army Contracting Support Agency. The letter states that before an Economy Act order for supplies or services is released outside DOD for contracting action, a written determination prepared by the requiring activity that addresses the elements in the Defense Secretary's memorandum shall be approved by the head of the requesting agency or their designee. The D&Fs are required to be prepared in the same format required by the Air Force, to include that "the supplies or services cannot be provided at the time required and more economically by contractors under an Army contract." In contrast to the Air Force and Army's delegation of approval authority, the Navy initially did not delegate approval authority below the Assistant Secretary of the Navy for Research, Development, and Acquisition. Toward the end of 1994, the Assistant Secretary delegated approval authority to the Deputy for Acquisition and Business Management. In January 1995, as permitted by the Secretary of Defense's memorandum, the Deputy redelegated authority to approve D&Fs to eight activities with contracting authority. However, approval authority for Economy Act orders placed with the Volpe Center and with agencies not subject to the Federal Acquisition Regulation was retained by the Deputy for Acquisition and Business Management. Despite efforts by the services to strengthen controls over Economy Act purchases, our review of fiscal year 1995 Air Force, Army, and Navy projects with the Volpe Center indicated that the controls were not fully implemented. Of the 13 purchase requests we reviewed, 7 lacked approved D&Fs. The results of our review are summarized in table 1. In two of the three Air Force cases where a D&F was not prepared, the project managers were not aware of the requirement to prepare a D&F; in the other case, a draft D&F was prepared by the requiring activity, reviewed by a contracting officer, but never completed or signed. In the Army case where a D&F was not prepared, Army officials had no excuse other than they "just missed it." One official suggested that some Army activities may not have understood the August 1994 policy letter. In one of the Navy cases without an approved D&F, ordering officials justified the transfer of 1995 funds on the basis of a D&F that covered 1993 and 1994 funding; subsequent to the transfer of funds, reviewing officials rejected this justification. The other two Navy cases involved purchases by a Marine Corps ordering activity. Similar to the Army case, Marine Corps officials explained that, regarding the preparation of D&Fs, the purchases "just fell through the cracks." The documentation for services' projects with approved D&Fs showed different approaches to meeting the Defense Secretary's requirement to elevate the consideration of cost. The Air Force D&Fs mainly emphasized that the estimated general and administrative expense rate of 9 percent charged by the Volpe Center appeared reasonable and did not exceed the actual cost of entering into and administering the interagency agreement under which the order is filled. The Air Force documentation also showed that business reviews were performed by the contracting officers; the business reviews indicated that independent government cost estimates had been completed. The documentation for the approved Army project computed the dollar value of the administrative fee and included an "information paper" prepared for the general officer who signed the D&F. The information paper indicated that the project would be transitioning from Volpe Center support to the Army's on-site contractor support in about 4 months. The approval for the two Navy cases involved purchases by a Marine Corps ordering activity different from the one above, which did not have approved D&Fs. These purchases were covered by a D&F prepared shortly after the Secretary's memorandum, but approved under the criteria in effect before the memorandum. Thus, the D&F did not contain a finding on the cost comparison cited in the Secretary's memorandum. However, Navy officials said that they concurred with approvals such as these because, at that time, no new detailed implementing guidance was available to ordering activities. The National Defense Authorization Act for Fiscal Year 1994 directed that DOD establish a monitoring system for Economy Act purchases not later than 1 year after the November 30, 1993, enactment of the act. That monitoring system has not yet been implemented. An official from the Office of the Under Secretary of Defense for Acquisition and Technology informed us, however, that the monitoring system has been developed and is now awaiting approval. The monitoring system is currently scheduled for implementation on October 1995. DOD Economy Act orders placed with the Volpe Center peaked in fiscal year 1991 at $93.2 million, which accounted for about 39 percent of the center's budget. By fiscal year 1994, DOD funding dropped to $26.5 million, which accounted for about only 13 percent of the center's budget. Funding transfers for the first 8 months of fiscal year 1995 indicate that DOD funding will only be one-half of fiscal year 1994 funding. The funding data are summarized in figure 1. It is difficult to pinpoint exact causes for the downward trend. However, the more recent declines may be a result of the 1993 Subcommittee hearing, resulting legislation, and the 1994 implementation of a more restrictive contracting environment by the Secretary of Defense. Coast Guard orders placed with the Volpe Center reached their highest levels in fiscal years 1992 and 1993 when over $21 million in new obligation authority was transferred each year. Funding dropped by almost half in fiscal year 1994. Fiscal year 1995 new obligation authority may be about half of the fiscal year 1994 total. The funding data are summarized in figure 2. As with the DOD data, it is difficult to identify exact causes for the downward trend. However, the November 1994 instruction with its cost and approval requirements may have been a contributing factor. FASA required that the Federal Acquisition Regulation be revised to include statutory requirements governing the exercise of Economy Act authority. The requirement is virtually identical to that required of DOD by the National Defense Authorization Act for Fiscal Year 1994. In March 1995, a proposed draft regulation was published in the Federal Register. The proposed regulation requires a determination that the ordered goods or services cannot be provided by contract as conveniently or cheaply by the requesting agency from a commercial enterprise. FASA did not require the more stringent "and" language applicable within DOD. The regulation authorizes determination approval authority to reside with the contracting officer or another official designated by agency regulation, except that if the servicing agency is not covered by the Federal Acquisition Regulation, approval authority may not be delegated below the senior procurement executive of the requesting agency. Such procedures are consistent with FASA. FASA also requires that by mid-October 1995 the Administrator for Federal Procurement Policy establish a monitoring system for Economy Act purchases for Federal civilian agencies, similar to the requirement for DOD. In commenting on a draft of this report, both the Departments of Defense and Transportation concurred with the report. Both suggested some technical changes to the draft, and we have incorporated them, where appropriate. DOD's comments are presented in appendix I. The Department of Transportation's comments were provided orally. We interviewed management officials and examined project management and budget documents, statements of work, cost summaries, military interdepartmental purchase requests, project plan agreements, and other program documentation. We performed work at the Department of Transportation's Volpe National Transportation Systems Center, Cambridge, Massachusetts, and Headquarters, United States Coast Guard, Washington, D.C. We also contacted policy representatives within the Office of the Assistant Secretary of the Air Force for Acquisition; the Office of the Assistant Secretary of the Army for Research, Development, and Acquisition; and the Office of the Assistant Secretary of the Navy for Research, Development, and Acquisition. Our review was performed in accordance with generally accepted government auditing standards and includes information obtained through May 1995. We are sending copies of this report to the Chairman, Subcommittee on Oversight of Government Management and the District of Columbia, Senate Committee on Governmental Affairs; other interested congressional committees; and the Secretaries of Defense and Transportation. Copies will also be available to others on request. Please contact me at (202) 512-4587 if you or your staff have any questions concerning this report. Major contributors to this report were Charles W. Thompson, Paul M. Greeley, and Paul G. Williams. The first copy of each GAO report and testimony is free. Additional copies are $2 each. Orders should be sent to the following address, accompanied by a check or money order made out to the Superintendent of Documents, when necessary. Orders for 100 or more copies to be mailed to a single address are discounted 25 percent. U.S. General Accounting Office P.O. Box 6015 Gaithersburg, MD 20884-6015 Room 1100 700 4th St. NW (corner of 4th and G Sts. NW) U.S. General Accounting Office Washington, DC Orders may also be placed by calling (202) 512-6000 or by using fax number (301) 258-4066, or TDD (301) 413-0006. Each day, GAO issues a list of newly available reports and testimony. To receive facsimile copies of the daily list or any list from the past 30 days, please call (301) 258-4097 using a touchtone phone. A recorded menu will provide information on how to obtain these lists. | Pursuant to a congressional request, GAO examined the: (1) impact of the Department of Defense's (DOD) policy changes for interagency orders on the Department of Transportation's Volpe National Transportation Systems Center; and (2) Coast Guard's recent initiatives and legislative changes extending the statutory requirements on interagency orders to other federal agencies. GAO found that: (1) because of past practices, the National Defense Authorization Act for Fiscal Year 1994 required the Secretary of Defense to issue regulations that strengthened controls over DOD's interagency orders for goods and services; (2) in a February 1994 memorandum, and in advance of the statutorily required regulations, the Secretary took additional steps to increase DOD's interagency transaction controls by requiring, among other things, that DOD's interagency orders be as convenient and cheap as other alternatives and approved at a level no lower than senior executive service, general officer, flag officer, or activity commander; (3) in November 1994, the Coast Guard independently developed reforms that paralleled these DOD initiatives; (4) DOD is still adjusting to the changes introduced by Congress and the Secretary; (5) there is an abundance of guidance available to Air Force, Army, and Navy contracting activities, but a sample of fiscal year (FY) 1995 Volpe Center purchases showed that not all files contained the information required by the Secretary's memorandum; (6) in addition, DOD has not yet implemented a statutorily mandated monitoring system for its interagency purchases; (7) the monitoring system is currently scheduled for implementation in October 1995; (8) DOD contracting with the Volpe Center has been declining since FY 1992; (9) while it is difficult to pinpoint exact causes for the downward trend, more recent declines appear to be a result of DOD's implementation of the more restrictive environment for interagency orders; (10) likewise, a similar recent decline in Coast Guard purchases at the Volpe Center appears to be related to the introduction of the Coast Guard reforms; (11) the Federal Acquisition Streamlining Act (FASA) generally extended the restrictive interagency transaction controls applicable to DOD to other federal agencies; and (12) the implementing draft regulation, while consistent with FASA, is not as stringent as the DOD or Coast Guard cost policies. | 3,600 | 499 |
VA's mission is to promote the health, welfare, and dignity of all veterans in recognition of their service to the nation by ensuring that they receive medical care, benefits, social support, and lasting memorials. It is the second largest federal department and, in addition to its central office located in Washington, D.C., has field offices throughout the United States, as well as the U.S. territories and the Philippines. The department has three major components that are primarily responsible for carrying out its mission: the Veterans Benefits Administration (VBA), which provides a variety of benefits to veterans and their families, including disability compensation, educational opportunities, assistance with home ownership, and life insurance; the Veterans Health Administration (VHA), which provides health care services, including primary care and specialized care, and performs research and development to serve veterans' needs; and the National Cemetery Administration (NCA), which provides burial and memorial benefits to veterans and their families. Collectively, the three components rely on approximately 340,000 employees to provide services and benefits. These employees work in 167 VA medical centers, approximately 800 community-based outpatient clinics, 300 veterans centers, 56 regional offices, and 131 national and 90 state or tribal cemeteries. For fiscal year 2016, VA reported about $176 billion in net outlays, an increase of about $16 billion from the prior fiscal year. VBA and VHA account for about $102 billion (about 58 percent) and $72 billion (about 41 percent) of VA's reported net outlays, respectively. The remaining net outlays were for NCA and VA's administrative costs. The fiscal year 2017 appropriations act that covered VA provided approximately $177 billion to the agency, about a $14 billion increase from the prior fiscal year. As we recently reported, improper payments remain a significant and pervasive government-wide issue. Since fiscal year 2003--when certain agencies began reporting improper payments as required by IPIA-- cumulative reported improper payment estimates have totaled over $1.2 trillion, as shown in figure 1. For fiscal year 2016, agencies reported improper payment estimates totaling $144.3 billion, an increase of over $7 billion from the prior year's estimate of $136.7 billion. The reported estimated government-wide improper payment error rate was 5.1 percent of related program outlays. As shown in figures 2 and 3, the government-wide reported improper payment estimates--both dollar estimates and error rates--have increased over the past 3 years, largely because of increases in Medicaid's reported improper payment estimates. For fiscal year 2016, overpayments accounted for approximately 93 percent of the government-wide reported improper payment estimate, according to www.paymentaccuracy.gov, with underpayments accounting for the remaining 7 percent. Although primarily concentrated in three areas (Medicare, Medicaid, and the Earned Income Tax Credit), the government-wide reported improper payment estimates for fiscal year 2016 were attributable to 112 programs spread among 22 agencies. (See fig. 4.) We found that not all agencies had developed improper payment estimates for all of the programs they identified as susceptible to significant improper payments. Eight agencies did not report improper payment estimates for 18 risk-susceptible programs. (See table 1.) As we have previously reported, the federal government faces multiple challenges that hinder its efforts to determine the full extent of and reduce improper payments. These challenges include potentially inaccurate risk assessments, agencies that do not report improper payment estimates for risk-susceptible programs or report unreliable or understated estimates, and noncompliance issues. For fiscal year 2016, VA's reported improper payment estimate totaled $5.5 billion, an increase of about $500 million from the prior year. The reported VA improper payment error rate was 4.5 percent of related program outlays for fiscal year 2016, a slight increase from the 4.4 percent reported error rate for fiscal year 2015. As shown in table 2, VA's Community Care and Purchased Long-Term Services and Support programs accounted for the majority of VA's estimated improper payments. Specifically, for fiscal year 2016, VA's reported improper payment estimate for VA's Community Care was approximately $3.6 billion (about 65 percent of VA's total reported improper payments estimate) and for VA's Purchased Long-Term Services and Support was approximately $1.2 billion (about 22 percent of VA's total reported improper payments estimate). As shown in figures 5 and 6, VA's reported improper payment estimates have increased over the past 3 years, and the reported improper payment error rates have increased over the past 2 years. The significant increase in VA's reported improper payment estimates and error rates primarily occurred, according to the VA OIG, because VA changed its sample evaluation procedures in fiscal year 2015, which resulted in more improper payments being identified. In response to a finding by the VA OIG, VA began classifying every payment as improper when it made a payment that did not follow all applicable Federal Acquisition Regulation (FAR) and Veterans Affairs Acquisition Regulation (VAAR) provisions. The OIG reported that when those purchases do not follow applicable legal requirements, such as having FAR-compliant contracts in place, the resulting payments are improper because they "should not have been made or were made in an incorrect amount under statutory, contractual, administrative, or other legally applicable requirements, according to the definition of improper payments set forth in OMB Circular A-123, Appendix C." As a result of the change in its sample evaluation procedures, VA reported significant increases in estimated improper payments for both its Community Care and Purchased Long- Term Services and Support programs. As shown in table 3, VA's Community Care and Purchased Long-Term Services and Support programs' reported improper payment error rates are the two highest reported error rates government-wide for fiscal year 2016. Specifically, VA's Community Care and Purchased Long-Term Services and Support programs had reported improper payment error rates of about 75.9 percent and 69.2 percent, respectively. The reported improper payment error rates for these two programs were each over 45 percentage points higher than the reported improper payment error rate for the next highest federal program--the Department of the Treasury's Earned Income Tax Credit program. In its fiscal year 2016 agency financial report, VA did not report improper payment estimates for four programs it identified as susceptible to significant improper payments. These four programs were Communications, Utilities, and Other Rent; Medical Care Contracts and Agreements; VA Community Care Choice payments made from the Veterans Choice Fund. Because VA did not report improper payment estimates for these risk- susceptible programs, VA's improper payment estimate is understated and the agency is hindered in its efforts to reduce improper payments in these programs. In its fiscal year 2016 agency financial report, VA stated that it will report improper payment estimates for these programs in its fiscal year 2017 agency financial report. According to OMB guidance, to reduce improper payments, VA can use root cause analysis to identify why improper payments are occurring and develop effective corrective actions to address those causes. In addition, our two prior reports identified problems with how VA processed its claims to reasonably assure the accuracy of or eligibility for the disability benefits, increasing the risk of improper payments. VA can implement our recommendations from these two reports to better ensure the accuracy of or eligibility for disability benefits. Root cause analysis is key to understanding why improper payments occur and to developing and implementing corrective actions to prevent them. In 2014, OMB established new guidance to assist agencies in better identifying the root causes of improper payments and assessing their relevant internal controls. Agencies across the federal government began reporting improper payments using these more detailed root cause categories for the first time in their fiscal year 2015 financial reports. Figure 7 shows the root causes of VA's estimated improper payments for fiscal year 2016, as reported by VA. According to VA's fiscal year 2016 agency financial report, the root cause for over three-fourths of VA's reported fiscal year 2016 improper payment estimates was program design or structural issues. As noted above, most of the improper payments occurred in VA's Community Care and Purchased Long-Term Services and Support programs. In the fiscal year 2016 agency financial report, VA provided details on how it plans to correct some program design issues by making its procurement practices compliant with relevant laws and regulations. The agency stated that it has made certain changes, such as issuing of new policies that can reduce the amount of improper payments in this area. For example, in VA's fiscal year 2016 agency financial report, VA stated that it issued guidance in May 2015 to appropriately purchase care, such as hospital care or medical services, in the community through the use of VAAR- compliant contracts. VA stated that the implementation of this guidance is ongoing with full impact and compliance anticipated during fiscal year 2017. According to VA's fiscal year 2016 agency financial report, the second largest root cause for VA's reported improper payments was administrative or process errors made by the federal agency. VA reported that most of these errors occurred in its Compensation program. These errors, such as failure to reduce benefits appropriately, affected the payment amounts that veterans and beneficiaries received. To address this root cause, VA stated in its fiscal year 2016 agency financial report that it is updating procedural guidance to reflect such things as changes in legislation and policy. In addition, VA stated that it will train employees on specific subjects related to errors found during improper payment testing and quality reviews. Accurate claim decisions help ensure that VA is paying disability benefits only to those eligible for such benefits and in the correct amounts. Thus, it is critical that VA follows its claims processes accurately and consistently. However, we previously reported problems with how VA processed its claims to reasonably assure the accuracy of or eligibility for the disability benefits, increasing the risk of improper payments. In November 2014, we reported that while VA pays billions of dollars to millions of disabled veterans, there were problems with VA's ability to ensure that claims were processed accurately and consistently by its regional offices. VA measures the accuracy of disability compensation claim decisions mainly through its Systematic Technical Accuracy Review (STAR). Specifically, for each of the regional offices, completed claims are randomly sampled each month and the data are used to produce estimates of the accuracy of all completed claims. In our November 2014 report, we reported that VA had not always followed generally accepted statistical practices when calculating accuracy rates through STAR reviews, resulting in imprecise performance information. We also identified shortcomings in quality review practices that could reduce their effectiveness. We made eight recommendations to VA to review the multiple sources of policy guidance available to claims processors and evaluate the effectiveness of quality assurance activities, among other things. In response to the draft report, VA agreed with each of our recommendations and identified steps it planned to take to implement them. To date, VA has implemented six of the report's eight recommendations. For example, VA has revised its sampling methodology and has made its guidance more accessible. VA has initiated action on the remaining two recommendations related to quality review of the claims processes. VA reported that it is in the process of making systems modifications to its electronic claims processing system that will allow VA to identify deficiencies during the claims process. In addition, VA is developing a new quality assurance database that will capture data from all types of quality reviews at various stages of the claims process. VA stated that this new database will support increased data analysis capabilities and allow the agency to evaluate the effectiveness of quality assurance activities through improved and vigorous error rate trend analysis. VA stated that it anticipates deploying the systems modifications and the new quality assurance database by July 2017. In June 2015, we reported that VA's procedures did not ensure that Total Disability Individual Unemployability (TDIU) benefit decisions were well-supported. To begin receiving and remain eligible for TDIU benefits, veterans must meet the income eligibility requirements. VA first determines a claimant's income by requesting information on the last 5 years of employment on the claim form and subsequently requires beneficiaries to annually attest to any income changes. VA uses the information provided by claimants to request additional information from employers and, when possible, verifies the claimant's reported income, especially for the year prior to applying for the benefit. In order to receive verification, VA sends a form to the employers identified on the veteran's benefit claim and asks them to provide the amount of income earned by the veteran. However, VA officials indicated that employers provided the requested information only about 50 percent of the time. In our 2015 report, we reported that VA previously conducted audits of beneficiaries' reported income by obtaining income verification matches from Internal Revenue Service (IRS) earnings data through an agreement with the Social Security Administration (SSA), but was no longer doing so despite the standing agreement. In 2012, VA suspended income verification matches in order to develop a new system that would allow for more frequent, electronic information sharing. VA officials told us that they planned to roll out a new electronic data system that would allow for compatibility with SSA data sources in fiscal year 2015. They noted that they planned to use this system to conduct more frequent and focused income verifications to help ensure beneficiaries' continued entitlement. VA officials also anticipated being able to use the system to conduct income verifications for initial TDIU applicants. However, at the time of our 2015 report, VA could not provide us with a plan or timeline for implementing this verification system. In the 2015 report, we recommended that VA verify the self-reported income provided by veterans (1) applying for TDIU benefits and (2) undergoing the annual eligibility review process by comparing such information against IRS earnings data, which VA currently has access to for this purpose. To date, VA is developing processes to use IRS earnings data from SSA in verifying income eligibility requirements. According to VA, in February 2016, it launched a national workload distribution tool within its management system to improve its overall production capacity and assist with reaching claims processing goals that will be used in implementing our recommendation. To determine if new beneficiaries are eligible for TDIU benefits, VA stated that it is expanding the data-sharing agreement with SSA to develop an upfront verification process. Specifically, when VA receives a TDIU claim, it will electronically request the reported IRS income information from SSA and receive a response within 16 days. In addition, according to VA, it is also planning to begin a process for checking incomes of veterans to determine whether they remain eligible for TDIU benefits. Specifically, VA has reinstituted the data match agreement with SSA that was set to expire in December 2016 to allow VA to compare reported income earnings of TDIU beneficiaries to earnings actually received. According to VA, it also has drafted a new guidance manual for the annual eligibility review process. VA stated that it planned to fully implement the upfront and annual eligibility verification processes by the summer of 2017. In conclusion, in light of VA's significant financial management challenges, we continue to be concerned about VA's ability to reasonably ensure its resources are being used cost-effectively and efficiently. Because VA's payment amounts are likely to increase with the increase in appropriations for fiscal year 2017, it is critical that VA takes actions to reduce the risks of improper payments. While VA has taken several actions to help prevent improper payments, further efforts are needed to help minimize the risks of improper payments across its programs. Chairman Bergman, Ranking Member Kuster, and Members of the Subcommittee, this completes my prepared statement. I would be pleased to respond to any questions that you may have at this time. If you or your staff have any question about this testimony, please contact Beryl H. Davis, Director, Financial Management and Assurance, at (202) 512-2623 or [email protected]. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this statement. GAO staff who made key contributions to this testimony are Matthew Valenta (Assistant Director), Daniel Flavin (Analyst in Charge), Marcia Carlsen, Francine Delvecchio, Robert Hildebrandt, Melissa Jaynes, Jason Kelly, and Jason Kirwan. This is a work of the U.S. government and is not subject to copyright protection in the United States. The published product may be reproduced and distributed in its entirety without further permission from GAO. However, because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. | For several years, GAO has reported in its audit reports on the consolidated financial statements of the U.S. government that the federal government is unable to determine the full extent to which improper payments occur and reasonably assure that actions are taken to reduce them. Strong financial management practices, including effective internal control, are important for federal agencies to better detect and prevent improper payments. VA faces significant financial management challenges. In 2015, GAO designated VA health care as a high-risk area because of concern about VA's ability to ensure that its resources are being used cost effectively and efficiently to improve veterans' timely access to health care and to ensure the quality and safety of that care. Further, improving and modernizing federal disability programs has been on GAO's high-risk list since 2003, in part because of challenges that VA has faced in providing accurate, timely, and consistent disability decisions related to disability compensation. In addition, in VA's fiscal year 2016 agency financial report, the independent auditor cited material weaknesses in internal control over financial reporting. This statement discusses improper payments on both the government-wide level and at VA. The statement also discusses certain actions that VA has taken and other actions that VA can take to reduce improper payments. This statement is based on GAO's recent work on improper payments and its analysis of agency financial reports and VA's Office of Inspector General reports. Improper payments, which generally include payments that should not have been made, were made in the incorrect amount, or were not supported by sufficient documentation, remain a significant and pervasive government-wide issue. Since fiscal year 2003--when certain agencies began reporting improper payments as required by the Improper Payments Information Act of 2002--cumulative improper payment estimates have totaled over $1.2 trillion. For fiscal year 2016, agencies reported improper payment estimates totaling $144.3 billion, an increase of about $7.6 billion from the prior year's estimate of $136.7 billion. For fiscal year 2016, the Department of Veterans Affairs' (VA) reported improper payment estimate totaled $5.5 billion. VA's Community Care and Purchased Long-Term Services and Support programs accounted for reported improper payment estimates of $3.6 billion and $1.2 billion, respectively, or about 87 percent of VA's reported improper payment estimate for fiscal year 2016. VA's reported improper payment estimates increased significantly from $1.6 billion for fiscal year 2014 to $5.0 billion for fiscal year 2015. According to the VA Office of Inspector General, this increase was primarily due to a change in VA's evaluation procedures, which resulted in more improper payments being identified. In accordance with Office of Management and Budget guidance, to reduce improper payments, VA can use detailed root cause analysis to identify why improper payments are occurring and to develop corrective actions. For example, according to VA, the root cause for over 75 percent of VA's reported improper payments for fiscal year 2016 was program design or structural issues. Most of these errors occurred in VA's health care area. To reduce these improper payments, VA stated that it will make its procurement practices compliant with Federal Acquisition Regulation provisions. GAO has also recommended steps that VA can take to reduce the risk of improper payments related to disability benefits. For example, in November 2014, GAO reported that VA had shortcomings in quality review practices that could reduce its ability to ensure accurate and consistent processing of disability compensation claim decisions, and GAO made eight related recommendations to improve the program. To date, VA has implemented six of the report's eight recommendations and expects to implement the other two recommendations related to the effectiveness of quality assurance activities later this summer. | 3,465 | 755 |
Tax expenditures are provisions of the tax code that are viewed as exceptions to the "normal structure" of the individual and corporate income tax (i.e., exceptions to taxing income). They take the form of exemptions, exclusions, deductions, credits, deferrals, and preferential tax rates; however, not all such provisions are tax expenditures. For example, some provisions that determine tax liability, such as business expense deductions, are not considered to be tax expenditures because costs of earning income are usually deducted in calculating taxable income for businesses. Generally, tax expenditures grant special tax relief for certain kinds of behavior by taxpayers or for taxpayers in special circumstances. Holding tax rates constant, tax expenditures result in forgone tax revenue the government incurs by granting the relief. Many of these provisions may, in effect, be viewed as spending programs channeled through the tax system. Congress updated the statutory framework for performance management in the federal government, the Government Performance and Results Act of 1993 (GPRA), with the GPRA Modernization Act of 2010 (GPRAMA). Both acts require agencies to set goals and measure and report the performance of their programs. GPRAMA introduced a more integrated and crosscutting approach to performance measurement that cuts across organizational boundaries. The act requires that OMB, in coordination with agencies, develop long-term crosscutting priority goals to improve performance and management of the government. OMB is to coordinate annually with agencies to develop a federal government performance plan which establishes performance indicators for achieving these goals. Moreover, GPRAMA requires that this plan identify the tax expenditures that contribute to each crosscutting priority goal. As we noted in a recent report, sporadic progress has been made along these lines. OMB Circular A-11 guidance directs agencies to list tax expenditures among the various programs and activities that contribute to the subset of performance goals that are designated as agency priority goals. A performance evaluation of a tax expenditure program would use largely the same concepts, methods, and types of data as an evaluation of an outlay program. In prior reports, we have described in some detail how such program evaluations would be conducted to measure progress toward achieving the program's intended purpose. Even if a tax expenditure is meeting its intended purpose, broader questions can be asked about its effects beyond that purpose. Specifically, the long standing criteria of fairness, economic efficiency, transparency, simplicity, and administrability can be used to evaluate whether a tax expenditure is good tax policy. Some agencies may be better positioned to collect tax expenditure information and make it available for analysis than others. As we said in our Guide for Evaluating Tax Expenditures, for a tax expenditure that is part of a crosscutting agency priority goal, the responsible agencies identified in the related performance plan may be the logical agencies responsible for evaluating the tax expenditure. Although IRS is the federal agency responsible for administering tax expenditures, it is not responsible for the program areas targeted by many tax expenditures. The information available at IRS is generally limited by the Paperwork Reduction Act to data used for tax administration, not for performance evaluation. Of the 163 tax expenditures identified by Treasury for tax year 2011, 102, or 63 percent, were not on a tax return, information return, or other tax form; or they were on these tax forms but did not have their own line item, as shown in table 1. For these tax expenditures, the tax forms do not capture information on who claimed the tax expenditures and how much they claimed. An example of a tax expenditure not on a tax form is the exclusion of interest on life insurance savings where the taxpayer is not asked to report the amount of the exclusion anywhere on a tax form, while an example of a tax expenditure without its own line item is the credit for holding clean renewable energy bonds where the credit is aggregated with other credits on a single line item. Nearly all deferrals or exclusions were either not on a tax form or did not have their own line item. For information on our classification by specific tax expenditure, see appendix II. If a tax expenditure has its own line item on a tax form, the IRS can identify the claimant and amount of the claim. These account for about half of the total tax expenditures. As shown in figure 1, these accounted for $501 billion of the almost $1 trillion of revenue estimated by Treasury in 2011 for the tax expenditures that we analyzed. The remaining $492 billion were not on tax forms or did not have their own line items. Having such basic information about tax expenditures can facilitate certain kinds of analysis. Specifically, when a tax expenditure has its own line item, the claimant can be matched to his or her income which is also reported on the tax return. This linkage facilitates analyses of the distributional effects of a tax expenditure by showing tax expenditure use by income category. The sum of tax expenditure revenue loss estimates that appear in figure 1 approximates the total revenue forgone through tax expenditure provisions. While sufficiently reliable as a gauge of general magnitude, the sum of the individual revenue loss estimates has important limitations in that any interactions between tax expenditures will not be reflected in the sum. Data necessary to assess how often a tax expenditure is used and by whom generally would not be collected on tax returns unless IRS needs the information to know the correct amount of taxes owed or is legislatively mandated to collect or report the information. IRS is obligated under the Paperwork Reduction Act to keep the administrative burden on taxpayers as low as possible, while still fulfilling its mission. In prior reports, we identified tax expenditures that could not be evaluated because appropriate data were not available from any source, including sources other than IRS. One example is Indian reservation depreciation (IRD), where IRS did not collect information on the identity of claimants, amounts claimed, or the location of the qualified investment. In addition, we could not find reliable data at other agencies on which taxpayers use IRD, how much IRD investment was made, or whether the provision was having a positive effect on economic development. For some tax expenditures, IRS data limitations can be remedied to some extent by information available from other federal agencies. For example, for Empowerment Zone (EZ) employment tax credits, IRS cannot separate the total credits claimed to show how much was claimed for specific EZ communities. This limitation is partially remedied by the Department of Housing and Urban Development (HUD), which collects community level information for some EZ-related tax expenditures. However, as we have previously reported, HUD was unable to validate the information on the use of some of these tax expenditures and it tracks only a portion of the EZ employment credits. HUD and IRS have begun collaborating to produce better data on the use of EZ tax credits. For some tax expenditures, it may be possible to estimate missing IRS information using other sources such as public records, state agency records, and surveys. However, in general, such estimates cannot be expected to be as precise as data from tax returns. For example, in the case of an evaluation of the Research Tax Credit, a measure of spending that qualifies for the credit derived from research spending as reported on corporate annual reports will not be as accurate as a measure derived from corporate tax returns because of differences in the tax and accounting rules for reporting the spending. The less accurate data can lead to less reliable conclusions from the evaluation. After reviewing the GPRAMA-mandated cross-agency priority (CAP) goals established by OMB and federal agencies, we chose four outlay programs--three addressing energy efficiency and one addressing job training--that we considered to be comparable to certain tax expenditures based on their similar purposes. Table 2 provides descriptions of these tax expenditures and comparable outlay programs. As shown in table 3, the four comparable outlay programs and tax expenditures associated with DOE and DOL had broadly similar purposes in the areas of energy conservation and employment. As shown in table 4, DOE and DOL produced performance measures and goals for outlay programs in their annual reports but did not do so for the comparable tax expenditures. Agencies were not required by GPRAMA to produce these measures for the tax expenditures. Also, IRS collects the basic information about claimants and the amounts claimed for the four tax expenditures in our case studies. (See appendix II table 5). But, since IRS is not tasked with evaluating tax expenditures, it has not formulated performance measures or goals for these tax expenditures. The performance measures shown in table 4 can track the progress of the outlay programs, on an ongoing basis, toward specific goals (stated in terms of number of gallons produced, number of turbines installed, etc.). However, additional data may be needed for an assessment of broader purposes and the impact of the programs. For example, for the vehicle technologies program, the purpose of reducing petroleum consumption can be measured by the performance measure in table 4 (gallons of petroleum saved) but additional data are needed to measure the outlay program's broader purpose of reducing environmental impacts. With so much spending going through the tax code in the form of tax expenditures, the need to determine whether this spending is achieving its purpose becomes more pressing. This report identifies gaps in the data required to evaluate tax expenditures but makes no recommendations on how to fill these gaps. A key step in collecting the data is first determining who should undertake this task. As we said in our guide for evaluating tax expenditures, the agency or agencies responsible for the program ought to determine what data should be collected to evaluate tax expenditures relevant to their goals. We recommended in our 2005 report that the Director of OMB, in consultation with the Secretary of the Treasury, determine which agencies will have leadership responsibilities to review tax expenditures and how to address the lack of credible performance information on tax expenditures. However, these agencies have not yet been identified. GPRAMA may make a start on answering the question of who should evaluate tax expenditures by requiring that the responsible agencies identify the various program activities that contribute to their goals, which we believe should include tax expenditures. The IRS provided technical comments after viewing a draft of this report, which we incorporated as appropriate. As agreed with your offices, unless you publicly announce the contents of this report earlier, we plan no further distribution until 30 days from the report date. At that time, we are sending copies of this report to the Acting Commissioner of Internal Revenue and other interested parties. This report will also be available at no charge on GAO's website at http://www.gao.gov. If you have any questions on this report, please contact me at (202) 512- 9110 or [email protected]. Contact points for our offices of Congressional Relations and Public Affairs may be found on the last page of this report. Key contributors to this report are listed in appendix III. To determine what Internal Revenue Service (IRS) data are available for evaluating tax expenditures, we used 173 tax expenditures for fiscal year 2011 that were developed by the Department of the Treasury (Treasury) and reported by the Office of Management and Budget (OMB) in Analytical Perspectives, Budget of the United States Government, Fiscal Year 2012. We reviewed IRS tax returns, tax forms, information returns, and publications for tax year 2011 and categorized the tax expenditures based on whether they were (1) not listed on tax forms, (2) listed on tax forms but did not have their own line item, or (3) listed on tax forms and had their own line item so the claimants and the amount claimed could be identified. The tax returns we reviewed were primarily Form 1040 for individual taxpayers, Form 1120 for corporate taxpayers, and Form 990 for tax-exempt organizations. Although the tax expenditure concept can also be applied to other kinds of taxes, such as excise taxes, this report covers only tax expenditures for the federal income tax system. We sent a list of the tax expenditures that we initially identified as not appearing on a tax form to IRS for verification of our assignment of these tax expenditures to this category. (IRS was not able to verify our assignment of all of the tax expenditures to their categories due to time constraints encountered as the agency readied for the tax filing season.) In addition, when IRS verified those not listed on a tax form, it generally used information from tax returns but not from information returns. Our assignment of the tax expenditures to the three categories sometimes required that we make judgments about the adequacy of the information on the tax forms. For example, according to IRS, the amount of deferred income from installment sales can be obtained from Form 6252 (Installment Sale Income) by subtracting the installment sale income line item from the gross profit line item. However, according to OMB, this difference does not represent the amount of the tax expenditure. The tax expenditure is the deferred amount less than $5 million for which non- dealers are not required to pay interest on their deferred taxes. Therefore, since we could not identify the deferral amount for non-dealers, and the amount of the tax expenditure deferral does not have its own line item, we classified it as a tax expenditure that is on a tax form but does not have its own line item. During our matching, we identified 10 tax expenditures that we did not include in our analysis because (1) they were not available in tax year 2011, such as the Hope Tax Credit which was temporarily replaced by the American Opportunity Tax Credit; (2) some but not all parts of the tax expenditure were on a tax form, such as the Exclusion of Benefits and Allowances to Armed Forces Personnel where only the combat pay portion was reported on a tax form--Form W-2 (Wage and Tax Statement); and (3) where reporting of the tax expenditure was optional, such as Employer Plans on Form W-2. Some tax expenditures use multiple tax forms, multiple line items, or both on the forms to account for all parts of the tax expenditure. When multiple forms or line items were used, we considered the tax expenditure as having its own line item when all parts of the tax expenditure were on tax forms and had their own line items. For example, the Adoption Credit and Exclusion tax expenditure lists the credit and exclusion on different line items of the Form 8839 (Qualified Adoption Expenses). Therefore, we considered this tax expenditure as having its own line item. We used the fiscal year 2013 budget, for fiscal year 2011 revised estimates, to identify the tax expenditure amounts. We chose the tax expenditure estimates reported in the budget for our analysis because Treasury develops revised estimates based on changes in tax policy and economic activity for the year prior to the reported fiscal budget year (i.e., retrospective estimates). Even though Treasury's estimates are retrospective, the final reported numbers are still estimates and may not reflect additional policy changes. In addition, tax expenditure revenue loss estimates for specific provisions do not take into account potential behavioral responses to changes in these provisions on the part of taxpayers. These revenue loss estimates do not represent the amount of revenue that would be gained if certain tax expenditures were repealed since repeal would probably change taxpayer behavior in some way that would affect revenue. For tax expenditures that were listed on tax forms, we reviewed IRS's Statistics of Income (SOI) Proposed Tax Year 2010 Forms and Schedules to determine whether SOI collected data for tax year 2010, the latest year available. We also reviewed SOI publications to identify the types of available information and whether they included tax expenditures. We reviewed our prior reports to identify instances where IRS data available for evaluating tax expenditures were limited and the limitations were not remedied by data from other sources. To analyze examples of data that agencies used to evaluate outlay programs that are comparable to tax expenditures, we visited the performance.gov web site on October 19, 2012. We reviewed the Government Performance and Results Act Modernization Act of 2010- mandated crosscutting, or what OMB calls cross-agency priority (CAP), goals for contributing agencies and programs that explicitly included tax expenditures among their policy initiatives. As examples of tax expenditures, we chose five tax credits from the sixteen tax expenditures listed under the CAP goal of energy efficiency and the one tax credit listed under the CAP goal of job training. As examples of what we considered to be comparable outlay programs, we chose three energy efficiency and renewable energy outlay programs, Weatherization, Vehicle Technologies, and Renewable Energy, and one Department of Labor-based outlay program, the Workforce Investment Act as it pertained to dislocated workers. We then reviewed the 2011 annual performance reports from the Department of Energy and the Department of Labor. We used fiscal year 2011 performance reports so the performance data would be comparable to the tax expenditure data we analyzed for tax year 2011. These performance reports were not available for fiscal year 2012. We identified their performance measures and goals, as well as the data they used to evaluate and assess these outlay programs. Lastly, we used our own criteria for performance measures and examples of data used to construct them. Table 2 provides a more detailed description of the tax expenditures and comparable outlay programs. To determine whether tax expenditures were included on tax forms and had their own line items, we matched the Department of Treasury's list of tax expenditures for fiscal year 2011 to Internal Revenue Service tax forms for tax year 2011. The relationships of the tax expenditures to tax forms are shown in table 5. In addition to the contact name above, Kevin Daly, (Assistant Director), Laurie King (Analyst-in-Charge), Jeff Arkin, Elizabeth Curda, Robert Gebhart, Lois Hanshaw, Benjamin Licht, Ed Nannenhorn, Karen O'Conor, Michael O'Neill, Robert Robinson, Alan Rozzi, MaryLynn Sergent, Stephanie Shipman, and Anne Stevens all made contributions to this report. | By one measure, tax expenditures resulted in an estimated $1 trillion of revenue forgone by the federal government in fiscal year 2011. GAO has recommended greater scrutiny of tax expenditures, as periodic reviews could help determine how well specific tax expenditures achieve their goals and how their benefits and costs compare to those of other programs with similar goals. To assist with this, GAO recently issued a guide ( GAO-13-167SP ) for evaluating the performance of tax expenditures. GAO was asked to identify data needed for evaluating tax expenditures and its availability. This report: (1) determines the information available from IRS for evaluating tax expenditures; and (2) compares, for a few case studies, the information identified by federal agencies for evaluating outlay programs with similar purposes to tax expenditures. To address these objectives, GAO analyzed 173 tax expenditures, and information from IRS tax forms, federal agency performance reports, and prior GAO reports. Internal Revenue Service (IRS) data are not sufficient for identifying who claims a tax expenditure and how much they claim for $492 billion or almost half the dollar value of all tax expenditures that GAO examined. Such basic data are not available at IRS for tax expenditures because they do not have their own line item on a tax form. This included $102 billion of tax expenditures that were not on tax forms, such as the exclusion of interest on life insurance savings, and $390 billion of tax expenditures that were on tax forms but did not have their own line items, such as the credit for holding clean renewable energy bonds which is aggregated with other credits on a single line item. In four cases in which the Office of Management and Budget (OMB) identified outlay programs and comparable tax expenditure programs that shared similar purposes, the related agencies produced performance measures and goals only for the outlay programs and not for the comparable tax expenditures. For example, OMB identified the Alternative Technology Vehicle Credit as having a comparable purpose to the Department of Energy (DOE) Vehicle Technologies outlay program--both are intended to create more fuel efficient modes of transportation. DOE produced a performance measure and goal for the outlay program--petroleum consumption reduced by 570 million gallons per year by 2011--as required under the provisions of the Government Performance and Results Act of 1993 and the Government Performance and Results Act Modernization Act of 2010. However, DOE did not produce measures and goals for the comparable tax expenditure as neither act requires DOE or other federal agencies to do so. Although IRS is responsible for administering these tax expenditures, it is required by law, unless otherwise directed by Congress, to collect only data which are required for administration of the tax code. GAO has recommended that the agencies responsible for tax expenditures be identified and the lack of credible performance data be addressed. GAO made no recommendations in this report. IRS provided technical comments that were incorporated as appropriate. | 3,853 | 616 |
According to the American Financial Services Association (AFSA), some of its members have been issuing live loan checks since the 1980s. Live loan checks are delivered in the mail and are preapproved offers of credit. Consumers are selected to receive the loan offers if they meet certain credit criteria. These preapproved offers of credit are based, in part, on a consumer's credit score. Credit bureaus develop scores by assessing various types of information collected from a large pool of borrowers, including borrowers with good payment histories and others with poor payment histories, to estimate the credit risk associated with different types of loans. Credit scoring systems use statistical analysis to identify and weigh the characteristics of borrowers who have been most likely to make loan payments. For example, borrowers with little or no history of delinquent payments receive higher credit scores than borrowers with many delinquent payments. Most widely used credit scoring systems have a range of scores from 350 to 900. Borrowers with higher scores are considered more creditworthy because they are more likely to repay the loan on time and in full than are borrowers with lower credit scores. While comprehensive data on live loan checks are not available, data provided by one lender depict its loans as amortizing loans with interest rates below credit card rates. According to this lender, the recipients of its live loan checks had high credit scores and good credit histories. Chase and Fleet officials provided us with the materials they sent to the recipients of live loan checks. The materials include information disclosing that the check represents a loan and presenting the terms and conditions of the loan. Voluntary industry standards also call for such disclosure. Comprehensive industry data on the average live loan check and the borrower using this product are not available. Fleet and Chase officials, however, provided us with information on their live loan check profile. According to Fleet, borrowers receive live loan checks ranging from $3,000 to $10,000, based on the lender's estimate of the recipient's predicted ability to repay the loan. Prior to selection, recipients had demonstrated their ability to manage debt by having satisfactory payment histories. According to Fleet and Chase officials, interest rates on loans resulting from live loan checks have ranged from 12.9 percent to 15.9 percent. The repayment terms for these loans ranged from 48 months to 60 months and are amortized. In addition, Fleet's live loan checks generally were only valid for 6 weeks from the date of issuance; this provision is intended to lessen the risk of using outdated credit data as a basis for assessing a potential borrower's creditworthiness. Borrowers' credit scores were used as the primary factor in determining whether to offer a live loan check, and credit criteria were conservative in the lender's view. Fleet officials told us that their borrowers had an average credit score of 730, with a minimum cut-off of 690. A credit score of 730, for example, implies odds of 125 to 1 against defaulting on an unsecured loan--that is, the estimated probability of default is less than 1 percent. The officials said that Fleet borrowers primarily resided within the established franchise area where the bank offers retail banking services. The borrower had a median household income of $44,000. In addition to having to meet a minimum credit score, borrowers also were to meet minimum requirements set by proprietary risk and bankruptcy models, according to Fleet officials. The borrower's average debt utilization--that is, the proportion of available credit limits actually used in unsecured debt on current revolving credit sources--was 29 percent, which the bank believes estimates the borrower's propensity to use credit. Also, borrowers had no prior record of bankruptcy, foreclosure, tax liens, or garnishments. According to AFSA, its members who offer live loan check programs reported that borrowers extended live loan check offers are generally between 35 and 50 years of age with income levels between $35,000 and $55,000. Interviews with lenders, bank regulators, and the Federal Trade Commission (FTC), which is responsible for, among other things, fostering free and fair business competition and preventing monopolies and activities in restraint of trade, revealed few complaints that live loan checks terms were not disclosed to borrowers. According to lenders, disclosure requirements are intended to protect the borrower and the lender. Office of the Comptroller of the Currency (OCC) officials consider live loan checks to be like any other small consumer loans in needing to meet Truth In Lending Act requirements ensuring that creditors disclose credit terms and the cost of credit as an annual percentage rate (APR). We spoke with lenders about the disclosure features of their live loan check programs. Fleet and Chase officials provided us with copies of their disclosure materials, which contained information that identified the loan check as a loan and clearly specified the interest rate and the terms and conditions of the loan. In the lenders' solicitation materials, for example, there were several statements such as, "this is a check for a loan" or "loan check." The interest rate, repayment terms, and other terms were displayed. The live loan checks were labeled "non-transferable" and "for deposit only" to help ensure that the customers would take the checks directly to their own banks for deposit. Chase officials told us that, under their policy, a customer is to be called by a Chase bank official when the check is presented by the depository bank to Chase for payment, to ensure that the intended person actually deposited the check. AFSA issued voluntary standards for live loan checks on September 17, 1997, and expanded them on October 29, 1997. According to an AFSA official, the voluntary standards for live loan checks are intended to provide extra protection for consumers. Bank officials told us that they abide by the voluntary standards to avoid the risk of creating a negative image of the live loan check program. AFSA voluntary standards are as follows: Live loan checks sent by mail or other similar instruments offered by AFSA members are to be negotiable up to 6 months after receipt. A lender's printed material accompanying the offer must advise the consumer to void and destroy the instrument if it is not going to be negotiated. Live loan checks sent by mail must include the following disclosure: "This is a solicitation for a loan--read the enclosed disclosures before signing and cashing this check." Solicitations are to be mailed in envelopes with no indication that a negotiable instrument is inside. Envelopes are to be marked with instructions informing the Postal Service not to forward the item if the intended recipient is no longer at the address on the envelope. In the event a live loan check-by-mail offer is stolen or fraudulently cashed, the intended recipient is to have no liability for the loan obligation. In order to deter theft or forgery, a consumer is to be asked to complete a confirmation statement provided by the creditor. Public and private sector officials told us that, while there was no comprehensive list of institutions with live loan check programs, several institutions were known to have offered such programs. Banks included Fleet in Boston, Massachusetts; Chase Manhattan Bank in New York, New York; Signet Bank in Richmond, Virginia; First USA in Wilmington, Delaware; and BancOne Corporation in Columbus, Ohio. Nonbanks included Capital One in Falls Church, Virginia, and Beneficial Corporation in Wilmington, Delaware. First Chicago NBD had conducted test marketing of live loan checks; a First Chicago official told us that the bank discontinued the program because the level of loss in a pilot program was not acceptable. Regulators and industry officials we interviewed also told us that no comprehensive data show the volume of live loan check activity. These officials also believed that it would be difficult for nonregulators to compile such industrywide information because individual financial institutions might be reluctant to release their proprietary data. Although comprehensive industry data were not available, Fleet officials provided us with information on Fleet's live loan check program history. (See table 1.) Although a similar number of checks were mailed in 1997 as in 1996, Fleet experienced far fewer acceptances in 1997 compared with 1996. Fleet officials said that the decline in acceptances occurred because in 1997 the potential borrowers were primarily non-Fleet customers, who were less likely to recognize Fleet's name. Public and private sector officials identified some benefits and risks associated with live loan checks for both borrowers and lenders. In general, the benefit for borrowers was the ease of obtaining the loan; the risks to a borrower were comparable to those for other unsecured loans. The Consumer Federation of America (CFA) told us that these loans could compound problems caused by high consumer debt. For lenders, the loans were often seen as profitable, with manageable risks. However, limited data exist on the losses associated with live loan checks. Fraud did not appear to be a widespread problem, although there was some concern among industry officials about how a potential borrower might be inconvenienced by fraud. First Chicago, however, discontinued making the loans because the losses during a pilot program were "not acceptable." In the view of lenders, borrowers enjoyed benefits and risks comparable to those associated with conventionally marketed unsecured loans. Borrowers accepted unsecured live loan checks at identical or lower interest rates than the recipient would receive at a local loan office of the lender. These loans had predictable, fixed monthly repayment terms of 48 to 60 months. According to lenders, borrowers experience little risk beyond that normally associated with a loan because they are protected against all liability from fraud or misuse. Some public and private sector officials said that live loan checks could potentially increase the possibility of default and bankruptcy if the borrower misused credit by running up credit card balances. The executive director of CFA said that live loan checks would only compound the problems created by the abundance of unsecured, high-cost credit card debt. Two lenders, however, said that there was no evidence to show that borrowers would file for bankruptcy quicker as a result of accepting live loan checks instead of using credit cards. To date, it does not appear that many potential borrowers have been exposed to the risk of fraudulently cashed loan checks. Lenders we spoke with told us that the bank does not hold a consumer responsible if the check mailed to that consumer is deposited or forged by another individual. For example, Chase officials told us that, in the event that a live loan check were stolen, the intended recipient would not be charged if he or she signed an affidavit stating that the check had not been cashed by him or her. AFSA officials said that state and federal laws shield consumers from liability related to live loan checks and that lenders' credit selection practices help reduce the rate of fraud. AFSA reported that the actual fraud on live loan checks has been extremely low, less than one-tenth of one percent of total mailings. AFSA believes that its voluntary standards ensure minimum inconvenience to the consumer in the event that a check is not cashed by the intended consumer. Public and private sector officials have not seen large levels of fraud involving live loan checks. OCC had no reported cases of fraudulent acts of cashing a live loan check. Federal Reserve officials said they do not believe that there is a significant problem with losses associated with live loan checks. Federal Reserve officials noted that the primary reason for the low rate of fraud is that rules governing check cashing practices act to deter fraud. The recipient's rights in the case of a forged endorsement are generally governed by state law. Articles 3 and 4 of the Uniform Commercial Code have been adopted in almost every state and determine check negotiation procedures and liability for invalid checks. FTC and Federal Reserve officials said that they had not received many complaints about live loan checks that involved theft and fraud issues over a 2-year period. The executive director of CFA testified, however, that the consumer may experience considerable inconvenience if the live loan check is cashed by someone other than the intended recipient, and believed that a consumer should not experience any inconvenience if fraud occurs. Fleet and Chase officials told us that their live loan check programs met corporate profitability requirements and expanded their lines of credit and their loan business. According to bank officials, live loan check programs are attractive because they enable lenders to provide a broader range of consumer loan products. Lenders viewed live loan checks as a convenient means of delivering a fixed rate, closed ended, unsecured loan product to a consumer. Fleet officials said that live loan checks were moderately profitable loans. They said that the results of these loans were provided monthly to senior management to assess the results against expectations. Chase officials said that a benefit of the bank's loan check program was that the net interest margin for live loan checks was higher than that for mortgage lending. Chase officials told us that prepayment rates for live loan checks are lower than those for mortgages. When interest rates decline, lower payments help cash flows remain more stable, which helps Chase to better manage its loan portfolio. In contrast, a decline in interest rates generally results in a rise in prepayments of some other loans. Chase officials also said that, by using good underwriting practices, they were able to manage credit risk. With regard to cases involving fraud, both lenders and bank regulatory agency officials said that lenders are to absorb all losses. With 155,000 loans accepted between 1995 and 1997, for example, Fleet reported 68 confirmed cases of fraud. Generally, in these cases, an unauthorized household member cashed the check. In order to prevent fraud, Fleet required that the borrower access funds only by depositing a check into a personal bank account. Once the live loan check was cleared, Fleet created an installment loan for the borrower. To reduce the risk of fraud, Fleet's live loan check offers were only valid for 6 weeks. Federal bank regulators do not have any special supervisory programs for live loan checks. As noted earlier, OCC officials said that they review these loans in the same way as they do other small consumer loans. Fleet officials told us that monthly reports on these loans, which are distributed to senior Fleet management, are also provided to OCC, Fleet's regulator. Federal Reserve examiners do not specifically monitor live loan check activities at Federal Reserve-regulated institutions. As part of their safety and soundness examinations, Federal Reserve examiners are to review risk models or other risk management systems to assess whether banks practice prudent behavior in their lending. Federal regulatory officials told us that industrywide live loan check activities are not tracked specifically. While Chase officials believed it was too soon to estimate their losses on live loan checks, we received data from Fleet concerning loss rates for its live loan check program. In 1996 and 1997, according to Fleet officials, the bank's loss rates on live loan checks were lower than the credit card industry national averages. Using year-end balances, in 1996, Fleet said, it experienced a 1 percent loss rate compared to 5.96 percent in the credit card industry. In 1997, Fleet experienced a loss rate of 4.20 percent compared to 6.04 percent in the credit card industry. Fleet projected its 1998 live loan check losses to be similar to the credit card industry's at 5 percent. Fleet officials said that they had set aside adequate reserves to cover anticipated losses. Fleet officials explained that the reason for the reported loss increase for live loan checks from 1996 to 1997 is that, typically, there are not many losses in the early years with a new loan product, and that Fleet was more cautious in marketing live loan checks. In the first year, 1995, Fleet marketed all of its live loan checks to its bank customers. According to Fleet officials, as the loans resulting from their live loan checks begin to mature, losses could increase. A First Chicago official told us that the bank discontinued its live loan check program because the level of loss was not acceptable. First Chicago conducted a live loan check pilot program in the summer of 1995 to determine whether offering immediate access to funds via checks would increase the likelihood that consumers would borrow money. The actual loss rate was not disclosed to us. To determine the characteristics of live loan checks, we gathered information on various aspects of individual loans, as well as on the average live loan check profile and the average borrower's profile. To do this, we interviewed officials representing three live loan check lending institutions, an industry association, and a rating agency. We also reviewed publicly available information, including published articles that reported such characteristics. Although we did not independently verify these--or any--industry data, we corroborated evidence with other independent sources whenever possible. To identify the major organizations that mail live loan checks, we interviewed public and private sector officials. We selected officials to talk to, in part, on the basis of information obtained from other industry sources. For example, we talked with officials at Fleet and Chase. In addition, we spoke with First Chicago officials about whether a live loan check program existed at that institution because officials of other banks had informed us that this institution had cancelled its live loan check program. Moreover, we conducted a literature search and reviewed selected articles that reported on live loan check lenders and their activities. We also spoke to officials representing federal banking and thrift regulatory agencies. We obtained Fleet's volume of live loan check lending in 1995, 1996, and 1997 and the expected volume in 1998 by interviewing Fleet officials; other lenders were not willing to provide volume data. We attempted to identify comprehensive, industrywide data for the volume of live loan checks by talking with officials representing an industry association, a consumer advocacy group, a rating agency, federal banking and thrift regulatory agencies, and two investment banks. In addition, we contacted officials representing another lender to corroborate information and to obtain additional volume data. To identify the benefits and risks of live loan checks for borrowers and lenders, we interviewed officials representing federal regulatory agencies and representatives from lending institutions, industry associations, and one rating agency. We reviewed articles and studies that reported benefits and risks associated with live loan check lending. We interviewed public and private sector officials, and reviewed selected federal and state regulations and laws, to gain an understanding of lender protection laws relevant to live loan checks. We also spoke with banking officials about losses associated with live loan checks. As agreed with your office, unless you announce the contents of this report earlier, we plan no further distribution until 30 days after the date of this letter. At that time, we will send copies of this report to the Ranking Minority Member of your Subcommittee, the Chairmen and Ranking Minority Members of other congressional committees with jurisdiction over finanical issues, the Chairman of the Board of Governors of the Federal Reserve System, the Comptroller of the Currency, the Director of the Office of Thrift Supervision, and other interested parties. We will also make copies available to others upon request. This report was prepared under the direction of James M. McDermott, Assistant Director, Financial Institutions and Markets Issues. Major contributors include Edwin J. Lane, Evaluator-in-Charge; Mitchell B. Rachlis, Senior Economist; and Becky K. Kennedy, Senior Evaluator. If you have any questions about this report, please call me on (202) 512-8678. Susan S. Westin Associate Director, Financial Institutions and Markets Issues The first copy of each GAO report and testimony is free. Additional copies are $2 each. Orders should be sent to the following address, accompanied by a check or money order made out to the Superintendent of Documents, when necessary. VISA and MasterCard credit cards are accepted, also. Orders for 100 or more copies to be mailed to a single address are discounted 25 percent. U.S. General Accounting Office P.O. Box 37050 Washington, DC 20013 Room 1100 700 4th St. NW (corner of 4th and G Sts. NW) U.S. General Accounting Office Washington, DC Orders may also be placed by calling (202) 512-6000 or by using fax number (202) 512-6061, or TDD (202) 512-2537. Each day, GAO issues a list of newly available reports and testimony. To receive facsimile copies of the daily list or any list from the past 30 days, please call (202) 512-6000 using a touchtone phone. A recorded menu will provide information on how to obtain these lists. | Pursuant to a congressional request, GAO provided information on live loan checks, focusing on the: (1) characteristics of live loan checks and the major organizations that provide unsolicited loan checks; (2) volume of live loan checks in 1995, 1996, and 1997 and the expected volume in 1998; and (3) benefits and risks of live loan checks for the borrowers and lenders. GAO noted that: (1) once cashed, live loan checks result in unsecured consumer loans; (2) bank officials GAO interviewed told it that live loan checks are aimed at the most creditworthy customers--that is, those least likely to be delinquent or in default in making loan payments; (3) according to bank officials, such loans are made at interest rates ranging from 12.9 percent to 15.9 percent, compared to an average 16 percent for credit cards; (4) Fleet Bank officials told GAO that it has sent potential borrowers live loan checks ranging from $3,000 to $10,000 based on its estimate of the borrower's ability to repay the loan; (5) the repayment terms for these loans ranged from 48 months to 60 months, and the loans were amortized; (6) Fleet officials stated that borrowers generally have used the loan amounts for expenses such as home improvements, debt consolidation, and school expenses; (7) according to bank officials GAO interviewed, at least eight financial institutions have offered live loan checks; (8) of these eight financial institutions, six were banks: Chase Manhattan, Fleet, First USA Bank, Signet Bank, BancOne Corporation, and First Chicago NBD; (9) two were nonbanks: Capital One and Beneficial Corporation; (10) First Chicago stopped offering these loans after suffering a level of losses that it considered not acceptable during a pilot program; (11) public- and private-sector officials told GAO that comprehensive data on the volume of data were not available; (12) Fleet provided GAO with quantitative data on its live loan check program; (13) between 1995 and 1997, Fleet mailed 4.35 million live loan checks; (14) of these, approximately 155,000 borrowers cashed the checks and accepted the loans; (15) Fleet made over $680 million in loans through this program; (16) Fleet officials told GAO that it experienced 68 confirmed cases of fraud, which generally involved someone other than the intended recipient cashing the check; (17) public- and private-sector officials identified benefits and risks associated with live loan checks; (18) borrowers benefit from live loan checks because these checks meet their needs for immediate access to funds at interest rates competitive with those offered by credit cards; (19) risks to the borrowers include the potential for these loans to compound problems associated with high levels of consumer borrowing; and (20) Fleet and Chase informed GAO that, while loans initiated from cashing live loan checks were a small percentage of their bank assets, the programs thus far have been profitable, with manageable risks. | 4,348 | 613 |
The federal government's civilian workforce faces large losses over the next several years, primarily through retirements. Expected retirements in the SES, which generally represents the most senior and experienced segment of the workforce, are expected to be even higher than the governmentwide rates. In our January 2003 report, we estimated that more than half of the government's 6,100 career SES members on board as of October 2000 will have left the service by October 2007. Estimates for SES attrition at 24 large agencies showed substantial variations in both the proportion that would be leaving and the effect of those losses on the gender, racial, and ethnic profile. We estimated that most of these agencies would lose at least half of their corps. The key source of replacements for the SES--the GS-15 and GS-14 workforce--also showed significant attrition governmentwide and at the 24 large agencies by fiscal year 2007. While this workforce is generally younger, and those who leave do so for somewhat different reasons than SES members, we estimate that almost half, 47 percent, of the GS-15s on board as of October 2000 will have left federal employment by October 2007 and about a third, 34 percent, of the GS-14s will have left. While past appointment trends may not continue, they do present a window into how the future might look. In developing our estimates of future diversity of the SES corps, we analyzed appointment trends for the federal government and at 24 large agencies to determine the gender, racial, and ethnic representation of the SES corps in 2007 if appointment trends that took place from fiscal years 1995 through 2000 continued. We found that, governmentwide, the only significant change in diversity by 2007 would be an increase in the number of white women, from 19.1 to 23.1 percent, and a corresponding decrease in white men, from 67.1 to 62.1 percent. The proportion of the SES represented by minorities would change very little, from 13.8 to 14.5 percent. Table 1 presents the results by gender, racial, and ethnic groups of our simulation of SES attrition and projection of SES appointments using recent trends. The table also shows that the racial and ethnic profile of those current SES members who will remain in the service through the 7- year period will be about the same as it was for all SES members in October 2000. This is because minorities are projected to be leaving at essentially the same rate overall as white members. Thus, any change in minority representation will be the result of new appointments to the SES. However, as the last columns of table 1 show, if recent appointment trends continue, the result of replacing over half of the SES will be a corps whose racial and ethnic profile changes very little. The outlook regarding gender diversity is somewhat different--while the percentage represented by SES white women is estimated to increase by 4 percentage points, the percentage of minority women is estimated to increase by .5 percentage point--from 4.5 to 5.0 percent. While white men are estimated to decrease by 5 percentage points, minority men are estimated to increase by .2 percentage point, from 9.3 to 9.5 percent. The results of our simulation of SES attrition and our projection of appointments to the SES over the 7-year period showed variation across the 24 Chief Financial Officers (CFO) Act agencies, as illustrated in table 2. However, as with the governmentwide numbers, agencies tend to increase the proportion of women in the SES, particularly white women, and decrease the proportion of white men. The proportion represented by minorities tended to change relatively little. Our estimates of SES attrition at individual agencies by gender, racial, and ethnic groups are likely to be less precise than for our overall SES estimates because of the smaller numbers involved. Nevertheless, the agency-specific numbers should be indicative of what agency profiles would look like on October 1, 2007, if recent appointment trends continue. The gender, racial, and ethnic profiles of the career SES at the 24 CFO Act agencies varied significantly on October 1, 2000. The representation of women ranged from 13.7 percent to 36.1 percent with half of the agencies having 27 percent or fewer women. For minority representation, rates varied even more and ranged from 3.1 percent to 35.6 percent with half of the agencies having less than 15 percent minorities in the SES. Our projection of what the SES would look like if recent appointment trends continued through October 1, 2007, showed variation, with 12 agencies having increased minority representation and 10 having less. While projected changes for women are often appreciable, with 16 agencies having gains of 4 percentage points or more and no decreases, projected minority representation changes in the SES at most of the CFO Act agencies are small, exceeding a 2 percentage point increase at only 6 agencies. At most agencies, the diversity picture for GS-15s and GS-14s is somewhat better than that for the SES. To ascertain what the gender, racial, and ethnic profile of the candidate pool for SES replacements would look like, we performed the same simulations and projections for GS-15s and GS-14s as we did for the SES. Over 80 percent of career SES appointments of federal employees come from the ranks of GS-15s. Similarly, over 90 percent of those promoted to GS-15 are from the GS-14 workforce. Table 3 presents the results of our analysis for GS-15s, and table 4 presents the results for GS-14s. The results show a somewhat lower proportion of this workforce will leave. Minority representation among those GS-15s who remain by 2007 will be about the same as it was at the beginning of fiscal year 2001, indicating that whites and minorities will leave at about the same rates. However, the proportion of minority GS-14s would increase somewhat (by 1.5 percentage points) and the proportion of both grades represented by white and minority women will also increase. Moreover, if recent promotion trends to GS-15 and GS-14 continue, marginal gains by almost all of the racial and ethnic groups would result. Our simulation shows that significant numbers of current minority GS-15s and GS-14s will be employed through fiscal year 2007, and coupled with our projection of promotions, shows there will be substantial numbers of minorities at both the GS-15 (8,957) and GS-14 (15,672) levels, meaning that a sufficient number of minority candidates for appointment to the SES should be available. With respect to gender, the percentage of white women at GS-15 is projected to increase by 2.6 percentage points to 22 percent and at GS-14 by 0.9 percentage point to 23.5 percent. The proportions of minority women will increase by 0.9 percentage point to 6.5 percent for GS-15s and 0.5 percentage point to 8.1 percent for GS-14s, while those for minority men will increase 0.8 percentage point to 10.8 percent for GS-15s and 0.5 percentage point to 10.7 percent for GS-14s. At 60.6 percent, white men will represent 4.2 percentage points less of GS-15s and, at 57.5 percent, 2.1 percentage points less of GS-14s than in fiscal year 2001. Again, our estimates for the GS-15 and GS-14 populations at individual agencies are likely to be less precise than our governmentwide figures because of the smaller numbers involved but should be indicative of what agency profiles would look like in October 2007. During fiscal years 2001 through 2007, the wave of near-term retirements and normal attrition for other reasons presents the federal government with the challenge and opportunity to replace over half of its career SES corps. The response to this challenge and opportunity will have enormous implications for the government's ability to transform itself to carry out its current and future responsibilities rather than simply to recreate the existing organizational structures. With respect to the challenge, the federal government and governments around the world are faced with losses that have a direct impact on leadership continuity, institutional knowledge, and expertise. Focusing on succession planning, especially at the senior levels, and developing strategies that will help ensure that the SES corps reflects diversity will be important. We have gained insights about selected succession planning and management practices used by other countries that may be instrumental for U.S. agencies as they adopt succession planning and management strategies. We found that leading organizations engage in broad, integrated succession planning and management efforts that focus on strengthening both current and future organizational capacity. As part of this approach, these organizations identify, develop, and select their people to ensure an ongoing supply of successors who are the right people, with the right skills, at the right time for leadership and other key positions. Succession planning is also tied to the federal government's opportunity to change the diversity of the SES corps through new appointments. Leading organizations recognize that diversity can be an organizational strength that contributes to achieving results. By incorporating diversity program activities and objectives into agency succession planning, agencies can help ensure that the SES corps is staffed with the best and brightest talent available regardless of gender, race, or ethnicity. As stated earlier, the succession pool of candidates from the GS-15 and GS-14 levels should have significant numbers of minority candidates to fill new appointments to the SES. It will be important to identify and nurture talent from this workforce and other levels in agencies early in their careers. Development programs that identify and prepare individuals for increased leadership and managerial responsibilities will be critical in allowing these individuals to successfully compete for admission to the candidate pool for the next level in the organization. Succession planning and management is starting to receive increased attention from the Office of Management and Budget (OMB) and OPM, and we have also seen a positive response from these leadership agencies in developing and implementing programs that promote diversity. In commenting on our January 2003 report, OPM concurred with our findings on SES attrition and diversity and said it welcomed the attention the report brings to a critical opportunity facing the federal workforce and federal hiring officials. The Director said that increasing diversity in the executive ranks continues to be a top priority for OPM and that the agency has been proactive in its efforts to help federal agencies obtain and retain a diverse workforce, particularly in the senior ranks.Both OPM and EEOC said that our analysis was an accurate reflection of the likely future composition of the career SES if recent patterns of selection and attrition continue. EEOC expressed concern about the trends suggested by our analyses to the extent that they may point to the presence of arbitrary barriers that limit qualified members of any group from advancing into the SES. EEOC also stated that in the years ahead, federal agencies will need to continue their vigilance in ensuring a level playing field for all federal workers and should explore proactive strategies, such as succession planning and SES development and mentoring programs for midlevel employees, to ensure a diverse group of highly qualified candidates for SES positions. Other federal agencies told us that they also have leadership development programs in place or are establishing agencywide human capital planning and executive succession programs, which include diversity as an element. They also told us that holding executives accountable for building a diverse workforce was an element in their performance evaluation for agency executives. Continued leadership from these agencies, coupled with a strong commitment from agency management, will go a long way toward ensuring the diversity of senior leadership. Chairwoman Davis and Members of the Subcommittee, this concludes my prepared statement. I would be pleased to answer any questions you may have. For further information, please contact George H. Stalcup on (202) 512- 9490 or at [email protected]. Individuals making key contributions to this testimony include Steven Berke, Anthony Lofaro, Belva Martin, and Walter Reed. This is a work of the U.S. government and is not subject to copyright protection in the United States. It may be reproduced and distributed in its entirety without further permission from GAO. However, because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. | The federal government faces large losses in its Senior Executive Service (SES), primarily through retirement but also because of other normal attrition. This presents the government with substantial challenges to ensuring an able management cadre and also provides opportunities to affect the composition of the SES. In a January 2003 report, GAO-03-34 , GAO estimated the number of SES members who would actually leave service through fiscal year 2007 and reviewed the implications for diversity, as defined by gender, race, and ethnicity of the estimated losses. Specifically, GAO estimated by gender, race, and ethnicity the number of members of the career SES who will leave government service from October 1, 2000, through September 30, 2007, and what the profile of the SES will be if appointment trends do not change. GAO made the same estimates for the pool of GS-15s and GS-14s, from whose ranks the vast majority of replacements for departing SES members come, to ascertain the likely composition of that pool. More than half of the 6,100 career SES members employed on October 1, 2000, will have left service by October 1, 2007. Using recent SES appointment trends, the only significant changes in diversity would be an increase in the number of white women and an essentially equal decrease in white men. The percentage of GS-15s and GS-14s projected to leave would be lower (47 percent and 34 percent, respectively), and we project that the number of minorities still in the GS-15 and GS-14 workforce would provide agencies sufficient opportunity to select minority members for the SES. Estimates showed substantial variation in the proportion of SES minorities leaving between 24 large agencies and in the effect on those agencies' gender, racial, and ethnic profiles. Minority representation at 10 agencies would decrease and at 12 would increase. Agencies have an opportunity to affect SES replacement trends by developing succession strategies that help achieve a diverse workforce. Along with constructive agency leadership, these strategies could generate a pool of well-prepared women and minorities to boost the diversity of the SES ranks. | 2,588 | 442 |
ICE has not developed and implemented a process to identify and analyze program risks since assuming responsibility for SEVP in 2003, making it difficult for ICE to determine the potential security and fraud risks across the more than 10,000 SEVP-certified schools and to identify actions that could help mitigate these risks. SEVP and CTCEU officials expressed concerns about the security and fraud risks posed by schools that do not comply with program requirements. Furthermore, various cases of school fraud have demonstrated vulnerabilities in the management and oversight of SEVP-certified schools. We reported that SEVP faces two primary challenges to identifying and assessing risks posed by schools: (1) it does not evaluate program data on prior and suspected instances of school fraud and noncompliance, and (2) it does not obtain and assess information from CTCEU and ICE field office school investigations and outreach events. Evaluating SEVP information on prior and suspected cases of school noncompliance and fraud. SEVP does not have a process to evaluate prior and suspected cases of school fraud and noncompliance to identify lessons learned from such cases, which could help it better identify and assess program risks. SEVP has maintained a compliance case log since 2005--a list of approximately 172 schools (as of December 2011) that officials have determined to be potentially noncompliant with program requirements. The compliance case log represents those schools that SEVP, on the basis of leads and out-of- cycle reviews, is monitoring for potential noncompliance. According to SEVP officials, it has not used this list to identify and evaluate possible trends in schools' noncompliance, although this list could provide useful insights to SEVP to assess programwide risks. Further, SEVP officials said that they have not looked across previous cases of school fraud and school withdrawals to identify lessons learned on program vulnerabilities and opportunities to strengthen internal controls. Our analysis indicates that there are patterns in the noncompliant schools, such as the type of school. For example, of the 172 postsecondary institutions on SEVP's December 2011 compliance case log, about 83 percent (or 142) offer language, religious, or flight studies, with language schools representing the highest proportion. Without an evaluation of prior and suspected cases of school fraud and noncompliance, ICE is not well positioned to identify and apply lessons learned from prior school fraud cases, which could help it identify and mitigate program risks going forward. Obtaining information from CTCEU and ICE field offices' investigations and outreach efforts. Based on our interviews with SEVP's Director and other senior officials, we reported that SEVP had not established a process to obtain lessons learned information from CTCEU's criminal investigators. Investigators may have valuable knowledge from working cases of school fraud in identifying and assessing program risks, including information such as characteristics of schools that commit fraud, how school officials exploited weaknesses in the school certification process, and what actions ICE could take to strengthen internal controls. For example, according to investigators in one ICE field office, CTCEU was hampered in pursuing a criminal investigation because SEVP officials did not obtain a signed attestation statement within the I-17 application from a school official stating that the official agreed to comply with rules and regulations. Another risk area we reported on is designated school officials' access to SEVIS. In 2011, CTCEU provided SEVP officials with a position paper expressing concerns that designated school officials, who are not required to undergo security background checks, are responsible for maintaining updated information on foreign students in SEVIS. Investigators at three of the eight field offices we interviewed said that SEVP allowed designated school officials to maintain SEVIS access and the ability to modify records in the system while being the subject of an ongoing criminal investigation, despite requests from CTCEU to terminate SEVIS access for these officials. In addition, CTCEU collects data on its outreach efforts with schools through its Campus Sentinel program; however, the SEVP Director stated that his office had not obtained and analyzed reports on the results of these visits. CTCEU initiated Campus Sentinel in 2011, which ICE operates across all of its field offices nationwide.conducted 314 outreach visits to schools. According to CTCEU investigators, these visits provide an opportunity to identify potential risks, including whether schools have the capacity and resources to support From October 1, 2011, through March 6, 2012, CTCEU programs for foreign students. Obtaining information on lessons learned from CTCEU investigators could help provide SEVP with additional insights on such issues as characteristics of schools that have committed fraud and the nature of those schools' fraudulent activities. To address these issues, we recommended that ICE develop and implement a process to identify and assess risks in SEVP, including evaluating prior and suspected cases of school noncompliance and fraud to identify potential trends, and obtaining and assessing information from CTCEU and ICE field office investigative and outreach efforts. DHS concurred and stated that ICE will develop and implement such a process by later this year. ICE has not consistently implemented existing internal control procedures for SEVP in four areas: (1) initial verification of evidence submitted in lieu of accreditation, (2) recordkeeping to ensure schools' continued eligibility, (3) ongoing compliance monitoring of school licensing and accreditation status, and (4) certification of schools offering flight training. Regulations require schools to establish that they are legitimate and meet other eligibility criteria for their programs to obtain certification from ICE. In addition, weaknesses in managing and sharing key information with CTCEU impede SEVP's prevention and detection of school fraud. The following summarizes these key findings and recommendations we made to address these issues. Initial verification of evidence submitted in lieu of accreditation. ICE requires schools to present evidence demonstrating that the school is legitimate and is an established institution of learning or other recognized place of study, among other things. Non-accredited, post-secondary schools, in particular, must provide "in lieu of" letters, which are evidence provided by petitioning schools in lieu of accreditation by a Department of Education-recognized accrediting agency. ICE policy and guidance require that SEVP adjudicators render an approval or denial of schools' petitions based on such evidence and supporting documentation. This includes verifying that schools' claims in the Form I-17, such as accreditation status and "in lieu of" letters, are accurate. However, SEVP adjudicators have not verified all "in lieu of" letters submitted to ICE by the approximately 1,250 non-accredited, post-secondary schools, as required by ICE's policy. Rather, adjudicators decide whether to verify a letter's source and the signatory authority of the signee based on any suspicions of the letters' validity. Investigators at one of the eight ICE field offices we interviewed stated SEVP officials certified at least one illegitimate school--Tri-Valley University in California--because the program had not verified the evidence provided in the initial petition. In March 2012, CTCEU officials stated that several of their ongoing investigations involve schools that provided fraudulent evidence of accreditation or evidence in lieu of accreditation to ICE. Consistent verification of these letters could help ICE ensure that schools are legitimate and detect potential fraud early in the certification process. We recommended that ICE consistently implement procedures for ensuring schools' eligibility, including consistently verifying "in lieu of" letters. DHS agreed and stated that SEVP personnel have initiated mandatory verification of all "in lieu of" letters. Recordkeeping to ensure continued eligibility of schools. ICE's standard operating procedures for recordkeeping require SEVP officials to maintain records to document ongoing compliance. We reported that ICE had not consistently maintained certain evidence of selected schools' eligibility for the program. According to our review of a stratified random sample of 50 SEVP-certified school case files, 30 files did not contain at least one piece of evidence required by the program's policies and procedures. In addition, ICE was unable to produce two schools' case files that we requested as part of our randomly selected sample. Without the schools' information and evidence contained in these case files, including attestation statements and site visit reports, ICE does not have an institutional record to provide reasonable assurance that these schools were initially and continue to be legitimate and eligible for certification. According to ICE officials, the school recertification process would help address issues with incomplete and missing school files because schools are required to resubmit all evidence required by regulation when going through recertification. The Border Security Act required recertification for all SEVP-certified schools by May 2004 and every 2 years thereafter. However, ICE began the first recertification cycle in May 2010 and did not recertify all schools during this 2-year cycle, which ended in May 2012. As of March 31, 2012, ICE reported to have recertified 1,870 schools (approximately 19 percent of SEVP-certified schools). Given the delays in completing recertification, ICE is not positioned to address gaps in SEVP's case files and cannot provide reasonable assurance that schools that were initially certified to accept foreign students are still compliant with SEVP regulations. Thus, we recommended that ICE establish a process to identify and address all missing school case files, including obtaining required documentation for schools whose case files are missing evidence. DHS concurred and stated that SEVP plans to work with ICE Records Management to develop protocols and actions to strengthen records management. Ongoing compliance monitoring of school licensing and accreditation status. ICE does not have a process to monitor the ongoing eligibility of licensed and accredited, non-language schools enrolling foreign students. ICE regulations require all certified schools to maintain state licensing (or exemption) and provide various forms of evidence to ICE supporting schools' legitimacy and eligibility. If a school loses its state license, the school would be unable to operate legally as a school within that state. However, ICE does not have controls to ensure that SEVP compliance unit officials would be aware of this issue; therefore, a school without a proper business license may remain certified to enroll foreign students and its designated school officials may continue to access SEVIS. We recommended that ICE develop and implement a process to monitor state licensing and accreditation status of all SEVP- certified schools. DHS concurred and stated that SEVP personnel are developing procedures to ensure frequent validation of license or accreditation information. Certification of schools offering flight training. ICE's policies and procedures require flight schools to have Federal Aviation Administration (FAA) Part 141 or 142 certification to be eligible for SEVP certification; however, ICE has certified schools offering flight training without such FAA certifications. As the federal agency responsible for regulating safety of civil aviation in the United States, FAA administers pilot certification (licensing) and conducts safety oversight of pilot training. FAA's regulations for pilot training and certification are found in three parts-- Parts 61, 141, and 142. ICE established a policy that requires Part 141 and 142 for eligibility in SEVP because FAA directly oversees these flight schools and training centers on an ongoing basis. We reported identifying 434 SEVP-certified schools that, as of December 2011, offer flight training to foreign students. However, 167 (38 percent) of these flight training providers do not have FAA Part 141 or 142 certification. SEVP senior officials acknowledged that all SEVP-certified schools offering flight training do not have FAA Part 141 or 142 certification even though the program requires it. ICE indicated that in most of the cases, it may have initially certified flight schools with Part 141 or 142 certification but the schools allowed their FAA certification to expire, and ICE did not identify or take compliance action against them. ICE is taking actions to address noncompliant flight schools as of May 2012, including notifying all SEVP-certified schools that do not have the required FAA certification that they must re-obtain the certification. Moreover, SEVP officials stated that they plan to coordinate with FAA to determine which schools have not met the requirements and will take withdrawal actions against them. While these are positive steps, we reported that SEVP had not yet established target time frames for implementing and completing these planned actions. Because ICE has certified or maintained certification of schools that provide flight training without the required FAA certification and oversight, the program is vulnerable to security and fraud risks. Thus, we recommended that ICE establish target time frames for notifying SEVP-certified flight schools that do not have the required FAA certification that they must re-obtain FAA certification. DHS concurred and stated that SEVP is consulting with FAA to develop target time frames. Coordination among SEVP, CTCEU, and ICE field offices. ICE has not consistently followed the standard operating procedures that govern the communication and coordination process among SEVP, CTCEU, and ICE field offices. Specifically, these procedures delineate roles and responsibilities for criminal investigations and establish protocols for SEVP taking administrative actions against schools during and following a criminal investigation. In some instances, SEVP management has not followed CTCEU requests to take or cease administrative actions and has not referred potentially criminal cases to CTCEU in accordance with ICE's procedures. By strengthening coordination and communication between SEVP and CTCEU, ICE could better ensure that SEVP, CTCEU, and ICE field offices understand which information to share regarding whether to take administrative actions during criminal investigations and that clear criteria exist for referring cases from CTCEU based upon potentially criminal behavior. Thus, we recommended that ICE revise its standard operating procedure to specify which information to share among stakeholders during criminal investigations. DHS concurred and stated that SEVP will work with CTCEU and ICE field personnel to make the necessary revisions. We also recommended that ICE establish criteria for referring cases of a potentially criminal nature from SEVP to CTCEU. ICE agreed and stated that SEVP will work with CTCEU to improve this process. Chairman Schumer, Ranking Member Cornyn, and members of the subcommittee, this concludes my prepared statement. I would be pleased to answer any questions that you may have at this time. For further information regarding this testimony, please contact Rebecca Gambler at (202) 512-8777, or [email protected]. In addition, contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this statement. Individuals who made key contributions to this testimony are Kathryn Bernet, Assistant Director; Frances Cook; Elizabeth Dunn; Anthony C. Fernandez; David Greyer; and, Lara Miklozek. This is a work of the U.S. government and is not subject to copyright protection in the United States. The published product may be reproduced and distributed in its entirety without further permission from GAO. However, because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. | This testimony discusses the findings from our June 2012 report assessing U.S. Immigration and Customs Enforcement's (ICE) oversight of the Student and Exchange Visitor Program (SEVP). ICE, within the Department of Homeland Security (DHS), is responsible for managing SEVP, including ensuring that foreign students studying in the United States comply with the terms of their admission into the country. ICE also certifies schools as authorized to accept foreign students in academic and vocational programs. As of January 2012, more than 850,000 active foreign students were enrolled at over 10,000 certified schools in the United States. In addition, ICE manages the Student and Exchange Visitor Information System (SEVIS), which assists the agency in tracking and monitoring certified schools, as well as approved students. We reported in April 2011 on the need for close monitoring and oversight of foreign students, and that some schools have attempted to exploit the immigration system by knowingly reporting that foreign students were fulfilling their visa requirements when they were not attending school or attending intermittently. Schools interested in accepting foreign students on F and M visas must petition for SEVP certification by submitting a Form I-17 to ICE. Once this certification is achieved, schools issue Forms I-20 for students, which enable them to apply for nonimmigrant student status. The Border Security Act requires DHS to confirm, every 2 years, SEVP-certified schools' continued eligibility and compliance with the program's requirements. During the initial petition and recertification processes, a school must provide ICE with evidence of its legitimacy and its eligibility, such as designated school officials' attestation statements that both the school and officials intend to comply with program rules and regulations. This testimony summarizes the key findings of our report on ICE's management of SEVP, which was publicly released last week. Like that report, this statement will address ICE's efforts to (1) identify and assess risks in SEVP, and (2) develop and implement procedures to prevent and detect fraud during the initial certification process and once schools begin accepting foreign students. In summary, we reported that ICE does not have a process to identify and assess risks posed by schools in SEVP. Specifically, SEVP (1) does not evaluate program data on prior and suspected instances of school fraud and noncompliance, and (2) does not obtain and assess information from CTCEU and ICE field office school investigations and outreach events. Moreover, weaknesses in ICE's monitoring and oversight of SEVP-certified schools contribute to security and fraud vulnerabilities. For example, ICE has not consistently implemented internal control procedures for SEVP in the initial verification of evidence submitted in lieu of accreditation. In addition, ICE has not consistently followed the standard operating procedures that govern the communication and coordination process among SEVP, CTCEU, and ICE field offices. We recommended that ICE, among other things, identify and assess program risks; consistently implement procedures for ensuring schools' eligibility; and, revise its standard operating procedure to specify which information to share among stakeholders during criminal investigations. ICE concurred with all the recommendations we made to address these challenges and has actions planned or under way to address them. | 3,228 | 662 |
Most of the military services' active and reserve components faced recruiting difficulties during the strong economic climate of the late 1990s. As a result, the services stepped up their recruiting to ensure that they would have enough recruits to fill their ranks. Recruiting efforts focus on three initiatives. First, a "sales force" of more than 15,000 recruiters, who are mostly located in the United States, recruit from the local population. Second, these recruiters have financial and other incentives that they can use to convince young adults to consider a military career. Such incentives include enlistment bonuses and college benefits. Finally, the services use advertising to raise the public's awareness of the military and help the sales force of recruiters reach the target recruiting population and generate potential leads for recruiters. This advertising can include television and radio commercials, Internet and printed advertisements, and special events. DOD believes that advertising is increasingly critical to its recruiting effort because convincing young adults to join the military is becoming more difficult. In 2001, over 70 percent of polled young adults said that they probably or definitely would not join the military, compared with 57 percent in 1976. The number of veterans is declining, which means that fewer young adults have influencers--a relative, coach, or teacher--who have past military experience. Compounding these difficulties, proportionally more high school graduates are attending college. Finally, the perception that service in the military is arduous--and possibly dangerous--can inhibit recruiting efforts. DOD believes that these factors together make the military an increasingly harder sell as a career choice and life-style option for young adults. The Office of the Secretary of Defense is responsible for establishing policy and providing oversight for the military recruiting and advertising programs of the active and reserve components. Within the Office of the Secretary of Defense, the Under Secretary for Personnel and Readiness is responsible for developing, reviewing, and analyzing recruiting policy, plans, and resource levels. The office provides policy oversight for advertising programs and coordinates them through the Joint Marketing and Advertising Committee. DOD's strategic plan for military personnel human resources emphasizes the need to recruit, motivate, and retain adequate and diverse numbers of quality recruits. DOD's recruiting and advertising programs are not centrally managed. All of the active components and some of the reserve components manage their separate advertising programs and work closely with their own contracted advertising agencies. DOD and the services believe that this decentralized approach better differentiates between the service "brands" (i.e., Army, Navy, Air Force, Marines). The Joint Advertising, Market Research, and Studies program, which is funded separately by DOD, exists to address common DOD requirements, such as conducting market research and obtaining and distributing lists of potential leads. The joint program has developed a DOD-wide advertising campaign to target the adult influencers of potential recruits, but this program had not been fully implemented at the time of our review. After most of the services experienced recruiting shortfalls in the late 1990s, DOD reviewed its advertising programs and identified opportunities for improvement. The services, except the Marine Corps, substantially revised their advertising campaigns and slogans and contracted with new advertising agencies. The services told us that their revised campaigns place them in a better position to recruit today's young adults. Currently, almost all of the services and reserve components are achieving their recruiting goals, and advertising funding has almost doubled since fiscal year 1998. The increases in funding have not been used to buy more national media, such as television commercials. Rather, the funding increases are being directed to other types of advertising, such as special events marketing and the Internet, that are intended to better reach today's young adults. Advertising funding for DOD increased from $299 million in fiscal year 1998 to $592 million in fiscal year 2003, an increase of 98 percent. Recruiting shortfalls in the late 1990s led to an examination and revision of DOD's advertising programs. The Army, Navy, and Air Force missed their recruiting quantity goals, while some of the reserve components fell short of both their quantity and quality goals. Following these recruiting shortfalls, Congress asked the Secretary of Defense to review DOD's advertising programs and make recommendations for improvements. DOD has revamped its advertising programs. The active-duty services, except for the Marine Corps, substantially revised their advertising campaigns and selected new advertising agencies as their contractors. They produced new advertising strategies and campaigns, complete with new slogans and revised television, print, and radio advertisements, along with new brand images defined by distinct logos, colors, and music. The services, in conjunction with their advertising agencies, conducted new research on young adults--their primary target market. During this period, the joint program developed an advertising campaign to target influencers of prospective recruits, as recommended in DOD's review. In addition to their overall campaigns, all of the services have specialized campaigns to target diverse segments of the young adult population. For instance, the Navy created a Web site, called El Navy, which is designed to better communicate with the Hispanic market, and the Army has specifically tailored radio advertisements to reach the African American market. The services also incorporated a greater variety of public relations and promotional activities, such as participating in job fairs and sponsoring sports car racing teams, as an integral part of their advertising programs. As shown in table 1, there are essentially nine advertising programs that are managed separately by the military services, reserve components, and the Office of the Secretary of Defense. The active services told us that they are pleased with their new advertising campaigns and agencies, and they believe that the revised and better- funded campaigns have placed them in a more competitive position to recruit young adults. The sluggish U.S. economy has also narrowed employment options and is considered to be an important factor in easing the recruiting challenge. Today, all of the active services are meeting or exceeding their overall recruiting goals. Most of the reserve components are also achieving their recruiting goals. As of June 2003, the Army National Guard was falling short of its recruiting goals because of extensive overseas deployments and the implementation of stop loss (restrictions on leaving the military). Army National Guard officials stated that they expect to meet their goals by the end of fiscal year 2003. Some reserve officers expressed concerns about the negative impact of the recent high deployment rates on future recruiting. The services, especially the reserve components, continue to face challenges in recruiting individuals with some types of specific training or skills, such as medical, legal, and construction, and they have developed some specialized advertising campaigns targeted to recruit them. Since fiscal year 1998, the services have changed how they allocate advertising funding, according to the figures provided by DOD. Grouped into three broad categories, advertising funding includes: (1) events marketing, public affairs and public relations, Internet, and other; (2) national media; and (3) direct mail and miscellaneous recruiting support. One of the categories--events marketing, public affairs and public relations, Internet, and other--has shown the greatest increase as a percentage of the total budget, nearly tripling from around 10 percent in fiscal year 1998 to 29 percent in fiscal year 2003. This increase was used partly to create and produce new advertising campaigns and strategies. Service officials told us that event marketing and public relations activities provide recruiters with greater opportunities to interact with potential recruits and supplement their national media campaigns and other methods of advertising. One example is the Army's sponsorship of a sports racing car. (See fig. 1.) Internet and Web-site recruiting have also increased significantly from fiscal year 1998 through fiscal year 2003. All of the active military services have increased the amount of advertising on the Internet and have used interactive Web sites to complement their traditional recruiting and advertising methods. The expenditures for the national media category, which includes paid television, radio, and magazine advertisements, have remained relatively constant. This means that this category's proportion of the growing total advertising budgets has actually declined. Specifically, expenditures for the national media in fiscal year 1998 were more than half of the advertising budget; currently, it represents about 40 percent. Television advertising--which offers tremendous reach to target audiences-- dominates this category. Television advertising has remained the single largest advertising expenditure: paid television is still about a quarter of the total advertising budget for all of the military components. DOD's advertising funding has nearly doubled in the years since 1998 and most of these increases occurred in the earlier years. (See fig. 2.) Total advertising funding for all of the services increased 98 percent, from $299 million in fiscal year 1998 to $592 million in fiscal year 2003. The total DOD advertising budget request to Congress for fiscal year 2004 was $592.8 million. Since fiscal year 1998, DOD's advertising funding, which is included in DOD's operation and maintenance appropriations, has increased at a significantly higher rate than the total of all of DOD's operation and maintenance funding. DOD officials cite media inflation as one reason for increased advertising funding. Inflation for some types of media, especially for television commercials, has been higher than general inflation. However, this is not the reason for all of the increases in advertising funding during this period because not all of the advertising funding is used for media advertising. For example, only about a quarter of advertising funds are currently spent to buy time to run television commercials. Growing advertising costs are only part of a rapidly increasing total investment in recruiting. The rising advertising and overall recruiting costs can be seen in the investment per enlisted recruit--an important bottom- line measure that shows the amount of money spent to enlist each recruit. Today, the services are spending almost three times as much on advertising per recruit than in fiscal year 1990. We examined data collected by DOD from the services, and it showed that the total advertising investment per enlisted recruit rose from approximately $640 to $1,900 between fiscal year 1990 and fiscal year 2003. As a proportion of the total recruiting investment, advertising has increased from 8 percent in fiscal year 1990 to 14 percent in fiscal year 2003. Bonuses and incentives to enlist have also increased substantially during this same period. The total recruiting investment per recruit increased almost 65 percent, from approximately $8,100 in fiscal year 1990 to $13,300 in fiscal year 2003. Very steep growth occurred between fiscal year 1998 and fiscal year 2002. This is shown in figure 3. The increases are not evenly distributed across the services' advertising programs. (See table 2.) The Army has the largest advertising budget, and the Army active and reserve components account for nearly half (about $295 million) of the total advertising funding. The Marine Corps, at just under $50 million, has the smallest advertising budget. The Air Force has experienced the most significant increase in funding, in part owing to the creation of its first national television campaign. The Navy's advertising funding has also increased, but this is primarily due to the addition of costs related to the Blue Angels and a program to test recruiting kiosks at public locations. DOD's Joint Advertising, Market Research, and Studies Program is responsible for (1) providing market research and studies for recruiting and (2) developing an advertising campaign to target adult influencers, such as parents, coaches, and career counselors. Currently, the joint program is conducting market research and studies and providing other support for the services' advertising programs, such as purchasing lists of high school students and recent graduates for use in mailing advertisements. In addition, the program is implementing a limited print advertising campaign targeting influencers in fiscal year 2003. The joint advertising campaign has not had consistent funding. Program managers told us that the current funding level is insufficient to fully implement the influencer advertising campaign they have developed. In past years, DOD cut funding for the joint advertising program because of concerns that the program office was not adequately executing its advertising budget. For fiscal year 2003, Congress provided the joint advertising program with less funding than DOD requested, and DOD subsequently reallocated part of the remaining joint advertising funding to a program that it considered a higher priority. DOD does not have adequate outcome measures to evaluate the effectiveness of its advertising as part of its overall recruiting effort. Effective program management requires the establishment of clear objectives and outcome measures to evaluate the program, and DOD has established neither. This has been a long-standing problem for DOD primarily because measuring the impact of advertising is inherently difficult, especially for a major life decision such as joining the military. Owing to the absence of established advertising objectives and outcome measures, DOD has not consistently collected and disseminated key information that would allow it to better assess advertising's contribution to achieving recruiting goals. This information would include public awareness of military recruiting advertising and the willingness of young adults to join the military. Rather, the services report outcome measures that focus on achieving overall recruiting goals instead of isolating the specific contribution of advertising. Without adequate information and outcome measures, the Office of the Secretary of Defense cannot satisfactorily review the services' advertising budget justifications nor can it determine the return on their advertising dollars as part of their overall recruiting investment. The Secretary of Defense is required by law to enhance the effectiveness of DOD's recruitment programs through an aggressive program of advertising and market research targeted at prospective recruits and those who may influence them. DOD guidance requires the services, by active and reserve components, to report their resource inputs--how much they are spending on advertising. DOD guidance also requires the services to report on overall recruiting outcomes--their recruit quantity and quality. However, the guidance does not require active and reserve components to report information specifically about the advertising programs' recruiting effectiveness. Effective program management requires the establishment of clearly defined objectives and outcome measures to evaluate programs. The Government Performance and Results Act was intended to help federal program managers enhance the effectiveness of their programs. It requires agencies to establish strategic plans for program activities that include, among other things, a mission statement covering major functions and operations, outcome-related goals and objectives, and a description of how these goals and objectives are to be achieved. GPRA shifted the focus of accountability for federal programs from inputs, such as staffing and resource levels, to outcomes. This requires agencies to measure the outcomes of their programs and to summarize the findings of program evaluations in their performance reports. The Office of Management and Budget's guidance implementing GPRA requires agencies to establish meaningful program objectives and identify outcome measures that compare actual program results with established program objectives. DOD does not have adequate information to measure the effectiveness of its advertising as part of the overall recruiting effort. Measuring advertising's effectiveness has been a long-standing problem, partly because it is inherently difficult to measure the impact that advertising has on recruiting. DOD has not established advertising program objectives and it lacks adequate outcome measures of the impact that advertising programs have on recruiting. Outcome measures are used to evaluate how closely a program's achievements are aligned with program objectives, and to assess whether advertising is achieving its intended outcome. DOD currently requires the services and reserve components to report on inputs and outcomes related to overall recruiting. These measures are important in assessing DOD's overall recruiting success; however, they do not assess advertising's contribution to the recruiting process. In our 2000 report, we noted that the services do not know which of their recruiting initiatives--advertising, recruiters, or bonuses--work best. This prevented DOD from being able to effectively allocate its recruiting investment among the multiple recruiting resources. We recommended that DOD and the services assess the relative success of their recruiting strategies, including how the services can create the most cost-effective mix of recruiters, enlistment bonuses, college incentives, advertising, and other recruiting tools. In comments on that report, DOD stated that it intended to develop a joint-service model that would allow trade-off analyses to determine the relative cost-effectiveness of the various recruiting resources. This has not been done, and the current DOD cost performance trade-off model does not support analyses across the services' budgets. Similarly, a 2002 Office of Management and Budget assessment, known as the Program Assessment Rating Tool, found that DOD's recruiting program had met its goal of enlisting adequate numbers of recruits; however, since there were no measures of program efficiency, the overall rating for the recruiting program was only "moderately effective." In its assessment, the Office of Management and Budget noted the inability of the recruiting program to assess the impact of individual resources, such as advertising and recruiters. The services continually adjust the mix of funding between advertising and other recruiting resources to accomplish their program goals. They have generally increased spending on advertising, added recruiters, and increased or added bonuses at the same time, making it impossible to determine the relative value of each of these initiatives. Other studies have reached similar conclusions. In 2000, a review of DOD's advertising programs resulted in a recommendation that they be evaluated for program effectiveness. More recently, the National Academy of Sciences also cited the need to evaluate advertising's direct influence on actual enlistments. The academy is now doing additional work on evaluating DOD's advertising and recruiting. The lack of adequate information is partly attributable to the inherent difficulty in measuring advertising's affect on recruiting. Measuring advertising's effectiveness is a challenge for all businesses, according to advertising experts. Private-sector organizations cannot attribute increases in sales directly to advertising because there are many other factors influencing the sale of products, such as quality, price, and the availability of similar products. Many factors impact recruiting as well, such as employment and educational opportunities, making it especially difficult to isolate and measure the effectiveness of advertising. Enlisting in a military service is a profound life decision. Typically, an enlistment is at least a 4-year commitment and can be the start of a long military career. Another complicating factor in measuring advertising's effectiveness is that it consists of different types and is employed differently throughout the recruiting process to attract and enlist potential recruits. Figure 4 displays the recruiting process and demonstrates the role of advertising while a young adult may be considering enlisting in the military. As the figure shows, the use of multiple types of advertising at various stages in the recruiting continuum makes it difficult to assess the effectiveness of specific types of advertising. A single recruit may be exposed to some or all of these advertising types. Traditional advertising in the national media, such as television and magazines, is intended to disseminate information designed to influence consumer activity in the marketplace. The services typically use such national media to make young people aware of a military service, the career options available in a service, and other opportunities the services have to offer them. Direct mail, special events, and the services' Web sites are utilized to provide more detailed information about the services and the opportunities available for persons who enlist. These marketing resources give people the opportunity to let a recruiter know they are possibly interested in enlisting in a service. Another contributing factor to the absence of advertising objectives and outcome measures is the lack of DOD-wide guidance. Officials from the Office of the Secretary of Defense view their role as overseeing the decentralized programs managed by the individual services and reserve components. They scrutinize the quality and quantity of recruits and gather data about the uses of advertising funds. However, they told us they were reluctant to be more prescriptive because of a concern about appearing to micromanage the successful recruiting programs of the active and reserve components. On the basis of our work, their sensitivity is warranted. The active and reserve components tend to guard their independence, seeking to maintain their "brand" and arguing that the current decentralized structure allows them to be more responsive to their individual needs. The Office of the Secretary of Defense seeks to coordinate the active and reserve components' activities through joint committees and to centralize research that can be utilized by all. Defining exactly what to measure may be difficult, but it is not impossible. DOD and the services, as well as their contracted advertising agencies, generally agree that there are at least two key advertising outcomes that should be measured: (1) the awareness of recruiting advertising and (2) the willingness or "propensity" to consider joining the military. However, this is not clearly stated in any program guidance. Current DOD guidance requires only that the services provide information on funding for advertising, the quality and quantity of recruits, and the allocation of resources to the various advertising categories. Although this information is valuable--in fact, critical--it is not sufficient to evaluate and isolate the effectiveness of the services' advertising programs. DOD's efforts thus far to measure the awareness of recruiting advertising and willingness to join the military have met with problems. Inconsistent funding for the Joint Advertising, Market Research, and Studies program has hampered consistent collection of this information. DOD has sponsored an advertising tracking study designed to monitor the awareness of individual service campaigns since 2001. However, officials from the Army, Navy, and Marine Corps told us that they do not regularly use the research provided by this study. According to program officials, there were numerous problems with the advertising tracking study. DOD is implementing changes to the study that are intended to improve its usefulness to all of the active and reserve components. In the absence of reliable and timely advertising tracking, the Army implemented its own tracking study, and the Air Force is currently planning an experimental study to assess the effectiveness of its national television advertising campaign, according to program managers. To monitor the willingness to join the military, DOD sponsors youth and adult polls, which are designed to track changes in attitudes and young adults' aspirations. These polls replaced the Youth Attitude Tracking Survey, which had been in place for a number of years and provided long-term trend data about the propensity of young adults to consider the military. The services expressed concern that the current polls ask questions that are significantly different from those asked in the prior survey, which makes the analysis of trends difficult. DOD officials also pointed to research indicating that advertising is a cost- effective recruiting investment when compared with other recruiting initiatives. For example, a report that was done for DOD found that it was less expensive to enlist a recruit through increased investments in advertising than through increased investments in military pay for new recruits in the Army and Navy. Similarly, a study for DOD analyzed the marginal cost of different recruiting initiatives and concluded that, under certain conditions, it was more cost-effective to invest additional funds in advertising than in military pay for recruits or recruiters. DOD officials told us that these reports, which used data from the 1980s and early 1990s, provide the best research available on the topic. However, the situation has changed dramatically in recent years. DOD has altered its advertising and recruiting strategies and is spending much more on advertising. Advertising itself is also changing and is more fragmented with an expanding array of television channels and other media. Finally, media inflation, which has increased faster than general inflation even in the sluggish economy, has lessened buying power. Funding devoted to advertising has increased considerably since fiscal year 1998. Although the military services are now generally meeting their overall recruiting goals, the question of whether the significant increases in advertising budgets were a main contributor to the services' recruiting successes remains open. During the same period, DOD also greatly increased funding for bonuses and other incentives to enlist recruits. At the same time, the U.S. economy slowed dramatically, narrowing the other employment options available to young people. These factors make it difficult to disentangle the effects of the internal DOD investments made in recruiting from the changes in the external recruiting environment. Even though the effect of advertising is inherently difficult to measure, this issue needs to be addressed. This is crucial because DOD is now spending nearly $592 million annually on recruiting advertising, or about $1,900 per enlisted recruit. In addition, the total funding for all of DOD's recruiting efforts is now almost $4 billion. DOD needs better advertising outcome measures to allow it to oversee and manage the advertising investment as part of its overall recruiting effort. DOD and the services have an understandable focus on the most important program outcome--to ensure that the military has enough quality recruits to fill its ranks. Judged by this short-term measure, the recruiting programs are successful. But now that DOD is meeting its recruiting goals, should it reduce advertising funding or continue at its current funding levels? DOD believes that continued investments in advertising are critical to keeping awareness up in the young adult population and combating the declining propensity among today's young adults to join the military. However, DOD has neither stated these goals clearly in its guidance, nor consistently gathered information to ensure that these objectives are being met. Now that it is meeting its recruiting goals, DOD needs to turn its attention to program effectiveness and efficiency to ensure that the active and reserve components are getting the best return on their recruiting and advertising investments. To improve DOD's ability to adequately measure the impact of its advertising programs on its recruiting mission, we recommend that the Secretary of Defense direct the Under Secretary of Defense for Personnel and Readiness to issue guidance that would (1) set clear, measurable objectives for DOD's advertising programs; (2) develop outcome measures for each of DOD's advertising programs that clearly link advertising program performance with these objectives; and (3) use these outcome measures to monitor the advertising programs' performance and make fact-based choices about advertising funding as part of the overall recruiting investment in the future. DOD concurred with all of our recommendations. In commenting on this report, DOD stated that the Office of the Under Secretary of Defense for Personnel and Readiness, in concert with the services, will develop an advertising strategic framework to provide overall direction for DOD's advertising programs. The framework, with associated outcome measures, would allow the office to monitor advertising results regularly and make fact-based decisions at a strategic level. It would provide an overarching structure within which each service would develop its own advertising program strategy, program objectives, and outcome measures. The framework would also direct the activities of the DOD joint program to ensure support to the services. DOD also commented that current research has not advanced to the point where models exist that adequately account for the many factors that affect recruiting as well as for the differences in the services. DOD stated that it will address this research gap through several initiatives intended to advance the measurement of the performance of recruiting and advertising. The National Academy of Sciences is currently developing an evaluation framework for recruiting and advertising and expects to publish a report in early 2004. DOD's comments are provided in their entirety in appendix II. DOD officials also provided technical comments that we have incorporated as appropriate. We are sending copies of this report to interested congressional committees; the Secretaries of Defense, the Army, the Navy, and the Air Force; and the Commandant of the Marine Corps. We will send copies to other interested parties upon request. In addition, the report will be available at no charge on the GAO Web site at http://www.gao.gov. Please contact me at (202) 512-5559 if you or your staffs have any questions regarding this report. Key contributors to this report were John Pendleton, Lori Atkinson, Nancy Benco, Kurt Burgeson, Alan Byroade, Chris Currie, LaTonya Gist, Jim McGaughey, Charles Perdue, Barry Shillito, and John Smale. To describe the changes in the Department of Defense's (DOD) advertising programs and advertising funding trends since the late 1990s, we reviewed advertising exhibits in the operation and maintenance congressional justification books as well as budget information provided by the Office of the Secretary of Defense. Since our objective was to look at broad funding trends, we did not reconcile these requested amounts with actual obligations or expenditures by the active and reserve components. We interviewed active and reserve component officials to understand program changes since the late 1990s. We obtained recruiting mission goals and actual accessions back to fiscal year 1990 from the Office of the Secretary of Defense and the services. We obtained information on the quality of accessions of each of the active and reserve components back to fiscal year 1990, as well as the investment per active enlisted accession back to fiscal year 1990. We reviewed information from the Defense Human Resources Activity and the Joint Marketing and Advertising Committee for discussions of advertising programs. The services provided additional information regarding the types of advertising media they use. To assess the adequacy of the measures used by DOD to evaluate the effectiveness of advertising, we reviewed information on outcome measures used to evaluate the effectiveness of advertising provided by each of the active and reserve components; the advertising agencies that are their contractors; and the DOD Joint Advertising, Market Research, and Studies program. We spoke with the advertising contractors to learn what measures of effectiveness they are aware of and use. We also reviewed the requirements for establishing program objectives and outcome measures in the Government Performance and Results Act and in Office of Management and Budget guidance. We interviewed DOD and advertising officials from each of the active and reserve components, as well as representatives from the services' advertising agencies. We also reviewed their programs, procedures, and oversight activities. These interviews were conducted with officials in the Office of the Under Secretary of Defense for Personnel and Readiness; Office of the Under Secretary of Defense (Comptroller/Chief Financial Officer); Defense Human Resources Activity, Joint Advertising, Market Research, and Studies Office; Army Accessions Command, Fort Knox, Kentucky; Air Force Recruiting Service, Randolph Air Force Base, Texas; Navy Recruiting Command, Millington, Tennessee; Marine Corps Recruiting Command, Quantico Marine Corps Base, Virginia; Army National Guard Recruiting and Retention Command, Arlington, Virginia; Naval Reserve Command, New Orleans, Louisiana; Air Force Reserve Command, Robins Air Force Base, Georgia; and the Air National Guard Office of Recruiting and Retention, Arlington, Virginia. We also interviewed officials at the contracted advertising agencies for the joint program, the Army, the Navy, the Marine Corps, and the Air Force. We reviewed reports on recruiting and advertising from DOD, the Congressional Research Service, the private sector, and others. We obtained recruiting advertising budget and funding data for types of advertising from the Office of the Secretary of Defense. We reviewed, but did not verify, the accuracy of the data provided by DOD. We conducted our review from October 2002 through July 2003 in accordance with generally accepted government auditing standards. Program Evaluation: Strategies for Assessing How Information Dissemination Contributes to Agency Goals. GAO-02-923. Washington, D.C.: September 30, 2002. Military Personnel: Services Need to Assess Efforts to Meet Recruiting Goals and Cut Attrition. GAO/NSIAD-00-146. Washington, D.C.: June 23, 2000. Military Personnel: First-Term Recruiting and Attrition Continue to Require Focused Attention. GAO/T-NSIAD-00-102. Washington, D.C.: February 24, 2000. Military Recruiting: DOD Could Improve Its Recruiter Selection and Incentive Systems. GAO/NSIAD-98-58. Washington, D.C.: January 30, 1998. Military Personnel: High Aggregate Personnel Levels Maintained Throughout Drawdown. GAO/NSIAD-95-97. Washington, D.C.: June 2, 1995. Military Recruiting: More Innovative Approaches Needed. GAO/NSIAD- 95-22. Washington, D.C.: December 22, 1994. Military Downsizing: Balancing Accessions and Losses Is Key to Shaping the Future Force. GAO/NSIAD-93-241. Washington, D.C.: September 30, 1993. | The Department of Defense (DOD) must convince more than 200,000 people each year to join the military. To assist in recruiting, the military services advertise on television, on radio, and in print and participate in other promotional activities. In the late 1990s, some of the services missed their overall recruiting goals. In response, DOD added recruiting resources by increasing its advertising, number of recruiters, and financial incentives. By fiscal year 2003, DOD's total recruiting budget was approaching $4 billion annually. At the request of Congress, GAO determined the changes in DOD's advertising programs and funding trends since the late 1990s and assessed the adequacy of measures used by DOD to evaluate the effectiveness of its advertising. GAO recommends that DOD set clear, measurable advertising Since the late 1990s, DOD has revamped its recruiting advertising programs and nearly doubled the funding for recruiting advertising. The military services have revised many of their advertising campaigns and focused on complementing traditional advertising, such as by increasing the use of the Internet, and participating in more promotional activities, such as sports car racing events. DOD's total advertising funding increased 98 percent in constant dollars from fiscal year 1998 through fiscal year 2003--from $299 million to $592 million. The advertising cost per enlisted recruit has nearly tripled and is now almost $1,900. The military services agree that the revised strategies and increased investments have energized their advertising campaigns and better positioned them to recruit in an increasingly competitive marketplace. Today, almost all of the active and reserve components are meeting their overall recruiting goals in terms of the quality and quantity of new recruits. DOD does not have clear program objectives and adequate outcome measures to evaluate the effectiveness of its advertising as part of its overall recruiting effort. Thus, DOD cannot show that its increased advertising efforts have been a key reason for its overall recruiting success. Isolating the impact of advertising on recruiting efforts is inherently difficult because joining the military is a profound life decision. Moreover, DOD has not consistently tracked key information, such as public awareness of military recruiting advertising and the willingness of young adults to join the military. Such data could be used to help evaluate the effectiveness of advertising. Without sufficient information on advertising's effectiveness, DOD cannot determine the return on its advertising funding or make fact-based choices on how its overall recruiting investments should be allocated. | 6,783 | 499 |
Health care providers submit claims to the Medicare program in order to receive payment for services provided to beneficiaries. Financial limits known as therapy caps are one tool used to better manage spending on outpatient therapy services. Congress directed CMS, beginning in 2006, to establish an exceptions process for beneficiaries in need of services above the therapy caps. Since the program was created in 1965, CMS has administered Medicare through private contractors, currently known as MACs. The MACs are responsible for reviewing and paying claims in accordance with Medicare policy, and conducting provider outreach and education on correct billing practices. The MACs process more than 1.2 billion claims per year (the equivalent of 4.5 million claims per work day). The MACs use electronic payment systems, and they transfer any claims submitted on paper into electronic format for processing. The computer systems that the MACs use for processing and paying claims execute automated prepayment "edits," which are instructions programmed into the system software to identify errors on individual claims and to prevent payment of incomplete or incorrect claims. The system edits also help ensure that payments are made only for claims submitted by appropriate providers for medically necessary goods or services covered by Medicare for eligible individuals. Edits may result in automatic rejection of claims due to missing information or data errors, or in payment denial for ineligible services. In addition to this automated process, the MACs may conduct MMRs when they are unable to determine whether the services provided were medically necessary on the basis of the information on the claim. The MACs solicit documentation of medical necessity from the provider by issuing an additional documentation request (ADR) for the medical records associated with a service; providers are required to submit the records to the MACs within 45 days. Upon receipt, the MMRs are performed within 60 days by licensed health care professionals. Providers and beneficiaries may appeal denials of services that are based on these reviews. Manual reviews can be conducted either before or after a claim is paid and are referred to as prepayment or postpayment reviews, respectively. CMS reports that although the MACs have the authority to review any claim at any time, the volume of claims prohibits manual review of most claims. In general, CMS directs the MACs to focus their MMRs on program integrity efforts targeting payment errors for services and items that pose the greatest financial risk to the Medicare program. We have previously reported that, overall, less than 1 percent of Medicare's claims are subject to medical record review by trained personnel. Medicare spending for outpatient therapy has increased from $1.3 billion in 1999 to $5.7 billion in 2011. (See fig. 1.) During this 12-year period, mean per user spending on outpatient therapy grew threefold from about $400 to almost $1,200. In 2011, about 80 percent of the 4.9 million Medicare beneficiaries who used OT and PT/SLP did not exceed the annual cap of $1,870. Twenty percent of the Medicare beneficiaries using outpatient therapy (about 980,000 individuals) exceeded the cap that year and spent, on average, $3,000 on outpatient therapy. Therapy provided in nursing homes and private practice offices accounted for over 70 percent of outpatient therapy services in 2011, with the remaining services being provided in hospital outpatient departments and outpatient rehabilitation centers, and by home health agencies. In addition, studies have found that utilization of outpatient therapy services is not evenly distributed across the country. For example, in 2010, the HHS OIG reported on 20 counties with spending per beneficiary 72 percent higher than the national average. MedPAC's analysis of outpatient therapy claims data from 2011 showed that average per- beneficiary spending varied widely by county, ranging from $406 to $3,582. The therapy caps that were first imposed in 1999 to control spending growth raised concern that patients with extensive need of outpatient therapy services would be affected adversely, and the caps were only in effect in 1999 and for part of 2003 due to a series of temporary congressional moratoria. As implemented in 2006, when the moratoria expired, Congress required CMS to implement a process to allow exceptions to the caps for certain medically necessary services. This exceptions process allowed for two types of exceptions. The first was an automatic exception for certain conditions or complexities, such as hip and knee replacements. The second--called a manual exceptions process by CMS-- was a preapproval process whereby a provider could submit a letter and supporting documentation requesting an exception-- called a preapproval request--for up to 15 days of treatment above the annual cap, which would be manually reviewed by the MAC. If the services qualified for either an automatic or a manual exception, CMS guidance instructed the provider to include a "KX" modifier on each line of the resulting claim that contained a service above the cap. This modifier represented the provider's attestation that the services rendered were medically necessary, and it triggered an exception in the Medicare claims processing system, which ensured payment for those outpatient therapy services above the cap. An automatic exceptions process for claims with a KX modifier was extended through 2012 for claims over the annual cap of $1,880, with manual reviews required for claims above the threshold of $3,700. The American Taxpayer Relief Act of 2012 extended the Medicare therapy caps exceptions process, including the requirement for the manual review of claims over $3,700, through December 31, 2013. According to CMS, in 2012, claims for services above the $1,880 cap without a KX modifier or above the $3,700 threshold were considered a benefit category denial, making the beneficiary liable for payment. To protect beneficiaries from unexpected liability for payment of denied claims above the threshold, CMS gave providers the option to send beneficiaries an Advance Beneficiary Notice of Noncoverage (ABN) informing them that Medicare might not pay for an item or service and that they might be liable for payment. An ABN enables the beneficiary to make an informed decision about whether to get services and accept financial responsibility for those services if Medicare does not pay. CMS implemented two types of MMRs during the last 3 months of 2012-- reviews of preapproval requests and reviews of claims submitted without preapproval. CMS did not issue complete guidance at the start of the MMR process, causing implementation challenges for the MACs, and the MACs were unable to fully automate systems for tracking the reviews of preapproval requests in the time allotted. CMS implemented two types of MMRs during the last 3 months of 2012-- reviews of preapproval requests and reviews of claims submitted without preapproval. First, CMS directed the MACs to manually review preapproval requests for outpatient therapy services above the $3,700 threshold--one for OT and one for PT/SLP combined--before the services were provided. Providers were permitted to request up to 20 days of treatment up to 15 days before providing medically necessary outpatient therapy services above $3,700. In contrast to the MMR process as implemented in 2006, CMS guidance did not allow any automatic exceptions for certain conditions; the MACs had to manually review preapproval requests for any services above $3,700. CMS officials told us that they included the preapproval request process in 2012 in order to help protect beneficiaries from being held liable for payment of claims not affirmed by the MMR, as the process would give the provider and beneficiary guidance as to whether the MACs would affirm or not affirm payment for the requested outpatient therapy services. In order to manage the expected volume of preapproval requests submitted to MACs at the start of the MMR process, CMS divided all outpatient therapy providers among three phases, based primarily on their past billing practices. Providers were instructed to submit preapproval requests during their assigned phase. CMS assigned providers with the highest average billing per patient for outpatient therapy services in 2011 to the first phase, which began on October 1, 2012. According to CMS, these high billers accounted for approximately 25 percent of all outpatient therapy providers and were subject to MMR for the full 3 months of the MMR process during 2012. The second phase began on November 1 and included providers with the next highest billing (also about 25 percent of the total number of providers). The third phase, which included the remaining 50 percent of outpatient therapy providers, generally the lowest billers, began on December 1. CMS officials explained that providers with historically low billing were less likely to have patients who would reach the threshold. CMS also included providers identified by law enforcement or the HHS OIG as being involved in active fraud investigations in the third phase. CMS officials stated that they did not include these providers until the third phase to avoid conflicts with ongoing investigations. As of December 1, all outpatient therapy providers were included in the MMR process. CMS notified providers about the preapproval request process and assignment of phases by letter and provided further information through three conference calls and additional agency communications. CMS instructed providers to submit preapproval requests by mail or fax, including key information such as provider and beneficiary identification numbers as well as supporting documentation including treatment notes and progress reports. CMS also instructed the MACs to post guidelines on their websites to educate providers about these requirements. In addition, CMS sent letters in mid-September 2012 to all Medicare beneficiaries who had received therapy services totaling over $1,700 by that date informing them that they might have to pay for services over the cap should the MACs determine that the services were not medically necessary. To expedite the preapproval process, CMS instructed the MACs to review preapproval requests within 10 business days of receipt of all requested documentation to determine whether the services were medically necessary. After reviewing the requests, the MACs were required to notify providers and beneficiaries of the number of treatment days affirmed or provide detailed reasons for not affirming a request. In addition, CMS instructed the MACs to automatically approve any requests they were unable to review within 10 business days. The MACs had to inform providers of their decisions by telephone, fax, or letter, and postmark all letters by the 10th day after receipt of all requested documentation. Providers were allowed to resubmit nonaffirmed requests with additional documentation for consideration by the MAC, at which point the MAC would have another 10 days within which to review the new request. (See fig. 2.) Second, CMS instructed the MACs to develop a mechanism for tracking preapproval requests in order to match the requests with submitted claims. Because preapproval requests were received by fax or mail, not through the automated claims payment systems, the MACs had to manually match the claim with the corresponding preapproval request. If the services included on the claim matched those affirmed during the preapproval process, the MAC would pay the claim; if not, the MAC would issue an ADR for the medical records associated with the services and conduct further manual review, which could extend the review process more than 3 months. The MACs were also required to manually review submitted claims before providing payment for therapy services provided above $3,700 without a preapproval request. Effective for dates of service on or after October 1, 2012, CMS required the MACs to implement an edit in part of the claims processing system to stop claims that reached the $3,700 threshold and to trigger MMRs by the MACs. To manually review claims without preapprovals, the MACs requested and reviewed supporting documentation from providers to determine whether the services were medically necessary. As with typical prepayment manual reviews, providers had 45 days to provide documentation of medical necessity, and the MACs had 60 days to review the supporting materials and notify providers and beneficiaries of their decisions. (See fig. 2.) If a MAC requested additional documentation, the review time frames would begin again. In contrast to preapproval request decisions, the MACs' claims payment systems automatically send letters notifying providers and beneficiaries of payment determinations. The MACs did not receive complete CMS guidance before the start of the 3-month MMR process regarding how the MACs should manage incomplete preapproval requests, how they should count the 10-day review time frame, and how they should handle preapproval requests received in the wrong phase. In addition, the MACs did not have enough time to fully automate systems for tracking and processing preapproval requests before the start of the MMR process. CMS did not issue complete guidance at the start of the MMR effort and changed the process throughout the 3-month period, which created implementation challenges. CMS provided instruction to the MACs through various forms of written guidance, as well as twice-weekly conference calls beginning in August 2012. However, CMS did not issue instructions on how the MACs should conduct MMRs of preapproval requests until August 31, 2012. The MACs we interviewed stated that receiving this guidance 1 month before the October 1st start of the MMR process made it difficult for them to adequately prepare and establish systems for reviews of preapproval requests. For example, one MAC said that because of the short turnaround time for implementation, it was not prepared for the high volume of preapproval requests received in the early weeks of the process, which caused it to approve requests without reviewing them. Another stated that it could have better managed the volume of preapproval requests received if it had more time to develop needed support systems. This late guidance also made it difficult for the MACs to train temporary staff assigned to the MMR process in a timely way; two MACs noted that they were still training temporary staff in October, after the start of the process, and one added that this made it difficult to manage the volume of preapproval requests received in October. Further, CMS did not provide guidance on how the MACs should process incomplete preapproval requests, which accounted for approximately 23 percent of the total requests submitted, until November 7, 2012. CMS officials told us they did not initially issue such guidance because they did not anticipate receiving a high volume of incomplete submissions. As a result, the MACs handled incomplete requests in different ways. For example, one MAC held incomplete requests--as many as several thousand--as pending without making a determination or providing a response to providers and beneficiaries within 10 business days. Another initially determined that incomplete requests would be rejected and returned to the provider for additional information. In addition, CMS did not initially issue clear instruction about how the MACs were to count the 10-day time frame for provider and beneficiary notification, which may have caused notification delays. CMS initially instructed the MACs to make decisions on preapproval requests and inform providers and beneficiaries of their decisions within 10 business days of receipt of all requested documentation, and to automatically approve requests they were unable to review within 10 days. The MACs stated that they were unclear, however, about how to count the 10-day time frame. On November 7, 2012, CMS clarified that the count was to begin on the day the MAC received the preapproval request in its mailroom, not in its MMR department. The MACs we interviewed stated that they received a large volume of requests per day--at times several hundred. In addition, two noted that providers often sent in additional supporting documentation for prior requests, which added to the volume of paper files the MACs had to manage and may have created a further lag between when the complete requests were received and when the paperwork was given to MMR staff for review. Before CMS issued this clarifying guidance, providers and beneficiaries may have experienced a longer wait time than expected if a MAC counted the 10 days beginning when the MMR department, rather than the mailroom, received the completed requests. Finally, CMS did not initially provide the MACs with instructions about how to handle preapproval requests and claims submitted in the wrong phase. In its written guidance issued on August 31, 2012, CMS instructed the MACs that they should not review preapproval requests any sooner than 15 days before the start of each phase for providers within that phase, but did not clarify whether requests received out of phase should be rejected and returned to providers, not affirmed, or held as pending until the start of the phase. As a result, one of the MACs we spoke with stated that it initially held requests received out of phase to be processed in the correct phase, but later in the process began rejecting such requests. CMS and the three MACs we interviewed reported challenges with processing preapproval requests because they were not able to fully automate systems to receive and track them in the time allotted. MACs typically conduct either prepayment or postpayment reviews after claims have been submitted; they do not typically receive or conduct MMRs of preapproval requests before the provision of services. All three MACs interviewed told us that MMRs of preapproval requests were more time- consuming and cumbersome because they had to process them outside of their claims processing systems. In addition, all three MACs told us they suspended some of their other medical review efforts in order to implement the mandated outpatient therapy MMRs. For example, the MACs we interviewed explained that they typically use automated edits in their claims processing systems to flag claims for prepayment review in areas identified to be at higher risk for improper payments, such as certain billing codes or service areas, but told us they turned off some other outpatient therapy edits while conducting the mandated MMRs. All three MACs interviewed said that it was difficult to develop fully automated systems for processing preapproval requests at the start of the 3-month process. Two noted that they would have required several months to develop the type of automated systems that, integrated with their regular claims processing systems, would have enhanced the efficiency and accuracy of their MMR efforts. However, CMS did not issue written guidance until August 31, 2012, instructing the MACs to develop processes for receiving and tracking preapproval requests. The MACs we interviewed adapted their systems to manage the preapproval process in different ways with varying degrees of automation. Two of the MACs received requests by fax, scanned the requests and supporting documents, and saved them electronically by date or other identification numbers for tracking. One of these MACs also developed a database in which it manually entered and tracked its MMR decisions, which MAC staff then manually searched to match with submitted claims. The other, however, stated that it did not have time to establish such a database, and conducted reviews without any automation. A third MAC received all requests by mail and developed a database in which it entered its preapproval decisions. This MAC also developed an electronic edit in its claims processing system that tracked incoming therapy claims so they could be processed according to the preapproval decision. Though this MAC was able to automate this step in the preapproval process, staff explained that they were still in the process of testing the edit after the start of the MMR process, and continued to address system errors until December. CMS officials estimate that preapproval requests and claims for over 115,000 Medicare beneficiaries were subject to approximately 167,000 MMRs conducted by the MACs as of March 1, 2013. Delays in claim submissions and pending appeals create uncertainty about the final outcomes of the 2012 MMR process. CMS staff estimated that the MACs manually reviewed more than 167,000 preapproval requests and claims without preapprovals for outpatient therapy from October 1, 2012, through December 31, 2012, affecting more than 115,000 Medicare beneficiaries. Of these MMRs, an and 57,000 were for estimated 110,000 were for preapproval requestsclaims for services that were not preapproved. Of the estimated 110,000 preapproval requests reviewed, the MACs affirmed 80,500 (73 percent) and did not affirm 29,500 (27 percent). As of March 1, 2013, providers who did not request preapprovals submitted an estimated 57,000 claims for outpatient therapy services provided during the last quarter of 2012. The results of the MMRs of claims without preapprovals resulted in 19,500 (34 percent) claims affirmed for payment and 37,000 claims (66 percent) not affirmed for payment. These estimates indicate that MMRs of both preapproval requests and claims resulted in a number of nonaffirmed outpatient therapy services during the last quarter of 2012 (see fig. 3). Both CMS officials and MAC staff acknowledged that the MACs were not able to process all the preapprovals submitted in a timely manner. The MACs do not usually conduct preapprovals of services, and the MACs stated that the high volume of preapproval requests outpaced the capacity of the MACs to review them. For example, the MACs we interviewed reported receiving thousands of preapproval requests by mail or fax prior to the start of the MMRs. By mid-October 2012, the MACs estimated they had received 46,000 preapproval requests for outpatient therapy services above the $3,700 threshold. In addition, the MACs rejected about 23 percent of all preapproval requests because they were incomplete. Incomplete requests could be resubmitted. In November 2012, on average more than 24,000 preapproval requests were categorized as having not been reviewed at the end of each of the 4 weeks. Overall, the MACs estimated they completed MMRs for about 52 percent of the total preapproval requests received within the 10 days required by CMS. (See fig. 4.) By the end of December 2012, the MACs had conducted MMRs on about 15,000 claims submitted without preapproval requests. However, the MACs were not under the same time constraints when reviewing claims because, unlike the preapproval requests, CMS guidance permits the MACs 2 months to conduct MMRs after they receive the supporting documentation. In addition, claims for therapy provided during the last quarter of 2012 were submitted incrementally, increasing from about 15,000 at the end of December to almost 57,000 by March 1, 2013. As a result, the MMRs of these claims are staggered over time. CMS officials indicated that the number of claims submitted and beneficiaries affected by these prepayment MMRs would continue to increase in 2013. Although CMS was able to estimate the results of the MMRs conducted, the final outcomes of the 2012 MMRs remain uncertain due to inconsistencies among the MACs in how the data were collected, and errors in the calculation of the number of preapproval requests received and the MMR decisions made. In addition, the time lag for submitting claims and finalizing the appeals process means that the final outcome of the MMR process will not be known for months. CMS officials told us that MACs did the "best they could" and that the final numbers provided in the MMR weekly workload report were obtained outside the MACs' computerized systems and should be considered approximate or an estimate of the results of the reviews at the time of this report. The manual processes CMS and the MACs used to complete the weekly MMR workload reports resulted in inconsistencies in the data. Both the CMS and MAC officials interviewed acknowledged that human error may have contributed to discrepancies in the reported numbers because the reports were assembled manually. In addition, due to the timing of CMS guidance throughout the MMR, the MACs reported collecting key data elements differently. For example, one MAC included the number of requests rejected in the total number of requests completed while two others did not. CMS officials also reported that they identified gaps or errors in MACs' weekly workload reports, but the agency did not require the MACs to go back to revise prior weeks' data. As a result, the running totals included errors from prior weeks and the final numbers do not total correctly. For example, the total number of treatment days that CMS estimates were requested (2.4 million) is significantly greater than the estimated total number of treatment days affirmed plus days nonaffirmed (1.9 million). The combination of potential delays in billing, the prepayment review of claims, and the appeals process also creates uncertainty about the final outcomes of the mandated MMRs associated with outpatient therapy services provided in 2012. Because claims for services provided from October 1, 2012, through December 31, 2012, may be submitted to the MACs as late as December 31, 2013, the total number of claims reviewed will not be known until 2014. In addition, CMS officials, some MAC staff, and outpatient therapy provider association representatives reported the filing of appeals for denials of payment for therapy provided during this period. The appeals process--which may involve five levels of review-- could take more than 2 years to reach a conclusion, and any reversals of prior therapy coverage denials will affect the final outcomes of the 2012 MMR process. HHS provided written comments on a draft of this report. HHS highlighted CMS's 2012 efforts to review the medical records associated with requests for exceptions for outpatient therapy services in excess of the annual $3,700 threshold. The department noted that CMS managed the new workload without additional funding and within a short time frame, and that the MACs shifted staff from other responsibilities to the MMR process. Outpatient therapy manual reviews were extended for 2013 and, according to HHS, CMS streamlined the MMRs of therapy services by transitioning the responsibility for these reviews from the MACs to the agency's RACs as of April 1, 2013. The RACs are conducting prepayment review of claims at the $3,700 threshold in California, Florida, Illinois, Louisiana, Michigan, Missouri, New York, North Carolina, Ohio, Pennsylvania, and Texas, and are conducting immediate postpayment reviews in all other states. HHS's comments are printed in appendix I. We are sending copies of this report to the Secretary of Health and Human Services, interested congressional committees, and others. In addition, the report is available at no charge on the GAO website at http://www.gao.gov. If you or your staff have any questions about this report, please contact me at (202) 512-7114 or [email protected]. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this report. GAO staff who made key contributions to this report are listed in appendix II. In addition to the contact named above, Martin T. Gahart, Assistant Director; George Bogart; Anne Hopewell; and Sara Rudow made key contributions to this report. | In 2011, Medicare paid about $5.7 billion to provide outpatient therapy services for 48 million beneficiaries. Rising Medicare spending for outpatient therapy services--physical therapy, occupational therapy, and speech-language pathology--has long been of concern. Congress established per person spending limits, or "therapy caps," for nonhospital outpatient therapy, which took effect in 1999. In response to concerns that some beneficiaries needing extensive services might be affected adversely, Congress imposed temporary moratoria on the caps several times until 2006, when it required CMS to implement an exceptions process. The Middle Class Tax Relief and Job Creation Act of 2012, in addition to extending the exceptions process, required CMS to conduct MMRs of requests for exceptions for outpatient services provided on or after October 1, 2012, over an annual threshold of $3,700. The act also mandated that GAO report on the implementation of the MMR process. This report describes (1) CMS's implementation of the 2012 MMR process, and (2) the number of individuals and claims subject to MMRs and the outcomes of these reviews. GAO reviewed relevant statutes, CMS policies and guidance, and CMS data on these reviews. GAO also interviewed CMS staff and officials from three MACs that accounted for almost 50 percent of the MMR workload and that processed claims for states previously determined to be at a higher risk for outpatient therapy improper payments. The Centers for Medicare & Medicaid Services (CMS) implemented two types of manual medical reviews (MMR)--reviews of preapproval requests and reviews of claims submitted without preapproval--for all outpatient therapy services that were above a $3,700 per-beneficiary threshold provided during the last 3 months of 2012. However, CMS did not issue complete guidance on how to process preapproval requests before the implementation of the MMR process in October 2012, and the Medicare Administrative Contractors (MAC) that conducted the MMRs were unable to fully automate systems for tracking preapproval requests in the time allotted. CMS required the MACs to manually review preapproval requests within 10 business days of receipt of all supporting documentation to determine whether the services were medically necessary, and to automatically approve any requests they were unable to review within that time frame. CMS officials told GAO that the purpose of the preapproval process was to protect beneficiaries from being liable for payment for nonaffirmed services by giving the provider and beneficiary guidance as to whether Medicare would pay for the requested services. If a provider delivered services without submitting a preapproval request, the MACs were required to manually review submitted claims above the $3,700 threshold prior to payment within 60 days of receiving the needed documentation. The MACs faced particular challenges with implementing reviews of preapproval requests because CMS continued to issue new guidance on how to manage preapproval requests after the MMR process started. For example, CMS did not inform the MACs how to process incomplete requests or count the 10-day preapproval request review time frame until November 7, 2012, and the MACs initially handled requests differently. In addition, all three MACs GAO interviewed told GAO that MMRs of preapproval requests were especially challenging because they did not have time to fully automate systems for tracking and processing the requests before the start of the MMR process, although they adapted their systems to manage the requests in different ways. CMS officials estimated that the MACs reviewed an estimated total of 167,000 preapproval requests and claims for outpatient therapy service above the $3,700 threshold provided from October 1, 2012, through December 31, 2012. Of these reviews, CMS estimated that 110,000 were for preapproval requests and 57,000 were for claims submitted without prior approval. However, due in part to the lack of automation, CMS officials reported that the total number of reviews should be considered estimates of the results of the MMR process at the time of this report. CMS estimated that the MACs affirmed about two-thirds of the preapproval requests and about one-third of the claims submitted without preapproval. Because providers can appeal denials of payment, the final outcome of the MMRs remains uncertain. CMS also estimated that by December 31, 2012, over 115,000 beneficiaries were affected by the reviews in 2012, a number that will rise as more claims subject to review are submitted throughout 2013. In its comments on a draft of this report, HHS emphasized that CMS managed the 2012 MMR process without additional funding and within a short time frame. HHS noted that the MMR process was extended for 2013 and CMS transitioned the responsibility for these reviews to other contractors as of April 1, 2013. | 5,635 | 979 |
The safety and quality of the U.S. food supply is governed by a highly complex system that is based on more than 30 laws and administered by 12 agencies. In addition, there are over 50 interagency agreements to govern the combined food safety oversight responsibilities of the various agencies. The federal system is supplemented by the states, which have their own statutes, regulations, and agencies for regulating and inspecting the safety and quality of food products. The United States Department of Agriculture (USDA) and the Food and Drug Administration (FDA), within the Department of Health and Human Services (HHS), have most of the regulatory responsibilities for ensuring the safety of the nation's food supply and account for most federal food safety spending. Under the Federal Meat Inspection Act, the Poultry Products Inspection Act, and the Egg Products Inspection Act, USDA is responsible for the safety of meat, poultry, and certain egg products. FDA, under the Federal Food, Drug, and Cosmetic Act, and the Public Health Service Act, regulates all other foods, including whole (or shell) eggs, seafood, milk, grain products, and fruits and vegetables. Appendix I summarizes the agencies' responsibilities. Existing statutes give the agencies different regulatory and enforcement authorities. For example, food products under FDA's jurisdiction may be marketed without the agency's prior approval. On the other hand, food products under USDA's jurisdiction must generally be inspected and approved as meeting federal standards before being sold to the public. Although recent legislative changes have strengthened FDA's enforcement authorities, the division of inspection authorities and other food safety responsibilities has not changed. As we have reported, USDA traditionally had more comprehensive enforcement authority than FDA; however, the Public Health Security and Bioterrorism Preparedness and Response Act of 2002 has granted FDA additional enforcement authorities that are similar to USDA's. For example, FDA can now require all food processors to register with the agency so that they can be inspected. FDA can also temporarily detain food products when there is credible evidence that the products present a threat of serious adverse health consequences, and FDA can require that entities such as the manufacturers, processors, and receivers of imported foods keep records to allow FDA to identify the immediate previous source and the immediate subsequent recipients of food, including its packaging. This record keeping authority is designed to help FDA track foods in the event of future health emergencies, such as terrorism-related contamination. In addition, FDA now has the authority to require advance notice of imported food shipments under its jurisdiction. Despite the additional enforcement authorities recently granted to FDA, important differences between the agencies' inspection and enforcement authorities remain. Finally, in addition to their established food safety and quality responsibilities, following the events of September 11, 2001, the federal agencies began to address the potential for deliberate contamination of agriculture and food products. In 2001, by Executive Order, the President added the food industries to the list of critical infrastructure sectors that need protection from possible terrorist attack. As a result of this Executive Order, the Homeland Security Act of 2002 establishing the Department of Homeland Security, and subsequent Presidential Directives, the Department of Homeland Security provides overall direction on how to protect the U.S. food supply from deliberate contamination. The Public Health Security and Bioterrorism Preparedness and Response Act also included numerous provisions to strengthen and enhance food safety and security. As we have stated in numerous reports and testimonies, the fragmented federal food safety system is not the product of strategic design. Rather, it emerged piecemeal, over many decades, typically in response to particular health threats or economic crises. In short, what authorities agencies have to enforce food safety regulations, which agency has jurisdiction to regulate what food products, and how frequently they inspect food facilities is determined by the legislation that governs each agency, or by administrative agreement between the two agencies, without strategic design as to how to best protect public health. It is important to understand that the origin of this problem is historical and, for the most part, grounded in the federal laws governing food safety. We and other organizations, including the National Academies, have issued many reports detailing problems with the federal food safety system and have made numerous recommendations for change. While many of these recommendations have been acted upon, problems in the food safety system persist, largely because food safety responsibilities are still divided among agencies that continue to operate under different laws and regulations. As a result there is fragmentation, inconsistency, and overlap in the federal food safety system. These problems are manifested in numerous ways as discussed below. Federal agencies have overlapping oversight responsibilities. Agency jurisdictions either assigned by law over time or determined by agency agreements result in overlapping oversight of single food products. For example, which agency is responsible for ensuring the safety of frozen pizzas depends on whether or not pepperoni is used as a topping. Figure 1 shows the agencies involved in regulating the safety of frozen pizza. In other instances, such as canned soups, it is the amount of a particular ingredient contained in the food product that governs whether it is subject to FDA or USDA inspection. As a result, canned soup producers are also subject to overlapping jurisdiction by the two food safety agencies. Overlap and duplication result in inefficient use of inspection resources. Food processing establishments may be inspected by more than one federal agency because they process foods that are regulated under different federal laws or because they participate in voluntary inspection programs. As of February 2004, FDA's records show that there are about 2,000 food processing facilities in the United States that may handle foods regulated by both FDA and USDA because their products include a variety of ingredients. Multi-ingredient products that are regulated by both FDA and USDA include pizza, canned soups, and sandwiches. GAO found that 514 of the 8,653 FDA inspections conducted in six states between October 1987 and March 1991, duplicated those of other federal agencies. For example, FSIS had five inspectors assigned full time to a plant that processed soups containing meat or poultry, yet FDA inspected the same plant because it also processed soups that did not contain meat or poultry. Thus, rather than having the full-time inspectors assigned to the plant conduct inspections for all the plant's products, additional inspectors from another agency were required to conduct separate inspections of products as a result of the different ingredients contained in the product. Moreover, there is also inefficient use of federal inspection resources dedicated to overseeing the safety of seafood products. FDA has responsibility for ensuring the safety of domestic and imported seafood products. However, as we reported in January 2004, the NOAA Seafood Inspection Program also provides fee-for-service safety, sanitation, and/or product inspections for approximately 2,500 foreign and domestic firms annually. Thus, both FDA and NOAA's programs duplicate inspections of seafood firms. To make more efficient use of federal inspection resources, we have recommended that FDA work toward developing a memorandum of understanding that leverages NOAA's Seafood Inspection Program resources to augment FDA's inspection capabilities. Federal agencies' different authorities result in inconsistent inspection and enforcement. Despite the additional enforcement authorities granted to FDA by the Public Health Security and Bioterrorism Preparedness and Response Act of 2002, differences between the agencies' inspection and enforcement authorities remain. For example, when FSIS inspectors observe serious noncompliance with USDA's food safety regulations, they have the authority to immediately withdraw their inspection services. This effectively stops plant operations because a USDA inspector must be present and food products under USDA's jurisdiction generally must be inspected and approved as meeting federal standards before being sold to the public. This ensures more timely correction of problems that could affect the safety of meat and poultry products. In contrast, food products under FDA's jurisdiction may be marketed without the agency's prior approval. Thus, while FDA may temporarily detain food products when there is credible evidence that the products present a threat of serious adverse health consequences, FDA currently has no authority comparable with USDA's allowing it to stop plant operations. As a result, problems identified during FDA inspections may take longer to correct. Federal agencies' different authorities to oversee imported foods also result in inconsistent efforts to ensure safety. A significant amount of the food we consume is imported; yet, as we have testified in the past, the same fragmented structure and inconsistent regulatory approach is being used to ensure the safety of imported foods. For example, more than three-quarters of the seafood Americans consume is imported from an estimated 13,000 foreign suppliers in about 160 different countries. As we have reported, however, FDA's system for ensuring the safety of imported seafood does not sufficiently protect consumers. For example, the agency inspected about 100 of roughly 13,000 foreign firms in 2002 and tested slightly over 1 percent of imported seafood products. In January 2004, we reported that despite some improvements, FDA is still able to inspect only a small proportion of U.S. seafood importers and visit few seafood firms overseas yearly. As we have previously recommended, a better alternative would be to strengthen FDA's ability to ensure the safety of imported foods by requiring that all food eligible for importation to the United States be produced under equivalent food safety systems. USDA has such authority. In fact, USDA is legally required to review certifications made by other countries that their meat and poultry food safety systems ensure compliance with U.S. standards and USDA must also conduct on-site inspections before those products can be exported to the United States. At this time, 37 countries are approved to export meat and poultry products to the United States. Frequency of inspections is not based on risk. Under current law, USDA inspectors maintain continuous inspection at slaughter facilities and examine each slaughtered meat and poultry carcass. They also visit each processing plant at least once during each operating day. For foods under FDA jurisdiction, however, federal law does not mandate the frequency of inspections. The differences in inspection frequencies are, at times, quite arbitrary, as in the case of jointly regulated food products. For example, as we testified in 2001, federal responsibilities for regulating the production and processing of a packaged ham and cheese sandwich depends on whether the sandwich is made with one or two slices of bread, not on the risk associated with its ingredients. As a result, facilities that produce closed-faced sandwiches are inspected on average once every 5 years by FDA, whereas facilities that produce open-faced sandwiches are inspected daily by FSIS. Federal expenditures are not based on the volume of foods regulated, consumed, or their risk of foodborne illness. FDA and FSIS food safety efforts are based on the respective legislation governing their operation. As a result, expenditures for food safety activities are disproportionate to the amount of food products each agency regulates and to the level of public consumption of those food products. FDA is responsible for ensuring the safety of approximately 79 percent of the foods Americans consume annually, while its budget represented only 40 percent ($508 million) of the approximately $1.3 billion spent on food safety oversight during fiscal year 2003. In contrast, FSIS inspects approximately 21 percent of the foods Americans consume annually, while its food safety budget represented 60 percent ($756 million) of the federal expenditures for food safety in 2003. Figure 2 shows the imbalance between the dollar amounts that the agencies spend on food safety activities and the volume of foods Americans consume annually. Perhaps more importantly, the agencies' food safety expenditures are disproportionate to the percentage of foodborne illnesses linked to the food products they regulate. For example, according to foodborne illness data compiled by the CDC, USDA-regulated foods account for about 32 percent of reported foodborne outbreaks with known sources. Conversely, FDA-regulated foods account for about 68 percent of these outbreaks. (See fig. 3.) Yet, USDA's food safety expenditures are about 49 percent more than FDA's. Finally, as figure 4 shows, FSIS has 9,170 employees that are, by law, responsible for daily oversight of approximately 6,464 meat, poultry, and egg product plants. FDA has roughly 1,900 food inspection employees who, among other things, inspect about 57,000 food establishments. Overlaps in egg safety responsibility compromise safety. Overlapping responsibilities have resulted in extensive delays in the development of a comprehensive regulatory strategy to ensure egg safety. As we have reported, no single federal agency has overall responsibility for the policies and activities needed to ensure the safety and quality of eggs and egg products. Figure 5 shows the overlapping responsibilities of multiple agencies involved in overseeing the production, processing, and transportation of eggs and egg products. As shown in figure 5, FDA has the primary responsibility for the safe production and processing of eggs still in the shell (known by industry as shell eggs), whereas FSIS has the responsibility for food safety at the processing plants where eggs are broken to create egg products. Despite FSIS and FDA attempts to coordinate their efforts on egg safety, more than 10 years have passed since the problem of bacterial contamination of intact shell eggs was first identified, and a comprehensive safety strategy has yet to be implemented. Agency representatives serving on the President's Council on Food Safety developed an Egg Safety Action Plan in 2000 and identified egg safety as one component of food safety that warranted immediate federal, interagency action. As of March 2004, comprehensive regulations to implement the actions the agencies identified in the Action Plan have not been published. Claims of health benefits for foods may be treated inconsistently by different federal agencies. Overlaps also exist in the area of health benefit claims associated with certain foods and dietary supplements. FDA, USDA, and the Federal Trade Commission (FTC) share responsibility for determining what types of health benefit claims are allowed on product labels and in advertisements. The varying statutory requirements among the agencies can lead to inconsistencies in labeling and advertisements. As a result, the use of certain health benefit claims on a product might be denied by one agency but allowed by another. For example, the FTC may allow a health claim in an advertisement as long as it meets the requirements of the Federal Trade Commission Act, even if FDA has not approved it for use on a label. Similarly, USDA reviews requests to use health claims on a case-by-case basis, regardless of whether or not FDA has approved them. Thus, consumers face a confusing array of claims, which may lead them to make inappropriate dietary choices. Multiple agencies must respond when serious food safety challenges emerge. Inconsistent food safety authorities result in the need for multiple agencies to respond to emerging food safety challenges. This was illustrated recently with regard to ensuring that animal feed is free of diseases, such as bovine spongiform encephalopathy (BSE), or mad cow disease. A fatal human variant of the disease is linked to eating beef from cattle infected with BSE. As we reported in 2002, four federal agencies are responsible for overseeing the many imported and domestic products that pose a risk of BSE. One, the U.S. Customs and Border Protection, screens all goods entering the United States to enforce its laws and the laws of 40 other agencies. The second, USDA's Animal and Plant Health Inspection Service (APHIS), protects livestock from animal diseases by monitoring the health of domestic and imported livestock. The third, USDA's FSIS, monitors the safety of imported and domestically produced meat and, at slaughterhouses, tests animals prior to slaughter to determine if they are free of disease and safe for human consumption. Finally, FDA monitors the safety of animal feed--animals contract BSE through feed that contains protein derived from the remains of diseased animals. During the recent discovery of an infected cow in Washington state, FDA investigated facilities that might have handled byproducts from the infected animal to make animal feed. Figure 6 illustrates the fragmentation in the agencies' authorities. When we issued our report in 2002, BSE had not been found in U.S. cattle. However, we found a number of weaknesses in import controls. Because of those weaknesses and the disease's long incubation period--up to 8 years--we concluded that BSE might be silently incubating somewhere in the United States. Then, in May 2003, an infected cow was found in Canada, and in December 2003, another was found in the state of Washington. USDA's Animal and Plant Health Inspection Service operates the surveillance program that found the infected U.S. cow, while FDA must ensure that the disease cannot spread by enforcing an animal feed ban that prohibits the use of cattle brains and spinal tissue, among other things, in cattle feed. With regard to the meat from the BSE-infected animal found in Washington state, FSIS conducted a recall of meat distributed in markets in six states. Both USDA and FDA have reported that meat from the cow was not used in FDA-regulated foods. However, had the meat been used, for example, in canned soups that contained less than 2 percent meat, FDA--not FSIS--would have been responsible for working with companies to recall those foods. (As app. II shows, the agencies' oversight responsibilities for food products vary depending on the amount of beef or poultry content.) Neither FDA nor USDA has authority under existing food safety laws to require a company to recall food products. Both agencies work informally with companies to encourage them to initiate a recall, but our ongoing work shows that each agency has different approaches and procedures. This can be confusing to food processors involved in a recall. Overlapping responsibilities in responding to mad cow disease highlight the challenges that government and industry face when responding to the need to remove contaminated food products from the market. As part of work currently underway, we are looking at USDA and FDA food recalls--including USDA's oversight of the BSE-related recall and FDA's oversight of the feed ban. We are also monitoring both USDA's and FDA's BSE-response activities. There are undoubtedly other federal food safety activities where overlap and duplication may occur. For example, in the areas of food safety research, public outreach, or both FDA, and USDA's Economic Research Service, FSIS and the Cooperative State Research, Education and Extension Service have all received funding to develop food safety-related educational materials for the public. In addition, responsibility for regulating genetically modified foods is shared among FDA, USDA, and the Environmental Protection Agency (EPA). However, we have not yet examined the extent to which these and other areas of overlap and duplication impact the efficiency of the food safety system. The fragmented legal and organizational structures of the federal food safety system are now further challenged by the realization that American farms and food are vulnerable to potential attack and deliberate contamination. As we recently reported in a statement for the record before the Senate Committee on Governmental Affairs, bioterrorist attacks could be directed at many different targets in the farm-to-table continuum, including crops, livestock, food products in the processing and distribution chain, wholesale and retail facilities, storage facilities, transportation, and food and agriculture research laboratories. Experts believe that terrorists would attack livestock and crops if their primary intent were to cause severe economic dislocation. Terrorists could decide to contaminate finished food products if their motive were to harm humans. Both FDA and USDA have taken steps to protect the food supply against a terrorist attack, but it is, for the most part, the current food safety system that the nation must depend on to prevent and respond to bioterrorist acts against our food supply. For example, in February 2003, we reported that FDA and USDA determined that their existing statutes empower them to enforce food safety, but do not provide them with clear authority to regulate all aspects of security at food-processing facilities. Neither agency feels that it has authority to require processors to adopt physical facility security measures such as installing fences, alarms, or outside lighting. Each agency, independently of one another, developed and published guidelines that food processors may voluntarily adopt to help them identify security measures and mitigate the risk of deliberate contamination at their production facilities. However, while food inspectors were instructed to be vigilant, they have not been asked to enforce, monitor, or document their actions regarding the extent to which security measures are being adopted. As a result, neither FDA nor USDA can fully assess the extent to which food processors are following the security guidelines that the agencies developed. Officials note, however, that they have taken many steps to address deliberate food contamination. Both agencies have distributed food security information to food processors under their jurisdictions and are cochairing the Food Emergency Response Network, which integrates the nation's laboratory infrastructure for the detection of threat agents in food at the local, state, and federal levels. Among other things, USDA established the Office of Food Security and Emergency Preparedness, enhanced security at food safety laboratories, and trained employees in preparedness activities. Similarly, FDA revised emergency response plans and conducted training for all staff, as well as participated in various emergency response exercises at FDA's Center for Food Safety and Applied Nutrition. Another GAO report documented vulnerabilities in federal efforts to prevent dangerous animal diseases from entering the United States. Our 2002 report on foot-and-mouth disease concluded that because of the sheer magnitude of international passengers and cargo that enters this country daily, completely preventing the entry of foot-and-mouth disease may not be feasible. During the 2001 outbreak of food-and-mouth disease in Europe, poor communication between USDA and Customs officials caused delays in carrying out inspections of international passengers and cargo arriving from disease-affected countries. To address the problems I have just outlined, a fundamental transformation of the current food safety system is necessary. As the Comptroller General has testified, there are no easy answers to the challenges federal departments and agencies face in transforming themselves. Changes, such as revamping the U.S. food safety system, will require a process that involves key congressional stakeholders and administration officials as well as others, ranging from food processors to consumers. There are different opinions about the best organizational model for food safety, but there is widespread national and international recognition of the need for uniform laws and the consolidation of food safety activities. Establishing a single food safety agency responsible for administering a uniform set of laws would offer the most logical approach to resolving long-standing problems with the current system, addressing emerging threats to food safety, and ensuring a safer food supply. This would ensure that food safety issues are addressed comprehensively by better preventing contamination throughout the entire food cycle--from the production and transportation of foods through their processing and sale until their eventual consumption by consumers. In our view, integrating the overlapping and duplicative responsibilities for food safety into a single agency or department can create synergy and economies of scale that would provide for more focused and efficient efforts to protect the nation's food supply. A second option would be to consolidate all food safety inspection activities, but not other activities, under an existing department, such as USDA or HHS. Other measures have not proven successful. For example, the Farm Security and Rural Investment Act of 2002 mandated the creation of a 15-member Food Safety Commission charged with making specific recommendations to improve the U.S. food safety system and delivering a report to the President and the Congress within a year. The Congress has thus far not provided funding for the commission. Simply choosing an organizational structure will not be sufficient, however. For the nation's food safety system to be successful, it will also be necessary to reform the current patchwork of food safety legislation and make it uniform, consistent, and risk-based. As table 1 shows, five of eight former senior food safety officials with whom we discussed the matter in preparation for this testimony concur with this view. Three officials had different views on the best approach to address problems with the current food safety system. Joseph Levitt, director of the FDA's Center for Food Safety and Applied Nutrition from 1998 to 2003, recommends that the existing agencies be fully funded. Thomas Billy, administrator of USDA's FSIS from 1996 to 2001 and director of FDA's Office of Seafood between 1990 and 1994, believes that no changes should take place until a presidential commission evaluates the problems, identifies the alternatives, and recommends a specific approach and strategy for consolidating food safety programs. However, Mr. Billy supports incremental legislative steps to fix current shortcomings. Finally, Caren Wilcox, USDA's deputy under secretary for Food Safety from 1997 to 2001, believes that creating a single food safety agency would be advisable, but only under certain circumstances. In 1998, the National Academies similarly recommended modifying the federal statutory framework for food safety to avoid fragmentation and to enable the creation and enforcement of risk-based standards. Moreover, our 1999 report on the experiences of countries that were then consolidating their food safety systems indicated that foreign officials are expecting long-term benefits in terms of savings and food safety. Five countries--Canada, Denmark, Great Britain, Ireland, and New Zealand-- have each consolidated their food safety responsibilities under a single agency. For example, New Zealand's Food Safety Authority was created in July 2002 to reduce inconsistencies and lack of coordination in food safety management by two separate agencies--the Ministry of Health and the Ministry of Agriculture and Forestry. The new authority anticipates an effective use of scarce resources and a reduction in duplication of effort. In conclusion, given the risks posed by new threats to the food supply, be they inadvertent or deliberate, we can no longer afford inefficient, inconsistent, and overlapping programs and operations in the food safety system. It is time to ask whether a system that developed in a piecemeal fashion in response to specific problems as they arose over the course of several decades can efficiently and effectively respond to today's challenges. We believe that creating a single food safety agency to administer a uniform, risk-based inspection system is the most effective way for the federal government to resolve long-standing problems, address emerging food safety issues, and better ensure the safety of the nation's food supply. This integration can create synergy and economies of scale, and provide more focused and efficient efforts to protect the nation's food supply. The National Academies and the President's Council on Food Safety have reported that comprehensive, uniform, and risk-based food safety legislation is needed to provide the foundation for a consolidated food safety system. We recognize that consolidating federal responsibilities for food safety into a single agency or department is a complex process. Numerous details, of course, would have to be worked out. However, it is essential that the fundamental decision to create more uniform standards and a single food safety agency to uphold them is made and the process for resolving outstanding technical issues is initiated. To provide more efficient, consistent, and effective federal oversight of the nation's food supply, we suggest that the Congress consider enacting comprehensive, uniform, and risk-based food safety legislation establishing a single, independent food safety agency at the Cabinet level. If the Congress does not opt for an entire reorganization of the food safety system, we suggest that as an alternative interim option it consider modifying existing laws to designate one current agency as the lead agency for all food safety inspection matters. Madam Chairwoman, this completes my prepared statement. I would be pleased to respond to any questions that you or other Members of the Committee may have at this time. For further information about this testimony, please contact Lawrence J. Dyckman, Director, Natural Resources and Environment, (202) 512-3841. Maria Cristina Gobin, Katheryn Summers Hubbell, Kelli Ann Walther, Amy Webbink, and John Delicath made key contributions to this statement. Food and Drug Administration (FDA) Centers for Disease Control and Prevention (CDC) Animal and Plant Health Inspection Service (APHIS) Agricultural Marketing Service (AMS) Agricultural Research Service (ARS) National Oceanic and Atmospheric Administration (NOAA) Collecting revenues and enforcing various Customs laws. Beans with bacon (2 percent or more bacon) Pork and beans (no limit on amount of pork) Food Safety: FDA's Imported Seafood Safety Program Shows Some Progress, but Further Improvements Are Needed. GAO-04-246. Washington, D.C.: January 30, 2004. Bioterrorism: A Threat to Agriculture and the Food Supply. GAO-04-259T. Washington, D.C.: November 19, 2003. Combating Bioterrorism: Actions Needed to Improve Security at Plum Island Animal Disease Center. GAO-03-847. Washington, D.C.: September 19, 2003. Results-Oriented Government: Shaping the Government to Meet 21st Century Challenges.GAO-03-1168T. Washington, D.C.: September 17, 2003. School Meal Programs: Few Instances of Foodborne Outbreaks Reported, but Opportunities Exist to Enhance Outbreak Data and Food Safety Practices. GAO-03-530. Washington, D.C.: May 9, 2003. Agricultural Conservation: Survey Results on USDA's Implementation of Food Security Act Compliance Provisions. GAO-03-492SP. Washington, D.C.: April 21, 2003. Food-Processing Security: Voluntary Efforts Are Under Way, but Federal Agencies Cannot Fully Assess Their Implementation. GAO-03-342. Washington, D.C.: February 14, 2003. Meat and Poultry: Better USDA Oversight and Enforcement of Safety Rules Needed to Reduce Risk of Foodborne Illnesses. GAO-02-902. Washington, D.C.: August 30, 2002. Foot and Mouth Disease: To Protect U.S. Livestock, USDA Must Remain Vigilant and Resolve Outstanding Issues. GAO-02-808. Washington, D.C.: July 26, 2002. Genetically Modified Foods: Experts View Regimen of Safety Tests as Adequate, but FDA's Evaluation Process Could Be Enhanced. GAO-02-566. Washington, D.C.: May 23, 2002. Food Safety: Continued Vigilance Needed to Ensure Safety of School Meals.GAO-02-669T. Washington, D.C.: April 30, 2002. Mad Cow Disease: Improvements in the Animal Feed Ban and Other Regulatory Areas Would Strengthen U.S. Prevention Efforts. GAO-02-183. Washington, D.C.: January 25, 2002. Food Safety: Weaknesses in Meat and Poultry Inspection Pilot Should Be Addressed Before Implementation. GAO-02-59. Washington, D.C.: December 17, 2001. Food Safety and Security: Fundamental Changes Needed to Ensure Safe Food.GAO-02-47T. Washington, D.C.: October 10, 2001. Food Safety: CDC Is Working to Address Limitations in Several of Its Foodborne Disease Surveillance Systems. GAO-01-973. Washington, D.C.: September 7, 2001. Food Safety: Overview of Federal and State Expenditures. GAO-01-177. Washington, D.C.: February 20, 2001. Food Safety: Federal Oversight of Seafood Does Not Sufficiently Protect Consumers. GAO-01-204. Washington, D.C.: January 31, 2001. Food Safety: Actions Needed by USDA and FDA to Ensure That Companies Promptly Carry Out Recalls. GAO/RCED-00-195. Washington, D.C.: August 17, 2000. Food Safety: Improvements Needed in Overseeing the Safety of Dietary Supplements and "Functional Foods." GAO/RCED-00-156. Washington, D.C.: July 11, 2000. School Meal Programs: Few Outbreaks of Foodborne Illness Reported. GAO/RCED-00-53. Washington, D.C.: February 22, 2000. Meat and Poultry: Improved Oversight and Training Will Strengthen New Food Safety System. GAO/RCED-00-16. Washington, D.C.: December 8, 1999. Food Safety: Agencies Should Further Test Plans for Responding to Deliberate Contamination. GAO/RCED-00-3. Washington, D.C.: October 27, 1999. Food Safety: U.S. Needs a Single Agency to Administer a Unified, Risk- Based Inspection System. GAO/T-RCED-99-256. Washington, D.C.: August 4, 1999. Food Safety: U.S. Lacks a Consistent Farm-to-Table Approach to Egg Safety. GAO/RCED-99-184. Washington, D.C.: July 1, 1999. Food Safety: Experiences of Four Countries in Consolidating Their Food Safety Systems. GAO/RCED-99-80. Washington, D.C.: April 20, 1999. Food Safety: Opportunities to Redirect Federal Resources and Funds Can Enhance Effectiveness. GAO/RCED-98-224. Washington, D.C.: August 6, 1998. Food Safety: Federal Efforts to Ensure Imported Food Safety Are Inconsistent and Unreliable. GAO/T-RCED-98-191. Washington, D.C.: May 14, 1998. Food Safety: Federal Efforts to Ensure the Safety of Imported Foods Are Inconsistent and Unreliable. GAO/RCED-98-103. Washington, D.C.: April 30, 1998. Food Safety: Agencies' Handling of a Dioxin Incident Caused Hardships for Some Producers and Processors. GAO/RCED-98-104. Washington, D.C.: April 10, 1998. Food Safety: Fundamental Changes Needed to Improve Food Safety. GAO/RCED-97-249R. Washington, D.C.: September 9, 1997. Food Safety: Information on Foodborne Illnesses. GAO/RCED-96-96. Washington, D.C.: May 8, 1996. Food Safety: Changes Needed to Minimize Unsafe Chemicals in Food. GAO/RCED-94-192. Washington, D.C.: September 26, 1994. Food Safety: A Unified, Risk-Based Food Safety System Needed. GAO/T- RCED-94-223. Washington, D.C.: May 25, 1994. Food Safety: Risk-Based Inspections and Microbial Monitoring Needed for Meat and Poultry. GAO/RCED-94-110. Washington, D.C.: May 19, 1994. Food Safety and Quality: Uniform, Risk-Based Inspection System Needed to Ensure Safe Food Supply. GAO/RCED-92-152. Washington, D.C.: June 26, 1992. Food Safety and Quality: Salmonella Control Efforts Show Need for More Coordination. GAO/RCED-92-69. Washington, D.C.: April 21, 1992. This is a work of the U.S. government and is not subject to copyright protection in the United States. It may be reproduced and distributed in its entirety without further permission from GAO. However, because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. | The safety of the U.S. food supply is governed by a highly complex system of more than 30 laws administered by 12 agencies. In light of the recent focus on government reorganization, it is time to ask whether the current system can effectively and efficiently respond to today's challenges. At the request of the Subcommittee on Civil Service and Agency Organization, we reviewed and summarized our work on the safety and security of the food supply regarding (1) the fragmented legal and organizational structure of the federal food safety system, (2) the consequences of overlapping and inconsistent inspection and enforcement, and (3) options for consolidating food safety functions. As we have stated in numerous reports and testimonies, the federal food safety system is not the product of strategic design. Rather, it emerged piecemeal, over many decades, typically in response to particular health threats or economic crises. The result is a fragmented legal and organizational structure that gives responsibility for specific food commodities to different agencies and provides them with significantly different authorities and responsibilities. The existing food safety statutes create fragmented jurisdictions between the two principal food safety agencies, the Food and Drug Administration (FDA) and the U.S. Department of Agriculture (USDA). As a result, there are inconsistencies in the frequency of the agencies' inspections of food facilities and the enforcement authorities available to these agencies. In short, which agency has jurisdiction to regulate various food products, the regulatory authorities they have available to them, and how frequently they inspect food facilities is determined by disparate statutes or by administrative agreement between the two agencies, without strategic design as to how to best protect public health. In many instances, food processing facilities are inspected by both FDA and USDA. Furthermore, federal food safety efforts are based on statutory requirements, not risk. For example, funding for USDA and FDA is not proportionate to the amount of food products each agency regulates, to the level of public consumption of those foods, or to the frequency of foodborne illnesses associated with food products. A federal food safety system with diffused and overlapping lines of authority and responsibility cannot effectively and efficiently accomplish its mission and meet new food safety challenges. These challenges are more pressing today as we face emerging threats such as mad cow disease and the potential for deliberate contamination of our food supply through bioterrorism. Therefore, fundamental changes are needed. First, there is a need to overhaul existing food safety legislation to make it uniform, consistent, and risk based. Second, consolidation of food safety agencies under a single independent agency or a single department is needed to improve the effectiveness and efficiency of the current federal food safety system. Integrating the overlapping responsibilities for food safety into a single agency or department can create synergy and economies of scale, as well as provide more focused and efficient efforts to protect the nation's food supply. | 7,631 | 601 |
NRC is an independent federal agency that (1) establishes standards and regulations for commercial nuclear power plants and non-power research, test, and training reactors; fuel cycle facilities; medical, academic, and industrial uses of nuclear materials; and the transport, storage, and disposal of nuclear materials and wastes, (2) issues licenses for nuclear facilities and uses of nuclear materials, such as industrial applications, nuclear medicine, academic activities, and research work, and (3) inspects facilities and the uses of nuclear materials to ensure compliance with regulatory requirements. While safety is a paramount goal, a reassessment in 2001 added three subordinate performance goals to NRC's strategic plan: (1) to make NRC activities and decisions more effective, efficient, and realistic, (2) to reduce unnecessary regulatory burden on industry without affecting safety, and (3) to increase public confidence in NRC actions. Figure 1 shows NRC's organization. NRC is governed by a five-member commission with one member designated by the President to serve as Chairman. The Chairman serves as the principal executive officer and official spokesperson of the commission. Reporting to the Commission Chairman is the Executive Director for Operations (EDO). The EDO is the chief operational and administrative officer of NRC, and is generally responsible for executing the program policies and decisions made by the NRC. Also reporting to the Commission Chairman is the Chief Financial Officer (CFO), who is responsible for the agency's PBPM and all of NRC's financial management activities. NRC is organized into seven program offices under the EDO. The Office of Nuclear Reactor Regulation (NRR), the Office of Nuclear Material Safety and Safeguards (NMSS), the Office of Nuclear Regulatory Research (RES), and the newly created Office of Nuclear Security and Incident Response (NSIR) are NRC's four largest offices. It also has three smaller program offices, various other management and mission support offices, and four regional offices. While strategic planning, budgeting, and program implementation involve headquarters offices and regional operations, we focused our work on those offices that NRC officials said had more experience in PBPM implementation. The Office of the CFO which includes the Division of Planning, Budget, and Analysis, is responsible for NRC's financial management and reporting under GPRA. NRR licenses and inspects nuclear power reactors and non-power reactors. NMSS directs and oversees licensing, inspection, and environmental activities for nuclear fuel cycle facilities and safeguards nuclear materials, including the management and disposal of high- and low-level radioactive wastes. RES provides technical support to the frontline regulatory activities involving licensing and inspection, oversight and development of regulatory products. NSIR combines NMSS responsibilities for protection of fuel cycle facilities and materials with NRR responsibilities for physical security at nuclear power plants and other facilities. The four regions execute NRC policies and various programs relating to inspection, licensing, enforcement, investigation, governmental liaison, as well as emergency response within their regional boundaries. NRC employed approximately 2,900 people and had a total budget of approximately $559 million in fiscal year 2002. Of that amount, the Congress transferred about $23.7 million from the Nuclear Waste Fund. The remainder was to be financed by a mix of revenues from licensing, inspection services, and other services and collections, and amounts from the general fund of the Treasury. These amounts were made available in NRC's annual appropriations and in an emergency supplemental appropriation to support homeland-security-related activities. Over half of NRC's annual budget is used to pay staff salaries and benefits. The remaining funds are used to support other operating expenses, purchase technical assistance for regulatory programs, and conduct safety research. During the 1990s, various concerns were raised about NRC's performance, particularly the way NRC conducted inspections and promulgated regulations. Agency officials told us that NRC's former Commission Chairman, Shirley Jackson, was concerned that NRC's practices were narrowly focused on ensuring that its activities and processes were consistent with regulatory law without adequate attention to the results of its activities. Both the nuclear industry and public interest groups criticized NRC's plant assessment and enforcement processes as lacking objectivity, consistency, and predictability. An NRC report also described its former regulatory approach as punitive and reactive. According to a senior agency official, the agency was concerned that the Congress would cut about one-third of the agency's staff from the NRC budget for fiscal year 1999 unless the agency changed the way it conducted business. NRC took various steps to improve regulatory oversight and agency management. These changes included a comprehensive strategic planning effort from 1995 to 1997 to reassess and establish new baselines for its programs, led by then-Chairman Jackson. NRC also charged the OCFO and the former Executive Council with developing a new planning, budgeting, and performance management process. NRC staff said that PBPM changes also supported the agency's efforts to implement GPRA. NRC established PBPM in the fall of 1997 and implemented a pilot project in NRR. In 1999, NRC extended PBPM to NMSS and RES for the fiscal year 2000 budget. NRC plans to further develop PBPM to include more detailed procedures, the products involved, and the roles of various management levels. To achieve our objectives, we interviewed selected NRC staff members from the offices of the EDO, the CFO, and the Chief Information Officer; from three headquarters offices in Rockville, Maryland (NRR, NMSS, and RES); and from the Region II (Atlanta) office for their perspectives on PBPM and how it supports resource decisions. The Region II office was selected because, according to NRC officials, this region had been instrumental in developing a cohesive operating plan--one of the PBPM techniques used by NRC to enhance coordination among program offices and regions. Within these organizations, we interviewed officials at various levels of management involved in the budget decision-making process, including office directors, division directors, and unit managers. In total, we interviewed more than 30 NRC officials on the various aspects of planning and budgeting practices. We reviewed NRC's planning, budget, and program documents, including strategic plans, annual performance plans, budget requests, operating plans, and performance reports, that support PBPM. This report presents NRC's budget and planning practices as described by the NRC officials we interviewed and described in the NRC documents we reviewed. The views of those individuals and the information in these documents, which we have summarized for reporting purposes, may not necessarily be generalized across NRC. We also did not observe or evaluate the processes in operation, nor did we assess the program or financial information contained in documents provided by NRC. We also did not evaluate the completeness or accuracy of NRC performance goals and measures or the effectiveness of NRC rule making, licensing, inspection, and oversight programs. Our work was conducted from February through May of 2002 in accordance with generally accepted government auditing standards. Implementation of PBPM is a work in progress. PBPM was created by NRC to improve program and service performance by integrating NRC's strategic planning and budgeting processes. This section describes how components of the process were designed to operate, while the next section ("Planning and Performance Information Influences Resource Allocation Decisions in Various Ways") explains how performance information informs resource decisions in those offices that have implemented PBPM and its techniques. NRC has gradually introduced PBPM techniques across the agency and has allowed offices some flexibility during implementation of the process. NRC began implementation in its larger program and mission support offices. As NRC has gained experience, it is examining ways to extend the process to the smaller program and mission support offices and to more fully standardize PBPM techniques across the agency. NRC designed PBPM as an integrated process that functions most effectively when information from one component is used to inform decisions in other components. Figure 2 shows how the four components interact over a budget cycle. For example, the strategic direction setting in Component 1 relies in part on the assessment elements in Component 4. The effectiveness review element in Component 2 relies on performance goals developed during strategic direction setting. Finally, the assessment elements in Component 4 incorporate information gathered from Component 3, performance monitoring, to identify topics for program evaluations and self-assessments. In Component 1, NRC establishes agencywide strategic direction by formulating the strategic plan and by issuing Commission guidance throughout the year. The plan includes NRC's strategic and performance goals and corresponding measures and identifies general strategies on how best to achieve the agency's mission. The plan is developed with Commission and stakeholder involvement by a senior management group with a broad perspective of the agency, and is approved by the Commission. Although the plan covers 5 years and is reexamined every 3 years as required by GPRA, if circumstances warrant, the plan can be changed more often. The plan also establishes a framework called "strategic arenas," each of which is composed of related programs with a common purpose. NRC's strategic arenas correspond to program activities in the President's budget. In addition, the Commission provides direction to its managers on programs and operations through various written directives. In Component 2, managers in offices using PBPM employ a set of interrelated tools to translate agency goals and strategies into individual office work activities, performance targets, and resource needs. To determine how work activities contribute to achieving NRC's four performance goals, individual offices conduct what are called effectiveness reviews. These reviews are not comprehensive assessments of programs but rather a structured way for managers to evaluate the contribution of work activities to achieving performance goals prior to budget formulation. For example, an office will examine each of its work activities and ask how a given activity achieves each of the performance goals. Effectiveness reviews also assist offices in identifying where there are gaps in activities or where new initiatives are needed. Agency officials said that offices that conduct these reviews have used various methodologies to rank office activities relative to agency performance goals. According to agency officials, if an office determines through an effectiveness review that activities are not critical to achieving NRC performance goals, the office will likely propose reducing or eliminating resources for the activity in the upcoming budget year. Effectiveness review discussions may begin prior to the start of the annual budget process, concurrent with Component 1 activities establishing strategic direction. These discussions enable senior management to provide guidance on expectations for work priorities (targets). The budget assumptions document is a tool used to plan work activities based on workload and set performance targets. This document identifies external and internal factors, such as anticipated number of license reviews that will affect the agency's workload over the next 2 fiscal years. These assumptions are developed by the offices and approved by NRC executive-level managers. These assumptions then become key inputs for offices when formulating their resource needs for the upcoming budget year. Each budget assumption is supported by a summary of the factors that were evaluated to produce the assumption and to indicate the likelihood that this assumption will materialize. For example, the fiscal year 2003- 2004 budget assumptions document estimates approximately 1,500 enforcement actions for each year. This estimate is based on historical trends and anticipated results from implementation of the revised reactor oversight process. In addition, the budget assumptions document includes related information that may affect the assumptions. In the above example, NRC is attempting to integrate Alternative Dispute Resolution techniques into the enforcement program, a decision that may require additional resources to implement. Finally, through its annual budget call NRC provides instructions to individual offices for developing office budget priorities. Individual offices submit budgets to the NRC executive level by program. These submissions address resources needed by each office to accomplish NRC strategic and performance goals. A group of senior managers then reviews office budget submissions by strategic arena and submits the proposed office budget to the CFO and EDO. The CFO and EDO then submit their proposed budget to the Chairman for Commission approval. After Commission approval, NRC submits a combined annual budget and performance plan to OMB for inclusion in the President's budget. The combined budget and performance plan also serves as the agency's budget justification to the Congress. Figure 3 shows how NRC's performance plan links program activities and funding allocations by goal. In Component 3, NRC executes the approved budget through office operating plans based on appropriations, congressional guidance, and Commission priorities. Each office prepares operating plans to reflect the allocation of staff years and funds available following appropriations action and OMB apportionment. The operating plans, tailored by each office implementing PBPM, tie allocated staff and other resources to each work activity and to performance goals and define how success is measured for each activity. As the budget is executed, operating plans also are used to compare actual office resources to budget estimates and actual performance to targeted performance, and to identify necessary programmatic and fiscal actions. Based on targets established in the operating plans, individual offices develop quarterly reports on the status of resources and performance. Any performance issues identified in the quarterly reports are discussed with the deputy executive director responsible for that particular office. Generally, when an office meets with its cognizant deputy executive director, it has prepared a course of corrective action it intends to take. However, if an issue is significant, senior staff members will meet with their deputy when they become aware of the issue rather than wait for the quarterly operating plan update. Follow-up actions are incorporated into the next scheduled operating plan meeting as appropriate. The Office of the EDO does not prepare quarterly reports summarizing its review of office operating plans for the Commission. Instead, the Commission is kept informed of operating plan issues throughout the year by various means including Commission meetings, staff papers, the Budget Execution Report, and individual briefings. Finally, performance results are reported annually through a publicly available agency performance report. In Component 4, NRC assesses agency performance. This component is designed to use information from and feed information to other components. Although this component is the least developed of the four components, products are intended to both inform future planning and budget deliberations and further improve performance. (A later section of this report, "Challenges to Improving the NRC Budget and Planning Process," more fully discusses challenges to improving the assessment component). When fully operational, this component should help NRC to determine whether a program should be continued, restructured, or curtailed and, as designed, may influence planning and budget decisions in Components 1 and 2. In July 2002, NRC proposed that this component include performance reviews conducted for the four major strategic arenas as well as selected management and support offices. However, no decision has been made on who in NRC will conduct these reviews. In addition, individual offices can identify issues during the performance monitoring component that they may select for internal self-assessments during Component 4. PBPM provides NRC with a framework through which it can use performance information to influence planning and resource allocation decisions and is consistent in key respects with our framework for budget practices. NRC informs its resource allocation decisions by providing strategic direction to operating units prior to budget formulation and by monitoring actual performance against performance targets during budget execution. PBPM also promotes agencywide coordination of budget formulation and execution decisions by providing a common language and common goals. A key principle driving PBPM is that the agency's strategic direction influences internal policy and resource decisions. NRC seeks to use PBPM to identify general strategies to achieve goals, identify programs to implement these strategies, and determine resources to fund and staff programs. NRC practices are similar to those proposed in our framework for budget practices. Under the framework for budget practices, agency management should provide context during budget formulation in the form of general guidance to program managers on proposed agency goals, existing performance issues, and resource constraints--consistent with Components 1 and 2 of PBPM. The following are examples of operation and program decisions that link NRC's strategic direction with corresponding resource decisions made though PBPM. One of the strategies used to implement the four performance goals in the strategic plan is risk-informed regulation and oversight. This strategy uses risk assessment findings, engineering analysis, and performance history to focus attention on the most important safety-related activities; establishes objective criteria to evaluate performance; develops measures to assess licensee performance; and uses performance results as the primary basis for making regulatory decisions. As part of its risk-informed regulation and oversight strategy, NRC modified its reactor oversight program to help achieve its three subordinate performance goals--developed through Component 1--while maintaining its primary safety goal. The Commission provided guidance throughout the development and implementation of the revised reactor oversight program. This guidance included requirements for staff reporting to the Commission, approval of a pilot program, and instructions for future program development. In one modification to the inspection process, NRC stopped inspecting some elements affecting the plant operators' work environments (e.g., how well lights in the plant illuminate the operating panel). NRC determined that these factors did not critically contribute to safety and created unnecessary regulatory burdens to industry. Regional officials told us that NRC could now focus on the significant work activities that maintain safety. The reactor oversight program's procedure for assessing nuclear plants was also changed to increase public confidence in NRC operations by increasing the predictability, consistency, objectivity, and transparency of the oversight process. Each quarter, NRC posts the performance of each nuclear plant on its Web site to provide more information to the public. Regional officials told us that the overall level of resources required to implement the revised reactor oversight program is similar to that of the prior oversight program but that significant changes have occurred in how they manage their inspection program. Specifically, the new inspection procedure includes baseline inspections of all plants but focuses more of the agency's resources on plants that demonstrate performance problems. Whether the revised reactor oversight program will reduce costs is unknown, but regional officials said that potentially fewer resources may be needed in the future using this approach. NRC established a focus group to identify where or how possible resource savings could occur. As part of its risk-informed regulation and oversight strategy, NRC developed the Risk-Informed Regulation Implementation Plan (RIRIP), which is updated periodically. The first RIRIP, issued in October 2000, examined a range of staff activities including rule making to achieve NRC performance goals. The Commission provided guidance throughout the development and implementation of the new plan, including instructions for future program development as NRC updates the plan. To facilitate its use, the plan is organized around the strategic arenas. Organizing the plan around arenas helps offices to establish priorities and identify resources as part of PBPM. For example, the plan describes activities designed to improve fire protection for nuclear power plants. In this area, NRC plans to develop less prescriptive, more performance-based risk-informed regulations to support its primary goal of safety. NRC is working with industry to study alternatives to existing fire protection standards and emergency postfire shutdown procedures. A senior NRC official gave additional examples of changes NRC has made to its regulations to reduce unnecessary regulatory burden on licensees without compromising safety. He cited the decision to have NRC oversee, but no longer perform, examinations to qualify power plant operators since the industry conducts its own examinations. In addition, this official said NRC eliminated its regulation requiring all nuclear power plants to install state-of-the-art equipment, for example, they could continue to use analog rather than digital equipment, focusing instead on whether use of the current equipment adversely affected safety. NRC also changed its licensing regulations to support its performance goals of reducing unnecessary regulatory burden on licensees and becoming more effective and efficient. One official said NRC changed its regulation governing the length of a power plant license from 40 years to 60 years in some circumstances. Before this change, NRC would only license a power plant for 40 years. At the end of the 40-year license period, the licensee would be required to shut down and decommission the plant. The change in regulation means that NRC will extend the term of a license from 40 to 60 years if it determines through licensing review that existing plant design will support a longer term. According to NRC officials, these license extensions can eliminate extremely large costs to licensees while reducing NRC costs because it is less costly to renew a plant operating license than to review a request for a license for a new power plant. The Commission directed the reorganization of NRC's three major NRC program offices so that they could become more effective and efficient. For example, in NRR the reorganization established reporting lines consistent with major NRR program functions--inspection, performance assessment, license renewal, and licensing. An NRR official said the previous organizational structure in NRC had contributed to inconsistent processes for inspecting power plants and duplication of work. To address the overall safety goal, NRC developed a program to measure trends in industry nuclear power reactor performance. One part of the safety goal is that there should be no statistically significant adverse industry trends in safety performance. Performance indicators are included in the NRC performance plan and are reported to the Congress through the NRC annual performance report. Resources for this new program are determined through PBPM. NRC uses performance information to inform resource allocation decisions during budget execution by monitoring current year work performance and by adjusting resource allocations as necessary. This practice is consistent with our proposed framework for budget practices. As noted previously, office operating plans track performance against established targets for each planned work activity to call attention to significant performance issues needing corrective action. For example, shortly after September 11, 2001, NRC conducted a comprehensive review of its security program. As part of this review, NRC examined lists of prioritized work activities prepared during the effectiveness review process in Component 2. These lists helped NRC determine which activities to delete or modify as it prepared to use existing resources to respond to security threats in the post-September 11 environment. For example, NRC staffed around-the- clock emergency response centers for significantly longer than originally anticipated. As part of this comprehensive review of its security program, NRC began research on the structural integrity of power plants if they were attacked by large aircraft. NRC also delayed routine inspections at non-power reactors for 3 months to help fund these new activities. In addition, in April 2002, NRC established NSIR to streamline selected NRC security, safeguards, and incident response responsibilities and related resources. Operating plans are also used to monitor performance and make necessary adjustments. For example, NRR discovered that the May 2000 operating plan report showed plant license renewal applications and associated staff years well below annual expected target levels that year. NRR was thus able to shift resources to other priorities. An NRR official said this example showed NRR the importance of monthly monitoring of the budget assumptions prior to the beginning of the fiscal year. Furthermore, in another example, NRR management officials also reviewed the fiscal year 2002 first quarter operating plan report and found that the workload impact from the September 11 attacks would prevent NRR from achieving annual licensing action targets. These officials redirected additional staff resources to complete these licensing actions. As a result, the third quarter projection is that NRR will slightly exceed its annual target for these actions. PBPM is designed to enhance cooperation and coordination among offices. This practice matches our proposed framework for budget practices, which states that agency managers should share information on policy and programs among offices during budget decision making. Sharing information during budgeting is important because many offices share responsibilities for achieving NRC goals. NRC office managers said they coordinate their work with others to determine if necessary skills are already available elsewhere in the agency. For example, one official said he relies on another unit's expertise in conducting environmental studies. In another example, regional officials reported that they occasionally share specialized staff with other regions to perform nonroutine inspections. PBPM provides NRC with reference points such as common goals, performance measures, and strategies that help offices communicate and reach agreement on budget priorities. For example, NRR, which depends upon research studies conducted by RES, meets regularly with that office to discuss program and budget priorities for risk analysis, structural integrity, and new reactor designs. NRR also meets with other offices as it develops its budget proposal to coordinate its resource requests for mutually agreed-upon priorities. For instance, NRR shares information with NMSS to ensure that crosscutting activities, such as rule making, have adequate resources. In addition, the NRC crosswalk of all program activities into strategic arenas allows NRC to clarify the relationship between budget requests and agency goals. Our report on federal agency efforts in linking performance plans with budgets found that NRC's budget presentation linked its program activities to performance goals, which showed funding needed to achieve goals. NRC uses the arena reporting structure to communicate its budget needs to audiences outside the agency, including OMB and the Congress. When it introduced PBPM, NRC recognized that continued development of the process would be necessary. After gaining experience for several years, NRC is now in the process of addressing several challenges to PBPM implementation. Agency officials noted challenges in (1) creating performance measures that balance competing goals and keep performance measures current, (2) associating resource requests with outcomes, (3) standardizing PBPM practices and techniques but still allowing individual offices to tailor the process to their needs, (4) developing the assessment component, and (5) committing significant effort to maintaining PBPM. In addition, NRC must continue developing a cost accounting system to support PBPM. As NRC officials create new performance measures or redesign existing measures, they find it a challenge to refine performance measures so that they balance performance goals. While safety is a paramount goal, NRC also seeks to progress in reducing unnecessary regulatory burden on the industry and improving public confidence in NRC's operations. One official said it is a balancing act to minimize the time and steps it takes to license a facility while at the same time being sure that the agency is licensing a safe operation. Several NRC officials also said current performance measures track office efficiency well but capture the quality of license review poorly. NRC officials said they are beginning to develop performance measures that better capture quality. For example, NRR is now using a template to assess the quality of its evaluation of safety issues during review of licensing actions. Officials believe that when measures of quality are in place, they can be used to determine whether adjusting budget resources will have an effect on the quality of their activities. New strategies, such as risk-based regulation and oversight programs, can dictate changes in performance measures. NRC must also keep its performance measures relevant as the industry changes. Several examples illustrate these points. NRC plans to develop new performance measures for reviewing applications to upgrade power output from existing plants because of concern that existing measures did not accurately measure NRC performance in this area. In another example, NRC is studying new performance measures to determine if it can predict, and thus avoid, emergent problems in the Reactor Oversight Program. NRC and industry representatives jointly developed a new set of performance indicators to measure availability of nuclear plant safety systems. NRC believes the new performance indicators will provide more accurate risk assessments. NRC officials said that linking outcomes to resources is challenging for several reasons. First, the budget process focuses on performance targets and budget decisions for the short term while achieving some outcomes may take many years. Therefore, it is difficult to know the incremental effect of adjusting resources annually for longer-term outcomes. For example, one official noted that research leading to safer reactor design takes many years to bear fruit. Agency officials said linking outcomes to resources is also difficult because achieving many agency goals depends on the actions of others not directly under NRC's control. NRC's strategic plan states that achieving its strategic goals requires the collective efforts of NRC, licensees, and the agreement states. Yet, as one NRC official noted, neither NRC nor stakeholder representatives could identify how much each contributes to achieving NRC strategic goals. Nonetheless, this official said that both NRC and stakeholders strongly believe in establishing quantifiable outcome measures so that all stakeholders understand NRC's goals. While the particular links and interdependencies are specific to NRC, many of these challenges permeate federal agencies. Many federal programs depend on other actors. For many federal activities ultimate outcomes are years away, but ways must be found to evaluate progress and make resource decisions annually. A continuing challenge during PBPM implementation is to determine which process techniques and information should be standardized across offices. For example, NRC officials said the major program offices use different procedures and methodologies to rank the contribution of their work activities to achieving NRC performance goals. Nonstandard weighing of priorities has made cross-office comparisons of activities and related resource allocation decisions more challenging for NRC officials. NRC officials said they established a task force to develop a common methodology to prioritize the contributions of the major program offices to NRC goals. They said their goal is to have aspects of a common ranking process among the major program offices for the fiscal year 2005 budget. In addition, NRC is in the process of further defining the roles and responsibilities of participants in PBPM through a management directive. In a related example, an NRC official said the agency faces a challenge to improve comparison of performance measures across both major program and mission support offices. Major NRC program offices are required to include agency strategic goals and performance goal measures in their annual operating plans. These measures are reported in the annual performance report by strategic arena. However, mission support offices are not required to report on these strategic performance goals. In addition, each office has been permitted to develop additional, office- specific, detailed performance measures to provide supplemental management information. NRC officials describe NRC's current assessment process as the weakest component of PBPM. These officials said existing guidance does not adequately describe what an assessment is or how to select programs for evaluation. Since there is not a clear definition of what qualifies as an assessment within Component 4, NRC performance reports vary and may not capture the full range of assessments that occurred or are planned at NRC. Because information contained in assessments is intended to inform the other PBPM components, NRC officials see the performance assessment component as a critical element of its process. For example, performance assessments can capture key information on how the agency is performing that can be used for setting the agency's strategic direction. This practice, consistent with our framework for budget practices, can help NRC to seek continual improvement by evaluating current program performance and identifying alternative approaches to better achieve agency goals. NRC is taking steps to improve its assessment process by developing a new procedure for selecting programs and activities for evaluation. In July 2002, NRC established annual performance reviews for the four major strategic arenas and an annual assessment plan that identifies subjects for evaluation during the upcoming fiscal year. Programs will be selected for evaluation where a strong potential exists for performance improvement, cost reduction, or both. Results of the program evaluations will inform the next strategic direction phase of PBPM and may also result in changes during the performance monitoring process. Agency officials describe the introduction of PBPM as a culture shift requiring a commitment of time and effort by NRC employees. NRC officials said the agency sought to facilitate this cultural change by holding staff meetings at all levels and by using task force working groups to introduce PBPM. The introduction and evolution of PBPM also presents a continuing workload challenge to NRC. For example, one official said the detailed work associated with PBPM had been added to reporting requirements already in place. Nevertheless, key officials reported that implementing PBPM has been worth the time and effort because it provides a framework for more informed and focused resource allocation decisions. According to one official, PBPM has resulted in agency officials asking the key questions about why and how they conduct an activity. NRC faces the challenge of developing a cost accounting system that can support budget decision making. Developing a cost accounting system is important to budget decision making because it can help managers track direct, indirect, and unit costs of activities and compare the cost of activities to appropriate benchmarks. The October 2001 NRC Managerial Cost Accounting Remediation Plan noted that the prior accounting system supported general financial reporting but did not include a managerial cost accounting system. An example in the remediation plan states that labor hour tracking systems were not integrated with payroll systems. NRC officials said the agency has since developed a cost accounting system to help in resource allocation decisions. They said the new system will integrate payroll and nonpayroll costs at a level that will enable NRC to compare total direct costs of work activities with appropriate benchmarks. However, officials told us that they only started using the cost accounting system in the first two quarters of fiscal year 2002 and plan to refine the information collected based on what is the most useful and relevant. Agency officials estimate that fully implementing the system will take 4 to 5 years. We requested comments on a draft of this report from NRC. NRC expressed appreciation for our recognition of its efforts and progress and the fact that we note consistencies with our framework for budget practices. NRC expressed some concern about our report underrecognizing how far beyond conceptual stage PBPM is, about our statement that a good cost accounting system was necessary, and about our reference to operating plans. We modified our language to clarify our views on the implementation of PBPM. The agency's letter and our response are contained in appendix I. NRC officials also provided clarifying comments, which we have incorporated in the report as appropriate. We are sending copies of this report to the Chairman of the Nuclear Regulatory Commission and will make copies available to other interested parties upon request. In addition, the report will be available at no charge on the GAO Web site at http://www.gao.gov. Please contact me on (202) 512-9573 or Denise Fantone, Assistant Director, on (202) 512-4997 if you or your staff has any questions about this report. Major contributors to this report are Robert Hadley, James Whitcomb, and Robert Yetvin. The following are GAO's comments on the Nuclear Regulatory Commission's (NRC) letter dated November 22, 2002. 1. Our point is not that the Planning, Budgeting, and Performance Management Process is still at a conceptual stage but rather that implementation is in various stages throughout NRC, and that refinement of agencywide implementation is still necessary. This is consistent with what we were told and saw at NRC. We modified wording to clarify this point. (See pp. 4 and 9.) 2. We consistently have said that good cost accounting is critical to linking resources to results/outcomes. For example, in our recent testimony on performance budgeting we said that the integration of reliable cost accounting data into budget debates needs to become a key part of the performance budgeting agenda. 3. NRC uses operating plans to set milestones, track progress, and make adjustments to improve program outcomes. This is--and was so described in our interviews at NRC-- an important part of PBPM. 4. The footnote was modified to clarify that this report neither observed nor evaluated reported safety problems in the Davis-Besse power plant. (See p. 9.) The General Accounting Office, the investigative arm of Congress, exists to support Congress in meeting its constitutional responsibilities and to help improve the performance and accountability of the federal government for the American people. GAO examines the use of public funds; evaluates federal programs and policies; and provides analyses, recommendations, and other assistance to help Congress make informed oversight, policy, and funding decisions. GAO's commitment to good government is reflected in its core values of accountability, integrity, and reliability. The fastest and easiest way to obtain copies of GAO documents at no cost is through the Internet. GAO's Web site (www.gao.gov) contains abstracts and full- text files of current reports and testimony and an expanding archive of older products. The Web site features a search engine to help you locate documents using key words and phrases. You can print these documents in their entirety, including charts and other graphics. Each day, GAO issues a list of newly released reports, testimony, and correspondence. GAO posts this list, known as "Today's Reports," on its Web site daily. The list contains links to the full-text document files. To have GAO e-mail this list to you every afternoon, go to www.gao.gov and select "Subscribe to GAO Mailing Lists" under "Order GAO Products" heading. | Encouraging a clearer and closer link between budgeting and planning is essential to improving federal management and instilling a greater focus on results. Through work at various levels within the organization, this report on the Nuclear Regulatory Commission (NRC)--and its two companion studies on the Administration for Children and Families (GAO-03-09) and the Veterans Health Administration (GAO-03-10)--documents (1) what managers considered successful efforts at creating linkages between planning and performance information to influence resource choices and (2) the challenges managers face in creating these linkages. Although in differing stages of implementation throughout NRC, NRC designed the Planning, Budgeting, and Performance Management Process (PBPM) to better integrate its strategic planning, budgeting, and performance management processes. PBPM links four individual components: (1) setting the agency's strategic direction, (2) determining activities and performance targets of component offices and related resources, (3) executing the budget and monitoring performance targets and taking corrective actions, if needed, to achieve those targets, and (4) assessing agency progress toward achieving its goals. GAO's report provides examples of how the PBPM framework can influence budget formulation and execution decisions. These examples show (1) how NRC informs its resource allocation decisions by providing strategic direction to operating units prior to budget formulation, (2) how operating units that have implemented these processes link strategic direction to budgets through tools that set priorities and assign resources to office activities to accomplish these priorities, and (3) how operating units monitor performance targets and make adjustments as necessary during budget execution. In addition, agency managers have told GAO that PBPM also promotes agencywide coordination of budget formulation and execution decisions by providing a common language and common goals. Integrating budget and planning processes and improving performance management in NRC is an ongoing effort that includes addressing a series of challenges. They are (1) creating performance measures that balance competing goals and keep performance measures current, (2) associating resource requests with outcomes, (3) standardizing PBPM practices and techniques but still allowing some flexibility among offices to tailor the process to their needs, (4) developing the assessment component, and (5) committing significant effort to maintain PBPM. In addition, NRC must continue developing a cost accounting system to support PBPM. | 7,828 | 482 |
In 1998, we reported that difficulties in comparing EPA's fiscal year 1999 and 1998 budget justifications arose because the 1999 budget justification was organized according to the agency's strategic goals and objectives, whereas the 1998 justification was organized according to EPA's program offices and components. Funds for EPA's Science and Technology account were requested throughout the fiscal year 1999 budget justification for all 10 of the agency's strategic goals and for 25 of its 45 strategic objectives. As shown in table 1, two strategic goals--Sound Science and Clean Air--accounted for 71 percent of the funds requested for Science and Technology. In its fiscal year 1999 budget justification, EPA did not show how the funds requested for each goal and objective would be allocated among its program offices or components. To be able to compare EPA's requested fiscal year 1999 funds for Science and Technology to the previous fiscal year's enacted funds, EPA would have had to maintain financial records in two different formats--by program components and by strategic goals and objectives--and to develop crosswalks to link information between the two. EPA maintained these two formats for some of the Science and Technology funds but not for others. Guidance from the Office of Management and Budget (OMB) does not require agencies to develop or provide crosswalks in their justifications when a budget format changes. However, OMB examiners or congressional committee staff may request crosswalks during their analyses of a budget request. Two of EPA's program offices--Research and Development and Air and Radiation--accounted for over 97 percent of the Science and Technology funds that were requested for fiscal year 1999. The offices maintained their financial records differently. The Office of Research and Development maintained the enacted budget for fiscal year 1998 by program components (the old format) and also by EPA's strategic goals and objectives (the new format). With these two formats of financial data, the Office of Research and Development could readily crosswalk, or provide links, to help compare the 1998 enacted funds, organized by program components, to the fiscal year 1999 budget justification, organized according to EPA's strategic goals and objectives. In contrast, the Office of Air and Radiation maintained its financial records for fiscal year 1998 under EPA's new strategic goals and objectives format but did not also maintain this information under the old format. Therefore, the Office of Air and Radiation could only estimate how the fiscal year 1998 enacted funds would have been allocated under the old format. For example, EPA estimated that the Office of Air and Radiation's program component for radiation had an enacted fiscal year 1998 budget of $4.6 million. While the activities of this program component continued in fiscal year 1999, they were subsumed in the presentation of the budget for EPA's strategic goals and objectives. Therefore, because the radiation program could not be readily identified in the fiscal year 1999 budget justification, congressional decisionmakers could not easily compare funds for it with the amount that had been enacted for fiscal year 1998. At our request, the Office of Air and Radiation estimated its enacted budget for fiscal year 1998 by program components and then developed a crosswalk to link those amounts with EPA's strategic goals and objectives. The remaining 3 percent of the requested funds for Science and Technology is administered by the Office of Water; the Office of Administration and Resources Management; the Office of Prevention, Pesticides, and Toxic Substances; and the Office of Enforcement and Compliance Assurance. Two of these offices--the Office of Prevention, Pesticides, and Toxic Substances and the Office of Enforcement and Compliance Assurance--did not format financial information by program components. These offices estimated how the 1998 enacted funds would be classified under their various program components. For fiscal year 2000, EPA made several changes to improve the clarity of its budget justification. According to EPA officials, they planned to provide tables for each goal and objective to show the amounts of funds requested for key programs, starting with the agency's fiscal year 2000 budget justification. The justification for fiscal year 2000 does contain additional information, in the form of tables for each objective, that details some of the requested amounts by key programs. For example, under the objective Research for Human Health Risk, part of the Sound Science goal, the $56 million requested for the objective is divided into two key programs: Human Health Research and Endocrine Disruptor Research. According to EPA officials, they did not plan to identify in the fiscal year 2000 budget justification the program offices that would be administering the requested funds. However, they intended to make available backup information to show the program offices that would be administering the requested funds. Such information is available for the fiscal year 2000 budget request and was provided to this Committee. According to EPA officials and an EPA draft policy on budget execution, the agency's Planning, Budgeting, Analysis, and Accountability System would record budget data by goals, objectives, subobjectives, program offices, and program components. EPA expected that this system would be fully implemented on October 1, 1998. According to EPA officials, the new Planning, Budgeting, Analysis, and Accountability System was implemented on this date; accordingly, EPA can provide information showing how the agency's requested funds would be allocated according to any combination of goals, objectives, subobjectives, program offices, and key programs. EPA also planned to submit future budget justifications in the format of its strategic goals and objectives, as it had done for fiscal year 1999. That way, the formats for fiscal year 2000 and beyond would have been similar to those for the fiscal year 1999 justification, facilitating comparisons in future years. According to EPA officials, the strategic goals and objectives in EPA's fiscal year 2000 justification for Science and Technology would be the same as those in its fiscal year 1999 justification. However, beginning in fiscal year 1999, the agency has begun to reassess its strategic goals and objectives, as required by the Government Performance and Results Act. This assessment was meant to involve EPA's working with state governments, tribal organizations, and congressional committees to evaluate its goals and objectives to determine if any of them should be modified. Upon completion of this assessment, if any of EPA's goals or objectives change, the structure of the agency's budget justification would change correspondingly. Changes to the strategic goals and objectives in the budget justifications could also require crosswalks and additional information to enable consistent year-to-year comparisons. EPA did maintain, as planned, the strategic goals and objectives format for its fiscal year 2000 budget justification. However, for the objectives that rely on Science and Technology funds, EPA made several changes without explanations or documentation to link the changes to the fiscal year 1999 budget justification. EPA (1) acknowledged that funds from one objective were allocated to several other objectives but did not identify the objectives or amounts, (2) did not identify funds in Science and Technology amounts that were transferred from Hazardous Substances Superfund, and (3) made other changes to the number or wording of objectives that rely on Science and Technology funds. In the fiscal year 1999 budget justification, under the strategic goal Sound Science, Improved Understanding of Environmental Risk, and Greater Innovation to Address Environmental Problems, EPA requested $86.6 million for the fifth objective: Enable Research on Innovative Approaches to Current and Future Environmental Problems; and the 1998 fiscal year enacted amount was listed as $85.0 million. In the fiscal year 2000 budget justification, EPA marked this objective as "Not in Use." The justification stated that the fiscal year 1999 request included the amounts for operating expenses and working capital for the Office of Research and Development under the same objective in the Sound Science goal. In the fiscal year 2000 budget justification, EPA allocated the amounts requested for this objective among the other goals and objectives to more properly reflect costs of the agency's objectives. However, the fiscal year 2000 justification did not identify the specific objectives for either the $85.0 million enacted for fiscal year 1998 nor the $86.6 million requested for fiscal year 1999. The allocation of funds was not specifically identified in the justification because EPA does not prepare crosswalks unless asked to by OMB or congressional committees. Therefore, a clear comparison of 1999 and 2000 budget justifications cannot be made. Another aspect that made year-to-year comparisons difficult was EPA's treatment of funds transferred to Science and Technology from the agency's Superfund account. In the fiscal year 2000 justification, the Science and Technology amounts shown as enacted for fiscal year 1999 include $40 million transferred from the Hazardous Substances Superfund. In contrast, the requested amounts for fiscal year 2000 do not include the transfer from the Superfund. As a result, amounts enacted for fiscal year 1999 cannot be accurately compared to the amounts requested for fiscal year 2000. This discrepancy is particularly evident in the objective Reduce or Control Risks to Human Health, under the goal Better Waste Management, Restoration of Contaminated Waste Sites, and Emergency Response. The amounts for Science and Technology as shown in the budget justification for the objective are shown in table 2. The $49.8 million shown as enacted for fiscal year 1999 includes a significant amount of the $40 million transferred from the Superfund account, according to an EPA official. However, because the specific amount is not shown, an objective-by-objective comparison of the Science and Technology budget authority for fiscal years 1999 and 2000 cannot be accurately made, and it appears that EPA is requesting a significant decrease for this objective. An EPA official stated that the $40 million was not separately identified because the congressional guidance on transferring the funds did not specifically state which objectives these funds were to support. In the fiscal year 1999 budget justification, the strategic goal Better Waste Management, Restoration of Contaminated Waste Sites, and Emergency Response had three objectives: (1) Reduce or Control Risks to Human Health, (2) Prevent Releases by Proper Facility Management, and (3) Respond to All Known Emergencies. In the fiscal year 1999 budget request, EPA indicated $6.3 million was enacted for Prevent Releases by Proper Facility Management in fiscal year 1998 and requested $6.6 million for fiscal year 1999. EPA indicated $1.6 million was enacted for Respond to All Known Emergencies in fiscal year 1998 and requested $1.6 for fiscal year 1999. The fiscal year 2000 budget justification omits these two--the second and third objectives and does not indicate where the funds previously directed to those objectives appear. Therefore, a clear comparison of budget requests year to year cannot be made. In the fiscal year 2000 budget justification, EPA added the second objective--Prevent, Reduce and Respond to Releases, Spills, Accidents, and Emergencies--to the strategic goal Better Waste Management, Restoration of Contaminated Waste Sites, and Emergency Response. EPA indicated that $8.8 million had been enacted for this objective in fiscal year 1999 and requested $9.4 million for this objective for fiscal year 2000. EPA did not identify which objectives in the fiscal year 1999 budget included the enacted $8.8 million and therefore a comparison to the prior budget justification was difficult. The other changes to the objectives were made as a result of the program offices' reassessment of and modifications to subobjectives, which in turn led to changes in the agency's objectives. While we do not question EPA's revisions of its goals or objectives, the absence of a crosswalk or explanation does not enable a clear comparison of budget requests year to year. Mr. Chairman, this concludes my prepared statement. I will be pleased to respond to any questions that you or the Members of the Subcommittee may have. The first copy of each GAO report and testimony is free. Additional copies are $2 each. Orders should be sent to the following address, accompanied by a check or money order made out to the Superintendent of Documents, when necessary. VISA and MasterCard credit cards are accepted, also. Orders for 100 or more copies to be mailed to a single address are discounted 25 percent. U.S. General Accounting Office P.O. Box 37050 Washington, DC 20013 Room 1100 700 4th St. NW (corner of 4th and G Sts. NW) U.S. General Accounting Office Washington, DC Orders may also be placed by calling (202) 512-6000 or by using fax number (202) 512-6061, or TDD (202) 512-2537. Each day, GAO issues a list of newly available reports and testimony. To receive facsimile copies of the daily list or any list from the past 30 days, please call (202) 512-6000 using a touchtone phone. A recorded menu will provide information on how to obtain these lists. | Pursuant to a congressional request, GAO discussed the Environmental Protection Agency's (EPA) budget justification for its Science and Technology account, and changes among the justifications for fiscal years (FY) 1998, 1999, and 2000, focusing on: (1) difficulties experienced in comparing EPA's Science and Technology budget justification for FY 1999 with those of previous years; and (2) actions that EPA planned and implemented in order to improve the clarity and comparability of the FY 2000 justification, and items that need further clarification. GAO noted that: (1) EPA's budget justification for FY 1999 could not be readily compared to amounts requested or enacted for FY 1998 and prior years because the justification did not show how the budget would be distributed among program offices or program components--information needed to link to the prior years' justifications; (2) the Office of Management and Budget does not require EPA to provide information to compare the justifications when the format changes; (3) to facilitate such comparisons, agency officials provided supplemental information to congressional committees; (4) because EPA did not maintain financial records by both program components and strategic goals and objectives for all enacted Science and Technology funds for FY 1998, it could not readily provide information for all amounts; (5) at GAO's request, EPA estimated the 1998 enacted amounts so that the 1998 budget could be compared with the FY 1999 request; (6) EPA implemented several changes to its FY 2000 justification to solve problems experienced in comparing the 1998 and 1999 budget justifications; (7) to improve the clarity of its budget justification for FY 2000, EPA included tables that detail, for each objective, how requested amounts are allocated among key programs; (8) backup information is also available that shows the program offices that will be administering the requested funds; (9) the agency also implemented a new accounting system that records budget data by goals and objectives, which enhances reporting financial data by goals and objectives; (10) while the budget justification followed the basic format reflecting the agency's strategic goals and objectives, EPA made changes to the objectives without explanations or documentation to link the changes to the FY 1999 budget justification; (11) for example, funds were allocated from one objective to other objectives without identifying the objectives or amounts, funds that included money transferred from another account were shown as Science and Technology funds, and changes were made to the number or wording of objectives without explanations; and (12) as a result, the FY 2000 budget justification cannot be completely compared with the FY 1999 justification without supplemental information. | 2,758 | 548 |
Since NASA was established in 1958, its civil service workforce has fluctuated widely. In 1967, during the Apollo program, the workforce was at about 35,900. In the 1970s, due to unfunded programs, the workforce shrank, with several thousand employees involuntarily separated during the middle of the decade. By 1980, the workforce had stabilized near 21,000. It remained close to that level until 1986, when the space shuttle Challenger accident forced a reexamination of NASA. In the mid- and late 1980s, NASA began some ambitious new programs and its workforce began to grow again in the latter part of the decade and into the early 1990s--peaking in 1992 at more than 25,000. When the current administration took office in 1993, it initiated steps to reduce the size of the overall federal workforce. An executive order in February 1993 directed that the workforce be reduced by 4 percent (100,000 employees) by the end of fiscal year 1995. Then, in September 1993, the National Performance Review (NPR) recommended a reduction of 252,000 federal employees by 1999. By the time Congress passed the Federal Workforce Restructuring Act in March 1994, which legislated an overall reduction of 272,900 federal employees by 1999, NASA was already cutting its workforce, which was more than 24,000 in fiscal year 1993, in response to the executive order and the NPR recommendation. NASA currently plans to achieve an FTE level of about 17,500 employees by fiscal year 2000, an overall reduction of about 8,000 from its previously planned level for that year. "As Administrator, I have decided not to take any precipitous action in FY 1996 to work toward these figures because to do so would involve a major disruption to our employees. It would not be fair to put them through this process to reach projections that are not hard and fast." Through fiscal year 1995, NASA reduced its previously planned fiscal year 2000 FTE goal by over 3,000 FTEs, and it was planning to increase the aggregate reduction to about 4,000 FTEs in 1996. As shown in table 1, NASA had just over 24,700 FTE personnel in fiscal year 1993. This number dropped below 23,100 in fiscal year 1995, and it is expected to decrease to about 21,500 in fiscal year 1996. A key feature of the Federal Workforce Restructuring Act of 1994 was the authorization for agencies to pay up to $25,000 to separating workers--a buyout. Initially, NASA planned to offer this buyout to no more than 825 personnel. However, after nearly 2,000 employees indicated interest, NASA decided to offer 1,252 buyouts in 1994. This buyout was accepted by 1,178 employees. The buyout allocations focused on Headquarters, Marshall Space Flight Center, Lewis Research Center, and Kennedy Space Center--the installations most affected by the space station's redesign and program management restructuring. No occupational categories were targeted in the 1994 buyout, but members of the Senior Executive Service, attorneys at Kennedy Space Center and Marshall Space Flight Center, and astronauts were not permitted buyouts, in part, because NASA felt that critical skills would be lost if these employees separated. After the 1994 buyout, NASA was confronted with an even larger downsizing challenge when the President's fiscal year 1996 budget request reduced NASA's budgets through fiscal year 2000 by $4.6 billion. NASA announced its intention to cover this reduction by cutting its infrastructure, including personnel, rather than canceling or cutting back program initiatives. The NASA Administrator tasked the agency to conduct a zero base review (ZBR), which included examining every civil service and support contractor position in NASA to find and eliminate overlap and over staffing. One of the review's conclusions was that NASA's civil service workforce could be reduced to about 17,500 by the end of the decade without eliminating core programs. In anticipation of lower numbers of personnel, NASA offered another buyout in 1995. All employees were eligible and it was accepted by 1,482 employees. The 2,660 buyouts represented about 66 percent of the more than 4,000 employees who left NASA during fiscal years 1994 and 1995, as shown in table 2. NASA's scientists and engineers had the largest reductions in numbers, but the smallest proportionate reductions, as shown in table 3. Consequently, as of September 30, 1995, scientists and engineers made up almost 58 percent of NASA's FTP employees--slightly higher than a few years ago when they were about 56 percent of NASA's workforce. NASA personnel managers consider the two buyouts a success. Given the rate of employee turnover experienced in the 2 years preceding the buyouts, they estimate that as many as 2,000 workers left the agency sooner than they would have without a buyout. As previously noted, buyouts accounted for about two-thirds of the employees leaving NASA in fiscal years 1994 and 1995. However, the buyout authority has expired. Without buyout authority, NASA personnel projections as of March 1996 showed that voluntary retirements and other separations should enable the agency to continue to meet its downsizing goals through fiscal year 1998, but attrition would not be sufficient in fiscal year 1999 to meet the proposed budgets of about half of NASA's centers or for the agency as a whole. As a result, NASA personnel officials said a reduction-in-force would be required by late fiscal year 1998. One element of the expected difficulty in 1999 is that about 70 percent of NASA's planned personnel reductions in the 1996-2000 period are scheduled in 1999 and 2000, with most of those--1,730 out of 2,822--scheduled for 1999. A NASA personnel official explained that reductions were being scheduled for late in the period, in part, to allow sufficient time to work out the details of the conversion to a space shuttle single prime contract at Kennedy Space Center. With the difficult launch schedule associated with the space station, NASA officials were concerned about mission performance if they lowered personnel levels too quickly at Kennedy. One of NASA's major concerns is ensuring a proper skill mix throughout the agency. Currently, NASA's strategy to deal with this concern is to rely on normal attrition, limited hiring focused on the most critical areas, and redeploying employees. NASA officials intend to refine their workforce planning efforts later this year. They stated that these refinements will include developing more detailed demographic information and turnover predictions, identifying specific skill-mix requirements, determining skill excesses and shortages, developing cross-training and relocation opportunities, and implementing specific programs and policies to help achieve an appropriate skill mix for the 17,500 FTE level. NASA's efforts to meet its planned FTE level while avoiding involuntary separations will be affected by the results of several management and operational changes, including the shifting of program management from headquarters to field centers and the use of a single prime contractor for managing the space shuttle at Kennedy Space Center. NASA is in the process of shifting program management control from its headquarters program offices to the field centers. Prior to the ZBR, the NPR recommended several management changes at NASA, including reducing its headquarters workforce by 50 percent, eliminating duplication of functions at headquarters and the centers, and reducing management layers. The ZBR, which was undertaken to develop strategies to meet funding reductions, proposed giving the centers increased management control. The ZBR defined the centers' missions and designated each as a Center of Excellence; that is, having preeminence within the agency for a recognized area of technical competence. A center's mission denotes its role or responsibility in supporting NASA's five major enterprises: Mission to Planet Earth, Aeronautics, Human Exploration and Development of Space, Space Science, and Space Technology. All program implementation responsibilities previously performed by headquarters offices are being reassigned to the field centers. In essence, it is intended that headquarters focus on what the agency does and why, while centers focus on executing programs. Table 4 shows the proposed ZBR reductions for program and staff offices in headquarters, and table 5 shows proposed reductions by NASA installation as of March 1996. In November 1995, NASA selected United Space Alliance--a Rockwell International and Lockheed Martin partnership--as the prime contractor for space flight operations. Although NASA will retain responsibility for launch decisions, NASA personnel will be less involved in day-to-day operations. Thus, fewer civil servants will be required to manage the program. However, conversion efforts are still underway and have not reached the point where NASA officials are able to judge the full extent to which NASA personnel will be involved in overseeing the contractor's operations. Despite this uncertainty, NASA estimates that it should be able to make personnel reductions in the range of 700 to 1,100 FTEs at the Kennedy Space Center. Because the length of the transition period is uncertain, NASA personnel officials show these reductions occurring in 1999 and 2000. However, NASA officials believe the personnel reductions at this center will not be precipitous, but will occur more gradually over the transition period. During the course of the ZBR, the concept of institutes was identified as a potentially beneficial approach to maintain or improve the quality of national science in the face of organizational streamlining. The recommendation was made to reshape NASA's science program under a reinvention strategy to bind NASA's science program more closely to the larger community that it serves. The strategy involved "privatization" of a portion of NASA's science program into a number of science institutes. The purpose for establishing science institutes was to preserve and improve the quality of NASA's contributions to national science in the face of reductions in the size of the federal workforce. Under its Science Institute Plan, NASA intended to select universities, not-for-profit organizations, or consortia to operate 11 institutes under competitively awarded contracts or cooperative agreements to conduct research supporting the specific missions of selected NASA field centers, among other purposes. NASA was working with OMB to identify ways to make the transition to institutes attractive to NASA personnel. Proposed legislation for the agency's fiscal year 1997 authorization bill was sent to OMB. The legislation would have facilitated the institutes' employing of NASA personnel by relaxing current laws that restrict the employment of former federal workers by the private sector and enabling NASA employees to retain the bulk of their federal retirement benefits should they accept an offer of institute employment. Each institute would make its own decisions on hiring NASA employees. This proposal was not favorably reviewed in the executive branch, in part because of concern that covering former NASA personnel with federal benefits after they became private-sector employees would set a precedent to do the same for other federal employees whose jobs are privatized. As shown in table 6, the potential loss of civil service work years as a result of creating science institutes would vary greatly from center to center. According to NASA officials, the extent to which NASA personnel would voluntarily leave to accept the institutes' offers of employment would depend largely on the enactment of the proposed legislation designed to ease such transfers. Without such legislation, NASA officials believe that the number of employees voluntarily leaving NASA would likely be negligible. On June 7, 1996, the NASA Administrator announced that, due to objections to the proposed legislation from the Office of Government Ethics, the Office of Personnel Management, and OMB, efforts to establish new science institutes other than the Biomedical Research Institute at Johnson Space Center would be discontinued. The Administrator stated that NASA did not intend to migrate civil service functions and positions to institutes absent legislative relief. However, NASA will continue to consider alternative options to the proposed institutes. NASA recently requested buyout authority from Congress. We have previously reported that savings from buyouts generally exceed those from reductions-in-force and that savings from downsizing largely depend, among other things, on whether the workforce restructuring has been effectively planned. As previously noted, NASA is currently involved in developing future workforce plans to help ensure a proper skill mix to support its programs and activities. In commenting on a draft of this report, NASA said it had a human resource planning activity underway in support of its fiscal year 1998 budget request. We believe that the results of this effort would provide useful information to Congress in reviewing both NASA's request for buyout authority and its fiscal year 1998 budget request. Therefore, Congress may wish to consider requiring NASA to submit a workforce restructuring plan for achieving its fiscal year 2000 FTE goal. NASA officials concurred with our report and stated that it is a good synopsis of the progress made and the problems remaining. NASA said that civil service staffing at the Kennedy Space Center may not be able to go below 1,360 FTEs. NASA indicated that it would reassess the size of the reduction in preparing its fiscal year 1998 budget request. NASA also summarized its reasons for wanting new buyout authority. NASA's comments are included in appendix I. We researched NASA's workforce history, reviewed NASA workforce statistics and centers' and headquarters' downsizing plans, examined workforce reviews and studies prepared by NASA discussing its downsizing activities, and discussed with NASA officials how the most recent reductions were achieved. We also examined projected workforce statistics through fiscal year 2000 and obtained information on NASA's approach to achieving future downsizing goals. We reviewed workforce statistics from three field centers--Goddard Space Flight Center, Marshall Space Flight Center, and Lewis Research Center--and we reviewed the centers' strategies for meeting future reductions. We relied primarily on information contained in NASA's Civil Service Workforce Report for most of our statistical data. We did not independently verify NASA's statistics. The civil service workforce totals discussed in this report reflect NASA's planning at the time of our review. The likelihood they will continue to be revised to reflect changes is high. We conducted our review principally at NASA headquarters, Washington, D.C., and the Goddard Space Flight Center, Greenbelt, Maryland. We also discussed personnel-related issues with NASA officials at Marshall Space Flight Center, Huntsville, Alabama, and Lewis Research Center, Cleveland, Ohio. We performed our work from June 1995 to June 1996 in accordance with generally accepted government auditing standards. Unless you publicly announce its contents earlier, we plan no further distribution of this report until 14 days from its issue date. At that time, we will send copies of this report to appropriate congressional committees, the NASA Administrator, the Director of OMB, and other interested parties upon request. If you or your staff have any questions concerning this report, please contact me on (202) 512-4841. The major contributors to this report were Frank Degnan, Lawrence Kiser, and Roberta Gaston. The first copy of each GAO report and testimony is free. Additional copies are $2 each. Orders should be sent to the following address, accompanied by a check or money order made out to the Superintendent of Documents, when necessary. VISA and MasterCard credit cards are accepted, also. Orders for 100 or more copies to be mailed to a single address are discounted 25 percent. U.S. General Accounting Office P.O. Box 6015 Gaithersburg, MD 20884-6015 Room 1100 700 4th St. NW (corner of 4th and G Sts. NW) U.S. General Accounting Office Washington, DC Orders may also be placed by calling (202) 512-6000 or by using fax number (301) 258-4066, or TDD (301) 413-0006. Each day, GAO issues a list of newly available reports and testimony. To receive facsimile copies of the daily list or any list from the past 30 days, please call (202) 512-6000 using a touchtone phone. A recorded menu will provide information on how to obtain these lists. | Pursuant to a congressional request, GAO examined the National Aeronautics and Space Administration's (NASA) efforts to downsize its staff. GAO found that: (1) NASA has reduced its fiscal year (FY) 2000 full-time equivalent (FTE) goal by more than 3,000 personnel; (2) NASA has provided eligible employees with voluntary separation incentive payments in exchange for their voluntary retirement or resignation; (3) two-thirds of the employees that left NASA in 1994 and 1995 took buyouts; (4) NASA will not be able to reduce its personnel levels by FY 2000 without invoking involuntarily separation measures; (5) NASA is relying on normal attrition, limited hiring, and redeployment to ensure a proper mix of skills throughout the agency; (6) NASA is shifting its program management control from headquarters to field centers and is using a single prime contractor to manage its space shuttle program at Kennedy Space Center; and (7) NASA would like to develop space science institutes to improve the quality of its science programs, but these efforts have been largely abandoned due to concerns regarding the transfer of NASA employees to institute positions. | 3,350 | 243 |
Most federal civilian employees are covered by the Civil Service Retirement System (CSRS) or the Federal Employees' Retirement System. Both of these retirement plans include survivor benefit provisions. Three separate retirement plans apply to various groups of judges in the federal judiciary, with JSAS being available to participants in all three retirement plans to provide annuities to their surviving spouses and children. Appendix I provides additional information regarding retirement plans that are available to federal judges. JSAS was created in 1956 to help provide financial security for the families of deceased federal judges. It provides benefits to surviving eligible spouses and dependent children of judges who participate in the plan. Judges may elect coverage within 6 months of taking office, 6 months after getting married, or 6 months after being elevated to a higher court, or during an open season authorized by statute. Active and senior judges currently contribute 2.2 percent of their salaries to JSAS, and retired judges contribute 3.5 percent of their retirement salaries to JSAS. Upon a judge's death, the surviving spouse is to receive an annual annuity that equals 1.5 percent of the judge's average annual salary during the 3 highest consecutive paid years (commonly known as the high-3) times the judge's years of creditable service. The annuity may not exceed 50 percent of the high-3 and is guaranteed to be no less than 25 percent. Separately, an unmarried dependent child under age 18, or 22 if a full-time student, receives a survivor annuity that is equal to 10 percent of the judge's high-3 or 20 percent of the judges' high-3 divided by the number of eligible children, whichever is smaller. JSAS annuitants receive an annual adjustment in their annuities at the same time, and by the same percentage, as any cost-of-living adjustment (COLA) received by CSRS annuitants. Spouses and children are also eligible for Social Security survivor benefits. Since its inception in 1956, JSAS has changed several times. Because of concern that too few judges were participating in the plan (74 percent of federal judges participated in 1985, which was down from 90 percent in 1976), Congress made broad reforms effective in 1986 with the Judicial Improvements Act of 1985. The 1985 act (1) increased the annuity formula for surviving spouses from 1.25 percent to the current 1.5 percent of the high-3 for each year of creditable service and (2) changed the provisions for surviving children's benefits to relate benefit amounts to judges' high-3 rather than the specific dollar amounts provided in 1976 by the Judicial Survivors' Annuities Reform Act. In recognition of the significant benefit improvements that were made, the 1985 act increased the amounts that judges were required to contribute from 4.5 percent to 5 percent of their salaries, including retirement salaries. The 1985 act also changed the requirements for government contributions to the plan. Under the 1976 Judicial Survivors' Annuities Reform Act, the government matched the judges' contributions of 4.5 percent of salaries and retirement salaries. The 1985 act modified this by specifying that the government would contribute the amounts necessary to fund any remaining cost over the future lifetime of current participants. That amount is limited to 9 percent of total covered salary each year. Despite the benefit improvements in the 1985 act, the rate of participation in JSAS continued to decline. In 1991, the rate of participation was about 40 percent overall and 25 percent for newly appointed judges. In response to concerns that required contributions of 5 percent may have created a disincentive to participate, Congress enacted the Federal Courts Administration Act of 1992. Under this act, participants' contribution requirements were reduced to 2.2 percent of salaries for active and senior judges and 3.5 percent of retirement salaries for retired judges. The 1992 act also significantly increased benefits for survivors of retired judges. This increase was accomplished by including years spent in retirement in the calculation of creditable service and the high-3 salary averages. Additionally, the 1992 act allowed judges to stop contributing to the plan if they ceased to be married and granted benefits to survivors of any judge who died in the interim between leaving office and the commencement of a deferred annuity. As of September 30, 2004, there were 1,329 active and senior judges, 207 retired judges, and 304 survivor annuitants covered under JSAS, compared with 1,265 active and senior judges, 193 retired judges, and 283 survivor annuitants as of September 30, 2002. AOUSC is responsible for administering and maintaining reliable information on JSAS. JSAS is financed by judges' contributions and direct appropriations in an amount estimated to be sufficient to fund future benefits paid to survivors of current and deceased participants. The federal government's contribution is approved through an annual appropriation and is not based on a rate or percentage of the judges' salaries. To determine the annual contribution of the federal government, AOUSC engages an enrolled actuary to perform the calculation of funding needed based on the difference between the present value of the expected future benefit payments to participants and the value of net assets in the plan. Appendix II provides more details on the formulas used to determine participants' and the federal government's contributions and lump sum payments. The cost of a retirement or survivor benefit plan is typically not measured by annual expenditures for benefits. Such expenditures are not an indicator of the overall long-term cost of a plan. The more complete calculation of a plan's cost is the present value of projected future outlays to retirees or survivors, based on the current pool of participants, with such costs allocated annually. This annual cost allocation is referred to as the normal cost. Normal cost calculations, prepared by an actuary, are estimates and require that many actuarial assumptions be made about the future, including mortality rates, turnover rates, returns on investment, salary increases, and COLA increases over the life spans of current participants and beneficiaries. The plan's actuary, using the plan's funding method--in this case, the aggregate cost method--determines the plan's normal cost. Under the aggregate cost method, the normal cost is the level percentage of future salaries that will be sufficient, along with investment earnings and the plan's assets, to pay the plan's benefits for current participants and beneficiaries. There are many acceptable actuarial methods for calculating normal cost. Regardless of which cost method is chosen, the expected total long-term cost of the plan should be the same; however, year-to-year costs may differ, depending on the cost method used. Our objectives were to determine whether participating judges' contributions for the 3 plan years ending on September 30, 2004, funded at least 50 percent of the JSAS costs and, if not, what adjustments in the contribution rates would be needed to achieve the 50 percent ratio. To satisfy our objectives, we examined the normal costs reported in the JSAS annual report submitted by AOUSC to the Comptroller General for plan years 2002 through 2004. We also examined participants' contributions, the federal government's contribution, and other relevant information in each annual report. An independent accounting firm hired by AOUSC audited the JSAS financial and actuarial information included in the JSAS annual reports, with input from an enrolled actuary regarding relevant data, such as actuarial present value of accumulated plan benefits. An enrolled actuary certified those amounts that are included in the JSAS annual reports. We discussed the contents of the JSAS reports with officials from AOUSC for the 3 plan years (2002 through 2004). In addition, we discussed with the enrolled actuary the actuarial assumptions made to project future benefits of the plan. We did not independently audit the JSAS annual report or the actuarially calculated cost figures. We performed our review in Washington, D.C., from May 2005 through July 2005, in accordance with U.S. generally accepted government auditing standards. We made a draft of this report available to the Director of AOUSC for review and comment. The Director's comments are reprinted in appendix III. For each of the JSAS plan years 2002 through 2004, participating judges funded more than 50 percent of the JSAS normal costs. In plan year 2002, participating judges paid approximately 75 percent of JSAS normal costs, and in plan years 2003 and 2004, they paid approximately 64 and 78 percent of JSAS normal costs, respectively. On the basis of data from plan years 2002, 2003, and 2004, participating judges paid, on average, approximately 72 percent of JSAS normal costs while the federal government's share amounted to approximately 28 percent. Table 1 shows judges' and the federal government's contribution rates and shares of JSAS normal costs (using the aggregate cost method, which is discussed in app. II) for the period covered in our review. The judges' and the federal government's contribution rates for each of the 3 years, shown in table 1, were based on the actuarial valuation that occurred at the end of the prior year. For example, the judges' contribution rate of 2.39 percent and the federal government's contribution rate of 0.80 percent in plan year 2002 were based on the September 30, 2001, valuation contained in the plan year 2002 JSAS report. The judges' contribution of JSAS normal costs shown in table 1 fluctuated from approximately 75 percent in plan year 2002, to approximately 64 percent in plan year 2003, and to 78 percent in plan year 2004. The federal government's contribution of JSAS normal costs also varied, from approximately 25 percent in plan year 2002, to approximately 36 percent in plan year 2003, and to approximately 22 percent in plan year 2004. During those same years, judges' contribution rates remained almost constant, while the federal government's contribution rate increased from 0.80 percent of salaries in plan year 2002 to 1.34 percent of salaries in plan year 2003, and then decreased to 0.65 percent in plan year 2004. The variance in the federal government's contribution rates was a result of the fluctuation in normal costs resulting from several combined factors, such as changes in assumptions; lower-than-expected return on plan assets; demographic changes--retirement, death, disability, new members, and pay increases; as well as an increase in plan benefit obligations. Specifically, the value of total plan assets increased from $473.8 million in plan year 2002 to $484.0 million in plan year 2003, and then decreased to $479.8 million in plan year 2004. However, accumulated plan benefit obligations increased steadily, from $385.4 million in plan year 2002, to $388.5 million in plan year 2003, and to $393.9 million in plan year 2004. Although the judges' contribution rate remained fairly constant, their contribution of normal costs rose to approximately 78 percent in plan year 2004 because total normal costs decreased. During 2004 plan year, contributions from the federal government and judges totaled almost $5.1 million, somewhat less than the actuarial cost of $6.9 million. A primary reason for the difference between total contributions and the plan's actuarial cost was that the approximately 1.3 percent return on the market value of plan assets was lower than the 6.25 percent assumed rate of investment return on plan assets. The resulting actuarial loss increased the required contribution level for the plan by 0.82 percent of total payroll for participating judges. Based on information in JSAS actuarial reports for the 3 years under review, we have determined that participating judges' future contributions do not have to increase in order to cover the minimum 50 percent of JSAS costs required by the Federal Courts Administration Act. We found that the current contribution rates of 2.2 percent of salaries for active and senior judges and 3.5 percent of retirement salaries for retired judges are sufficient to cover at least 50 percent of JSAS costs. As shown in table 1, the judges' average contribution for JSAS costs for this review period was approximately 72 percent, which exceeded the 50 percent contribution goal for judges. Because future normal costs are estimates that may change in any given year, adjusting judges' contribution rates whenever they are found to be generating more or less than 50 percent of JSAS costs is not practical. Future normal costs may change because of certain events that occur during the course of a year, such as the number of survivors or judges who die, the number of new judges electing to participate in JSAS, and the number of judges who retire, and because the values of, and rates of return on, plan assets could create normal statistical variances that would affect the annual normal costs of the plan. Because the plan has only 1,536 participants and 304 survivor annuitants, such variances can have a significant effect on expected normal costs and lead to short-term variability. Therefore, it is important to take a long-term view when evaluating whether contribution rates for judges are appropriate to achieve a 50 percent JSAS contribution share for judges. For example, as shown in table 2, although the judges' contribution share for plan year 2004 was approximately 78 percent, the judges' average contribution share for plan years 1996 through 2004 was approximately 55 percent--significantly closer to the 50 percent contribution goal. Another drawback to making frequent changes to the judges' contribution rate in response to short-term fluctuations in their contribution share could be a decline in JSAS participation. Increasing participation was a major reason for the changes made to JSAS in 1992. From plan years 1998 through 2004, the number of judges participating in JSAS increased 8 percent, from 1,420 to 1,536. We requested comments on a draft of this report from the Director of AOUSC or his designee. In a letter dated August 23, 2005, the Director provided written comments on the report, which we have reprinted in appendix III. AOUSC also provided technical comments, which we have incorporated as appropriate. In its comments, AOUSC stated that our report showed that judges' contributions to JSAS have become disproportionately high, but that we were not suggesting a change in the contribution rate for judges. Specifically, AOUSC stated that we did not present in our report the adjustment that would be needed to the participating judges' contribution rates to achieve the 50 percent funding of the program's costs by the judges. In AOUSC's view, this omission is not consistent with Congress's intent in enacting the Federal Courts Administration Act of 1992. We disagree with AOUSC's view as to the purpose of section 201(i) of the act. Since enactment, we have interpreted this section as providing a minimum percentage of the costs of the program to be borne by its participants because the statute requires us to recommend adjustments when the judges' contributions have not achieved 50 percent of the costs of the fund. We do not view the section as calling for parity between the participants and the federal government with respect to funding the program. Thus, for the 3 years covered by this review, we determined and reported that judges' contributions funded approximately 72 percent of normal costs of JSAS, and therefore, an adjustment to the judges' contribution rates was not needed under the existing legislation because the judges' contributions achieved 50 percent of JSAS costs. We have consistently applied this interpretation of the act's requirement in all of our previous mandated reviews. However, if one were to interpret the act as calling for an equal sharing of the program's costs between participants and the government, then, on the basis of the information contained in the JSAS actuarial report as of September 30, 2004, participating judges' future contributions would have had to decrease a total of 0.86 percentage points below the current 2.2 percent of salaries for active judges and senior judges and 3.5 percent of retirement salaries for retired judges in order to fund 50 percent of JSAS costs over the past 3 years. If the decrease were distributed equally among the judges, those currently contributing 2.2 percent of salaries would have had to contribute 1.34 percent, and those currently contributing 3.5 percent of retirement salaries would have had to contribute 2.64 percent. As we have noted both in this report and prior reports, because of the yearly fluctuations that are experienced by JSAS, short-term trends are not sufficient for use in making informed decisions. As we stated in our report, future normal costs may change because of certain events that occur during the course of a year, such as the number of survivors or judges who die, the number of new judges electing to participate in JSAS, and the number of judges who retire. Also, the values of, and rates of return on, plan assets could create normal statistical variances that would affect the annual normal costs of the plan. Therefore, it is important to take a long- term view when evaluating whether rates for judges are appropriate to achieve a 50 percent minimum JSAS contribution share for judges. We are sending copies of this report to the Director of AOUSC. Copies of this report will be made available to others upon request. This report is also available at no charge on the GAO Web site at http://www.gao.gov. Please contact Steven J. Sebastian at (202) 512-3406 or [email protected] if you or your staff have any questions concerning this report. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this report. Key contributors to this report were Hodge Herry, Assistant Director; Joseph Applebaum; Jacquelyn Hamilton; Amy Bowser; and Kwabena Ansong. The Administrative Office of the United States Courts (AOUSC) administers three retirement plans for judges in the federal judiciary. The Judicial Retirement System automatically covers United States Supreme Court justices, federal circuit and district court judges, and territorial district court judges and is available, at their option, to the Administrative Assistant to the Chief Justice, the Director of AOUSC, and the Director of the Federal Judicial Center. The Judicial Officers' Retirement Fund is available to bankruptcy and full- time magistrate judges. The United States Court of Federal Claims Judges' Retirement System is available to the United States Court of Federal Claims judges. Also, except for judges who are automatically covered under the Judicial Retirement System, judges and judicial officials may opt to participate in the Federal Employees' Retirement System (FERS) or elect to participate in the Judicial Retirement System for bankruptcy judges, magistrate judges, or United States Court of Federal Claims judges. Judges who retire under the judicial retirement plans generally continue to receive the full salary amounts that were paid immediately before retirement, assuming the judges met the age and service requirements. Retired territorial district court judges generally receive the same cost-of- living adjustment that Civil Service Retirement System retirees receive, except that their annuities cannot exceed 95 percent of an active district court judge's salary. United States Court of Federal Claims judge retirees continue to receive the same salary payable to active United States Court of Federal Claims judges. Those in the Judicial Retirement System and the United States Court of Federal Claims Judges' Retirement System are eligible to retire when the number of years of service and the judge's age total at least 80, with a minimum retirement age of 65, and service ranging from 10 to 15 years. Those in the Judicial Officers' Retirement Fund are eligible to retire at age 65 with at least 14 years of service or may retire at age 65 with 8 years of service, on a less than full salary retirement. Participants in all three judicial retirement plans are required to contribute to and receive Social Security benefits. Aggregate funding method. This method, as used by the Judicial Survivors' Annuities System (JSAS) plan, defines the normal cost as the level percentage of future salaries that will be sufficient, along with investment earnings and the plan's assets, to pay the plan's benefits for current participants and beneficiaries. The formula is as follows: The present value of future normal costs (PVFNC) equals the present value of future benefits less net asset value. PVFNC is the amount that remains to be financed by judges and the federal government. The normal cost (NC) percentage equals PVFNC divided by present value of future salaries. Federal government contribution. The following formula is used to determine the federal government's contribution amount: The federal government contribution represents the portion of NC not covered by participants' contributions. Lump sum payout. Under JSAS, a lump sum payout may occur upon the dissolution of marriage either through divorce or death of spouse. Payroll contributions cease, but previous contributions remain in JSAS. Also, if there is no eligible surviving spouse or child upon the death of a participating judge, the lump sum payout to the judge's designated beneficiaries is computed as follows: Lump sum payout equals total amount paid into the plan by the judge plus 3 percent annual interest accrued less 2.2 percent of salaries for each participating year (forfeited amount). In effect, the interest plus any amount contributed in excess of 2.2 percent of judges' salaries will be refunded. | The Judicial Survivors' Annuities System (JSAS) was created in 1956 to provide financial security for the families of deceased federal judges. It provides benefits to eligible spouses and dependent children of judges who elect coverage within 6 months of taking office, 6 months after getting married, or 6 months after being elevated to a higher court, or during an open season authorized by statute. Active and senior judges currently contribute 2.2 percent of their salaries to JSAS, and retired judges contribute 3.5 percent of their retirement salaries to JSAS. Pursuant to the Federal Courts Administration Act of 1992 (Pub. L. No. 102-572), GAO is required to review JSAS costs every 3 years and determine whether the judges' contributions fund 50 percent of the plan's costs. If the contributions fund less than 50 percent of these costs, GAO is to determine what adjustments to the contribution rates would be needed to achieve the 50 percent ratio. GAO is not making any recommendations in this report. The Administrative Office of the United States Courts (AOUSC) believes that GAO should be recommending a reduction in the judges' contribution rate. GAO disagrees with AOUSC's interpretation of the act's requirements. During plan years 2002 through 2004, the participating judges' contributions funded more than 50 percent of the JSAS normal costs. The participating judges funded approximately 75 percent of JSAS normal costs during plan year 2002, 64 percent during plan year 2003, and 78 percent during plan year 2004. On average over the 3-year period, the participating judges funded approximately 72 percent of JSAS normal costs, while the federal government funded approximately 28 percent. The variance in the government's contribution rates was a result of the fluctuation in normal costs resulting from several combined factors, such as changes in assumptions; lower-than-expected rates of return on plan assets; demographic changes--retirement, death, disability, new members, and pay increases; as well as an increase in plan benefit obligations. For the 3 years covered by the review, GAO determined that an adjustment to the judges' contribution rate was not needed because their average contribution share for the review period was approximately 72 percent, which exceeded the minimum 50 percent contribution goal specified by law. In addition, GAO examined the annual share of normal costs covered by judges' contributions over a 9-year period and found that on average the participating judges funded approximately 55 percent of JSAS's normal costs. | 4,553 | 525 |
We found that Interior continues to experience problems hiring and retaining sufficient staff to provide oversight and management of oil and gas activities on federal lands and waters. BLM, BOEM, and BSEE office managers we surveyed reported that they continue to find it difficult to fill vacancies for key oil and gas oversight positions, such as petroleum engineers, inspectors, geologists, natural resource specialists, and geophysicists. These managers reported that it was difficult to retain staff to oversee oil and gas activities because staff leave for higher salaries in the private sector. They also reported that high rates of attrition are a concern because some Interior offices have just one or two employees per position, so a single retirement or resignation can significantly affect office operations and oversight. Nearly half of the petroleum engineers that left BLM in fiscal year 2012 resigned rather than retired, suggesting that they sought employment outside the bureau. According to Office of Personnel Management (OPM) data, the fiscal year 2012 attrition rate for petroleum engineers at BLM was over 20 percent, or more than double the average federal attrition rate of 9.1 percent. We found hiring and retention problems were most acute in areas where industry activity is greatest, such as in the Bakken shale play in western North Dakota, because the government is competing there with industry for the same group of geologists and petroleum engineers. Interior officials cited two major factors that affect the agency's ability to hire and retain sufficient staff to oversee oil and gas activities on federal leases: Higher industry salaries. BLM, BOEM, and BSEE office managers surveyed reported that they have lost potential applicants and staff to industry because it can pay higher salaries. Bureau of Labor Statistics data confirm that there is a wide and growing gap between industry and federal salaries for some positions, particularly petroleum engineers and geologists. For example, from 2002 through 2012, mean federal salaries for petroleum engineers have remained fairly constant at about $90,000 to $100,000 per year whereas private sector salaries have steadily increased from about $120,000 to over $160,000 during this same time period. The lengthy federal hiring process. BLM, BOEM, and BSEE officials surveyed reported that the federal hiring process has affected their ability to fill key oil and gas positions because it is lengthy, with multiple required steps, and that many applicants find other employment before the federal hiring process ends. We analyzed Interior's hiring data and found that the average hiring time for petroleum engineers was 197 days, or more than 6 months, at BOEM and BSEE. BLM fared a little better; its average hiring time for petroleum engineers was 126 days, or a little more than 4 months. However, all hiring times were much longer than 80 calendar days-- OPM's target. According to BLM, BOEM, and BSEE officials, other factors have contributed to difficulties hiring and retaining key oil and gas oversight personnel, such as few qualified applicants in remote areas, or areas with a high cost of living. Interior and its three bureaus--BLM, BOEM, and BSEE--have taken some steps to address hiring and retention challenges but could do more. Interior has used special salary rates and incentives to increase hiring and retention for key oil and gas positions, but use of these incentives has been limited. Interior has taken some steps to reduce the time it takes to hire oil and gas oversight staff but does not collect data to identify the causes of delays in the hiring process and opportunities for reducing them. Finally, Interior has taken some actions to improve recruiting, such as developing workforce plans to coordinate hiring and retention efforts, but this work is ongoing, and the extent to which these plans will help is uncertain. Special salary rates. For fiscal years 2012 and 2013, Congress approved a special 25 percent base pay increase for geologists, geophysicists, and petroleum engineers at BOEM and BSEE in the Gulf of Mexico. According to Interior officials in the Gulf of Mexico, this special pay authority helped retain some geologists, geophysicists, and petroleum engineers, at least in the near term. BOEM and BSEE requested an extension of this special pay authority though fiscal year 2014. In 2012, BLM met with OPM officials to discuss special salary rates for petroleum engineers and petroleum engineering technicians in western North Dakota and eastern Montana, where the disparity between federal and industry salaries is most acute, according to a BLM official. A BLM official told us that OPM requested that BLM provide more data to support its request. The official also told us that BLM submitted draft language to Congress requesting special salary rates through a congressional appropriation. According to Interior officials, all three bureaus are preparing a department-wide request for special salary rates to submit to OPM. Incentives. BLM, BOEM and BSEE have the authority to pay incentives in the form of recruitment, relocation, and retention awards of up to 25 percent of basic pay, in most circumstances, and for as long as the use of these incentives is justified, in accordance with OPM guidance, such as in the event an employee is likely to leave federal service. However, we found that the bureaus' use of these incentives has been limited. For example, during fiscal years 2010 through 2012, the three bureaus hired 66 petroleum engineers but awarded just four recruitment incentives, five relocation incentives, and four retention incentives. BLM awarded two of the four retention incentives in 2012 to help retain petroleum engineers in its North Dakota Field Office. OPM data showed that, in 2011, Interior paid about one-third less in incentive awards than it did in 2010. BLM officials cited various factors that contributed to the limited use of incentives, such as limited funds available for incentives. A BLM official also told us that there was confusion about an OPM and Office of Management and Budget (OMB) requirement to limit incentive awards to 2010 levels and that some field office managers were uncertain about the extent to which office managers were allowed to use incentive awards. Without clear guidance outlining when these incentives should be used, and a means to measure their effectiveness, we concluded that Interior will not be able to determine whether it has fully used its authority to offer incentives to hire and retain key oil and gas oversight staff. Hiring times. To improve its hiring times, Interior participated in an OPM- led, government-wide initiative to streamline the federal hiring process. In 2009, a team of hiring managers and human resources specialists from Interior reviewed the department's hiring process and compared it with OPM's 80 calendar-day hiring target. The team identified 27 action items to reduce hiring times, such as standardizing position descriptions and reducing the number of managers involved in the process. Interior and its bureaus implemented many of the action items over the past few years and made significant progress to reduce hiring times, according to Interior officials and agency records. For example, BSEE reduced the time to select eligible applicants from 90 to 30 days by limiting the amount of time allowed for managers to review and select applicants. A BLM official told us that the bureau is working to automate vacancy announcements to improve the efficiency of its hiring process. However, neither the department nor the three bureaus have complete and accurate data on hiring times that could help them identify and address the causes of delays in the hiring process. Beginning in 2011, Interior provided quarterly data on hiring times to OPM, calculated based on Interior's personnel and payroll databases. However, we identified discrepancies in some of the data--for example, in some cases, hiring times were erroneously recorded as 0 or 1 day. In addition, none of the bureaus systematically analyze the data collected. For instance, BSEE and BOEM collect hiring data on a biweekly basis, but officials told us they use the data primarily to track the progress of individual applicants as they move through the hiring process. Likewise, a BLM official stated that the bureau does not systematically analyze data on hiring times. Without reliable data on hiring times, Interior's bureaus cannot identify how long it takes to complete individual stages in the hiring process or effectively implement changes to expedite the hiring process. Recruiting. BLM, BOEM, and BSEE have taken some steps to improve recruiting. In 2012, BOEM and BSEE contracted with a consulting firm to draft a marketing strategy highlighting the advantages of employment at the bureaus, such as flexible work hours and job security. BOEM and BSEE used this marketing strategy to revise the recruiting information on their external websites and develop recruiting materials such as brochures and job fair displays. According to a BLM workforce strategy planning document, the bureau is considering contracting with a consulting firm to review its recruiting strategy. All three bureaus are also visiting colleges and universities to recruit potential applicants for oil and gas positions, and each has had some success offering student intern positions that may be converted to full-time employment. Workforce planning. Interior is participating in a government-wide initiative led by OPM to identify and address critical skills gaps across the federal government. The effort aims to develop strategies to hire and retain staff possessing targeted skills and address government-wide and department-specific mission-critical occupations and skill gaps. In March 2012, Interior issued a plan providing an overview of workforce planning strategies that it can use to meet emerging workforce needs and skills gaps within constrained budgets. As part of the next phase of this effort, Interior asked its bureaus to develop detailed workforce plans using a standardized model based on best practices used at Interior. Both planning efforts are ongoing, however, so it is too early to assess the effect on Interior's hiring and retention challenges for key oil and gas positions at this time. BLM, BOEM, and BSEE are developing or implementing workforce plans as well. As we reported in July 2012, BOEM and BSEE did not have strategic workforce plans, and we recommended that the bureaus develop plans to address their hiring and retention challenges.workforce plan, and BOEM officials told us that they expect to complete one in 2014. BLM issued a workforce planning strategy in March 2012 that outlined strategic objectives to address some of its key human capital challenges; however, this strategy does not include implementation; address challenges with the hiring process; or outline mechanisms to monitor, evaluate, or improve the hiring process; so it is too soon to tell whether BLM's planning strategy will help the bureau address its human capital challenges. Moreover, we found that the bureaus' efforts do not appear to have been conducted as part of an overarching workforce plan, or in a coordinated and consistent manner, therefore the bureaus do not have a basis to assess the success of these efforts or determine whether and how these efforts should be adjusted over time. The BLM, BOEM, and BSEE officials that we interviewed and surveyed reported that hiring and retention challenges have made it more difficult to carry out their oversight activities. These officials stated that position vacancies have resulted in less time for oversight, and vacancies directly affect the number of oversight activities they can carry out--including the number of inspections conducted and the time for reviewing applications to drill. Officials at some BLM field offices told us that they have not been able to meet their annual inspection and enforcement goals because of vacancies. Of the 20 offices with inspector vacancies that we surveyed, 13 responded that they conducted fewer inspections in 2012 compared with what they would have done if fully staffed, and 9 responded that the thoroughness of inspections was reduced because of vacancies. Of the 21 BLM and BSEE offices with petroleum engineer vacancies, 8 reported that they reviewed fewer applications to drill in 2012 compared with what they would have done if fully staffed. BSEE officials told us that fewer or less-thorough inspections may mean that some offices are less able to ensure operator compliance with applicable laws and regulations and, as a result, there is an increased risk to human health and safety due to a spill or accident. According to a BSEE official, the longer federal inspectors are away from a site, the more likely operators are to deviate from operating in accordance with laws and regulations. Officials at each of the three bureaus cited steps they have taken to address vacancies in key oil and gas positions; specifically, reassigning staff from lower-priority to higher-priority tasks, borrowing staff from other offices, or increasing overtime. However, each of these steps comes at a cost to the agency and is not a sustainable solution. Interior officials told us that moving staff from lower to higher priority work means that the lower priority tasks--many of which are still critical to the bureaus' missions--are deferred or not conducted, such as processing permits. Likewise, offices that borrow staff from other offices gain the ability to carry out activities, but this comes at a cost to the office that loaned the staff. With regard to overtime, BOEM officials reported that a heavy reliance on overtime was exhausting their staff. BLM and BSEE are developing and implementing risk-based inspection strategies--long recommended by GAO and others--as they work to ensure oversight resources are efficiently and effectively allocated; however, staffing shortfalls and turnover may adversely affect the bureaus' ability to carry out these new strategies. In 2010, we reported that BLM routinely did not meet its goals for conducting key oil and gas facility inspections, and we recommended that the bureau consider an alternative inspection strategy that allows for the inspection of all wells within a reasonable time frame, given available resources. In response to this recommendation, in fiscal year 2011, BLM implemented a risk- based inspection strategy whereby each field office inspects the highest risk wells first. Similarly, BSEE officials told us that they have contracted with Argonne National Laboratory to help develop a risk-based inspection strategy. In our January 2014 report, to address the hiring challenges we identified, we recommended that Interior explore its bureaus' expanded use of recruitment, relocation, retention, and other incentives and systematically collect and analyze hiring data. Interior generally agreed with our recommendations. Chairman Lamborn, Ranking Member Holt, and Members of the Subcommittee, this concludes my prepared statement. I would be pleased to answer any questions that you may have at this time. If you or your staff members have any questions concerning this testimony, please contact me at (202) 512-3841 or [email protected]. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this statement. Other individuals who made key contributions include Christine Kehr, Assistant Director; Mark Braza, Glenn Fischer, Michael Kendix, Michael Krafve, Alison O'Neill, Kiki Theodoropoulos, Barbara Timmerman, and Arvin Wu. This is a work of the U.S. government and is not subject to copyright protection in the United States. The published product may be reproduced and distributed in its entirety without further permission from GAO. However, because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. | Interior employs a wide range of highly trained specialists and scientists with key skills to oversee oil and gas operations on leased federal lands and waters. GAO and others have reported that Interior has faced challenges hiring and retaining sufficient staff to carry out these responsibilities. In February 2011, GAO added Interior's management of federal oil and gas resources to its list of programs at high risk of fraud, waste, abuse, and mismanagement in part because of Interior's long-standing human capital challenges. This testimony and the January 2014 report on which it is based address (1) the extent to which Interior continues to face challenges hiring and retaining key oil and gas staff and the causes of these challenges, (2) Interior's efforts to address its hiring and retention challenges, and (3) the effects of hiring and retention challenges on Interior's oversight of oil and gas activities. To do this work, GAO surveyed all 44 Interior offices that oversee oil and gas operations, of which 40 responded; analyzed offshore inspection records and other documents; and interviewed agency officials. The Department of the Interior continues to face challenges hiring and retaining staff with key skills needed to manage and oversee oil and gas operations on federal leases. Interior officials noted two major factors that contribute to challenges in hiring and retaining staff: lower salaries and a slow hiring process. In response to GAO's survey, officials from a majority of the offices in the three Interior bureaus that manage oil and gas activities--the Bureau of Land Management (BLM), the Bureau of Ocean Energy Management (BOEM), and the Bureau of Safety and Environmental Enforcement (BSEE)--reported ongoing difficulties filling vacancies, particularly for petroleum engineers and geologists. Many of these officials also reported that retention is an ongoing concern as staff leave for positions in industry. Bureau of Labor Statistics data confirm a wide gap between industry and federal salaries for petroleum engineers and geologists. According to Office of Personnel Management (OPM) data, the fiscal year 2012 attrition rate for petroleum engineers at BLM was over 20 percent, or more than double the average federal attrition rate of 9.1 percent. Field office officials stated that attrition is of concern because some field offices have only a few employees in any given position, and a single separation can significantly affect operations. Additionally, Interior records show that the average time required to hire petroleum engineers and inspectors in recent months generally exceeded 120 calendar days--much longer than OPM's target of 80 calendar days. Interior and the three bureaus--BLM, BOEM, and BSEE--have taken some actions to address their hiring and retention challenges, but they have not fully used their existing authorities to supplement salaries or collect and analyze hiring data to identify the causes of delays in the hiring process. For instance, BLM, BOEM, and BSEE officials said that recruitment, relocation, and retention incentives are key options to help hire and retain staff, but the bureaus' use of these incentives to attract and retain petroleum engineers and inspectors has been limited for various reasons. Moreover, Interior and its bureaus have taken some steps to reduce hiring times, but they do not have complete and accurate data on hiring times. For instance, while BSEE and BOEM collect hiring data on a biweekly basis, the data are used primarily to track the progress of individual applicants as they move through the hiring process. Likewise, a BLM official stated that the bureau does not systematically analyze data on hiring times. Without reliable data on hiring times, Interior's bureaus cannot identify how long it takes to complete individual stages in the hiring process or effectively implement changes to expedite the hiring process. According to BLM, BOEM, and BSEE officials, hiring and retention challenges have made it more difficult to carry out oversight activities in some field offices. For example, many BLM and BSEE officials GAO surveyed reported that vacancies have resulted in a reduction in the number of inspections conducted. As a result of these challenges, bureau officials cited steps they have taken to address vacancies in key positions, such as borrowing staff from other offices or using overtime, but these are not sustainable, long-term solutions. In its January 2014 report, GAO recommended that Interior explore its oil and gas management bureaus' expanded use of recruitment, relocation, retention, and other incentives and systematically collect and analyze hiring data. Interior generally agreed with GAO's recommendations. GAO is not making any new recommendations in this testimony. | 3,260 | 949 |
Advances in the use of IT and the Internet are continuing to change the way that federal agencies communicate, use, and disseminate information; deliver services; and conduct business. For example, electronic government (e-government) has the potential to help build better relationships between government and the public by facilitating timely and efficient interaction with citizens. To help agencies more effectively manage IT, the Congress has established a statutory framework of requirements and roles and responsibilities relating to information and technology management. In particular, the Paperwork Reduction Act of 1995 and the Clinger-Cohen Act of 1996 require agency heads, acting through agency CIOs to, among other things, better link their IT planning and investment decisions to program missions develop and maintain a strategic information resources management (IRM) plan that describes how IRM activities help to accomplish agency missions; develop and maintain an ongoing process to establish goals for improving IRM's contribution to program productivity, efficiency, and effectiveness; methods for measuring progress toward these goals; and clear roles and responsibilities for achieving these goals; develop and implement a sound IT architecture; implement and enforce IT management policies, procedures, standards, and guidelines; establish policies and procedures for ensuring that IT systems provide reliable, consistent, and timely financial or program performance data; and implement and enforce applicable policies, procedures, standards, and guidelines on privacy, security, disclosure, and information sharing. Nevertheless, the agencies face significant challenges in effectively planning for and managing their IT. Such challenges can be overcome through the use of a systematic and robust management approach that addresses critical elements such as IT strategic planning and investment management. Federal agencies did not always have in place important practices associated with IT laws, policies, and guidance related to strategic planning/performance measurement and investment management (see fig. 1). A well-defined strategic planning process helps to ensure that an agency's IT goals are aligned with its strategic goals. Moreover, establishing performance measures and monitoring actual-versus- expected performance using those measures can help to determine whether IT is making a difference in improving performance. Finally, an IT investment management process is an integrated approach to managing investments that provides for the continuous identification, selection, control, life-cycle management, and evaluation of IT investments. Agency IT officials could not always identify why practices were not in place, but in those instances in which reasons were identified, a variety of explanations were provided; for example, that the CIO position had been vacant, that not including a requirement in the agency's guidance was an oversight, or that the process was being revised. Nevertheless, these practices are based on law, executive orders, Office of Management and Budget (OMB) policies, and our guidance, and are also important ingredients in ensuring effective strategic planning, performance measurement, and investment management that, in turn, make it more likely that the billions of dollars in government IT investments will be wisely spent. Critical aspects of the strategic planning/performance measurement area include documenting the agency's IT strategic planning processes, developing IRM plans, establishing goals, and measuring performance to evaluate whether goals are being met. Although the agencies often had these practices, or elements of these practices, in place, additional work remains, as demonstrated by the following examples: Strategic planning process. Strategic planning defines what an organization seeks to accomplish and identifies the strategies it will use to achieve desired results. A defined strategic planning process allows an agency to clearly articulate its strategic direction and to establish linkages among planning elements such as goals, objectives, and strategies. About half of the agencies had fully documented their strategic planning processes. Such processes are an essential foundation for ensuring that IT resources are effectively managed. Strategic IRM plans. The Paperwork Reduction Act requires that agencies indicate in strategic IRM plans how they are applying information resources to improve the productivity, efficiency, and effectiveness of government programs. An important element of a strategic plan is that it presents an integrated system of high-level decisions that are reached through a formal, visible process. The Paperwork Reduction Act also requires agencies to develop IRM plans in accordance with OMB's guidance. However, OMB does not provide cohesive guidance on the specific contents of IRM plans. Accordingly, although agencies generally provided OMB with a variety of planning documents to meet its requirement that they submit an IRM plan, these plans were generally limited to IT strategic or e-government issues and did not address other elements of IRM, as defined by the Paperwork Reduction Act. In particular, these plans generally include individual IT projects and initiatives, security, and enterprise architecture elements but do not often address other information functions--such as information collection, records management, and privacy--or the coordinated management of all information functions. OMB IT staff agreed that the agency has not set forth guidance on the contents of agency IRM plans in a single place, stating that its focus has been on looking at agencies' cumulative results and not on planning documents. These staff also noted that agencies account for their IRM activities through multiple documents (e.g., Information Collection Budgets and Government Paperwork Elimination Act plans). Nevertheless, half the agencies indicated a need for OMB to provide additional guidance on the development and content of IRM plans. Accordingly, we recommended that OMB develop and disseminate to agencies guidance on developing IRM plans. IT goals. The Paperwork Reduction Act and the Clinger-Cohen Act require agencies to establish goals that address how IT contributes to program productivity, efficiency, effectiveness, and service delivery to the public. We have previously reported that leading organizations define specific goals, objectives, and measures, use a diversity of measure types, and describe how IT outputs and outcomes impact operational customer and agency program delivery requirements. The agencies generally had the types of goals outlined in the Paperwork Reduction Act and the Clinger- Cohen Act. However, five agencies did not have one or more of the goals required by the Paperwork Reduction Act and the Clinger-Cohen Act. It is important that agencies specify clear goals and objectives to set the focus and direction for IT performance. IT performance measures. The Paperwork Reduction Act, the Clinger- Cohen Act, and an executive order require agencies to establish a variety of IT performance measures--such as those related to how IT contributes to program productivity, efficiency, and effectiveness--and to monitor the actual-versus-expected performance using those measures. Although the agencies largely had one or more of the required performance measures in place, these measures were not always linked to the agencies' enterprisewide IT goals. Moreover, few agencies monitored actual-versus- expected performance for all of their enterprisewide IT goals. Specifically, although some agencies tracked actual-versus-expected outcomes for the IT performance measures in their performance plans or accountability reports and/or for specific IT projects, they generally did not track the performance measures that were specified in their IRM plans. As we have previously reported, an effective IT performance management system offers a variety of benefits, including serving as an early warning indicator of problems and the effectiveness of corrective actions; providing input to resource allocation and planning; and providing periodic feedback to employees, customers, stakeholders, and the general public about the quality, quantity, cost, and timeliness of products and services. Moreover, without enterprisewide performance measures that are tracked against actual results, agencies lack critical information about whether their overall IT activities are achieving expected goals. Benchmarking. The Clinger-Cohen Act requires agencies to quantitatively benchmark agency process performance against public- and private-sector organizations, where comparable processes and organizations exist. Benchmarking is used because there may be external organizations that have more innovative or more efficient processes than their own processes. Seven agencies in our review had mechanisms in place--such as policies and strategies--related to benchmarking their IT processes. In general, however, agencies' benchmarking decisions were ad hoc. Few agencies had developed a mechanism to identify comparable external private- or public-sector organizations and processes and/or had policies related to benchmarking, although all but 10 of the agencies provided examples of benchmarking that they had performed. Our previous study of IT performance measurement at leading organizations found that they had spent considerable time and effort comparing their performance information with that of other organizations. Agency IT officials could not identify why strategic planning/performance measurement practices were not in place in all cases, but in those instances in which reasons were identified, a variety of explanations were provided. For example, reasons cited by agency IT officials included that they lacked the support from agency leadership, that the agency had not been developing IRM plans until recently and recognized that the plan needed further refinement, that the process was being revised, and that requirements were evolving. Without strong strategic management practices, it is less likely that IT is being used to maximize improvement in mission performance. Moreover, without enterprisewide performance measures that are being tracked against actual results, agencies lack critical information about whether their overall IT activities, at a governmentwide cost of billions of dollars annually, are achieving expected goals. Critical aspects of IT investment management include developing well- supported proposals, establishing investment management boards, and selecting and controlling IT investments. The agencies' use of practices associated with these aspects of investment management was wide- ranging, as follows: IT investment proposals. Various legislative requirements, an executive order, and OMB policies provide minimum standards that govern agencies' consideration of IT investments. In addition, we have issued guidance to agencies for selecting, controlling, and evaluating IT investments. Such processes help ensure, for example, that investments are cost-beneficial and meet mission needs and that the most appropriate development or acquisition approach is chosen. The agencies in our review had mixed results when evaluated against these various criteria. For example, the agencies almost always required that proposed investments demonstrate that they support the agency's business needs, are cost-beneficial, address security issues, and consider alternatives. However, they were not as likely to have fully in place the Clinger-Cohen Act requirement that agencies follow, to the maximum extent practicable, a modular, or incremental, approach when investing in IT projects. Incremental investment helps to mitigate the risks inherent in large IT acquisitions/developments by breaking apart a single large project into smaller, independently useful components with known and defined relationships and dependencies. Investment management boards. Our investment management guide states that establishing one or more IT investment board(s) is a key component of the investment management process. Such executive-level boards, made up of business-unit executives, concentrate management's attention on assessing and managing risks and regulating the trade-offs between continuing to fund existing operations and developing new performance capabilities. Almost all of the agencies in our review had one or more enterprise-level investment management board. However, the investment management boards for six agencies were not involved, or the agency did not document the boards' involvement, in the control phase. Maintaining responsibility for oversight with the same body that selected the investment is crucial to fostering a culture of accountability by holding the investment board that initially selected an investment responsible for its ongoing success. Selection of IT investments. During the selection phase of an IT investment management process, the organization (1) selects projects that will best support its mission needs and (2) identifies and analyzes each project's risks and returns before committing significant funds. To achieve desired results, it is important that agencies have a selection process that, for example, uses selection criteria to choose the IT investments that best support the organization's mission and that prioritizes proposals. Twenty- two agencies used selection criteria in choosing their IT investments. In addition, about half the agencies used scoring models to help choose their investments. Control over IT investments. During the control phase of the IT investment management process, the organization ensures that, as projects develop and as funds are spent, the project is continuing to meet mission needs at the expected levels of cost and risk. If the project is not meeting expectations or if problems have arisen, steps are quickly taken to address the deficiencies. In general, the agencies were weaker in the practices pertaining to the control phase of the investment management process than to the selection phase and no agency had the practices associated with the control phase fully in place. In particular, the agencies did not always have important mechanisms in place for agencywide investment management boards to effectively control investments, including decision-making rules for project oversight, early warning mechanisms, and/or requirements that corrective actions for under- performing projects be agreed upon and tracked. Executive level oversight of project-level management activities provides an organization with increased assurance that each investment will achieve the desired cost, benefit, and schedule results. Among the variety of reasons that agencies cited for not having IT investment management practices fully in place were that the CIO position had been vacant, that not including a requirement in the IT investment management guide was an oversight, and that the process was being revised. However, in some cases agencies could not identify why certain practices were not in place. It is important that agencies address their shortcomings, because only by effectively and efficiently managing their IT resources through a robust investment management process can they gain opportunities to make better allocation decisions among many investment alternatives and to further leverage their IT investments. To help agencies improve their IT strategic planning/performance measurement and investment management, we have made numerous recommendations to agencies and issued guidance. Specifically, in our January 2004 report we made recommendations to the 26 agencies in our review regarding practices that were not fully in place. These recommendations addressed issues such as IT strategic planning; establishing and linking enterprisewide goals and performance measures and tracking progress against these measures; and selecting, controlling, and evaluating investments. By implementing these recommendations, agencies can better ensure that they are using strategic planning, performance measurement, and investment management practices that are consistent with IT legislation, executive orders, OMB policies, and our guidance. Another mechanism that agencies can use to improve their IT management is to apply the management frameworks and guides that we have issued, which are based on our research into IT management best practices and our evaluations of agency IT management performance. In this vein, today we are releasing the latest version of our ITIM framework. This framework identifies and organizes critical processes for selecting, controlling, and evaluating IT investments into a framework of increasingly mature stages (see fig. 2). First issued as an exposure draft in May 2000, this new version of the ITIM includes lessons learned from our use of the framework in our agency reviews and from lessons conveyed to us by users of the framework. In addition, in order to validate the appropriateness of our changes and to gain the advantage of their experience, we had the new version reviewed by several outside experts who are familiar with the ITIM exposure draft and with investment management in a broad array of public and private organizations. ITIM can be used to analyze an organization's investment management processes and to determine its level of maturity. The framework is useful to many federal agencies because it provides: (1) a rigorous, standardized tool for internal and external evaluations of an agency's IT investment management process; (2) a consistent and understandable mechanism for reporting the results of these assessments to agency executives, Congress, and other interested parties; and (3) a road map that agencies can use for improving their investment management processes. Regarding the first two points, we and selected agency Inspectors General have used the ITIM to evaluate and report on the investment management processes of several agencies. Concerning the third point, a number of agencies have recognized the usefulness of the ITIM framework and have used it to develop and enhance their investment management strategies. For example, one agency uses the framework to periodically review its IT investment management capabilities and has developed an action plan to move through the stages of maturity. In summary, our January 2004 report indicates that the federal government can significantly improve its IT strategic planning, performance measurement, and investment management. Such improvement would better ensure that agencies are being responsible stewards of the billions of dollars for IT with which they have been entrusted, by helping them to invest these monies wisely. This can be accomplished, in part, through the expeditious implementation of our recommendations and the adoption of best practices, which we have incorporated into our IT management frameworks and guides such as the ITIM. Mr. Chairman, this completes my prepared statement. I would be happy to respond to any questions that you or other Members of the Subcommittee may have at this time. If you have any questions regarding this statement, please contact me at (202) 512-9286 or by e-mail at [email protected]. Specific questions related to our January 2004 report may also be directed to Linda Lambert at (202) 512-9556 or via e-mail at [email protected] or Mark Shaw at (202) 512-6251 or via e-mail at [email protected]. Questions related to the ITIM framework can be directed to Lester Diamond at (202) 512-7957 or via e- mail at [email protected]. Table 1 describes the 12 IT strategic planning/performance measurement and the 18 IT investment management practices that we used in our January 2004 report on the government's performance in these areas. We identified these 30 practices after reviewing major legislative requirements (e.g., the Paperwork Reduction Act of 1995 and the Clinger-Cohen Act of 1996), executive orders, Office of Management and Budget policies, and our own guidance. | The federal government spends billions of dollars annually on information technology (IT) investments that are critical to the effective implementation of major government programs. To help agencies effectively manage their substantial IT investments, the Congress has established a statutory framework of requirements and roles and responsibilities relating to information and technology management, that addresses, for example, (1) IT strategic planning/performance measurement (which defines what an organization seeks to accomplish, identifies the strategies it will use to achieve desired results, and then determines how well it is succeeding in reaching resultsoriented goals and achieving objectives) and (2) IT investment management (which involves selecting, controlling, and evaluating investments). GAO was asked to summarize its January 2004 report on IT strategic planning/performance measurement and investment management (Information Technology Management: Governmentwide Strategic Planning, Performance Measurement, and Investment Management Can Be Further Improved, GAO-04-49 , January 12, 2004) and to discuss how agencies can improve their performance in these areas. GAO recently reported that the use of important IT strategic planning/performance measurement and investment management practices by 26 major federal agencies was mixed. For example, agencies generally had IT strategic plans and goals, but these goals were not always linked to specific performance measures that were tracked. Agencies also largely had IT investment management boards, but no agency had the practices associated with the oversight of IT investments fully in place. Although they could not always provide an explanation, agencies cited a variety of reasons for not having practices fully in place, including that the chief information officer position had been vacant and that the process was being revised. By improving their IT strategic planning, performance measurement, and investment management, agencies can better ensure that they are being responsible stewards of the billions of dollars for IT that they have been entrusted with through the wise investment of these monies. To help agencies improve in these areas, GAO has made numerous recommendations to agencies and issued guidance. For example, in the January 2004 report, GAO made recommendations to the 26 agencies regarding practices that were not fully in place. In addition, today GAO is releasing the latest version of its Information Technology Investment Management (ITIM) framework, which identifies critical processes for selecting, controlling, and evaluating IT investments and organizes them into a framework of increasingly mature stages; thereby providing agencies a road map for improving IT investment management processes in a systematic and organized manner. | 3,694 | 505 |
The Fair Housing Act, title VIII of the Civil Rights Act of 1968, prohibited discrimination in the sale, rental, and financing of housing based on race, color, religion, or national origin. The act allowed the Department of Housing and Urban Development (HUD) to investigate and conciliate complaints of housing discrimination and authorized the Department of Justice to file suits in cases of a pattern or practice of discrimination or in cases of public importance. HUD was not given any authority to administratively remedy acts of discrimination against an individual, however. The Fair Housing Act also required HUD to refer housing discrimination complaints to state and local agencies where the state or local law provided rights and remedies substantially equivalent to those provided by the federal law. In 1980, HUD established the Fair Housing Assistance Program to provide financial assistance to state and local agencies to encourage them to assume a greater share of the enforcement of their fair housing laws. The Fair Housing Initiatives Program (FHIP), administered by HUD, is designed to provide a coordinated and comprehensive approach to fair housing activities in order to strengthen enforcement of the Fair Housing Act. During the 1986 Senate hearings on its proposal to establish the FHIP, HUD testified that enforcement activity, particularly testing, by private nonprofit and other private entities would be the principal focus and motivation of the program. In February 1988, the program was created as a 2-year demonstration program by the Housing and Community Development Act of 1987. About 7 months later, the Fair Housing Amendments Act of 1988 was signed into law, and it became effective in March 1989. The 1988 act attempted to remedy the enforcement shortcomings of the original legislation. It significantly strengthened federal fair housing enforcement by, among other things, establishing an administrative enforcement mechanism, allowing HUD to pursue cases filed by an individual before an administrative law judge for disposition and providing for civil penalties. In November 1990, FHIP was extended for 2 additional years, and with the enactment of the Housing and Community Development Act of 1992, it became a permanent program, effective fiscal year 1993. The 1992 act also expanded the program to reflect significant legislative changes in fair housing and lending that had taken place after the program's creation in 1988. It authorized FHIP to implement testing programs whenever there was a reasonable basis for doing so; establish new fair housing organizations or expand the capacity of existing ones; conduct special projects to, for example, respond to new or sophisticated forms of housing discrimination; undertake larger, long-term enforcement activities through multiyear funding agreements; and pay for litigation. For fiscal years 1989 through 1997, the Congress appropriated $113 million for FHIP. The permanent program grew from an appropriation of $10.6 million in fiscal year 1993 to $26 million in fiscal year 1995 (see fig. 1). Funds for the program are distributed on the basis of competitive grants through four program initiatives. These initiatives or funding categories generally define who is eligible to receive funds and/or the focus of activities to be funded. The initiatives are (1) the private enforcement initiative--funding for private nonprofit organizations to undertake testing and other enforcement-related activities; (2) the fair housing organizations initiative--funding for private nonprofit organizations to create new fair housing enforcement organizations in those areas of the country that were unserved or underserved by such organizations or expand the capacity of existing private nonprofit fair housing organizations; (3) the education and outreach initiative--funding for private and public entities to educate the general public and housing industry groups about fair housing rights and responsibilities; and (4) the administrative enforcement initiative--funding for state and local government agencies that administer fair housing laws certified by HUD as substantially equivalent to federal law to help such agencies broaden their range of enforcement and compliance activities. Private organizations that receive grants generally are nonprofit entities and have experience in investigating complaints, testing for fair housing violations, and enforcing legal claims or outcomes. The program provides considerable flexibility in the types of activities that can be funded under each initiative. Eligible activities include education and outreach programs, testing based on complaints and other reasonable bases, the recruitment of testers and attorneys, special projects to respond to new or sophisticated forms of discrimination, litigation expenses, and the creation of new fair housing organizations in areas of the country underserved by fair housing enforcement organizations. The program is restricted from funding two types of activities: (1) settlements, judgments, or court orders in any litigation action involving HUD or HUD-funded housing providers and (2) expenses associated with litigation against the federal government. Appendix I provides additional details on the types of activities eligible for funding under the program. FHIP is an integral part of HUD's fair housing enforcement and education efforts that are concentrated within the Office of Fair Housing and Equal Opportunity. In addition to FHIP, this office is responsible for the oversight of the Fair Housing Assistance Program, investigation and processing of fair housing complaints, and referral of complaints to Justice when appropriate. FHIP links and extends fair housing enforcement and education and outreach activities to many state and local governments and communities across the country. The program makes it possible for HUD to look comprehensively at fair housing problems and to work with the whole spectrum of agencies that are involved in fighting housing discrimination. Taken together, FHIP and the Fair Housing Assistance Program, form a national fair housing strategy through greater cooperation between the private and public sectors. In fiscal year 1996, FHIP accounted for about 22 percent of the Office of Fair Housing and Equal Opportunity's $76.3 million budget (see fig. 2). HUD uses discretion in deciding how FHIP funds are allocated among the four program initiatives. Reflecting the program's principal focus, HUD's budget requests to the Congress set forth how it plans to divide the total amount of dollars requested for FHIP among the four initiatives. Notices of funding availability in the Federal Register indicate the dollar amounts HUD makes available for competition under each program initiative. According to the Acting FHIP Division Director, the Assistant Secretary for Fair Housing and Equal Opportunity determines how funds are allocated on the basis of legislation, administration and agency priorities, and input from the housing industry and fair housing groups. HUD's allocations for FHIP have consistently reflected that enforcement activities are the principal focus of the program. In annual budget justifications to the Congress, HUD discusses its emphasis for the year and indicates how much of FHIP's total budget request it plans to allocate to each FHIP initiative. Table II.1 in appendix II shows by fiscal year the dollar amounts HUD anticipated it would allocate to each initiative. The Congress has appropriated amounts equal to or greater than the amounts HUD requested each fiscal year until 1996. In accordance with its budget plans, HUD has made the largest portion of FHIP dollars available for the private enforcement initiative. In 2 fiscal years (1993 and 1994) in which HUD received appropriated amounts higher than its budget requests, the additional dollars available resulted in the private enforcement initiative's receiving significantly more money than initially planned. Overall, HUD made about 48 percent of FHIP funds available for the private enforcement initiative (see table II.2). The relationship between HUD's proposed allocations for each initiative and the funds made available indicates that the dollar amounts were basically the same in 4 of the 8 years (fiscal years 1989 through 1991 and 1995). For the remaining years, allocations varied considerably from HUD's initial budget plans primarily because appropriated amounts for FHIP overall were either higher or lower than the budget requests. The variations were as follows: In fiscal year 1992, the amount appropriated for FHIP was the same as the budget request. The private enforcement initiative's allocation was $1.3 million less than HUD initially anticipated; the administrative enforcement initiative's was $0.9 million more, and the education and outreach initiative's was $0.4 million more. In fiscal year 1993, FHIP's appropriation was $3 million higher than the budget request. The private enforcement initiative's allocation was $1 million more; the education and outreach initiative's, $0.5 million more. The fair housing organizations initiative, which was authorized in late 1992, received a $2.6 million allocation. The administrative enforcement initiative's allocation was $1.1 million less than anticipated, however. In fiscal year 1994, FHIP's appropriation was $3.6 million higher than the budget request. Of this, HUD allocated $3 million to the private enforcement initiative and $0.6 million to the fair housing organizations initiative. In fiscal year 1996, FHIP's appropriation was 43 percent lower than the budget request. While the budget request included funds for all initiatives, owing to the reduced appropriation, HUD did not allocate any funds to the administrative enforcement initiative. Allocations to the other three initiatives ranged from 30 to 120 percent of the amount initially requested. From fiscal year 1989 through fiscal year 1996, HUD received 2,090 applications for FHIP grants and approved about one-quarter of these applications for funding. Historically, the demand for education and outreach grants has exceeded that for the other three initiatives each fiscal year except for 1996. For the 3 most recent years (fiscal years 1994 through 1996), the greatest demand, as measured by the amounts requested on applications, has been for the private enforcement initiative. In fiscal year 1996, the number of applications for grants decreased from 300 in each of the 3 previous fiscal years to 91. The most significant decrease was for education and outreach grants, dropping to 19 applications from over 200 the prior year (see table II.3). HUD told us that the significant drop in education and outreach applications is primarily attributable to language in the 1996 appropriations law requiring applicants to meet the definition of a qualified fair housing enforcement organization in order to be eligible for FHIP funds. According to FHIP legislation, a qualified fair housing enforcement organization is a private nonprofit organization that has at least 2 years of experience in complaint intake, complaint investigations, testing, and enforcement of legal claims. HUD told us that the legislative requirement precluded many previously eligible organizations from applying for an education and outreach initiative grant. Also, according to HUD, a one-third reduction in FHIP's appropriation for that fiscal year discouraged many organizations from applying for FHIP funding. On the basis of the dollar value of grant applications submitted to HUD, the greatest demand has been for private enforcement initiative grants. Our analysis of the dollar amount of applications is based on fiscal years 1994 through 1996 for which complete information is readily available (see table II.4). Of the total $175 million in applications received for the 3-year period, $76 million, or about 43 percent, was for private enforcement initiative grants, and about 36 percent was for education and outreach initiative grants. From the program's inception through September 1996, a total of 220 different organizations received FHIP grants in 44 states and the District of Columbia; 26 organizations received about half of all FHIP funds awarded. The organizations are located in 15 states and the District of Columbia. FHIP-funded activities have reflected the program's purpose as described in the legislation. That is, grantees have used FHIP dollars to fund the kinds of activities intended, namely, implementing fair housing testing programs and testing-related activities; establishing new fair housing organizations; and educating the public and housing providers about fair housing requirements. Through fiscal year 1996, HUD awarded 483 grants totaling $86 million to support fair housing enforcement and education. Of the 220 different organizations that received grants, 26 received about half of the funds awarded. These 26 organizations, located in 15 states and the District of Columbia, received 179 of the 483 grants. They include state governments; national membership organizations; legal aid organizations; and civil rights and advocacy groups. Some have grants that are national in scope, and some are involved in establishing new fair housing organizations in states that were unserved or underserved by fair housing enforcement organizations. Also, some organizations represent all protected classes, while others focus on a specific target population, such as persons with disabilities. Table 1 identifies the 26 organizations and the number and dollar value of grants received through fiscal year 1996. (See app. III for a complete list of the grants awarded and the dollar amount of each.) Many of these organizations received grants in consecutive years as well as grants under more than one FHIP initiative. For example, the National Fair Housing Alliance received at least one grant during each fiscal year of FHIP funding, including two education and outreach grants from 1991 funds, two private enforcement grants and one fair housing organizations grant from 1994 funds, and a fair housing organizations and an education and outreach grant from 1995 funds. The Metropolitan Milwaukee Fair Housing Council also received one grant each fiscal year and two grants in each of two fiscal years--a private enforcement grant and an education and outreach grant in 1990 and two private enforcement grants in 1994. The Open Housing Center, Inc., received three grants in 1994 and two in 1995, but none in 1993. Some of the 26 organizations received grants that were awarded for multiyear projects, and these grants were generally much larger than single-year grants. FHIP grant awards reflect the program's emphasis on private enforcement-related activities. From fiscal year 1989 through 1996, the largest percentage of FHIP dollars funded activities under the private enforcement initiative--$40.5 million, or 47 percent. Another $15.8 million, or 18 percent, was awarded for the fair housing organizations initiative (see fig. 3). Overall, FHIP-funded activities consist predominately of testing (complaint-based, systemic, or both) and other enforcement-related activities. Under the private enforcement initiative, in particular, funded activities include, among others, testing to confirm allegations of discrimination in the rental and sale of property, litigating cases, organizing new fair housing offices, and developing computer databases on complaints. Seventy-nine different organizations received 202 private enforcement initiative grants ranging from $10,000 to $1 million and averaging about $200,500. Of the 202 grants we reviewed, 181 were funded to carry out testing and testing-related activities. The remaining 21 grants were funded to engage in other enforcement-related activities, such as litigating cases; recruiting and/or training attorneys; developing fair housing databases; establishing a statewide attorney network to handle complaints from member offices; and training volunteers and community residents. In addition, private enforcement initiative grants funded special projects that focus on high-priority issues such as mortgage lending discrimination and insurance redlining. Included among those awards was a fiscal year 1992 grant for $1 million to support a large-scale national testing program to assess mortgage lending discrimination. Information obtained from FHIP-funded projects can be used by either public or private nonprofit organizations, or HUD, as the basis for a formal complaint against individuals or lending institutions. Several FHIP-funded projects involving testing mortgage lenders and insurance companies were completed in 1995, and as a result, complaints have been filed with HUD against three of the largest home insurance companies and five of the largest independent mortgage companies in the country. Under FHIP's fair housing organizations initiative, 47 different groups received 56 grants ranging from $30,000 to $1,859,000 and averaging about $282,500. While organizations with grants under the fair housing organizations initiative may engage in many of the same activities as the private enforcement initiative grantees, the fair housing organizations initiative was established to create new fair housing enforcement organizations in those areas of the country that were unserved or underserved by these organizations or expand the capacity of existing private nonprofit fair housing organizations. Of the 56 fair housing organizations initiative grants, 19 were used to establish new organizations. According to HUD, some grants funded more than one new fair housing organization, and in total, 23 new organizations have been established with FHIP grants. The new organizations are located primarily in the southern and western United States--areas historically underserved by fair housing enforcement programs, according to HUD. Fair housing organizations initiative grantees were also funded to recruit and train testers, implement testing programs, and conduct community outreach to inform the public about the services provided by newly established fair housing organizations. One hundred and twenty-eight different organizations received 188 education and outreach initiative grants ranging from $6,500 to $1,182,900 and averaging about $119,300. A wide range of activities were funded to provide education and outreach under this initiative's three components--national, regional and local, and community-based. Overall, the principal activities for the 188 education and outreach grants were developing pamphlets and brochures; preparing print, television, and radio advertisements; producing video and audio tapes; and providing conferences and seminars for other interested parties, including the housing industry, consumers, and community organizations. Twenty-two different organizations received 37 administrative enforcement initiative grants ranging from $55,300 to $439,300 and averaging about $197,200. About two-thirds of those grants funded at least one type of testing, that is, complaint-based or systemic. Other FHIP-funded activities include staff training, community training, tester recruitment, and conciliation/settlement activities. To determine whether grantees used FHIP funds to sue the government, we asked HUD's Office of General Counsel to identify FHIP grantees involved in litigation with the government. The General Counsel identified 10 cases involving 7 grantees who had filed lawsuits against the government since the inception of the program. Of the 10 lawsuits, 4 (involving 3 grantees) were filed and resolved before a FHIP grant was awarded to the fair housing organization. For the remaining six lawsuits (involving four grantees), pro bono legal services or other resources were used to pursue the cases against the U.S. government, according to HUD. HUD has generally been satisfied with grantees' use of funds. During the grant performance period and before closing out a grant, HUD reviews quarterly reports and products provided by the grantee to ensure that the organization's performance is consistent with the grant agreement. At the end of the grant period and after receipt of the final performance reports and products, HUD completes a closeout review. For this final assessment, HUD determines whether the grantee performed all grant requirements, indicates whether all work is acceptable, and rates the grantee's performance. Our analysis of the available assessments of 206 grants that had been closed out as of November 1996 indicates that HUD believes that the grantees generally carried out the activities as agreed. HUD rated 21 grantees as excellent, 150 as good, 27 as fair, and 6 as unsatisfactory. For the six grantees rated unsatisfactory, the primary reason cited was a failure to complete all the expected work requirements usually because of personnel changes within the organization. According to HUD, these 206 grants did not represent the total of all grants that should have been closed out and evaluated. An additional 118 grants for which the work has been completed and final payments have been made have yet to be closed out. The Acting FHIP Division Director told us that performing closeout reviews is an administrative process and, as such, is a low-priority item. According to HUD's Office of Procurement and Contracts, neither federal regulations nor HUD's guidelines include a specific time frame for completing the reviews. We provided a draft of this report to HUD for review and comment. We discussed the draft report with HUD officials, including the Acting FHIP Division Director. In commenting, HUD said that the report presents an accurate description of how FHIP funds are used. HUD also provided other comments consisting primarily of suggested changes to technical information, and we incorporated these in the report where appropriate. We conducted our work between August 1996 and February 1997 in accordance with generally accepted government auditing standards. Appendix IV describes our objectives, scope, and methodology. We will send copies of this report to congressional committees and subcommittees interested in housing matters; the Secretary of Housing and Urban Development; the Director, Office of Management and Budget; and other interested parties. We will also make copies available to others upon request. If you would like additional information on this report, please call me at (202) 512-7631. Major contributors to this report are listed in appendix V. testing and other investigative activities to identify housing discrimination; remedies for discrimination in real estate markets; special projects, including the development of prototypes to respond to new or sophisticated forms of discrimination; technical assistance to local fair housing organizations; the formation and development of new fair housing organizations; capacity building to investigate housing discrimination complaints for all protected classes; regional enforcement activities to address broader housing discrimination practices; and litigation costs and expenses, including expert witness fees. staff training; education and outreach to promote awareness of services provided by new organizations; technical assistance and mentoring services for new organizations; and activities listed above under the private enforcement initiative. projects that help establish, organize, and build the capacity of fair housing enforcement organizations in targeted unserved and underserved areas of the country. media campaigns, including public service announcements, television, radio and print advertisements, posters, pamphlets and brochures; seminars, conferences, workshops and community presentations; guidance to housing providers on meeting their Fair Housing Act obligations; meetings with housing industry and civic or fair housing groups to identify and correct illegal real estate practices; activities to meet state and local government fair housing planning requirements; and projects related to observance of National Fair Housing Month. fair housing testing programs and other related enforcement activities; systemic discrimination investigations; remedies for discrimination in real estate markets; technical assistance to government agencies administering housing and community development programs concerning applicable fair housing laws and regulations; and computerized complaint processing and the monitoring of system improvements. The following four tables provide details on the Department of Housing and Urban Development's (HUD) allocation of funds among the Fair Housing Initiatives Program's (FHIP) four funding initiatives or categories, the dollar amounts made available under each category, and the level of demand for funds under each category. The demand is indicated by both the number of applicants and the dollars requested. Table II.1: HUD-Proposed Allocations, by Initiative and Fiscal Year Not applicable. Not applicable. Not applicable. Not applicable. (continued) Toledo Community Housing Resource Board Housing Opportunities Made Equal of Richmond Metropolitan Milwaukee Fair Housing Council Old Pueblo Community Housing Resource Board Housing for All, Metro Denver Fair Housing Center Fair Housing Council of Greater Washington Caldwell Community Housing Resource Board Leadership Council for Metropolitan Open Communities NAACP-Illinois State Conference of Branches Wyandotte County Community Housing Resource Board Community Housing Resource Board of Lake Charles York County Community Action Corporation Camden County Community Housing Resource Board (continued) City of Tulsa, Department of Human Rights Multnomah County Community Development Division Chattanooga Community Housing Resource Board Virginia Polytechnic Institute and University Metropolitan Milwaukee Fair Housing Council Metropolitan Phoenix Fair Housing Center Fair Housing Council of Greater Washington Housing Opportunities Project for Excellence, Inc. Interfaith Housing Center of the Northern Suburbs Leadership Council for Metropolitan Open Communities Lawyers' Committee for Civil Rights Under Law of the Boston Bar Association Fair Housing Center of Metropolitan Detroit Fair Housing Council of Northern New Jersey Medger Evers College, Center for Law and Social Justice Monroe County Legal Assistance Corporation Westchester Residential Opportunities, Inc. Metropolitan Fair Housing Council of Greater Oklahoma City (continued) Housing Opportunities Made Equal of Richmond Metropolitan Milwaukee Fair Housing Council Alaska State Commission for Human Rights Arkansas Delta Housing Development Corporation Fair Housing Congress of Southern California Housing for All, Metro Denver Fair Housing Center International Association of Official Human Rights Agencies Fair Housing Council of Greater Washington Neighborhood Federation for Neighborhood Diversity Leadership Council for Metropolitan Open Communities Northern Bergen County Community Housing Resource Board State of South Carolina Human Affairs Commission Fair Housing Congress of Southern California Housing for All, Metro Denver Fair Housing Center Fair Housing Council of Greater Washington (continued) Housing Opportunities Project for Excellence, Inc. Interfaith Housing Center of the Northern Suburbs Chicago Lawyers' Committee for Civil Rights Under Law, Inc. Leadership Council for Metropolitan Open Communities Lawyers' Committee for Civil Rights Under Law of the Boston Bar Association Lawyers' Committee for Civil Rights Under Law of the Boston Bar Association Lawyers' Committee for Civil Rights Under Law of the Boston Bar Association Fair Housing Center of Metropolitan Detroit Fair Housing Council of Northern New Jersey Westchester Residential Opportunities, Inc. Toledo Community Housing Resource Board Fair Housing Council of Suburban Philadelphia Housing Opportunities Made Equal of Richmond Metropolitan Milwaukee Fair Housing Council Connecticut Commission on Human Rights and Opportunities North Carolina Human Relations Commission King County Office of Civil Rights and Compliance (continued) Fair Housing Council of Greater Washington Interfaith Housing Center of the Northern Suburbs Leadership Council for Metropolitan Open Communities Housing Coalition of the Southern Suburbs Lawyers' Committee for Civil Rights Under Law of the Boston Bar Association Portland West Neighborhood Planning Council North Carolina State University, Office of Research, Outreach and Extension State of New Jersey, Department of Public Advocacy Housing Consortium for Disabled Individuals Housing Opportunities Made Equal of Richmond Housing for All, Metro Denver Fair Housing Center Fair Housing Council of Greater Washington (continued) Housing Opportunities Project for Excellence, Inc. Leadership Council for Metropolitan Open Communities Fair Housing Center of Metropolitan Detroit Fair Housing Council of Northern New Jersey Westchester Residential Opportunities, Inc. Metropolitan Milwaukee Fair Housing Council King County Office of Civil Rights and Compliance National Association of Protection and Advocacy Systems City of Boston, Boston Fair Housing Commission Metropolitan St. Louis Equal Housing Opportunity Council West Jackson Community Development Corporation State University of New York Research Foundation (continued) Arkansas Delta Housing Development Corporation Chicago Lawyers' Committee for Civil Rights Under Law, Inc. Lawyers' Committee for Civil Rights Under Law of the Boston Bar Association Metropolitan St. Louis Equal Housing Opportunity Council Fair Housing Partnership of Greater Pittsburgh Fair Housing Congress of Southern California Lawyers' Committee for Civil Rights Under Law of the Boston Bar Association Fair Housing Council of Northern New Jersey (continued) Westchester Residential Opportunities, Inc. Catholic Community Services of Southern Arizona (dba the Direct Independent Living Center) Independent Living Resource Center of San Francisco Conference of Mayors, Research and Education Foundation Iowa Citizens for Community Improvement Leadership Council for Metropolitan Open Communities Lawyers' Committee for Civil Rights Under Law of the Boston Bar Association Legal Aid Bureau of Southwestern Michigan North Carolina State University, Center for Accessible Living Housing Opportunities Made Equal of Richmond (continued) Champlain Valley Office of Economic Opportunity Center for Legal Advocacy (dba the Legal Center Serving Persons With Disabilities) Iowa Protection and Advocacy Services, Inc. Medger Evers College, Center for Law and Social Justice Protection and Advocacy for People With Disabilities North East Wisconsin Fair Housing Council, Inc. Fair Housing Congress of Southern California Housing for All, Metro Denver Fair Housing Center Fair Housing Council of Greater Washington (continued) Housing Opportunities Project for Excellence, Inc. Leadership Council for Metropolitan Open Communities Lawyer's Committee for Better Housing, Inc. Leadership Council for Metropolitan Open Communities Interfaith Housing Center of the Northern Suburbs New Orleans Legal Assistance Corporation Lawyers' Committee for Civil Rights Under Law of the Boston Bar Association Fair Housing Center of Metropolitan Detroit Fair Housing Center of Metropolitan Detroit Legal Aid Bureau of Southwestern Michigan Fair Housing Council of Northern New Jersey Housing Opportunities Made Equal Committee of Cincinnati Fair Housing Council of Suburban Philadelphia Housing Opportunities Made Equal of Richmond Housing Opportunities Made Equal of Richmond (continued) Metropolitan Milwaukee Fair Housing Council Metropolitan Milwaukee Fair Housing Council Maryland Commission on Human Relations Rhode Island Commission for Human Rights Washington State Human Rights Commission Iowa Citizens for Community Improvement Leadership Council for Metropolitan Open Communities Mayor's Office for People With Disabilities West Jackson Community Development Corporation North Carolina State University, Center for Universal Design (continued) State University of New York Research Foundation Westchester Residential Opportunities, Inc. Eugene/Springfield/Cottage Grove (et al.) Community Housing Resources Board Golden Triangle Radio Information Center Tennessee Association of Legal Services, Legal Aid Projects Fair Housing Council of Greater Washington Chicago Lawyers' Committee for Civil Rights Under Law, Inc. Lawyers' Committee for Civil Rights Under Law of the Boston Bar Association Fair Housing Partnership of Greater Pittsburgh Housing for All, Metro Denver Fair Housing Center Lawyers' Committee for Civil Rights Under Law of the Boston Bar Association Housing Opportunities Made Equal Committee of Cincinnati Metropolitan Milwaukee Fair Housing Council (continued) As requested, we reviewed (1) how funds are allocated among the four FHIP initiatives, the dollar amounts made available for each initiative, and the level of demand for funds under each initiative and (2) who receives FHIP funds and how the funds are being used. We are also providing background information, as you requested, on the history of FHIP and activities that can be funded under the program. To obtain information on FHIP, its funding, and eligible activities, we reviewed the program's legislative history, regulations, policies, procedures, and Federal Register notices that solicited applications from eligible fair housing agencies and organizations. We also reviewed HUD's annual reports to the Congress on fair housing programs for 1993 and 1994 and obtained descriptions and budgets for other HUD-administered fair housing activities. We interviewed the Director, Office of Fair Housing Initiatives and Voluntary Programs (who also is the Acting FHIP Division Director); FHIP's government technical representatives; the Deputy Assistant Secretary, Enforcement and Investigations; and the Director, Office of Investigations, Fair Housing and Equal Opportunity. We also interviewed FHIP officials at the HUD's Southwest and Midwest Regions in Fort Worth, Texas and Chicago, Illinois, respectively, as well as officials of six organizations that received FHIP grants. In addition, we held discussions with the National Association of Realtors and the Mortgage Bankers Association and attended the 1996 New England and Mid-Atlantic Fair Housing Conference. To determine how HUD allocates funds among the four program initiatives, we reviewed and analyzed FHIP congressional budget justifications for fiscal years 1989 to 1997. We also reviewed memorandums and correspondence regarding funding allocations and HUD's priorities for FHIP since its inception. To determine the amounts available for award, we reviewed FHIP's notices of funding availability as published in the Federal Register for fiscal years 1989 through 1996. To determine the demand for funds, we reviewed and analyzed the available selection results, including technical evaluation panels' reports, which contained lists of grant applicants and the panels' recommendations to the Assistant Secretary for Fair Housing and Equal Opportunity. We reviewed technical evaluation reports to compile data on the number of applications by fiscal year and by program initiative. We also analyzed the dollar value of applications for those years for which complete information was readily available--fiscal years 1994 to 1996. Additionally, we reviewed program guidance on the selection process and interviewed HUD government technical representatives involved in the selection process. To identify the recipients of FHIP funds and the amount of dollars received, we obtained a copy of the FHIP funding and contract tracking system's database, which contained 486 grant listings as of October 1996. Many grant numbers were not accompanied by the grantee organizations' names and locations. To develop a more complete list, we compared the listed grant numbers to other HUD-provided reports and added names and locations to the database where possible. We used this database as a control for our review of the FHIP grant files. During our review of the files, we filled in the missing names and locations and verified all other grantees' names and locations, as well as the grant amounts and year of appropriation. To determine how FHIP dollars are being used, we developed a data collection instrument to record data from grant files on the activities organizations agreed to carry out under the program. In developing the instrument, we interviewed program officials, reviewed FHIP legislation and regulations, notices of funding availability, and a sample of FHIP grant files. HUD program officials reviewed and commented on the data collection instrument, and we incorporated their suggested changes. For grants awarded through fiscal year 1996, we reviewed the available grant files (483) and recorded on the data collection instrument the activities each grantee agreed to carry out. We used the information to develop a database from which we analyzed the number and dollar value of the grants awarded to organizations and the kinds of activities funded under each FHIP initiative. We also reviewed the available final performance assessments (206) to determine whether grantees completed work as agreed and how HUD rated their overall performance. We did not independently verify the accuracy of the final performance assessments. In addition, we interviewed HUD Inspector General officials in each HUD region regarding their reviews of FHIP grantees. To determine whether any grantees have used FHIP funds to pay expenses associated with litigation against the U.S. government, we interviewed officials in HUD's Office of General Counsel, namely, the Assistant General Counsel, Fair Housing Enforcement Division, and Managing Attorney, Litigation Division. At our request, HUD's General Counsel contacted agency attorneys in each region to determine whether they had knowledge of any lawsuits filed by FHIP grantees against the government. We interviewed the Acting FHIP Division Director, responsible government technical representatives, and government technical monitors about their knowledge of the cases identified. We also reviewed correspondence from grantees concerning whether FHIP funds were used to pursue litigation. Patricia D. Moore Jeannie B. Davis Michael L. Mgebroff Vondalee R. Hunt Alice G. Feldesman John T. McGrail The first copy of each GAO report and testimony is free. Additional copies are $2 each. Orders should be sent to the following address, accompanied by a check or money order made out to the Superintendent of Documents, when necessary. VISA and MasterCard credit cards are accepted, also. Orders for 100 or more copies to be mailed to a single address are discounted 25 percent. U.S. General Accounting Office P.O. Box 6015 Gaithersburg, MD 20884-6015 Room 1100 700 4th St. NW (corner of 4th and G Sts. NW) U.S. General Accounting Office Washington, DC Orders may also be placed by calling (202) 512-6000 or by using fax number (301) 258-4066, or TDD (301) 413-0006. Each day, GAO issues a list of newly available reports and testimony. To receive facsimile copies of the daily list or any list from the past 30 days, please call (202) 512-6000 using a touchtone phone. A recorded menu will provide information on how to obtain these lists. | Pursuant to a congressional request, GAO reviewed the Department of Housing and Urban Development's Fair Housing Initiatives Program, focusing on: (1) how funds are allocated among the program's four initiatives or funding categories, what dollar amounts are made available under each category, and what level of demand exists for funds under each category; and (2) who receives program funds and how the funds are being used. GAO noted that: (1) from the program's inception through fiscal year (FY) 1997, the Congress has appropriated $113 million to carry out the Fair Housing Initiatives Program; (2) the Assistant Secretary for Fair Housing and Equal Opportunity, the Department of Housing and Urban Development, judgmentally determines how funds are allocated among the four initiatives on the basis of the program legislation, the administration's and the agency's priorities, and input from the housing industry and fair housing groups; (3) the agency's budget requests to the Congress set forth how it plans to divide the total program dollars among the four initiatives; (4) the largest portion, more than $40 million, has been budgeted and made available for the private enforcement initiative; (5) as measured by the amounts requested on applications, for the 3 most recent years, fiscal years 1994 through 1996, there is also great demand for the private enforcement initiative; (6) through FY 1996, 220 different organizations in 44 states and the District of Columbia received program grants; (7) of all the funds awarded, 26 organizations received about half; (8) the largest portion of funds, about $41 million, was spent on the private enforcement initiative for activities aimed at determining the existence of discrimination in renting, sales, and lending, primarily testing to investigate individual complaints and testing to investigate industry practices; (9) grantees have used funds for a variety of other fair housing activities, such as litigation, new fair housing organizations and capacity building for existing organizations, pamphlets and brochures, print, television, and radio advertisements, and conferences and seminars for housing industry professionals; and (10) other funded activities also have included special projects on mortgage lending and insurance redlining. | 7,411 | 440 |
The Randolph-Sheppard Act created a vending facility program in 1936 to provide blind individuals with more job opportunities and to encourage their self-support. The program trains and employs blind individuals to operate vending facilities on federal property. While Randolph-Sheppard is under the authority of the Department of Education, the states participating in this program are primarily responsible for program operations. State licensing agencies, under the auspices of the state vocational rehabilitation programs, operate the programs in each state. Federal law gives blind vendors under the program a priority to operate cafeterias on federal property. Current DOD guidance implementing this priority directs that a state licensing agency be awarded a contract if its contract proposal is in the competitive range. In fiscal year 2006, all of the activities of the Randolph-Sheppard program generated $692.2 million in total gross income and had a total of 2,575 vendors operating in every state except for Wyoming. In 1938 the Wagner-O'Day Act established a program designed to increase employment opportunities for persons who are blind so they could manufacture and sell certain goods to the federal government. In 1971, the Javits-Wagner-O'Day Act amended the program to include people with other severe disabilities and allowed the program to provide services as well as goods. The JWOD Act established the Committee for Purchase, which administers the program. The Committee for Purchase is required by law to designate one or more national nonprofit agencies to facilitate the distribution of federal contracts among qualified local nonprofit agencies. The designated national agencies are the National Industries for the Blind and NISH, which represent local nonprofit agencies employing individuals who are blind or have severe disabilities. These designated national agencies charge fees for the services provided to local nonprofit agencies. Effective on October 1, 2006, the maximum fee is 3.83 percent of the revenue of the contract for the National Industries for the Blind, and 3.75 percent for NISH. The purpose of these fees is to provide operating funds for these two agencies. In fiscal year 2006, more than 600 JWOD nonprofit agencies provided the federal government with goods and services worth about $2.3 billion. The JWOD program provided employment for about 48,000 people who are blind or have severe disabilities. Military dining contracts under the Randolph-Sheppard and JWOD programs provide varying levels of service, ranging from support services to full-food services. Support services include activities such as food preparation and food serving. Full-food service contracts provide for the complete operation of facilities, including day-to-day decision making for the operation of the facility. As of October 17, 2006, DOD had 39 Randolph-Sheppard contracts in 24 different states. These contracts had an annual value of approximately $253 million and were all for full-food services. At the same time, DOD had 53 JWOD contracts valued at $212 million annually. Of these, 39 contracts were for support services and 15 were for full-food service. Figure 1 shows the distribution of Randolph- Sheppard and JWOD contracts with DOD dining facilities across the country. In 1974, amendments to the Randolph-Sheppard Act expanded the scope of the program to include cafeterias on federal property. According to a DOD official, when DOD began turning increasingly to private contractors rather than using its own military staff to fulfill food service functions in the 1990s, state licensing agencies under the Randolph-Sheppard program began to compete for the same full-food services contracts for which JWOD traditionally qualified. This development led to litigation, brought by NISH, over whether the Randolph-Sheppard Act applied to DOD dining facilities. Two decisions by federal appeals courts held that the Randolph- Sheppard Act applied because the term "cafeteria" included DOD dining facilities. The courts also decided that if both programs pursued the full- food service contracts for DOD dining facilities, Randolph-Sheppard had priority. Congress enacted section 848 of the National Defense Authorization Act for Fiscal Year 2006 requiring the key players involved in each program to issue a joint policy statement about how DOD food services contracts were to be allocated between the two programs. In August 2006, DOD, Education, and the Committee for Purchase issued a policy statement that established certain guidelines, including the following: The Randolph-Sheppard program will not seek contracts for dining support services that are on the JWOD procurement list, and Randolph- Sheppard will not seek contracts for operation of a dining facility if the work is currently being performed under the JWOD program; JWOD will not pursue prime contracts for operation of dining facilities at locations where an existing contract was awarded under the Randolph- Sheppard program (commonly known as the "no-poaching" provision). For contracts not covered under the no-poaching provision, the Randolph-Sheppard program may compete for contracts from DOD for full-food services; and the JWOD program will receive contracts for support services. If the needed support services are on the JWOD procurement list, the Randolph-Sheppard contractor is obligated to subcontract for those services from JWOD. In affording a priority to a state licensing agency when contracts are competed and the Randolph-Sheppard Act applies, the price of the state licensing agency's offer will be considered to be fair and reasonable if it does not exceed the best value offer from other competitors by more than 5 percent or $1 million, whichever is less. Congress enacted the no-poaching provision in section 856 of the National Defense Authorization Act for Fiscal Year 2007. A recent GAO bid protest decision determined that adherence to the other provisions of the policy statement was not mandatory until DOD and the Department of Education change their existing regulations. As of July 2007, neither agency had completed updating its regulations. The Randolph Sheppard and JWOD programs utilize different operating procedures to provide dining services to DOD. For the Randolph-Sheppard program, state licensing agencies act as prime contractors, and train and license blind vendors to operate dining facilities. For the JWOD program, the Committee for Purchase utilizes NISH to act as a central nonprofit agency and match DOD needs for dining services with local nonprofit agencies able to provide the service. JWOD employees generally fill less skilled jobs such as cleaning dining facilities or serving food. Education is responsible for overseeing the Randolph-Sheppard program, but relies on state licensing agencies to place blind vendors as dining facility managers. The Department of Education certifies state licensing agencies and is responsible for ensuring that their procedures are consistent with Randolph-Sheppard regulations. According to our survey, state licensing agencies act as prime contractors on Randolph-Sheppard contracts, meaning that they hold the actual contract with DOD. The state licensing agencies are responsible for training blind vendors to serve as dining facility managers and placing them in facilities as new contracting opportunities become available. According to our survey, the state issues the vendor a license to operate the facility upon the successful completion of the training program. Furthermore, many states said this process often includes both classroom training and on-the-job training at a facility. Figure 2 depicts how the Randolph-Sheppard program is generally structured. Responding to our survey, state licensing agencies reported that all blind vendors have some level of managerial responsibility for each of the 39 Randolph-Sheppard contracts. Specific responsibilities may include managing personnel, coordinating with military officials, budgeting and accounting, and managing inventory. An official representing state licensing agencies likened the vendor's role to that of an executive and said the vendor is responsible for meeting the needs of his or her military customer. At one facility we visited, the vendor was responsible for general operations, ensuring the quality of food, and helped develop new menu selections. Of the 37 contracts where the state licensing agencies provided information regarding whether the blind vendor visits his or her facility, all stated that their blind vendors visit their facilities, and in most cases are on site every day. Additionally, most state licensing agencies told us that they have an agreement with the blind vendor that lays out the state licensing agency's expectations of the blind vendor and defines the vendor's job responsibilities. Most state licensing agencies rely on private food service companies to provide the expertise to help operate dining facilities. According to our survey, 33 of the 39 Randolph-Sheppard contracts relied on a food service company--known as a teaming partner--to provide assistance in operating dining facilities. The survey showed that in many cases, the blind vendor and teaming partner form a joint venture company to operate the facility with the vendor as the head of the company. The teaming partner can provide technical expertise, ongoing training, and often extends the vendor a line of credit and insurance for the operation of the facility. Officials representing state licensing agencies told us that states are often unable to provide these resources, and for large contracts these start-up costs may be beyond the means of the blind vendor and the state licensing agency. According to our survey, the teaming partner may assist the state in negotiating and administering the contract with DOD. Additionally, state licensing agencies told us that they often enter into a teaming agreement that defines the responsibilities of the teaming partner. For 6 of the 39 contracts, the state licensing agencies reported that the blind vendor operates the dining facility without a teaming partner. We visited one of these locations and learned that the vendor has his own business that he uses to operate the facility. This particular vendor had participated in the Randolph-Sheppard program for almost 20 years and operated various other dining facilities. In our survey, state licensing agencies reported that vendors in about half (20 of 39) of the contracts are required to employ individuals who are blind or have other disabilities, while others have self-imposed goals. In other cases there may be no formal hiring requirements, but the state licensing agency encourages the blind vendor to hire individuals with disabilities. Based on survey responses we received for 30 contracts, we calculated that the percentage of persons with disabilities working at Randolph-Sheppard dining facilities ranged from 3 percent to 72 percent, with an average of 18 percent. The Committee for Purchase works with NISH to match DOD's need for services with nonprofit agencies able to provide food services. For military food service contracts, NISH acts as a central nonprofit agency and administers the program on behalf of the Committee for Purchase. In this role, NISH works with DOD to determine if it has any new requirements for dining services. When it identifies a need, NISH will search for a nonprofit agency that is able to perform the required service. NISH then facilitates negotiations between DOD and the nonprofit agency, and submits a proposal to the Committee for Purchase requesting that the specific service be added to the JWOD procurement list. If the Committee for Purchase approves the addition, DOD is required by the Federal Acquisition Regulation (FAR) to obtain the food service from the entity on the procurement list. In some instances, a private food service company is awarded a military dining facility contract and then subcontracts with a JWOD nonprofit agency to provide either full or support food services. For example, the Marine Corps awarded two regional contracts to Sodexho--a large food service company--to operate its dining facilities on the East and West Coasts. Sodexho is required by its contracts to utilize JWOD nonprofit agencies and uses these nonprofit agencies to provide food services and/or support services at selected Marine Corps bases. Figure 3 depicts the JWOD program structure. Most JWOD employees at military dining facilities perform less skilled jobs as opposed to having managerial roles. At the facilities we visited, we observed that employees with disabilities (both mental and physical) performed tasks such as mopping floors, serving food, and cleaning pots and pans after meals. Officials from NISH said this is generally true at JWOD dining facilities, including facilities where the nonprofit agency provides full-food service. Additionally, we observed--and NISH confirmed--that most supervisors are persons without disabilities. At one facility we visited, for example, the nonprofit supervisor oversees employees with disabilities who are responsible for keeping the facility clean and serving food. The Committee for Purchase requires that agencies associated with NISH perform at least 75 percent of their direct labor hours with people who have severe disabilities. For nonprofit agencies with multiple JWOD contracts, the 75 percent direct labor requirement is based on the total for all of these contracts. Therefore one contract may be less than 75 percent but another contract must be greater than 75 percent in order for the total of these contracts to meet the 75 percent requirement. NISH is responsible for ensuring that nonprofit agencies comply with this requirement, and we previously reported that it performs site visits to all local nonprofit agencies every three years, in order to ensure compliance with relevant JWOD regulations. At the three JWOD facilities we visited, officials reported that the actual percentage of disabled individuals employed was 80 percent or higher. Table 1 provides a comparison of the Randolph-Sheppard and JWOD programs' operating procedures. The Randolph-Sheppard and JWOD programs have significant differences in terms of how contracts are awarded and priced, and in the compensation provided to beneficiaries who are blind or have other disabilities. Under the Randolph-Sheppard program, federal law provides for priority for blind vendors and state licensing agencies in the operation of a cafeteria. This priority may come into play when contracts are awarded either by direct noncompetitive negotiations or through competition with other food service companies. Regardless of how the contract is awarded, the prices are negotiated between the state licensing agency and DOD. Under the JWOD program, competition is not a factor because DOD is required to purchase food services from a list maintained by the Committee for Purchase. Contracts are awarded at fair market prices established by the Committee for Purchase. The two programs also differ in terms of how program beneficiaries are compensated. Under the Randolph-Sheppard program, blind vendors generally receive a share of the profits, while JWOD beneficiaries receive hourly wages and fringe benefits under federal law or any applicable collective bargaining agreement. Randolph-Sheppard blind vendors received, on the average, pretax compensation of about $276,500 annually, while JWOD workers at the three sites visited earned on average $13.15 per hour, including fringe benefits. Although contracts for food services awarded under the Randolph- Sheppard and JWOD programs use the terms and conditions generally required for contracts by the FAR, the procedures for awarding and pricing contracts under the two programs differ considerably. Under the Randolph-Sheppard program, Education's regulations provide for giving priority to blind vendors in the operation of cafeterias on federal property, provided that the costs are reasonable and the quality of the food is comparable to that currently provided. The regulations provide for two procedures to implement this priority. First, federal agencies, such as the military departments, may engage in direct, noncompetitive negotiations with a state licensing agency. Of the eight Randolph-Sheppard contracts we reviewed in detail, six had been awarded through direct negotiations with the state licensing agency. In most of the eight cases, the contract was a follow-on to an expiring food service contract. The second award procedure involves the issuance of a competitive solicitation inviting proposals from all potential food service providers, including the relevant state licensing agency. The solicitation will specify the criteria for evaluating proposals, such as management capability, past performance, and price, and DOD will use these criteria to evaluate the proposals received. When the competitive process is used, DOD policy provides for selecting the state licensing agency for award if its proposal is in the "competitive range." Of the eight Randolph-Sheppard contracts we reviewed, only two involved a solicitation open to other food service providers, and there was no case in which more than one acceptable proposal was received such that DOD was required to determine a competitive range. The prices of contracts under the Randolph-Sheppard program are negotiated between DOD and the state licensing agency, regardless of whether DOD uses direct negotiations or seeks competitive proposals. Negotiations in either case typically begin with a pricing proposal submitted by the state licensing agency, and will then involve a comparison of the proposed price with the prices in previous contracts, an independent government estimate, or the prices offered by other competitors, if any. In some cases, DOD will seek the assistance of the Defense Contract Audit Agency (DCAA) in assessing various cost aspects of a proposal. All of the Randolph-Sheppard contracts we reviewed were generally firm, fixed price. Some had individual line items that provided for reimbursing the food service provider for certain costs incurred, such as equipment maintenance or replacing items. In most cases, the contract was for a base year, and provided for annual options (usually four) that may be exercised at the discretion of DOD. Of the 39 Randolph-Sheppard contracts within the scope of our review, the average price for the current year of the contract was about $6.5 million. Table 2 shows the 8 Randolph- Sheppard contracts in our sample with selected contract information. Under Part 8 of the FAR, the JWOD program is a mandatory source of supply, requiring DOD to award contracts to the listed nonprofit entity at fair market prices established by the Committee for Purchase. There is no further competition. Table 3 shows the 6 JWOD contracts in our sample with selected contract information. Compensation for Randolph-Sheppard blind vendors is computed differently from compensation paid to JWOD disabled workers. For the Randolph-Sheppard program, blind vendors' compensation is generally based on a percentage of the profits generated by the dining facilities' operations. Based on the 37 survey responses where we could determine the basis of how blind vendors' compensation was computed, 34 reported that that the vendor's compensation was computed either entirely, or in part, based on the profits generated by the dining facility contract. For compensation based entirely on the facilities' profits, the blind vendor received from 51 to 65 percent of the profits. For those blind vendors that were compensated partially based on profits, their compensation was based on fixed fees, administrative fees or salaries, and a percentage of the profits. Where compensation was not based on profits, these three blind vendors received either a percentage of the contract value or a fixed base fee. Figure 4 shows the annual compensation received by blind vendors for military food services contracts, within specified ranges, and the average compensation for each range. As shown in figure 4, 15 of 38 Randolph-Sheppard blind vendors' annual compensation was between $100,000 and $200,000. Overall, blind vendors working at DOD dining facilities received average annual compensation of about $276,500 per vendor. These figures are based on pretax earnings. We did not collect compensation information for employees of the blind vendors or employees of the teaming partners. For the JWOD program, for most workers--including those with and without a disability--the compensation is determined by either federal law or collective bargaining agreements. The Service Contract Act (SCA) was enacted to give employees of contractors and subcontractors labor standards protection when providing services to federal agencies. The SCA requires that, for contracts exceeding $2,500, contactors pay their employees, at a minimum, the wage rates and fringe benefits that have been determined by the Department of Labor to be prevailing in the locality where the contracted work is performed. However, the SCA hourly rate would not be used if there is a collective bargaining agreement that sets a higher hourly wage for selected workers. According to NISH, the collective bargaining hourly rates are, in general, 5 to 10 percent higher than the SCA's wage rates. Of the six JWOD contracts in our sample, Holloman Air Force Base and the Marine Corps' eastern and western regional contracts had collective bargaining agreements. For the three JWOD sites visited, we obtained an estimate of the average hourly wages, average hourly fringe benefits rates, and average number of hours worked and computed their annual wages. The average hourly wage for the three JWOD sites was $13.15 including fringe benefits. Table 4 shows the average annual wages that an employee earned. Another law that can affect the disabled worker's wages is section 14(c) of the Fair Labor Standards Act, which allows employers to pay individuals less than the minimum wage (called special minimum wage rates) if they have a physical or mental disability that impairs their earning or productive capacity. For example, if a 14(c) worker's productivity for a specific job is 50 percent of that of experienced workers who do not have disabilities that affect their work, and the prevailing wage paid for that job is $10 dollars per hour, the special minimum wage rate for the 14(c) worker would be $5 dollars per hour. None of the three JWOD sites we visited applied the special minimum wage for any of their disabled workers. The Randolph-Sheppard and JWOD programs have a common goal of serving individuals who are blind or have severe disabilities, and who are generally underrepresented in the workforce. However, these programs operate differently regarding how contracts are awarded and priced, and are designed to serve distinct populations through different means-- particularly with respect to compensation for program participants. This is true for contracts with military dining facilities. The blind vendors who participate in the Randolph-Sheppard program seek to become entrepreneurs by gaining experience managing DOD dining facilities. In this respect, although most of these vendors require the assistance of a private food service teaming partner, they are compensated for managing what can be large, complicated food service operations. By contrast, because the participants of the JWOD program perform work activities that require less skill and experience, and who might otherwise not be able to secure competitive employment, they are compensated at a much lower rate than the Randolph-Sheppard vendors. In this regard, it is apparent that the two programs are designed to provide very different populations with different types of assistance, and thus, it is difficult to directly compare them, particularly with respect to compensation. We provided a draft of this report to the Committee for Purchase, the Department of Defense, and the Department of Education for review and comment. The Committee for Purchase had no comments. DOD concurred with the draft and also provided technical comments for our consideration. We considered all of DOD's technical comments and revised the draft as appropriate. The DOD comment letter is attached as appendix II. The Department of Education provided clarifications and suggestions in a number of areas. First, Education was concerned about comparing the earnings of the blind vendors under the Randolph-Sheppard program and the compensation provided to the food service workers under the JWOD program. The agency suggested we compare the earnings of the blind vendors with the earnings of employees of the JWOD nonprofit agencies who perform similar management functions. We agree that there are significant differences in their responsibilities, but we were required to report on the compensation of the "beneficiaries" of the two programs, which are blind managers for the Randolph-Sheppard program and hourly workers for the JWOD program. Our report highlights these differences. Our report also highlights in a number of places the difficulty in comparing the compensation of the two groups of beneficiaries. We were not required to report on the earnings of the management personnel of the nonprofit agencies, and we did not collect this information. Second, Education urged that we fully describe the permitted uses of the set-aside fees charged by the state licensing agencies, and that we recognize that there is a similar assessment under the JWOD program. We have revised the report to point out that the Randolph-Sheppard set-aside may be used to fund the operation of the state licensing agencies. We also added language to a footnote to table 3 to recognize that the JWOD contract amounts include a fee that is used to fund the operations of the central nonprofit agency. Third, Education questions our description of the price negotiations that occur between DOD and the state licensing agencies. We believe our report is both clear and accurate on this point as written. In addition, DOD did not have any comments or questions about how we described price negotiations for the Randolph-Sheppard program. Fourth, Education questioned our discussion of the numbers of persons with disabilities employed under the two programs. Specifically, Education pointed out that the requirement under the JWOD program that at least 75 percent of the direct labor hours be performed by persons with disabilities applies in the aggregate to all work performed by a nonprofit entity, not at the contract level. We have revised the report to reflect this. And finally, Education sought clarification concerning the extent commercial food service companies are used as teaming partners under the Randolph-Sheppard program or as subcontractors under the JWOD program. We have revised figures 2 and 3 of the report to more accurately reflect the use of these companies. The comment letter from Education is attached as Appendix III. We will send copies of this report to interested congressional committees, the Secretary of Defense, the Secretary of Education, and the Chairperson of the Committee for Purchase, as well as other interested parties. We will also make copies available to others upon request. In addition, the report will be available at no charge on GAO's Web site at http://www.gao.gov. If you or your staffs have any questions about this report, please contact George Scott at (202) 512-7215 or [email protected] or William Woods at (202) 512-8214 or [email protected]. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this report. Key contributors to this report are listed in appendix IV. To accomplish our research objectives, we interviewed officials from the Department of Defense (DOD), the Department of Education, the Committee for Purchase, and organizations representing both the Randolph-Sheppard and Javits-Wagner-O'Day (JWOD) programs. We also reviewed pertinent documents and regulations governing both programs. We reviewed a sample of 14 contracts--8 Randolph-Sheppard contracts and 6 JWOD contracts. For these contracts, we requested the source selection memorandum, the acquisition plan, the basic contract, and the statement of work. For two of these contracts, the Randolph-Sheppard prime contractor for full-food services subcontracted with a JWOD nonprofit agency for support services. We determined that it was not feasible to review a representative sample of contracts based on our preliminary work, which indicated wide variations in how the two programs are structured and how the Randolph-Sheppard program is administered from state to state. For these reasons, we selected a number of contracts to review in order to ensure representation of both programs, as well as ensure a balance of contracts based on dollar value, size of military facility, branch of the military, and geographic location. As the sample was not representative, results of our review cannot be projected to the entire universe of contracts. In addition, we visited the military installation for 5 of the 14 contracts in our sample in order to observe dining facilities and their operations, as well as interview pertinent officials and staff, including the blind vendor or JWOD agency management whenever possible. Again, these five locations were selected to ensure representation of both programs, as well as variation in geographic location, contract size, and military branch. In terms of beneficiary compensation, we limited our review to Randolph-Sheppard blind vendors and JWOD workers. For the JWOD program, we obtained average hourly wages, average hourly fringe benefits, and average total hours worked during the year for JWOD employees at selected sites. We did not obtain compensation amounts for the managerial employees for any JWOD nonprofit agencies. To obtain information on the relationships between state licensing agencies and blind vendors, we conducted a survey of the 24 state licensing agencies we determined to have Randolph-Sheppard military dining contracts. We asked questions regarding the roles and responsibilities of blind vendors, the vendor's relationship with the state licensing agencies, and the role played by teaming partners. We administered this survey between April and July 2007. We pretested this survey with program directors and modified the survey to take their comments into account. All 24 state licensing agencies responded to our survey for a response rate of 100 percent and provided information for 39 military dining facilities contracts. Additionally, we requested information for the 40 blind vendors with military dining contracts to determine their annual compensation. For the 39 contracts, there were 40 blind vendors as one contract utilized two vendors. We received compensation information for 38 of the 40 blind vendors. Jeremy D. Cox (Assistant Director), Richard Harada (Analyst-in-Charge), Daniel Concepcion, Rosa Johnson, and Sigurd Nilsen made significant contributions to all aspects of this report. In addition, Susannah Compton and Lily Chin assisted in writing the report and developing graphics. John Mingus provided additional assistance with graphics. Walter Vance assisted in all aspects of our survey of state licensing agencies as well as providing methodological support. Doreen Feldman, Daniel Schwimer, and Alyssa Weir provided legal support. Federal Disability Assistance: Stronger Federal Oversight Could Help Assure Multiple Programs' Accountability. GAO-07-236. Washington, D.C.: January 26, 2007. | Randolph-Sheppard and Javits-Wagner-O'Day (JWOD) are two federal programs that provide employment for persons with disabilities through federal contracts. In 2006, participants in the two programs had contracts with the Department of Defense (DOD) worth $465 million annually to provide dining services at military dining facilities. The 2007 National Defense Authorization Act directed GAO to study the two programs. This report examines (1) differences in how the Randolph-Sheppard and JWOD programs provide food services for DOD and (2) differences in how contracts are awarded, prices are set, and program beneficiaries (i.e. persons with disabilities) are compensated. GAO interviewed program officials, conducted a survey of states with Randolph-Sheppard programs, and reviewed eight Randolph-Sheppard and six JWOD contracts. The Randolph-Sheppard and JWOD programs use different procedures to provide food services to DOD. In Randolph-Sheppard, states act as prime contractors, and train and license blind individuals to act as managers of dining facilities. In most cases, the blind vendor relies on a food service company--known as a teaming partner--to assist in operations, provide expertise, and help with start-up costs. About half of the blind vendors are required to employ other persons with disabilities. JWOD is administered by an independent federal agency called the Committee for Purchase from People Who are Blind or Severely Disabled (Committee for Purchase). The Committee for Purchase engages a central nonprofit agency to match DOD's needs with services provided by local nonprofit agencies. Most of the individuals working for these local nonprofit agencies are employed in less skilled jobs such as serving food or washing dishes. The Randolph-Sheppard and JWOD programs differ significantly in the way DOD dining contracts are awarded, how prices are set, and how participants are compensated. For Randolph-Sheppard, DOD awards contracts to the states either through direct negotiations or competition with other food service companies. In either case, DOD and the states negotiate the prices based on factors such as historical prices and independent government estimates. Under JWOD, competition is not a factor because DOD is required to purchase services it needs from a list maintained by the Committee for Purchase, which establishes fair market prices for these contracts. In terms of compensation, Randolph-Sheppard blind vendors generally received a percentage of contract profits, averaging about $276,500 per vendor annually. JWOD beneficiaries are generally paid hourly wages according to rules set by the federal government. For the three sites we visited, we estimate that beneficiaries received an average wage of $13.15 per hour, including fringe benefits. Given the differences in the roles of the beneficiaries of these two programs, comparisons of their compensation have limited value. | 6,322 | 607 |
To assess how agencies are using the results of single audits, we conducted a survey of the 24 agencies subject to the CFO Act. We pretested our survey with one federal agency, solicited comments from OMB, and modified the survey based on the comments we received. The survey included two sections. The first section captured background information on agency federal awards programs, the single audit process from an agencywide perspective, and the offices within the agency that are responsible for fulfilling the task of implementing the various single audit responsibilities defined under OMB Circular A-133. The second part of the survey captured information on how agency CFO, IG, and program offices use the results of single audits in each agency's largest grant program. We distributed the surveys to the agencies for completion. We then performed follow-up interviews with representatives from CFO, IG, and program offices to obtain, discuss, and clarify their survey responses. Our survey results reflect the information provided by and the opinions of the agency officials who participated in our survey. We did not independently verify the responses to our questions. We received responses from all of the CFO Act agencies. One of the 24 agencies returned but did not complete the survey because it does not have grant- making authority, and, therefore, has no experience with single audits. As a result, our survey results are based on responses from 23 agencies. We conducted our work from July 2001 through December 2001, in accordance with generally accepted government auditing standards. We discussed a draft of this report with representatives from OMB and have incorporated their comments and views where appropriate. According to OMB, federal awards for fiscal year 2001 totaled about $325 billion of the $1.8 trillion federal budget. The Departments of Agriculture, Education, Health and Human Services, Housing and Urban Development, and Transportation were responsible for managing about 86 percent of the federal awards in fiscal year 2001. The Single Audit Act, as amended, established the concept of the single audit to replace multiple grant audits with one audit of the recipient as a whole. As such, a single audit is an organizationwide audit that focuses on the recipient's internal controls and compliance with laws and regulations governing federal awards and should be viewed as a tool that raises relevant or pertinent questions rather than as a document that answers all questions. Federal awards include grants, loans, loan guarantees, property, cooperative agreements, interest subsidies, insurance, food commodities, and direct appropriations and federal cost reimbursement contracts. The objectives of the Single Audit Act, as amended, are to promote sound financial management, including effective internal controls, with respect to federal awards administered by nonfederal entities; establish uniform requirements for audits of federal awards administered by nonfederal entities; promote the efficient and effective use of audit resources; reduce burdens on state and local governments, Indian tribes, and ensure that federal departments and agencies, to the maximum extent practicable, rely upon and use audit work done pursuant to the act. Recipients of federal awards who expend $300,000 or more in a year are required to comply with the Single Audit Act's requirements. In general, they must (1) maintain internal control over federal programs, (2) comply with laws, regulations, and the provisions of contracts or grant agreements, (3) prepare appropriate financial statements, including the Schedule of Expenditures of Federal Awards, (4) ensure that the required audits are properly performed and submitted when due, and (5) follow up and take corrective actions on audit findings. OMB Circular A-133 establishes policies for federal agency use in implementing the Single Audit Act, as amended, and provides an administrative foundation for consistent and uniform audit requirements for nonfederal entities that administer federal awards. It details federal responsibilities with respect to informing grantees of their responsibilities under the act. A significant part of OMB Circular A-133 is the Compliance Supplement. This document serves as a source of information to aid auditors in understanding federal program objectives, procedures, and compliance requirements relevant to the audit, and it identifies audit objectives and suggested procedures for auditors' use in determining compliance with the requirements. For example, it includes guidance on audit procedures applicable to 14 areas including allowable activities, allowable costs, cash management, eligibility, and reporting. (Appendix III lists and briefly describes the 14 areas.) Organizations that must comply with the Single Audit Act, as amended, are required to submit a reporting package to the FAC. The FAC serves as the central collection point, repository, and distribution center for single audit reports. Its primary functions are to receive the SF-SAC Form--a data collection form that contains summary information on the auditor, auditee and its federal programs, and audit results--and the audit report from the auditee, archive copies of the SF-SAC Form and audit report, forward a copy of the audit report to each federal awarding agency that has provided direct funding to the auditee when the report identifies a finding relating to that agency's awards, and maintain an electronic database that is accessible through the Internet. In our June 1994 report, Single Audit: Refinements Can Improve Usefulness (GAO/AIMD-94-133), nearly two-thirds of the program managers we interviewed said that a database of single audit information would be a significant help in comparing information about entities operating their programs. Eighty percent of the managers said they would like to use the database to identify all entities operating their programs that had serious internal control or noncompliance problems disclosed in single audit reports. The Single Audit Act Amendments of 1996 led to the establishment of an automated database of single audit information--the FAC database. OMB Circular A-133 requires all entities that must submit single audit reports to the FAC to prepare and submit a data collection form (SF-SAC Form) with the audit report. The FAC uses this form as the source of the information for its automated, Internet-accessibledatabase of information contained in single audit reports. The database contains about 4 years of information on over 30,000 annual single audit reports. The various data query options available provide potential users, including program managers, auditors, and other interested parties, with significant amounts of readily available information on grant recipient financial management and internal control systems and on compliance with federal laws and regulations. Our survey results indicated that the CFO Act agencies have generally developed processes and assigned responsibilities to meet their requirements under the Single Audit Act, as amended. The CFO, IG, and program offices perform these activities either individually or in coordination with each other. Federal agencies indicated that they use single audit results for many purposes. The most common reported use was as a tool to monitor auditee compliance with administrative and program requirements and to monitor the adequacy of internal controls. Although agencies have identified many uses for the single audit results, our survey results show that they are generally not using the FAC automated database to obtain summary information on the audit results or the entities that are receiving funds under their programs. Rather, they reported developing their own systems or methods to obtain information from the reports. According to our survey results, agency program offices are primarily responsible for ensuring the application of the provisions set forth in OMB Circular A-133. For example, in completing the survey, program office officials indicated that they (1) ensure that award recipients are given information that describes the federal award, (2) advise recipients of other applicable award requirements, (3) advise recipients of the requirement to obtain a single audit when they expend $300,000 or more in federal awards in a year, (4) ensure that single audits are completed and the reports are received in a timely manner, and (5) follow up on issues identified in the reports that require corrective action. Specifically, 20 agency program offices responded that they ensure recipients are given the information necessary to describe the federal award and advise recipients of other applicable award information, 19 responded that they advise recipients of the requirements to obtain a single audit when they expend $300,000 or more in federal awards in a year, 19 responded that they follow up on issues that are identified in the reports that require corrective action, 17 responded that they provide information to auditors about the federal 10 responded that they ensure that single audits are completed and the reports are received in a timely manner. Additionally, at some agencies more than one office responded that they are responsible for the application of the provisions of OMB Circular A-133. The FAC distributes single audit reports to each federal awarding agency that has provided direct funding and for which the report identifies an audit finding related to an award managed by that agency. Based on our survey, receipt of single audit reports from the FAC and distribution of the reports to the applicable agency office is predominately the responsibility of the OIG. Our results show that 18 OIGs responded that they receive the single audit reports directly from the FAC and that they distribute them to applicable agency offices. Audits provide important information on recipient performance and are a critical control that agencies can use to help ensure that entities that receive federal funds use those funds in accordance with program rules and regulations. Agency OIGs play a key role in this area by performing quality control reviews (QCR) to ensure that the audit work performed complies with auditing standards. Our survey results show that 10 of the CFO Act agency OIGs performed 109 QCRs during fiscal year 2001, although this total may be overstated since OIGs occasionally perform joint QCRs and our survey did not capture information on the number of times this occurred. Although the number of QCRs performed is small compared to the approximately 30,000 single audits performed annually, several OIGs conducting QCRs have identified problems with the audit work performed. For example, 7 OIGs noted problems with the internal control and/or compliance testing performed by the auditors, and 3 OIGs reported problems relating to auditor compliance with generally accepted government auditing standards. Audit follow-up is an integral part of good management and is a shared responsibility of agency management officials and auditors. Corrective action taken by the recipient on audit findings and recommendations is essential to improving the effectiveness and efficiency of government operations. In addition, federal agencies need to ensure that recipients take timely and effective corrective action. OMB Circular A-133 notes that audit follow-up is the responsibility of the federal awarding agency. The Circular requires agencies to issue a management decision on audit findings within 6 months after receipt of the recipient's audit report and to ensure that the recipient takes appropriate and timely corrective action. Analysis of our survey results indicates that both the IG and program offices have a role in the audit follow-up process. For example, 15 IG and 9 program offices responded that they are responsible for reviewing reports to verify that the report contains agency program-specific information. When single audit reports do not have enough information, both IG and program offices indicated that they follow up with either the recipient or the auditor. Thirteen IG and 14 program offices stated that they follow up with the recipient, and 13 IG and 10 program offices stated they follow up with the auditor. Program offices, on the other hand, are responsible for evaluating the corrective action plans filed by recipients to determine whether they address the audit findings. Sixteen program offices responded that they are responsible for evaluating the corrective action plans to determine whether the issues are valid and what corrective action is necessary. Furthermore, the program offices at 10 agencies stated that they rely on subsequent audits to determine whether corrective actions have been taken. At 22 of the agencies, officials in at least one of the CFO, IG, and/or program offices responded that they use single audits as a tool to monitor compliance with administrative and program requirements addressed in the OMB Circular A-133 Compliance Supplement and to monitor the adequacy of internal controls. Six agencies reported that the CFO, IG, and program offices all perform this function. Six agencies reported that some combination of CFO, IG, and program offices perform this function. Ten agencies reported that one office performs the function, and that office varies across the 10 agencies. The next most frequent uses reported were for identifying leads for additional audits (18 agencies) and as a preaward check for determining how recipients managed previous awards (14 agencies). Further, they reported that the single audit reports are used in preaward checks to identify findings that may affect the program area of operations and identify questioned or unallowable costs incurred by the recipient. agencies reported that these checks may affect future awards. Additionally, the survey results indicated that between 6 and 12 agencies use single audit results to identify leads for program office site visits (12 agencies), as support for closeout of the award (12 agencies), to hold agency program offices accountable for administrative and program compliance (12 agencies), to support the agency's financial statements (10 agencies), and as a source of program information for the agency's performance plan or annual accountability report (6 agencies). As can be seen, agencies report using single audits for a number of purposes. However, between 1 and 8 agencies indicated that, for several reasons, they did not use the reports for some or all of these purposes. When asked why they did not use single audit reports, several agencies noted that their programs were too small to be covered in the scope of an audit performed under the Single Audit Act. For example, the Single Audit Act requires auditors to use combined expenditure and risk-based criteria to determine which programs to include in the scope of a single audit. Since the expenditure portion of the criteria identifies awards with large- dollar expenditures, agencies whose programs do not meet this criteria are less likely to have their programs audited during a single audit. Additionally, agencies said the single audit reports did not provide relevant information for specific purposes such as support for the agency financial statements or holding federal program offices accountable for administrative and program compliance. Other reasons provided for not using single audit reports include limited staff resources and competing priorities. Our survey results indicate that 11 agencies routinely use the FAC database and that usage is distributed among the CFO, IG, and program offices. For example, the 11 agencies indicated that they use the database to identify recipients that have incurred questioned costs, have made improper payments, or both. In addition, 8 agencies noted that they use the database to determine whether large-dollar or complex programs have significant findings such as adverse opinions on recipient compliance with program laws and regulations. Survey respondents also indicated that they use the FAC database to perform other tasks, such as tracking the status of audit-finding resolution, determining whether the recipient has filed its single audit report, a source for audit leads, identifying trends between recipients, and verifying the accuracy of the Schedule of Expenditures of Federal Awards. Those agencies that do not use the database reported that they rely on the FAC to send them the single audit reports and that they review the hard copy reports to obtain information on the agency's programs instead of the database. In discussions with personnel at 4 agencies, we learned that they were unfamiliar with the FAC database and how it could be used. These officials did express interest in using the database and inquired about the availability of training. We are sending copies of this report to the ranking minority member, Subcommittee on Government Efficiency, Financial Management and Intergovernmental Relations, House Committee on Government Reform; the chairman and ranking minority member, Senate Committee on Appropriations; the chairman and ranking minority member, House Committee on Appropriations; the chairman and ranking minority member, Senate Committee on Governmental Affairs; the chairman and ranking minority member, House Committee on Government Reform; the chairman and ranking minority member, Senate Budget Committee; and the chairman and ranking minority member, House Budget Committee. We are also sending copies of this report to the director of the Office of Management and Budget and the agency CFOs and IGs. Copies of this report will be made available to others upon request. This report will also be available on GAO's home page (http://www.gao.gov). Please call me at (213) 830-1065 or Tom Broderick, Assistant Director, at (202) 512-8705 if you or your staff have any questions about the information in this report. Key contributors to this report were Cary Chappell, Mary Ellen Chervenic, Valerie Freeman, Stuart Kaufman, and Gloria Hernandez- Saunders. According to Office of Management and Budget (OMB) figures, federal awards for fiscal year 2001 totaled $325 billion of the $1.8 trillion budget. This assistance includes grants, loans, loan guarantees, property, cooperative agreements, interest subsidies, insurance, food commodities, and direct appropriations and federal cost reimbursement contracts. (Fiscal Year 2001) Fiscal Year 2001 Grants by Agency to State and Local Governments According to OMB figures, the Department of Health and Human Services is responsible for managing 54 percent of the $325 billion in federal awards provided during fiscal year 2001. The Departments of Transportation, Housing and Urban Development, Education, and Agriculture are responsible for managing an additional 32 percent of federal awards. (in billions) Top Ten Programs for Fiscal Year 2001 According to OMB figures, the Department of Health and Human Services managed 5 of the top 10 federal awards programs in fiscal year 2001. These programs are Medicaid, Temporary Assistance for Needy Families, Head Start, Foster Care, and Child Support Enforcement. (in billions) Briefing Section II--Single Audit Processes and Awarding Agency Responsibilities Organizations Performing Selected A-133 Responsibilities According to our survey results, agency program offices are primarily responsible for ensuring the application of the provisions set forth in OMB Circular A-133, Audits of States, Local Governments, and Non-Profit Organizations. For example, 20 agency program offices responded that they ensure that recipients are given the information necessary to describe the federal award and advise recipients of other applicable award information. Nineteen responded that they advise recipients of the requirement to obtain a single audit when they expend $300,000 or more in federal awards in a year, 19 responded that they follow up on issues that are identified in the reports that require corrective action, 17 responded that they provide information to auditors about the federal program, and 10 responded that they ensure that single audits are completed and that the reports are received in a timely manner. Additionally, at some agencies more than one office responded that they are responsible for the application of the provisions of OMB Circular A-133. For example, the chief financial officer (CFO) and inspector general (IG) offices are involved in providing information to auditors performing single audits and in addressing issues that require corrective action. While the majority of agencies hold program offices responsible for such tasks, 3 agencies established a separate function within the CFO's office to ensure proper oversight of federal awards. While these agencies award relatively small amounts of federal money, they felt it was important to maintain proper oversight. Agencies responded that the primary way they promote compliance with OMB Circular A-133 is by mandating it in regulations, agency policy directives, or guidance on grants administration, and by including it in the grant award document. Provide recipients the information necessary to describe the federal award. Advise recipients of other applicable award requirements and provide information as requested. Advise recipients of the requirement to obtain a single audit when they expend $300,000 or more in federal awards in a year. Address issues that are identified in single audit reports that require corrective action. Provide information to auditors on agency programs as requested. Ensure single audits are completed and reports are received in a timely manner. NOTE: Rows do not add across to total agencies because we received responses from multiple offices within an agency. Briefing Section II--Single Audit Processes and Awarding Agency Responsibilities Organizations Performing Selected A-133 Responsibilities OMB Circular A-133 requires the Federal Audit Clearinghouse (FAC) to distribute single audit reports to the federal agencies. The FAC distributes reports to each federal agency that provides federal awards and for which the report identifies an audit finding related to an award managed by that agency. Based upon our survey, receipt of single audit reports from the FAC and distribution of the reports within the agency are predominately Office of Inspector General (OIG) responsibilities. Our results show that 18 OIGs receive the single audit reports directly from the FAC and distribute them to applicable agency offices. Under OMB Circular A-133, federal award recipients are assigned either a cognizant agency for audit or an oversight agency for audit, depending on the amount of federal awards they expend. The agency that provides the predominant amount of direct funding to a recipient is responsible for carrying out the functions of the cognizant or oversight agency, unless OMB makes a specific cognizant agency for audit assignment. The cognizant agency for audit is required to conduct quality control reviews (QCR) of selected audits made by nonfederal auditors. Receive single audit reports from the FAC. Distribute single audit reports to the applicable agency office. Obtain or conduct QCRs of selected audits made by nonfederal auditors, and provide the results, when appropriate, to other interested organizations. NOTE: Rows do not add across to total agencies because we received responses from multiple offices within an agency. Briefing Section II--Single Audit Processes and Awarding Agency Responsibilities Analysis of our survey results indicates that both the IG and program offices are responsible for the audit follow-up process. For example, 15 IG and 9 program offices responded that they are responsible for reviewing reports to verify that the report contains agency program-specific information. When single audit reports do not have enough information, both IG and program offices follow up with either the recipients or the auditor. Thirteen IG and 14 program offices stated they follow up with the recipient, and 13 IG and 10 program offices stated that they follow up with the auditor. Program offices, on the other hand, are responsible for evaluating the corrective action plans filed by recipients to determine whether they address the audit findings. As shown on the accompanying slide, 16 program offices responded that they are responsible for evaluating the corrective action plans to determine their validity. Furthermore, the program offices at 10 agencies stated that they rely on subsequent audits to determine if corrective actions have been taken. To facilitate follow-up procedures, automated or manual audit tracking systems are necessary. The results of our interviews show that most agencies use a tracking system to track single audit findings. NOTE: Rows do not add across to total agencies because we received responses from multiple offices within an agency. Briefing Section III--How Agencies Use Single Audits Agency Uses of Single Audits Review of the surveys indicated that one or more offices at 22 agencies use single audits as a tool to monitor compliance with administrative and program requirements and to monitor the adequacy of recipients' compliance with internal controls. Five agencies reported that the CFO, IG, and program offices all perform these functions. Six agencies reported that some combination of CFO, IG, and program offices perform them and 11 agencies reported that one office performs this function. Our results also indicate that many agency personnel read all single audit reports they receive to identify noncompliance with program requirements or inadequacy of internal controls. NOTE: Rows do not add across to total agencies because we received responses from multiple offices within an agency. Briefing Section III--How Agencies Use Single Audits Agency Uses of Single Audits Single-audit-report leads for follow-on work can come from a review of the entity's financial statements or the auditor's findings. Further, while single audit report findings are supposed to be corrected by the entities, some findings may indicate problems that need further investigation to be fully understood and effectively resolved. Thus, information from single audit reports may indicate the possible need for follow-on audits or additional review and analysis by program officials or both. Eighteen agencies responded that they use single audits as a source of leads for additional audits. Fourteen agencies said they use single audits as a preaward check to determine how the recipient managed previous awards. These agencies responded that single audit reports are used in preaward checks to identify findings that may affect the program area of operations, questioned or unallowable costs incurred by the recipient, and findings that may affect future awards. Program officials at 12 agencies responded that single audits are used as a source of leads to select recipients for program site visits. Twelve agencies said they used single audit reports as support for award closeout. Briefing Section III--How Agencies Use Single Audits Agency Uses of Single Audits Survey results indicate that 12 of the 24 CFO agencies use single audit results to hold agency program offices accountable for administrative and program compliance. Ten agencies responded that they use single audit reports to support the agency's financial statements. Six agencies responded that they used the results of single audits as a source of program information for the agency's performance plan or annual accountability report. Briefing Section III--How Agencies Use Single Audits Why Agencies Do Not Use Single Audit Reports As indicated in the preceding slides, agencies use single audits for a number of purposes. However, between 1 and 8 agencies indicated that, for several reasons, they did not use the reports for these purposes. When asked why they did not use single audit reports for a particular purpose, between 4 and 8 agencies noted that their programs were too small to be covered by the Single Audit Act. For example, the Single Audit Act requires auditors to use combined expenditure and risk-based criteria to determine which programs to include in the scope of a single audit. Since the expenditure portion of the criteria identifies awards with large-dollar expenditures, agencies whose programs do not meet this criteria are less likely to have their programs audited during a single audit. Additionally, between 2 and 8 agencies said that the single audit reports did not provide relevant information for specific uses. Other reasons provided for not using single audit reports included limited staff resources (2 to 5 agencies), and competing priorities (1 to 3 agencies). Briefing Section IV--Use of Federal Audit Clearinghouse Database Uses of Federal Audit Clearinghouse Database Our survey results indicate that 11 agencies routinely use the FAC database and that usage is distributed among the CFO, IG, and program offices. For example, 11 agencies indicated that they use the database to identify recipients that have incurred questioned costs, have made improper payments, or both. In addition, 8 agencies noted that they use the database to determine whether large-dollar or complex programs have significant findings such as adverse opinions on recipient compliance with program laws and regulations. Survey respondents also indicated that they use the FAC database to perform other tasks, such as tracking the status of audit-finding resolution, determining whether the recipient has filed its single audit report, a source for audit leads, identifying trends between recipients, and verifying the accuracy of the Schedule of Expenditures of Federal Awards. Those agencies that do not use the database rely on the FAC to send them the single audit reports and review the reports to obtain information on the agency's programs instead of using the database to obtain such information. In discussions with agency personnel at four agencies, we learned that they were unfamiliar with the FAC and how it could be used. These officials did express interest in using the database and inquired about the availability of training. to determine whether multiple agency programs have similar audit issues called "finding categories" to identify recipients that have incurred questioned costs, made improper payments, or both to determine how many recipients have recurring findings to determine whether large-dollar or complex programs have significant findings such as adverse opinions on recipient compliance with program laws and regulations to study the findings of subrecipients (A subrecipient is a nonfederal entity that expends federal awards received from a pass-through entity to carry out federal programs.) NOTE: Rows do not add across to total agencies because we received responses from multiple offices within an agency. Presented below are the 14 types of compliance requirements that the auditor shall consider in every audit conducted under OMB Circular A-133. Activities allowed or unallowed are unique to each federal program and are found in the laws and regulations and the provisions of the contract or grant agreements pertaining to the program. OMB Circulars A-87, Cost Principles for State, Local and Indian Tribal Governments; A-21, Cost Principles for Educational Institutions; and A-122, Cost Principles for Non-Profit Organizations prescribe the cost accounting policies associated with the administration of federal awards managed by states, local governments, Indian tribal governments, educational institutions, and nonprofit organizations. Requires that recipients follow procedures to minimize the time elapsing between the transfer of funds from the U.S. Treasury and payment by the recipient. Requires that all laborers and mechanics employed to work on construction projects over $2,000 financed by federal assistance funds be paid prevailing wage rates. The specific requirements for eligibility are unique to each federal program and are found in the laws and regulations and the provisions of the contract or grant agreements pertaining to the program. Equipment and real property management Requires real property acquired by nonfederal entities with federal award funds be used for the originally authorized purpose and may not be disposed of without prior consent of the awarding agency. The specific requirements for matching, level of effort, and earmarking are unique to each federal program and are found in the laws and regulations and the provisions of the contract or grant agreements pertaining to the program. Where applicable, federal awards may specify a time period during which the nonfederal entity may use the federal funds. A nonfederal entity may charge to the award only costs resulting from obligations incurred during the funding period and any preaward costs authorized by the awarding agency. Nonfederal entities are prohibited from contracting with or making subawards to parties that are suspended or debarred from contracting with the federal government. Requires that program income be deducted from program outlays unless otherwise specified in agency regulations or the terms and conditions of the award. Requires that the provisions specified in the Uniform Relocation Assistance and Real Property Acquisition Policies Act of 1970, as amended, are adhered to when persons are displaced from their homes, businesses, or farms by federally assisted programs. Requires that each recipient report program outlays and program income on a cash or accrual basis, as prescribed by the awarding agency. Requires that pass-through entities monitor subrecipients. Monitoring activities may include reviewing reports submitted by subrecipients, performing site visits, reviewing the subrecipients single audit results, and evaluating audit findings and the corrective action plan. Special tests and provisions are unique to each federal program and are found in the laws and regulations and the provisions of the contract or grant agreements pertaining to the program. The General Accounting Office, the investigative arm of Congress, exists to support Congress in meeting its constitutional responsibilities and to help improve the performance and accountability of the federal government for the American people. GAO examines the use of public funds; evaluates federal programs and policies; and provides analyses, recommendations, and other assistance to help Congress make informed oversight, policy, and funding decisions. GAO's commitment to good government is reflected in its core values of accountability, integrity, and reliability. The fastest and easiest way to obtain copies of GAO documents is through the Internet. GAO's Web site (www.gao.gov) contains abstracts and full-text files of current reports and testimony and an expanding archive of older products. The Web site features a search engine to help you locate documents using key words and phrases. You can print these documents in their entirety, including charts and other graphics. Each day, GAO issues a list of newly released reports, testimony, and correspondence. GAO posts this list, known as "Today's Reports," on its Web site daily. The list contains links to the full-text document files. To have GAO E-mail this list to you every afternoon, go to www.gao.gov and select "Subscribe to daily e-mail alert for newly released products" under the GAO Reports heading. Web site: www.gao.gov/fraudnet/fraudnet.htm, E-mail: [email protected], or 1-800-424-5454 or (202) 512-7470 (automated answering system). | The federal government awards $300 billion to state and local governments and nonprofit groups each year. The Single Audit Act promotes sound financial management, including effective internal controls, over these federal awards. Before the act, government relied on audits of individual grants to determine if the money was spent properly. The act replaced these grant audits with a single audit--one audit of an entity as a whole. GAO surveyed the 24 federal agencies subject to the Chief Financial Officers (CFO) Act and found that they have developed processes and assigned responsibilities to meet the requirements of the Single Audit Act. Agencies reported that they are using single audits to monitor compliance with administrative and programs requirements and to determine the adequacy of recipients' internal controls. One or more offices at 22 of the 24 agencies used single audits to monitor compliance with administrative and program requirements in the Circular A-133 Compliance Statement and to monitor recipients' compliance with internal controls. Eleven agencies reported that they routinely use the Federal Audit Clearinghouse database to identify recipients that incurred questionable costs or programs that have significant findings, to identify recipients with recurring findings, or to study subrecipient findings. Individuals at four agencies were unaware of the database or how to use it. Agencies that do not use the database rely on the Federal Audit Clearinghouse to send them the single audit report, which they review for information on their programs. | 6,968 | 299 |
In recent years, federal agencies have been making greater use of interagency contracting--a process by which agencies can use another agency's contracting services or existing contracts already awarded by other agencies to procure many goods and services. An agency can enter into an interagency agreement with a servicing agency and transfer funds to the servicing agency to conduct the acquisition on its behalf, or an agency can order directly from a servicing agency's contract, such as the GSA schedules or GWACs. When funds are transferred to another agency, the contracting service can be provided through entrepreneurial, fee-for- service organizations, which are government-run but operate like businesses. Interagency contracts are designed to leverage the government's aggregate buying power and simplify procurement of commonly used goods and services. In this way, the contracts offer the benefits of improved efficiency and timeliness in the procurement process. Determining the value of a particular contracting method includes considering benefits such as timeliness and efficiency as well as cost-- including price and fees. Although interagency contracts can provide the advantages of timeliness and efficiency, use of these types of vehicles can also pose risks if they are not properly managed. GAO designated management of interagency contracting a governmentwide high-risk area in 2005. A number of factors make these types of contracts high risk, including their rapid growth in popularity along with their administration and use by some agencies that have limited expertise with this contracting method, and their contribution to a much more complex procurement environment in which accountability has not always been clearly established. In an interagency contracting arrangement, both the agency that holds, and the agency that makes purchases against, the contract share responsibility for properly managing the use of the contract. However, these shared responsibilities often have not been well-defined. As a result, our work and that of some inspectors general has found cases in which interagency contracting has not been well-managed to ensure that the government was getting good value. For example, in our review of the Department of Defense's (DOD) use of two franchise funds, we found that the organizations providing these services did not always obtain the full benefits of competitive procedures, did not otherwise ensure fair and reasonable prices, and may have missed opportunities to achieve savings on millions of dollars in purchases. In another review, we found task orders placed by DOD on a GSA schedule contract did not satisfy legal requirements for competition because the work was not within the scope of the underlying contract. Recent inspector general reviews have found similar cases. For example, the Inspector General for the Department of the Interior found that task orders for interrogators and other intelligence services in Iraq were improperly awarded under a GSA schedule contract for information technology services. The Federal Acquisition Regulation (FAR) is the primary regulation governing how most agencies acquire supplies and services with appropriated funds. The regulation provides general guidance for interagency agreements that fall under the authority of the Economy Act and for the GSA schedules and GWACs. The FAR precludes agency acquisition regulations that unnecessarily repeat, paraphrase, or otherwise restate the FAR, limits agency acquisition regulations to those necessary to implement FAR policies and procedures within an agency, and provides for coordination, simplicity, and uniformity in the federal acquisition process. There are several types of interagency contracting. For more information on those included in our review, see appendix II. DHS spends significant and increasing amounts through interagency contracting--a total of $6.5 billion in fiscal year 2005, including $5 billion through interagency agreements and about $1.5 billion by placing orders off other agencies' contracts (see fig. 1). DHS' total spending on interagency contracting increased by about 73 percent in just 1 year. DHS was established as of March 1, 2003, by merging the functions of 23 agencies and organizations that specialize in one or more aspects of homeland security. OCPO is responsible for creating departmentwide policies and processes to achieve integration and to manage and oversee the acquisition function but does not have enforcement authority to ensure that initiatives are carried out. There are seven acquisition offices within DHS that pre-date the formation of DHS and continue to operate at the components. OPO was formed with the new department to serve the newly established entities and those components that did not have a separate procurement operation. Of those that pre-date DHS, the Coast Guard and CBP provide different examples of the types of components that formed DHS. The Coast Guard, previously under the Department of Transportation, already had an extensive procurement operation, whereas CBP was created by combining the United States Customs Service, formerly part of the Department of the Treasury, Border Patrol and the inspectional parts of the Immigration and Naturalization Service, and portions of the Department of Agriculture's Animal Plant and Health Inspection Service. Thus, CBP has been faced with the added challenge of creating a procurement organization to meet its new mission. Our prior work has found that an effective acquisition organization has in place knowledgeable personnel who work together to meet cost, quality, and timeliness goals while adhering to guidelines and standards for federal acquisition. While DHS has developed guidance on the use of interagency agreements--the largest category of interagency contracting at DHS, which amounted to $5 billion in fiscal year 2005--it does not have specific guidance for other types of interagency contracting, including GSA schedules and GWACs, which accounted for almost $1.5 billion in fiscal year 2005. Moreover, we found that some DHS users may have lacked expertise in the proper use of interagency contracts. Although some DHS acquisition officials believe the FAR provides adequate guidance on the use of interagency contracts, such as the GSA schedules, our prior work and inspector general reviews have found numerous cases in which these contracting methods have not been properly used. For example, users have requested work that was not within the scope of the contract and administrators have not ensured fair and reasonable prices. Recognizing this concern, other large agencies, such as DOD and the Department of Energy, have identified the need to carefully manage the use of these contracts and have issued supplemental guidance and emphasized training programs to mitigate these risks. DHS departmentwide acquisition guidance covers interagency agreements but not other types of interagency contracting. In December 2003, DHS issued the Homeland Security Acquisition Regulation and the Homeland Security Acquisition Manual to provide departmentwide acquisition guidance. In addition, DHS issued a departmentwide directive on how to use interagency agreements by which funds are transferred to other agencies to award and administer contracts or to provide contracting services on behalf of DHS. However, as we reported in March 2005, the directive was not being followed for purchases made through these agreements. For example, there was little indication that required analyses of alternatives were performed or that required oversight was in place. Although DHS began revising the directive in fiscal year 2004, the revisions have yet to be issued. According to OCPO officials, its limited policy and oversight resources provide assistance to the components as needed, taking time away from acquisition policy efforts, such as developing guidance. For example, OCPO officials provided contracting assistance to the Federal Emergency Management Agency in the response to Hurricanes Katrina and Rita. To supplement departmentwide DHS guidance on interagency agreements, each of the components we reviewed has issued some implementing guidance. OPO issued guidance addressing the appropriate use of interagency agreements that requires program officials and contracting officers to research other available contract vehicles. In contrast, CBP guidance addresses the goals of an analysis of alternatives, but emphasizes the process and the documentation necessary to execute the interagency agreement. The Coast Guard's supplemental guidance focuses mainly on the ordering and billing procedures for interagency agreements. However, none of the components we reviewed had implementing guidance for other types of interagency contracts. While DHS acquisition officials acknowledge the need to manage the risks of interagency agreements, some do not see other types of interagency contracting, such as the GSA schedules and GWACs, as needing the same type of attention and believe sufficient guidance is available in the FAR. In fiscal year 2005, the three components we reviewed spent a total of $832 million through GSA schedules, GWACs, and other interagency contracts (see table 1). This is a 53 percent increase over the prior year. We have previously reported that use of interagency contracts demands a higher degree of business acumen and flexibility on the part of users and administrators than in the past, and acquisition officials need sufficient training and expertise to ensure the proper use of these types of contracts in an increasingly complex procurement environment. During our review, we identified several examples that showed that DHS may not have obtained a good value for millions of dollars in spending and indicated a need for improved training and expertise (see table 2). Several contracting officials stated that additional training is needed in the use of interagency contracts but that there was not much training available. In addition, other contracting officials told us that they were not aware of the range of available alternatives for interagency contracting. To ensure the proper use of all types of interagency contracts, other large procuring agencies, including DOD and the Department of Energy, have issued guidance to supplement the FAR and have emphasized specialized training. DOD is the largest user of other agencies' contracts and the Department of Energy reported that it spent about $1.7 billion on other agencies' contracts in fiscal year 2005--a substantial amount, but less than DHS. For example, DOD issued special guidance to ensure that proper procedures and management practices are used when using other agencies' contracts including GSA schedules. The guidance requires DOD acquisition personnel to evaluate, using specific criteria, whether using a non-DOD contract for a particular purchase is in the best interest of the department. The criteria include the contract's ability to satisfy the requirements in a timely manner and provide good value. DOD's guidance also emphasizes using market research to help identify the best acquisition approach to meet the requirement and states that the contracting officer should document this research. The Department of Energy also has issued guidance addressing the proper use of GSA schedules and GWACs. This guidance emphasizes that these contracts are not to be used to circumvent agency regulations and that the contracting officer should ensure that the original order and all future orders are within the scope of the contract. In the case of the GSA schedules, the contracting officer should seek and document advice from GSA's contracting officer on the proper use of the schedules whenever an issue is in doubt. In 2004, GSA took a step toward improving the management of GSA contracts and services by implementing the "Get It Right" program in part to secure the best value for federal agencies, improve education and training of the federal acquisition workforce on the proper use of GSA contracts and services, and ensure compliance with federal acquisition policies, regulations, and procedures. As part of the program, DOD and GSA have partnered to offer updated training on the proper use of GSA schedules. In addition, the Department of Energy has instituted training to emphasize the proper use and the need for planning when using the GSA schedules and GWACs. Interagency contracts are intended to offer a simplified procurement process whereby users commonly rely on planning that has already been conducted by the agency that established the contract to ensure that the prices are competitive. However, our recent work, as well as the work of others, has found that not all interagency contracts provide good value when considering both timeliness and cost. This suggests the need for evaluating the selection of an interagency contract. According to DHS contracting officials the benefits of speed and convenience--not total value including cost--have often driven decisions to choose interagency contracting vehicles. As of July 2005, DHS has required an analysis of alternatives for all purchases. Of the 17 cases in our review, this analysis was only required for the four interagency agreements. None of these interagency agreements indicated that the required analysis was conducted. Without an evaluation of interagency contracting alternatives, DHS users cannot be sure they are obtaining a good value. A sense of urgency has prevailed in DHS' acquisition decision-making process, according to officials from the Office of Inspector General. For example, one official said that expediting program schedules and contract awards limits time available for adequate procurement planning, which can lead to higher costs, schedule delays, and systems that do not meet mission objectives. Eight of the 16 contracting officers we interviewed at OPO, CBP, and Coast Guard told us that using interagency contracts was a quick and convenient way to acquire needed products and services. A few DHS contracting officers felt that interagency contracts--in particular, GSA schedules--were the only viable alternatives given time constraints. In some cases, officials told us that it could take 4 to 6 months to establish and obtain goods and services through an in-house contract. In other cases, officials stated that purchase requests were received too close to the end of the fiscal year to use anything other than an interagency contract. None of the contracting officials said they chose to use interagency contracts because they also provided good value to DHS in terms of total cost. Interagency contracts are designed to be convenient to use and require less planning than entering into a full and open competition for a new contract, and users commonly rely on planning that has already been conducted by the agency that established the contract. However, we found that GSA schedule prices may not always be the most competitive, and agencies do not always obtain the required competition when using the schedules, thus, there is no assurance that these contracts are providing good value. In another review, we found that fees charged by the agency that provides the contracting service may not make these contracts cost- effective in some cases. Purchasing agencies also sometimes pay fee on top of fee for the use of another agency's contract because servicing agencies may be using other agencies' contracts--including GSA schedules--to make purchases. Fees charged for the use of GWACs also range between 0.65 and 5 percent. Given these concerns, evaluating the selection of an interagency contract is a sound management practice used by other large agencies. Pursuant to DHS acquisition policy, purchases made through interagency agreements require an analysis of alternatives to determine that the approach is in the government's best interest; however, in the four cases we reviewed that fell under this requirement, there was no indication that this analysis was performed. In one case, CBP used FedSim, one of GSA's contracting service providers, to place an order for $9 million for information technology support for systems security. In another case, CBP transferred $5 million to a franchise fund for the purchase of license plate readers. In the two remaining cases, OPO used FedSim to place orders totaling about $45 million against one contract to provide information technology support for the Homeland Secure Data Network. In these examples, there was little evidence that DHS users determined whether this was the best method for acquiring the needed services. These findings are consistent with our March 2005 review, in which we did not find an analysis of alternatives in 94 percent of the cases where it was required. Recent internal reviews at OPO and CBP cited similar findings in which evidence that a determination of findings or an analysis of alternatives was conducted was missing. In our review of 17 cases, we also found several examples where contracting officers placed orders to fulfill what were perceived to be critical needs, for convenience without comparing alternatives, or to spend funds at the end of the fiscal year without obtaining competing proposals. While an analysis of alternatives was not required in most of these cases, performing such an analysis could have helped DHS users to address some of the known concerns about these types of contracts to ensure that they obtained good value for the department (see table 3). As of July 2005 DHS has required an analysis of alternatives for all acquisitions, including all types of interagency contracts. DHS policy now states that all acquisition plans must include an analysis of alternatives including a discussion of why the acquisition process was chosen and the processes considered. The guidance states that the plan must contain information about the type of contract selected. However, the guidance does not include factors to consider or specific criteria for making a good choice among alternative contracting options. We have found that some agencies have established factors to consider in making this decision. For example, DOD and the Department of Energy have established factors that incorporate considerations of value, policy and regulatory requirements, customer needs, and administrative responsibilities. Following are some of the factors these agencies use: Value: cost (including applicable fees or service charges); whether using an interagency contract is in the best interest of the department. Policy and regulatory requirements: departmental funding restrictions; departmental policies on small business, performance- based contracting, and competition. Customer needs: schedule; scope of work; unique terms, conditions and requirements. Contract administration: including oversight, monitoring, and reporting requirements. Although DHS' spending through interagency contracting totals billions of dollars annually and increased by 73 percent in the past year, the department does not systematically monitor its use of these contracts to assess whether this method for acquiring goods and services is being properly managed and provides good outcomes for the department. While OCPO has established a framework for an acquisition oversight program, the program is not designed to assess the outcomes of different contracting methods including interagency contracting. According to officials, DHS' acquisition oversight program has been hindered by limited resources and authority. DHS does not systematically monitor spending on its interagency contracts, which totaled $6.5 billion in fiscal year 2005--37 percent of DHS' procurement spending for that year. This type of monitoring could provide DHS with useful information to assess its use of this contracting method. For example, as part of its strategic sourcing initiative, DHS officials said they reviewed the component's use of information technology and telecommunications contracts and determined that the department could achieve savings of $22.5 to $45 million in fees and reduced prices by establishing its own departmentwide contracts. However, DHS does not have available information to make comparable assessments for interagency contracts. For example, DHS officials were not able to readily provide data on the amounts spent through different types of interagency contracts. To respond to our request for information, OCPO prepared a special report on the use of GSA schedules and GWACs. For information on interagency agreements, OCPO had to request data from components. Ultimately, however, we had to compile a summary and clarify information obtained from components. DHS also does not collect data on the amount of service fees paid to other agencies for the use of contracting services or vehicles regarding interagency contracting, such as the amount of service fees paid to other agencies, and the components, which pay the fees, also do not collect this data. In prior work in this area, we have found that these fees can range from less than 1 percent to 8 percent. In March 2005, we found that OPO, the largest user of interagency contracts among the components, alone paid $12.9 million in service fees in fiscal year 2004. Given that the volume of DHS' interagency contracting has increased by $2.7 billion, or about 73 percent, since fiscal year 2004, it is likely that the fees paid also have increased substantially. This lack of data is not unique to DHS. Although the need to collect and track data on interagency contracting transactions has become increasingly important governmentwide, there is no governmentwide system to collect this data. In fact, the Office of Management and Budget has an effort underway to collect basic information on interagency contracting from all federal agencies. While each of the components we visited has established its own internal reviews to evaluate contracting practices, including the use of interagency contracts, these reviews are compliance-based and are not designed to evaluate the outcomes of interagency contracting. For example, OPO, which has taken a comprehensive approach, established procedures for reviewing and approving procurement actions. The review includes an assessment of the documentation for compliance with acquisition regulations or policies; soundness of the acquisition strategy; use of business judgment; and completeness, consistency, and clarity. OPO also had a study completed to determine whether its contracts, task orders, interagency agreements, and other transactions were awarded and administered in compliance with procurement laws, regulations, and internal DHS and OPO operating policies and procedures. While the review found that much improvement was needed to comply with policies and procedures, it was not designed to address areas such as timeliness, total cost including price and fees paid, and customer service to determine whether a particular contract method resulted in the best outcome. In December 2005, OCPO issued a policy that provides a framework for a departmentwide acquisition oversight program. However, the framework does not evaluate the outcomes of different contracting methods, including interagency contracting, to determine whether the department obtained good value. Additionally, the Chief Procurement Officer lacks the authority needed to ensure the department's components comply with its procurement policies and procedures that would help to establish an integrated acquisition function. The framework includes four key reviews (see table 4). According to DHS officials, the acquisition planning review was operational as of August 2006, and an on-site review was ongoing at the Federal Emergency Management Agency. DHS plans to implement the full program in fiscal year 2007. According to OCPO officials, while DHS expects to track interagency contracting through this framework, it will not gather data to determine whether these contracts were used effectively. For example, through the operational status reviews, DHS plans to track the number and dollar value of orders placed using interagency agreements and GSA schedules and GWACs. However, these reviews will not collect data on cost including the price of goods and services and fees paid, timeliness, or customer service, that would help them to evaluate whether specific interagency contracts were a good value. In addition, the Chief Procurement Officer, who is held accountable for departmentwide management and oversight of the acquisition function, lacks the authority and has limited resources to ensure compliance with acquisition policies and processes. As of August 2006, according to OCPO officials, only five staff were assigned to departmentwide oversight responsibilities for $17.5 billion in acquisitions. According to OCPO officials, their small staff faces the competing demands of providing acquisition support for urgent needs at the component level. As a result, they have focused their efforts on procurement execution rather than oversight. Officials also noted that limited resources have delayed the oversight program's implementation. DHS' acquisition function was structured to rely on cooperation and collaboration among DHS components to accomplish the department's goals. While this structure was intended to make efficient use of resources departmentwide, it has limited the Chief Procurement Officer's ability to effectively oversee the department's acquisitions, manage risks, and has ultimately wasted time and other resources. In our prior work, we have found that in a highly functioning acquisition organization, the chief procurement officer is in a position to oversee compliance with acquisition policies and processes by implementing strong oversight mechanisms. In March 2005, we recommended that OCPO be provided sufficient enforcement authority and resources to provide effective oversight of DHS' acquisition policies and procedures. In a 2005 review of the department's organization, the Secretary focused on mission initiatives and, as of August 2006, has not changed the structure of the operational functions to provide additional authority to the Chief Procurement Officer. One of the largest procuring agencies in the federal government, DHS relies on contracts for products and services worth several billions of dollars to meet its complex homeland security mission. Effective acquisition management must include sound policies and practices for managing the risks of large and rapidly increasing use of other agencies' contracts. While the use of these types of contracts provides speed and convenience in the procurement process, the agencies that manage the contracts and DHS users have not always adhered to sound contracting practices. Guidance and training that could help DHS to address risks is not in place; planning was not always conducted; and adequate monitoring and oversight were not performed. While DHS has developed a framework for an oversight program, until such oversight is in place, DHS cannot be sure that taxpayer's dollars are being spent wisely and purchases are made in the best interest of the department. While the challenges to effective management of an acquisition function in any organization with a far-reaching mission are substantial, these challenges are further complicated at DHS by an organizational structure in which the Chief Procurement Officer lacks direct authority over the components. Without such authority, the department cannot be sure that necessary steps to implement improvements to its acquisition function will be taken. To improve the department's ability to manage the risks of interagency contracting, we recommend that the Secretary of Homeland Security consider the adequacy of the Office of the Chief Procurement Officer's resources and implement the following three actions: develop consistent, comprehensive guidance, and related training to reinforce the proper use of all types of interagency contracts to be followed by all components; establish, as part of the department's planning requirement for an analysis of alternatives, criteria to consider in making the decision to use an interagency contract; and implement oversight procedures to evaluate the outcomes of using interagency contracts. Because the Secretary has not taken action to ensure departmentwide acquisition oversight, Congress should require the Secretary to report on efforts to provide the Chief Procurement Officer with sufficient authority over procurement activities at all components. We provided a draft of this report to DHS for review and comment. In written comments, DHS concurred with all of our recommendations and provided information on what action would be taken to address them. The department's comments are reprinted in appendix III. Regarding the recommendation for guidance and training to reinforce the proper use of all interagency contracts, DHS stated that it will issue a revised management directive in the near future. This directive will require the reporting of data on interagency agreements. DHS also will issue additional direction to the components on reporting the use of other types of interagency contracts. With regard to training, the OCPO will introduce specific training with respect to all types of interagency contracting for all contracting personnel during fiscal year 2007. With regard to establishing criteria to consider in making the decision to use an interagency contract, DHS will revise the acquisition planning guide to address this recommendation. With regard to implementing oversight procedures to evaluate the outcomes of using interagency contracts, DHS plans to incorporate oversight procedures assessing the proper use of interagency contracts and agreements into its acquisition oversight program. Concerning the overall use of interagency contracts, the department's comments stated that it is the goal of the OCPO to reduce the number and value of contracts awarded through the use of interagency contracts or agreements. This will be accomplished in part through the use of new departmentwide contracts for information technology equipment and services. We believe this is a positive step toward improving DHS' contract management. In responding to the Matter for Congressional Consideration that the Secretary report on efforts to provide the Chief Procurement Officer with sufficient authority over procurement activities, DHS noted some steps that the Secretary has taken to improve acquisition oversight. revised the investment review process, placing the Chief Procurement Officer in a key position to review and provide oversight of the Department's most critical programs; supported an increase of 25 OCPO positions to improve acquisition and directed the Chief Procurement Officer to work with all component heads to report on departmentwide progress in key acquisition areas. While these actions should help, they do not provide the Chief Procurement Officer with sufficient authority to ensure effective oversight of DHS' acquisition policies and procedures, and we continue to believe that the Congress should require the Secretary to report on efforts to address this lack of authority. We are sending copies of this report to the Secretary of the Department of Homeland Security, and to other interested agencies and congressional committees. We will also make copies available to others upon request. In addition, this report will be available at no charge on the GAO Web site at http://www.gao.gov. If you have any questions about this report or need additional information, please contact me at (202) 512-4841 ([email protected]). Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this report. Other staff making key contributions to this report were Amelia Shachoy, Assistant Director; Greg Campbell; Christopher Langford; Eric Mader; Bill McPhail; Russ Reiter; Karen Sloan; and Karen Thornton. To determine the level of interagency contracting at the Department of Homeland Security (DHS), we requested data from each component on fiscal year 2005 purchases made through all types of interagency contracts. We compiled a summary of purchases made through interagency agreements, the General Service Administration's schedules and governmentwide acquisition contracts (GWAC) from the individual reports we received from each component. We found that the Office of Procurement Operations (OPO), Customs and Border Protection (CBP), and Coast Guard were the largest users of interagency contracts in fiscal year 2005. Based on a review of this data, we selected 17 cases, totaling $245 million. Interagency contracting actions for these components represented a sample of GSA schedule, GWAC, and interagency transactions made through fee-for-service contracting providers. See table 5. The 17 cases were selected to represent procurement actions of $5 million or more at three DHS components. Because our findings included similar problems across these activities, we believe they represent common problems in DHS' procurement process. To assess the reliability of this data, we compared the data obtained from DHS to the data maintained in the Federal Procurement Data System-Next Generation (FPDS-NG). Based upon the comparison, we determined that the data were sufficiently reliable for our purposes. To assess the extent to which DHS manages the risks of interagency contracting, we reviewed guidance and oversight at the departmental level and at the three components in our sample--OPO, CBP and Coast Guard, and we interviewed officials in the Office of the Chief Procurement Officer (OCPO) and senior officials of the components under review. To determine how other large agencies address the management risks of interagency contracting, we reviewed relevant guidance and training at the Departments of Defense and Energy. We also reviewed relevant GAO and Inspector General reports. To assess DHS planning for the use of interagency contracts, we conducted fieldwork at CBP's National Acquisition Center in Indianapolis, Indiana; National Data Center in Springfield, Virginia; and at the Coast Guard's procurement office in Norfolk, Virginia, and reviewed contract files and completed a data collection instrument for each of the 17 cases we selected. We also interviewed the contracting officer, program manager and Contracting Officer's Technical Representative to discuss each case. In conducting our review, we identified the reasons for using interagency contracts and the reasons for choosing a particular interagency contract. We performed our review between February and August 2006 in accordance with generally accepted government auditing standards. | The Department of Homeland Security (DHS) has some of the most extensive acquisition needs within the federal government. In fiscal year 2005, DHS spent $17.5 billion on contracted purchases, $6.5 billion, or 37 percent, of which was through the use of other agencies' contracts and contracting services, a process known as interagency contracting. While these types of contracts offer the benefits of efficiency and convenience, in January 2005, GAO noted shortcomings and designated the management of interagency contracting as a governmentwide high-risk area. Given the department's critical national security mission and the results of our earlier work, GAO reviewed the extent to which DHS manages the risks of interagency contracting and assessed DHS' guidance, planning, and oversight of interagency contracting. DHS has developed guidance on how to manage the risks of some but not all types of interagency contracts. The department has guidance for interagency agreements--the largest category of interagency contracting at the department--but does not have specific guidance for using other types of contracts such as the General Services Administration (GSA) schedules and governmentwide acquisition contracts (GWAC), which amounted to almost $1.5 billion in fiscal year 2005. Moreover, in some cases we found users may have lacked expertise that could be addressed through guidance and training on the use of these types of contracts. DHS did not always consider alternatives to ensure good value when selecting among interagency contracts. While this contracting method is often chosen because it requires less planning than establishing a new contract, evaluating the selection of an interagency contract is important because not all interagency contracts provide good value when considering timeliness and cost. As of July 2005 DHS has required planning and analysis of alternatives for all acquisitions. In this review, we found that in all four cases for which an analysis of alternatives was required, it was not conducted. DHS officials said benefits of speed and convenience--not total value including cost--have often driven decisions to choose these types of contracts. DHS does not systematically monitor its total spending on interagency contacts and does not assess the outcomes of its use of this contracting method. According to officials, DHS' acquisition oversight program has been hindered by limited resources and authority. As of August 2006, the Office of the Chief Procurement Officer had five staff assigned to departmentwide oversight responsibilities for $17.5 billion in acquisitions. In March 2005, GAO recommended that the Chief Procurement Officer be provided sufficient authority to provide effective oversight of DHS' acquisition policies and procedures. Without this authority, DHS cannot be certain that acquisition improvements are made. | 6,694 | 566 |
OSHA is responsible for enforcing the provisions of the Occupational Safety and Health Act of 1970 for about half the states; the remaining 26 states have been granted authority to set and enforce their own safety and health standards under a state plan approved by OSHA. At present, 22 of these 26 states enforce occupational safety and health provisions under a state plan covering all worksites, and have their own VPP programs. The other 4 states have plans covering only public sector employer worksites; VPP sites in these 4 states are part of OSHA's federally managed VPP. To help ensure compliance with federal safety and health regulations and standards, OSHA conducts enforcement activities and provides compliance assistance to employers. Enforcement represents the preponderance of agency activity and includes safety and health inspections of employer worksites. Among its compliance assistance efforts, OSHA established the VPP in 1982 to recognize worksites with safety and health systems that exceed OSHA's standards. A key requirement for participation in the VPP is that worksites have low injury and illness rates compared with the average rates for their respective industries. The VPP is divided into three programs (see table 1): the Star, Merit, and Star Demonstration programs. The Star program has the most stringent requirements because it is for worksites with exemplary safety and health systems that successfully protect employees from fatality, injury, and illness. OSHA's Directorate of Cooperative and State Programs--the national office--oversees the VPP activities of each of its 10 regional and 80 area offices. Each regional office has a regional administrator, who coordinates all of the region's activities, including the VPP, and a VPP manager, who implements and manages the program. The VPP manager conducts outreach to potential VPP sites and encourages participants to continually improve their safety and health systems. In addition, the VPP manager coordinates the region's activities related to the program, such as reviews of applications submitted by potential sites and on-site reviews of VPP sites. Employer worksites apply to OSHA to participate in the VPP. They must meet a number of requirements, including having an active safety and health management system that takes a systems approach to preventing and controlling workplace hazards. As shown in figure 1, OSHA has defined four basic elements of a comprehensive safety and health management system. These requirements must be in place for at least 1 year. In addition, there must be no ongoing enforcement actions, such as inspections, at the worksites or willful violations cited by OSHA within the 3-year period prior to the site's initial application to participate in the VPP. VPP sites are also required to have injury and illness rates below the average rates for their industries published by Bureau of Labor Statistics. These rates must be below the average industry rates for 1 of the most recent 3 years. VPP sites are required to report their injury and illness rates to OSHA's regional offices annually. The VPP managers review this information and send summary reports to the national office. For each calendar year, the national office compiles a summary report of injury and illness rates for VPP sites participating in the program. OSHA determines whether worksites are qualified to participate in the VPP through its approval process, which includes an on-site review of each worksite. According to OSHA guidance, the regional offices are required to conduct an on-site review of each potential VPP site to ensure that the four elements are in place and to determine how well the site's safety and health management system is working. As part of these reviews, the regions are required to verify the sites' injury and illness rates, interview employees and management, and walk through the facilities. This initial on-site review usually lasts about 4 days and involves approximately three to five OSHA staff, according to OSHA's VPP policies. OSHA also uses volunteers from other VPP sites--Special Government Employees who have been trained by OSHA--to conduct some portions of these reviews. OSHA's national office is responsible for the initial approval of all new VPP sites. VPP sites in the Star program must also be reapproved every 3 to 5 years after an on-site review is conducted by the region. OSHA's approval process is outlined in table 2. Once they have been approved, VPP sites must commit to continuously improving the safety and health of their worksites, maintaining low injury and illness rates, and reporting annually to OSHA on the status of their safety and health systems. The VPP sites' annual reports detail their efforts to continuously improve and detail the sites' injury and illness rates. OSHA's regional offices review these reports to ensure that the VPP sites' injury and illness rates have not increased beyond the program's requirements. According to OSHA's VPP Policies and Procedures Manual, OSHA must request that a site withdraw from the VPP if it determines that the site no longer meets the requirements for VPP participation. OSHA may also terminate a site for failure to maintain the requirements of the program. The national office is responsible for collecting the injury and illness data reported annually by VPP sites to the regions. If VPP sites' 3-year average rates rise above the average rates for their industries published by the Bureau of Labor Statistics, the regions must place the site on a rate-reduction plan if an on-site review is not conducted that year or must place the site in a 1-year conditional status if an on-site review is conducted. The regions must also notify the national office of actions they take in response to incidents, such as fatalities and serious injuries, at VPP sites. The regions are required to review sites' safety and health systems after such incidents to determine (1) whether systemic changes are needed to prevent similar incidents from occurring in the future and (2) whether the site should remain in the program. The regions may also conduct on-site reviews of VPP sites if they determine that the incidents were related to deficiencies in the sites' safety and health management systems. The decision to recommend whether a site at which a fatality has occurred should remain in the program is left to the discretion of the regional administrator. The VPP has grown steadily since its inception, with the number of employer worksites in the program more than doubling--from 1,039 sites in 2003 to 2,174 sites in 2008. During this period, the number of sites in the federally managed VPP, representing over two-thirds of all VPP sites, increased at a similar rate as the number of sites in the state managed programs. In 2003, there were 734 sites in the federal VPP and 305 in the state managed VPP. By the end of 2008, both the federal and the state programs had more than doubled to 1,543 and 631, respectively. (See fig. 2.) Although the industries represented in the VPP did not change significantly from 2003 to 2008, there were substantial increases in certain industries. The largest industry in the VPP was the chemical industry, which accounted for a 43 percent increase in the number of VPP sites, from 208 in 2003 to almost 300 in 2008. The motor freight transportation industry, which had only 20 sites in 2003, grew tenfold to just over 200 sites in 2008, due in part to the growth in the number of Postal Service sites. In addition, the number of sites in the electric, gas, and sanitary services industries increased from about 50 sites to more than 200 during the same period. See figure 3 for a comparison of the largest industries represented in the VPP in 2003 and 2008. While 4 federal worksites--including the Tobyhanna Army Depot in Tobyhanna, Pennsylvania, and the National Aeronautics and Space Administration Langley Research Center in Hampton, Virginia--have participated in the VPP since the late 1990s, the number of federal worksites increased to almost 10 percent of all VPP sites in 2008. At the end of 2008, almost 200 VPP sites were federal agencies or Postal Service sites. The majority of these sites--157--were post offices, processing and distribution centers, and other postal facilities, while most of the remaining sites were Department of Defense facilities, such as naval shipyards, Army depots, and Air Force facilities. In addition, from 2005 to 2008, 7 OSHA area offices in 1 region were approved as new VPP sites as a result of OSHA's efforts to have all of its offices participate in the program so that they could be role models for the federal agencies. The average size--based on the number of employees--of VPP sites has become increasingly smaller in the last 5 years. From 2003 to 2008, the average number of employees at VPP sites decreased from 501 to 408. In addition, the median size of a VPP site decreased from 210 to 145 employees. As shown in figure 4, the proportion of VPP sites with fewer than 100 workers increased from 28 percent in 2003 to 39 percent in 2008. Across all VPP sites, the number of employees covered by the VPP has grown to over 885,000 workers. A key factor influencing growth of the VPP has been OSHA's emphasis on expansion of the program. For example, in 2003, the Secretary of Labor for OSHA announced plans to expand eligibility for the VPP to reach a larger number of worksites. These plans included adding more federal sites, such as Department of Defense facilities and certain types of construction sites. OSHA's national office has given each of its 10 regions targets for the number of new sites to be approved each year. While the regions did not always meet these targets from fiscal years 2003 to 2008, they generally increased the number of new sites each year, as shown in table 3. Several OSHA regional administrators told us that expanding the program beyond the current level of approved sites will be difficult, given their current resources. Another factor influencing the growth of the VPP is outreach efforts, including participants' outreach to other employers and employers seeking out the program after hearing about it from OSHA or other employers. According to OSHA officials and VPP participants, outreach efforts focus on the positive benefits of the program, including improved productivity of workers at VPP sites and decreased costs, such as reductions in sites' workers' compensation insurance premiums due to lower injury and illness rates. Some employers, such as the Postal Service, also cite avoidance of the costs of workplace injuries--which the National Safety Council estimated as approximately $39,000 per year, per incident in 2007--as one of the benefits of participation. In addition, the national association of VPP participants, the Voluntary Protection Programs Participants' Association, contributes to program growth through its mentoring program in which current participants help new sites meet the qualifications of the VPP. We interviewed employees from VPP sites, and their perspectives varied. Employees who supported the program told us that the benefits include having a more collaborative partnership between OSHA, management, and workers; establishing a "mindset of safety"; and addressing several safety problems at one worksite that workers had tried for several years to have corrected. Those who did not fully support the program included employees at VPP sites who told us that they recognized some of the benefits of the VPP, but that they had reservations about the program. For example, some employees were concerned that, after the application process and initial on-site review had been completed, sites may not maintain the high standards that qualified them for participation. Furthermore, some employees said that the injury and illness rates requirements of the VPP are used as a tool by management to pressure workers not to report injuries and illnesses. OSHA's internal controls are not sufficient to ensure that only qualified worksites participate in the VPP. First, OSHA's oversight is limited by the minimal documentation requirements of the program. Second, OSHA does not ensure that its regional offices consistently comply with its policies for the VPP. OSHA's lack of a policy requiring documentation in the VPP files of actions taken by the regions in response to incidents, such as fatalities and serious injuries, at VPP sites limits the national office's ability to ensure that regions have taken the required actions. OSHA's VPP Manual requires regions to review sites' safety and health systems after such incidents to determine whether systemic changes are needed to prevent similar incidents from occurring in the future and whether the site should remain in the program. However, the manual does not require the regions to document their decisions or actions taken in the VPP files, which would allow OSHA's national office to ensure that the regions took the appropriate actions. When fatalities, accidents, or other incidents involving serious safety and health hazards occur at any VPP site, OSHA's policy requires that enforcement staff conduct an inspection of the site. In these cases, the area director is required to notify the VPP manager and send a report of the inspection. The VPP manager is then required to report information on the incidents that occurred to the Assistant Secretary for Occupational Safety and Health, the Director of Cooperative and State Programs, and the regional administrator. The decision on whether to conduct an on-site review after such an incident is left to the discretion of the regional administrator based on the results of the enforcement inspection. These reports, however, are not required to be included in the VPP files maintained by the regions. OSHA has a draft policy that sets time frames for retention of documents in the VPP files, but the policy does not contain guidance regarding the types of actions that must be documented in the files. Some regional VPP officials told us that they have requested such guidance from OSHA's national office, but the national office has not issued a directive on what information should be documented in the files or on how long it should be retained. The OSHA official responsible for overseeing the program did not agree with regional VPP officials, and stated that the VPP Manual addresses the documentation requirements. However, the manual does not require actions taken by the regions in response to fatalities and serious injuries to be documented in the VPP files. From our review of OSHA's VPP files, we found that there was no documentation of actions taken by the regions' VPP staff to (1) assess the safety and health systems of the 30 VPP sites where 32 fatalities occurred from January 2003 to August 2008 or (2) determine whether these VPP sites should remain in the program. We obtained information on VPP sites at which fatalities occurred during this period from OSHA's national office. To determine what actions were taken in response to the fatalities, we interviewed regional VPP staff and reviewed the regions' inspection and VPP files for the sites with fatalities. Although the actions taken by the regional VPP staff were not documented in the VPP files, we reviewed the inspection files and interviewed the VPP staff to determine the actions they took in response to the fatalities. The VPP managers told us that they placed 5 of the 30 sites on 1-year conditional status, and that 5 sites voluntarily withdrew from the VPP. OSHA allowed 17 of the sites to remain in the VPP--some in the Star program and some in the Merit program--until their next regularly scheduled on-site reviews. One of these sites had 3 separate fatalities over the 5-year period. Another site received 10 violations related to a fatality, including 7 serious violations and 1 violation related to discrepancies in the site's injury and illness logs. OSHA allowed this site to continue to participate in the VPP as a Star site. Three sites had not been reviewed by the regional VPP staff because OSHA's enforcement staff had not completed their investigations of the sites. As a result, sites that did not meet the definition of the VPP's Star program to "successfully protect employees from fatality, injury, and illness" have remained in the program. OSHA's oversight of the VPP is limited because it does not have internal controls, such as management reviews by the national office, to ensure that its regions consistently comply with VPP policies for verifying sites' injury and illness rates and conducting on-site reviews. Although having relatively low injury and illness rates are key criteria for program participation, the regions do not always verify sites' rates according to OSHA's policies. For example, the VPP Manual requires that, prior to conducting an on-site review, the region must obtain written approval from the national office allowing access to medical information related to injuries and illnesses at the site. However, our review of the VPP files and information from OSHA's national office showed that, for almost 80 percent of the cases, regions did not obtain such written approval prior to conducting their on-site reviews. As a result, the regions did not have access to workers' medical records needed to verify sites' injury and illness rates, and the national office had no assurance that the regions verified these rates as required. In addition, OSHA's national office did not review the actions taken by the regions to ensure that they followed up when VPP sites' injury and illness rates rose above the minimum requirements for the program. From our review of OSHA's 2007 summary report of injury and illness rates for VPP sites, we found that, for 12 percent of the sites, at least one of their 3-year average injury and illness rates was higher than the average injury and illness rates for their industries. For example, one VPP site reported a 3- year average injury and illness rate of 10.0, which was 7.6 points higher than the industry average of 2.4. Similarly, another site's 3-year average injury and illness rate was 7.5 points higher than the industry average. We found that this site's injury and illness rate had also been above the industry averages for each of the previous 4 years, yet it remained in the VPP Star program. OSHA's national office does not require regions to report information on actions taken to ensure that sites lower their injury and illness rates when these rates rise above the industry averages. The national office, therefore, cannot ensure that the regions take action as required. As a result, some sites that have not met a key requirement of the VPP have remained in the program. Finally, some regions conducted less comprehensive reviews of VPP sites than those required by the VPP Manual. In an effort to leverage its limited resources, OSHA permitted two regions to conduct abbreviated on-site reviews as part of a pilot program in which the regions were allowed to evaluate only one or two elements of sites' safety and health management systems, rather than all four elements. From our review of the VPP files, we estimated that, from 2000 to 2006, OSHA conducted abbreviated on-site reviews of almost 10 percent of its sites. As a result, some sites for which OSHA reviewed only two of the four elements may not have met all of the minimum requirements to participate in the program. According to the OSHA official responsible for managing the VPP, the agency discontinued its use of these abbreviated reviews after learning from the pilot that it is difficult to isolate certain program elements, and that evaluating only one or two elements leaves out key aspects of the program because the four elements are interrelated. OSHA's efforts to assess the performance of the VPP and evaluate its effectiveness are not adequate. First, OSHA has not developed performance goals or measures to assess the performance of the program. Second, OSHA contracted for a study of the VPP to evaluate its effectiveness, but the study was flawed. OSHA has not developed performance goals or measures for the VPP to assess the program's performance. The Government Performance and Results Act of 1993 requires agencies to set goals and report annually on program performance by measuring the degree to which the program achieves those goals. OSHA officials told us that, while they have not established specific goals for the VPP, the best measure of program performance is that VPP participants consistently report average injury and illness rates that are about 50 percent below their industries' average rates. However, these rates may not be the best measure of performance. First, our analysis of OSHA's annual summary reports of injury and illness rates for 2003 through 2007 showed that, for 35 percent of the sites in our sample for which data were available, there were discrepancies between the injury and illness rates reported by the sites and the rates noted in OSHA's regional on-site review reports for the same time periods. For example, OSHA's 2007 summary report showed that one VPP site reported an injury and illness rate of zero, but OSHA found during its on-site review that the rate was actually 1.7 for the same period. Second, OSHA has not evaluated the impact of the VPP on sites' injury and illness rates, such as comparing VPP sites' injury and illness rates with those of similar sites that do not participate in the program. OSHA also does not use information reported annually by VPP sites to develop goals or measures that could be used to assess program performance. VPP participants are required to conduct annual self assessments of their sites and to report this information to OSHA. The reports are to contain a review of the site's safety and health management system, including safety and health hazards identified and the steps taken to correct them; a description of any significant management changes that can affect safety and health at the site, such as changes in ownership; and information on benefits related to participation in the VPP, such as cost savings due to lower workers' compensation insurance premiums, decreased turnover and absenteeism, and increased productivity. However, OSHA's national office does not use the information from these reports because most of this information is maintained in the regional offices, and they are not required to send it to the VPP national office. In response to a recommendation in our 2004 report that the agency evaluate the effectiveness of the VPP, OSHA contracted with The Gallup Organization to study the effectiveness of the program--the results of which were reported in September 2005. As part of this study, OSHA identified two objectives that included (1) determining the impact of its outreach and mentoring programs on potential and new VPP sites' safety and health systems and (2) determining changes in the VPP sites' injury and illness rates due to their participation in the program. To obtain information for this study, The Gallup Organization sent a questionnaire to all VPP sites participating in the federally managed program. However, the study had significant design flaws. Specifically, the response rates by participants were low (46 percent overall, and 34 percent completed the questionnaire), and the data reported by participants were not validated. For example, a review of the sites' mentoring and outreach efforts, which are not indicators of program performance, made up two-thirds of the report, and other factors that could have influenced the sites' injury and illness rates were not considered or measured. Because of these limitations, we concluded that the report's findings were not reliable or valid and could not be used to demonstrate the effectiveness of the VPP. In our discussions with OSHA officials, they acknowledged the limitations of the study, but said they have not conducted any additional evaluations of the VPP and have no plans to conduct future evaluations of the effectiveness of the program. Officials said they do not need to do so because the low injury and illness rates reported by VPP participants are the best measure of the program's effectiveness. However, without a more reliable evaluation of the program, OSHA does not know whether the program is effectively meeting its objective of recognizing worksites with exemplary safety and health management systems that exceed OSHA's standards. OSHA continues to expand the VPP, which adds to the responsibilities of staff who manage and maintain the integrity of the program and reduces the resources available to ensure that non-VPP sites comply with safety and health regulations and with OSHA's standards. In the absence of policies that require its regional offices to document information regarding actions taken in response to fatalities and serious injuries at VPP sites, OSHA cannot ensure that only qualified sites participate in the program. In addition, some sites with serious safety and health deficiencies that contributed to fatalities have remained in the program, which has affected its integrity. Without sufficient oversight and internal controls, OSHA's national office cannot be assured that the regional offices are following VPP policies. Finally, because OSHA lacks performance goals and measures to use in assessing the performance of the VPP, it continues to expand the program without knowing its effect on employer worksites, such as whether participation in the VPP has improved workers' safety and health. To ensure proper controls and measurement of program performance, the Secretary of Labor should direct the Assistant Secretary for Occupational Safety and Health to take the following three actions: develop a documentation policy regarding information on follow-up actions taken by OSHA's regional offices in response to fatalities and serious injuries at VPP sites; establish internal controls that ensure consistent compliance by the regions with OSHA's VPP policies for conducting on-site reviews and monitoring injury and illness rates so that only qualified worksites participate in the program; and establish a system for monitoring the performance of the VPP by developing specific performance goals and measures for the program. We provided a draft of this report to the Secretary of Labor for comment. We received written comments from the Assistant Secretary for Occupational Safety and Health, which are reproduced in their entirety in appendix II. The agency also provided technical comments, which we incorporated in the report as appropriate. OSHA agreed with our recommendations to develop better documentation requirements and strengthen internal controls to ensure consistent compliance with VPP policies across its regions. Regarding our recommendation to develop performance goals and measures for the VPP to use in monitoring performance, OSHA stated that it would continue to identify and refine appropriate performance measures for the program. As agreed with your offices, unless you publicly announce the contents of this report earlier, we plan no further distribution until 30 days from the report date. At that time, we will send copies of this report to relevant congressional committees, the Secretary of Labor, and other interested parties. The report will also be available at no charge on GAO's Web site at http://www.gao.gov. If you or your staffs have any questions about this report, please contact me at (202) 512-7215 or [email protected]. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this report. Key contributors to this report are listed in appendix III. To identify the number and characteristics of employer worksites in the Voluntary Protection Programs (VPP), we analyzed data in the Department of Labor's Occupational Safety and Health Administration (OSHA) VPP database. We reviewed data in OSHA's VPP database for all sites in the VPP--both those in the federally managed program and in the VPP programs managed by the states. We analyzed data on VPP participation activity from the inception of the program in 1982 through the end of calendar year 2008. Prior to our analysis, we assessed the reliability of the information in OSHA's VPP database by interviewing OSHA officials; reviewing related documentation, including the data system user manual; and conducting electronic testing of the data. On the basis of our review of the database, we found that the data were sufficiently reliable to report the number and characteristics of participants in the VPP. To determine the factors that contributed to growth in program participation, we obtained information about the VPP from officials at OSHA's national office and the 10 regional offices. To enhance our understanding of the VPP from the perspective of the participants, we interviewed employees, including union and nonunion employees at VPP sites as well as employees from sites that elected not to participate in the VPP. To determine the extent to which OSHA ensures that only qualified worksites participate in the VPP, we reviewed OSHA's internal controls for the program and limited our review to VPP sites in the federally managed program that were part of the Star program. We reviewed sites in the federally managed program because they represent over 70 percent of the sites in the program--1,543 of the 2,174 sites--and because the policies and practices for the state managed programs differ from state to state. We reviewed sites in the Star program because they represented more than 95 percent of sites in the federally managed VPP at the time of our review, and because the Star program has the most stringent requirements. To assess OSHA's internal controls, we compared OSHA's VPP Policies and Procedures Manual with GAO's Standards for Internal Control in the Federal Government. We also reviewed OSHA's policies and procedures for the federal VPP, including (1) procedures for on-site reviews of VPP sites, (2) annual reporting requirements for VPP sites to report data to the regions, and (3) requirements for regional offices to report information to OSHA's national office. To determine the extent to which OSHA complied with its procedures in approving initial and renewing VPP participants, we reviewed OSHA's VPP files for a randomly selected, representative sample of VPP sites in the program as of April 2008. Estimated percentages derived from this sample have confidence intervals of no more than +/- 7 percent. The files, maintained by OSHA's regional offices, contained reports of the regions' on-site reviews of VPP sites. We reviewed the reports of the reviews conducted prior to the sites' initial acceptance and, if they had been in the program long enough to be reapproved, the most recent review conducted. We reviewed the VPP files and interviewed officials at OSHA's regional offices in Atlanta, Boston, Dallas, New York, and Philadelphia. We selected these sites to obtain a geographic range of regional offices with small, medium, and large numbers of VPP sites. We interviewed officials in the five remaining regional offices in Chicago, Denver, Kansas City, San Francisco, and Seattle by telephone and had them send the VPP files for their sites to us for review. To determine what actions OSHA took in response to fatalities at VPP sites, we asked OSHA's national office for a list of all sites at which fatalities occurred from January 2003 to October 2008. The national office asked the regions to provide this information, and the national office provided this information to us. We reviewed the inspection and VPP files maintained by the regional offices for these sites and interviewed VPP managers about the actions taken by the regions in response to the fatalities. Finally, we reviewed other information provided by the regional offices to the national office, such as data on the injury and illness rates for each VPP site that are reported by the sites annually to OSHA and tracked by the national office on electronic spreadsheets. To assess the adequacy of OSHA's efforts to assess the performance and effectiveness of the VPP, we reviewed its policies and procedures, performance and accountability reports, operating plans, and The Gallup Organization's 2005 evaluation report of the VPP. We reviewed these documents relative to the guidelines in the Government Performance and Results Act of 1993. To verify the injury and illness rates reported by VPP sites to OSHA's regions in the sites' annual reports, we compared the data tracked by the national office on sites' injury and illness rates with the rates reported in OSHA's on-site reviews for the sites in our sample of 184 sites. We assessed the Gallup study on the basis of commonly accepted program evaluation standards. We conducted this performance audit from March 2008 through May 2009 in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. Revae Moran, Acting Director, and Anna M. Kelley, Analyst in Charge, managed all aspects of the assignment. Kathleen Boggs, Richard Harada, Yumiko Jolly, and Summer Pachman made significant contributions to the report. In addition, Richard Brown, Doreen Feldman, Justin Fisher, Cindy Gilbert, Sheila R. McCoy, Kathleen van Gelder, Gabriele Tonsil, and Shana Wallace provided key technical and legal assistance. | The Department of Labor's Occupational Safety and Health Administration (OSHA) is responsible for ensuring workplace safety. OSHA has established a number of programs, including the Voluntary Protection Programs (VPP), that take a cooperative approach to obtaining compliance with safety and health regulations and OSHA's standards. OSHA established the VPP in 1982 to recognize worksites with exemplary safety and health programs. GAO was asked to review (1) the number and characteristics of employer worksites in the VPP and factors that have influenced growth, (2) the extent to which OSHA ensures that only qualified worksites participate in the VPP, and (3) the adequacy of OSHA's efforts to monitor performance and evaluate the effectiveness of the VPP. GAO analyzed OSHA's VPP data, reviewed a representative sample of VPP case files, and interviewed agency officials. The VPP has grown steadily since its inception in 1982, with the number of employer worksites in the program more than doubling--from 1,039 sites in 2003 to 2,174 sites in 2008. Although industries represented have not changed significantly, with the chemical industry having the largest number of sites in the VPP, the number of sites in the motor freight transportation industry--which includes U.S. Postal Service sites--increased tenfold from 2003 to 2008. The proportion of smaller VPP sites--those with fewer than 100 workers--increased from 28 percent in 2003 to 39 percent in 2008. Key factors influencing growth of the VPP have been OSHA's emphasis on expansion of the program and VPP participants' outreach to other employers. OSHA's internal controls are not sufficient to ensure that only qualified worksites participate in the VPP. The lack of a policy requiring documentation in VPP files regarding follow-up actions taken in response to incidents, such as fatalities and serious injuries, at VPP sites limits the national office's ability to ensure that its regions have taken the required actions. Such actions include reviewing sites' safety and health systems and determining whether sites should remain in the program. GAO reviewed OSHA's VPP files for the 30 sites that had fatalities from January 2003 to August 2008 and found that the files contained no documentation of actions taken by the regions' VPP staff. GAO interviewed regional officials and reviewed the inspection files for these sites and found that some sites had safety and health violations related to the fatalities, including one site with seven serious violations. As a result, some sites that no longer met the definition of an exemplary worksite remained in the VPP. In addition, OSHA's oversight is limited because it does not have internal controls, such as reviews by the national office, to ensure that regions consistently comply with VPP policies for monitoring sites' injury and illness rates and conducting on-site reviews. For example, the national office has not ensured that regions follow up as required when VPP sites' injury and illness rates rise above the minimum requirements for the program, including having sites develop plans for reducing their rates. Finally, OSHA has not developed goals or measures to assess the performance of the VPP, and the agency's efforts to evaluate the program's effectiveness have been inadequate. OSHA officials said that low injury and illness rates are effective measures of performance. These rates, however, may not be the best measures because GAO found discrepancies between the rates reported by worksites annually to OSHA and the rates OSHA noted during its on-site reviews. In addition, OSHA has not assessed the impact of the VPP on sites' injury and illness rates. In response to a recommendation in a GAO report issued in 2004, OSHA contracted with a consulting firm to conduct a study of the program's effectiveness. However, flaws in the design of the study and low response rates made it unreliable as a measure of effectiveness. OSHA officials acknowledged the study's limitations but had not conducted or planned other evaluations of the VPP. | 6,872 | 854 |
Early childhood is a key period of development in a child's life and an emphasized age group for which services are likely to have long-term benefits. Recent research has underscored the need to focus on this period to improve children's intellectual development, language development, and school readiness. Early childhood programs serve children from infancy through age 5. The range of services includes education and child development, child care, referral for health care or social services, and speech or hearing assessment as well as many other kinds of services or activities. $4 billion), administered by HHS, and Special Education programs (approximately $1 billion), administered by Education. Head Start provides education and developmental services to young children, and the Special Education-Preschool Grants and Infants and Families program provides preschool education and services to young children with disabilities. Although these programs target different populations, use different eligibility criteria, and provide a different mix of services to children and families, there are many similarities in the services they provide. Figure 1 illustrates the federal agencies responsible for federal early childhood funding. Early childhood programs were included in the list of more than 30 programs our governmentwide performance and accountability report cited to illustrate the problem of fragmentation and program overlap.Virtually all the results that the government strives to achieve require the concerted and coordinated efforts of two or more agencies. However, mission fragmentation and program overlap are widespread, and programs are not always well coordinated. This wastes scarce funds, frustrates taxpayers, and limits overall program effectiveness. The Results Act is intended to improve the management of federal programs by shifting the focus of decision-making and accountability from the number of grants and inspection made to the results of federal programs. The act requires executive agencies, in consultation with the Congress and other stakeholders, to prepare strategic plans that include mission statements and goals. Each strategic plan covers a period of at least 5 years forward from the fiscal year in which the plan is submitted. It must include the following six key elements: a comprehensive mission statement covering the major functions and operations of the agency, a description of general goals and objectives for the major functions and operations of the agency, a discussion of how these goals and objectives will be achieved and the resources that will be needed, a description of the relationship between performance goals in the annual performance plan and general goals and objectives in the strategic plan, a discussion of key factors external to the agency that could affect significantly the achievement of the general goals and objectives, and a description of program evaluations used to develop the plan and a schedule for future evaluations. describe the means the agency will use to verify and validate its performance data. The act also requires that each agency report annually on the extent to which it is meeting its annual performance goals and the actions needed to achieve or modify goals that have not been met. The first report, due by March 31, 2000, will describe the agencies' fiscal year 1999 performance. The Results Act provides a valuable tool to address mission fragmentation and program overlap. The act's emphasis on results implies that federal programs contributing to the same or similar outcomes are expected to be closely coordinated, consolidated, or streamlined, as appropriate, to ensure that goals are consistent and that program efforts are mutually reinforcing. As noted in OMB guidance and in our recent reports on the act, agencies should identify multiple programs within or outside the agency that contribute to the same or similar goals and describe their efforts to coordinate. Just as importantly, the Results Act's requirement that agencies define their mission and desired outcomes, measure performance, and use performance information provides multiple opportunities for the Congress to intervene in ways that could address mission fragmentation. As missions and desired outcomes are determined, instances of fragmentation and overlap can be identified and appropriate responses can be defined. For example, by emphasizing the intended outcomes of related federal programs, the plans might allow identification of legislative changes needed to clarify congressional intent and expectations or to address changing conditions. As performance measures are developed, the extent to which agency goals are complementary and the need for common performance measures to allow for crossagency evaluations can be considered. For example, common measures of outcomes from job training programs could permit comparisons of programs' results and the tools used to achieve those results. As continued budget pressures prompt decisionmakers to weigh trade-offs inherent in resource allocation and restructuring decisions, the Results Act can provide the framework to integrate and compare the performance of related programs to better inform choices among competing budgetary claims. The outcome of using the Results Act in these ways might be consolidation that would reduce the number of multiple programs, but it might also be a streamlining of program delivery or improved coordination among existing programs. Where multiple programs remain, coordination and streamlining would be especially important. Multiple programs might be appropriate because a certain amount of redundancy in providing services and targeting recipients is understandable and can be beneficial if it occurs by design as part of a management strategy. Such a strategy might be chosen, for example, because it fosters competition, provides better service delivery to customer groups, or provides emergency backup. Education and HHS's ACF--the two agencies that are responsible for the majority of early childhood program funds--addressed early childhood programs in their strategic and 1999 performance plans. Although both agencies' plans generally addressed the required elements for strategic and performance plans, Education's plans provided more detailed information about performance measures and coordination strategies. The agencies in their 2000 plans similarly addressed the required elements for performance plans. However, strategies and activities that relate to coordination were not well defined. Although agencies state that some coordination occurs, they have not yet fully described how they will coordinate their efforts. The Education plan provided a more detailed description of coordination strategies and activities for early childhood programs than the ACF plan, including some performance measures that may cut across programs. The ACF plan described in general terms the agency's plans to coordinate with external and internal programs dealing with early childhood goals. Yet the information presented in the plans did not provide the level of detail, definition, and identification of complementary measures that would facilitate comparisons of early childhood programs. research on early brain development reveals that if some learning experiences are not introduced to children at an early age, the children will find learning more difficult later; children who enter school ready to learn are more likely to achieve high standards than children who are inadequately prepared; and high-quality preschool and child care are integral in preparing children adequately for school. Early childhood issues were discussed in the plan's goal to "build a solid foundation for learning for all children" and in one objective and two performance indicators (see table 1). The 1999 performance plan, Education's first performance plan, followed from the strategic plan. It clearly identified programs contributing to Education's early childhood objective and set individual performance goals for each of its programs. Paralleling the strategic plan, the performance plan specified the core strategies Education intended to use to achieve its early childhood goal and objective. Among these were interagency coordination, particularly with HHS's Head Start program. According to Education's strategic plan, this coordination was intended to ensure that children's needs are met and that the burden on families and schools working with multiple providers is reduced. The performance plan also said that Education would work with HHS and other organizations to incorporate some common indicators of young children's school readiness into their programs. It would also work with HHS more closely to align indicators of progress and quality between HHS's Head Start program and its Even Start Family Literacy program--which has as part of its goal the integration of early childhood education, adult literacy or adult basic education, and parenting education. other federal agencies enables it to better serve program participants and reduce inefficiencies in service delivery. We said that although this first plan included a great deal of valuable information, it did not provide sufficient details, such as a more complete picture of intended performance across the department, a fuller portrayal of how its strategies and resources would help achieve the plan's performance goals, and better identification of significant data limitations and their implications for assessing the achievement of performance goals. These observations apply to the early childhood programs as well. Without this additional detail, policymakers are limited in their ability to make decisions about programs and resource allocation within the department and across agencies. Education's 2000 performance plan continues to demonstrate the department's commitment to the coordination of its early childhood programs. Like the 1999 performance plan, the sections on early childhood programs clearly identified programs contributing to its childhood program objectives. It also contained new material highlighting the importance of the coordination of early childhood programs as a crosscutting issue, particularly with HHS. To facilitate collaboration, the department added a strategy to work with the states to encourage interagency agreements at the state level. It also added using the Federal Interagency Coordinating Council to coordinate strategies for children with disabilities and their families. At the same time, the department still needs to better define its objectives and performance measures for crosscutting issues. Unless the purpose of coordination activities is clearly defined and results in measurable outcomes, it will be difficult to make progress in the coordination of programs across agencies. development, safety, and well-being of children and youth"--and three objectives (see table 2). The ACF plan, however, did not always give a clear picture of intended performance of its programs and often failed to identify the strategies the agency would use to achieve its performance goals. ACF programs that contribute to each early childhood objective were identified, and several of these programs had individual performance goals. However, without a clear picture of intended program goals and performance measures for crosscutting early childhood programs, it will be difficult to compare programs across agencies and assess the federal government's overall efficacy in fostering early childhood development. and external stakeholders in this area. However, it did not define how this coordination will be accomplished or the means by which the crosscutting results will be measured. Agency officials are able to describe numerous activities that demonstrate collaboration within the agency and with Education. The absence of that discussion in the plan, however, limits the value the Results Act could have to both improving agency management and assisting the Congress in its oversight role. Progress in coordinating crosscutting programs is still in its infancy, although agencies are recognizing its importance. Agency performance plans provide the building blocks for recognizing crosscutting efforts. Because of the iterative nature of performance-based management, however, more than one cycle of performance plans will probably be required in the difficult process of resolving program fragmentation and overlap. Mr. Chairman, this concludes my prepared statement. We would be happy to answer any questions that you or Members of the Subcommittee may have. Government Management: Addressing High Risks and Improving Performance and Accountability (GAO/T-OCG-99-23, Feb. 10, 1999). Head Start: Challenges Faced in Demonstrating Program Results and Responding to Societal Changes (GAO/T-HEHS-98-183, June 9, 1998). The Results Act: Observations on the Department of Education's Fiscal Year 1999 Annual Performance Plan (GAO/HEHS-98-172R, June 8, 1998). The Results Act: An Evaluator's Guide to Assessing Agency Annual Performance Plans (GAO/GGD-10.1.20, Apr. 1, 1998). Managing for Results: Observations on Agencies' Strategic Plans (GAO/T-GGD-98-66, Feb. 12, 1998). Managing for Results: Agencies' Annual Performance Plans Can Help Address Strategic Planning Challenges (GAO/GGD-98-44, Jan. 30, 1998). Child Care: Federal Funding for Fiscal Year 1997 (GAO/HEHS-98-70R, Jan. 23, 1998). Federal Education Funding: Multiple Programs and Lack of Data Raise Efficiency and Effectiveness Concerns (GAO/T-HEHS-98-46, Nov. 6, 1997). At-Risk and Delinquent Youth: Multiple Programs Lack Coordinated Federal Effort (GAO/T-HEHS-98-38, Nov. 5, 1997). Managing for Results: Using the Results Act to Address Mission Fragmentation and Program Overlap (GAO/AIMD-97-146, Aug. 29, 1997). The Results Act: Observations on the Department of Education's June 1997 Draft Strategic Plan (GAO/HEHS-97-176R, July 18, 1997). The Government Performance and Results Act: 1997 Governmentwide Implementation Will Be Uneven (GAO/GGD-97-109, June 2, 1997). Early Childhood Programs: Multiple Programs and Overlapping Target Groups (GAO/HEHS-95-4FS, Oct. 31, 1994). The first copy of each GAO report and testimony is free. Additional copies are $2 each. Orders should be sent to the following address, accompanied by a check or money order made out to the Superintendent of Documents, when necessary. VISA and MasterCard credit cards are accepted, also. Orders for 100 or more copies to be mailed to a single address are discounted 25 percent. U.S. General Accounting Office P.O. Box 37050 Washington, DC 20013 Room 1100 700 4th St. NW (corner of 4th and G Sts. NW) U.S. General Accounting Office Washington, DC Orders may also be placed by calling (202) 512-6000 or by using fax number (202) 512-6061, or TDD (202) 512-2537. Each day, GAO issues a list of newly available reports and testimony. To receive facsimile copies of the daily list or any list from the past 30 days, please call (202) 512-6000 using a touchtone phone. A recorded menu will provide information on how to obtain these lists. | Pursuant to a congressional request, GAO discussed how Congress can use the Government Performance and Results Act to facilitate agency performance plans to oversee early childhood programs, focusing on: (1) how the Results Act can assist in management and congressional oversight, especially in areas where there are multiple programs; (2) how the Departments of Education and Health and Human Services (HHS)--which together administer more than half of the federal early childhood program funds--addressed early childhood programs in their strategic and fiscal year 1999 and 2000 performance plans and the extent to which recent plans show progress in coordinating early childhood programs. GAO noted that: (1) Congress can use the Results Act to improve its oversight of crosscutting issues because the act requires agencies to develop strategic and annual performance plans that clearly specify goals, objectives, and measures for their programs; (2) the Office of Management and Budget has issued guidance saying that for crosscutting issues, agencies should describe efforts to coordinate federal programs contributing to the same or similar outcomes so that goals are consistent and program efforts are mutually reinforcing; (3) when GAO looked at the Education and HHS plans, it found that the plans are not living up to their potential as expected from the Results Act; (4) more specifically, while the fiscal year 1999 and 2000 plans to some extent addressed coordination, the departments have not yet described in detail how they will coordinate or consolidate their efforts; and (5) therefore, the potential for addressing fragmentation and duplication has not been realized, and GAO cannot assess whether the agencies are effectively working together on crosscutting issues. | 2,923 | 331 |
Carrier strike groups are typically centered around an aircraft carrier and its air wing, and also include a guided missile cruiser; two guided missile destroyers; a frigate; an attack submarine; and one or more supply ships with ammunition, fuel, and supplies (such as food and spare parts). These groups are formed and disestablished on an as needed basis, and their compositions may differ though they contain similar types of ships. Figure 1 shows a carrier strike group sailing in a group formation as it prepares to participate in an exercise. Prior to the September 11, 2001, terrorist attacks, only those Navy ships and air squadrons at peak readiness were deployed overseas, usually for 6 months at a time. Most of the Navy's remaining units were not available because they were in early stages of their maintenance or training cycles, or because the Navy did not have good visibility of the readiness of these units. This prompted the Chief of Naval Operations in March 2003 to task the Commander, Fleet Forces Command, to develop the Fleet Response Plan concept to enhance the Navy's surge capability. The Chief of Naval Operations approved the concept in May 2003 and further directed the Commander, Fleet Forces Command, to be responsible and accountable for effectively implementing the plan. The Fleet Response Plan emphasizes an increased level of readiness and the ability to quickly deploy naval forces to respond to crises, conflicts, or homeland defense needs. The plan applies broadly to the entire fleet; however, it only sets specific requirements for carrier strike groups. For example, the plan calls for eight carrier strike groups to be ready to deploy within 90 days of notification. Six of them would be available to deploy within 30 days and the other two within 90 days. This is commonly referred to as the 6 + 2 goal. Under the Fleet Response Plan, the Navy has developed a surge capability schedule that it uses to manage and identify the level of training a ship has completed and its readiness to deploy. The schedule contains three progressive readiness goals: emergency surge, surge-ready, and routine deployable status. Each readiness goal specifies phases of training that must be completed to achieve the goal. To be placed in emergency surge status, a ship or an air squadron needs to have completed its unit-level phase training. Achieving surge-ready status requires the completion of integrated phase training. Attaining routine deployable status requires achievement of all necessary capabilities, completion of underway sustainment phase training, and certification of the unit for forward deployed operations. The surge capabilities schedule provides a readiness snapshot for each ship, allowing decision makers to quickly determine which ships are available to meet the needs of the mission. Figure 2 illustrates how the Navy notionally identifies the eight aircraft carriers available for surge deployments. The carriers numbered 1 through 6 are expected to be ready to deploy within 30 days notice. The carriers labeled "+1" and "+2" are expected to able to surge within 90 days notice. The six surge-ready carriers include two carriers on deployment (numbered 3 and 4), one carrier that is part of the forward deployed naval force based in Japan (number 6), and three carriers in the sustainment phase (numbered 1, 2, and 5). These six carriers are expected to have completed postdeployment depot-level maintenance and their unit-level phase training. The two additional surge carriers are expected to have completed depot-level maintenance but not to have completed unit-level phase training. The remaining four carriers are in the maintenance phase or deep maintenance. Based on the Navy's experiences during the past 2 years, Fleet Forces Command has convened a cross-functional working group to develop a refined version of the Fleet Response Plan. This update, known as Fleet Response Plan-Enhanced, is intended to further define the Fleet Response Plan, modify terminology for progressive readiness states to better reflect their meaning, tie in elements such as a human capital strategy, and expand the focus of the plan beyond carrier strike groups to the entire Navy. It may also extend the Fleet Response Plan's current employment cycle length of 27 months. The Fleet Response Plan-Enhanced is still under development at this time. The Navy's management approach in establishing the Fleet Response Plan as its new readiness construct has not fully incorporated sound management practices needed to effectively guide, monitor, and assess implementation. Studies by several organizations have shown that successful organizations in both the public and private sectors use sound management practices to assist agencies in measuring performance, reporting results, and achieving desired outcomes. These practices provide management with a framework for effectively implementing and managing programs and shift program management focus from measuring program activities and processes to measuring program outcomes. Sound management practices include (1) establishing a coherent mission and integrated strategic goals to guide the transformation, including resource commitments; (2) setting implementation goals and a timeline to build momentum and show progress from day one; and (3) establishing a communication strategy to create shared expectations and report related progress. The Navy's implementation of the Fleet Response Plan has included some aspects of these practices. For example, the Navy has established some strategic goals needed to meet the intent of the plan, such as the progressive readiness levels of emergency surge, surge-ready, and routine deployable status. The Navy also has established specific training actions to support these goals, such as that carrier strike groups must complete unit-level training to be certified as emergency surge-ready. However, other actions taken by the Navy do not fully incorporate these practices. For example, the Navy has identified the 6 + 2 surge capability as a readiness goal and performance measure for carrier strike groups, but no such goal was established for the rest of the fleet. The Navy also has some unofficial goals and performance measures regarding manning and maintenance, but these unofficial goals and performance measures have not been formally established. For example, briefings on the Fleet Response Plan state that the Navy desires and needs fully manned ships (i.e., manning at 100 percent of a ship's requirement) for the program to be successful. Moreover, according to Navy officials, the Navy has not established milestones for achieving its results. In addition, 2 years after initiating implementation of the Fleet Response Plan, the Navy still does not have an official written definition of the Fleet Response Plan that clearly establishes a coherent mission and integrated strategic goals to guide the transformation, including resource commitments. This definition would describe the Fleet Response Plan's total scope and contain guidance with formal goals and performance measures. The Navy recently has taken some action to address this area. In February 2005, the Navy directed the Center for Naval Analyses to conduct a study to develop formal definitions and guidance as well as identify goals and performance measures for the plan. However, it remains to be seen whether this study will be completed as planned by November 2005; if it will recommend developing and implementing sound management practices, such as goals, measures, milestones, and timelines; and whether any management improvement recommendations made in the study will be implemented by the Fleet Forces Command, the Navy command responsible for implementing the Fleet Response Plan. Without goals, performance measures, timelines, milestones, benchmarks, and guidance to help effectively manage implementation of the Fleet Response Plan and determine if the plan is achieving its goals, the Navy may find it more difficult to implement the Fleet Response Plan across the entire naval force. Moreover, despite the Navy's unofficial goal that the Fleet Response Plan be budget neutral, as articulated in briefings and by senior leaders, the Navy has not yet clearly identified the resources needed to achieve its goals or provided a rationale for how these resources will contribute to achieving the expected level of performance. Navy officials have said that current operations and maintenance funding levels, as well as manning at 100 percent of required positions, have contributed to successful implementation of the Fleet Response Plan. However, officials do not know what level of manning or funding is actually required for program success over the long term to avoid any unintended consequences, such as greater amounts of deferred maintenance. According to Navy officials, it is difficult to attribute costs to the plan because there is no single budget line item that tracks the costs associated with the Fleet Response Plan. Without knowing the funding needed, the Navy may not be able to assess the impact of possible future changes in funding on implementing the plan. Furthermore, without a comprehensive plan that links costs with performance measures and outcomes, neither the Navy nor Congress may be able to determine if the Fleet Response Plan is actually achieving its unofficial goal of being budget neutral. Finally, the Navy also has not developed a comprehensive communications strategy that reaches out to employees, customers, and stakeholders and seeks to genuinely engage them in a two-way exchange, which is a critical step in successfully implementing cultural change or transformation. We looked for formal mechanisms that communicated the details of the Fleet Response Plan and spoke with personnel from carrier strike groups, aircraft carriers, air wings and an air squadron, one surface combatant ship, and other command staff. We found that while the Fleet Response Plan was communicated extensively to senior-level officers, and the Navy provided numerous briefings and messages related to the plan, communication and understanding of the plan did not flow through to the lower ranks. While the concept of the Fleet Response Plan is generally understood by some senior-level officials, many of the lower grade personnel on these ships were unaware of the scope, goals, and other aspects of the plan. In the absence of clear communication throughout the fleet via an overall communications strategy that could increase employee awareness of the Fleet Response Plan, its successful implementation could be impeded. Sound management practices, such as those noted above, were not fully used by the Navy because senior leaders wanted to quickly implement the Fleet Response Plan in response to the Chief of Naval Operations' desires. However, without an overall management plan containing all of these elements to guide the implementation of such a major change, it may be difficult for the Navy and Congress to determine the extent to which the Fleet Response Plan is achieving the desired results, measure its overall progress, or determine the resources needed to implement the plan. The Navy has not fully tested and evaluated the Fleet Response Plan or developed lessons learned to identify the effectiveness of its implementation and success over time. The methodical testing, exercising, and evaluation of new doctrines and concepts is an established practice throughout the military to gain insight into how systems and capabilities will perform in actual operations. However, instead of methodically conducting realistic tests to evaluate the Fleet Response Plan, the Navy has tried to demonstrate the viability of the plan by relying on loosely linked events that were not part of an overall test and evaluation strategy, which impairs the Navy's ability to validate the plan and evaluate its success over time. In addition, the Navy has not used its lessons learned system to share the results of its Fleet Response Plan tests or as an analytical tool to evaluate the progress of the plan and improve implementation, which limits the Navy's ability to identify and correct weaknesses across the fleet. Methodically testing, exercising, and evaluating new doctrines and concepts is an important and established practice throughout the military. DOD has long recognized the importance of using tabletop exercises, war games, and experimentation to explore military doctrine, operational concepts, and organizational arrangements. Collectively, these tests and experiments can provide important insight into how systems and capabilities will perform in actual operations. U.S. Joint Forces Command, which has lead responsibility for DOD experimentation on new concepts of operation and technologies, states that its experimental efforts aim to foster military innovation and improvement by exploring, developing, and transferring new concepts and organizational ideas into operational reality. Particularly large and complex issues may require long-term testing and evaluation that is guided by study plans. Joint Forces Command's Joint Warfighting Center has an electronic handbook that provides guidance for conducting exercises and lays out the steps in an exercise life cycle: design; planning; preparation; execution; and analysis, evaluation, and reports. The Army also has well-established guidance governing service studies, analyses, and evaluations that the Navy feels is representative of best practices for military operations research. This provides an important mechanism through which problems pertaining to critical issues and other important matters are identified and explored to meet service needs. As shown in figure 3, the Army's process involves six major steps that create a methodical process for developing, conducting, documenting, and evaluating a study. Following a formal study process enables data evaluation and development of lessons learned that could be used to build on the existing knowledge base. In a roundtable discussion with the Fleet Forces Command on the rationale behind Summer Pulse 2004, the Navy's major exercise for the Fleet Response Plan, a senior Navy official stated, "From the concept, ... you need to exercise, ... you need to practice, ... you need to demonstrate it to know you got it right and what lessons are there to learn from how we did it." Other governmental agencies, like GAO, and the private sector also rely on detailed study plans, or data collection and analysis plans, to guide the development of studies and experiments and the collection and analysis of data, and to provide a feedback loop that links the outcomes of the study or experiment event and subsequent analysis to the original goals and objectives of the study or event. GAO guidance states that data collection and analysis plans "should carry forward the overall logic of the study so that the connection between the data that will be collected and the answers to the study questions will become evident." Recent Navy guidance also recognizes the need for a thorough evaluation of complex initiatives. In April 2005, the Navy issued a Study Planning and Conduct Guide assembled by the Navy Warfare Development Command. This guide stresses the importance of establishing a long- range plan for complex and novel problems and lays out the rationale for detailed study plans for exercises and experiments, as they establish a structure in which issues are explored and data are collected and analyzed in relation to the established goals or objectives for the event. Furthermore, the Navy's guide notes that random, inadequately prepared events and a determination just to study the problem do not lead to successful resolution of problems that may arise in programs and concepts that the Navy is testing and evaluating. The Navy has not methodically conducted realistic tests of the Fleet Response Plan to demonstrate the plan's viability and evaluate its progress and success over time, instead relying on loosely linked events and some routine data to demonstrate the viability of the plan. The events identified by the Navy as successful tests of the Fleet Response Plan are Summer Pulse 2004, the emergency deployment of the U.S.S. Abraham Lincoln, and Global War on Terrorism Surge 2005, but of these events only Summer Pulse 2004 was driven by the Fleet Response Plan with the intent of demonstrating that large numbers of ships could be surged. In addition, these events were not part of an overall test and evaluation strategy that yielded specific information from which to assess the value of the plan in increasing readiness and meeting the new 6 + 2 surge capability goal for carrier strike groups. Summer Pulse 2004 encompassed a number of previously scheduled deployments, exercises, and training events that took place between June and August of 2004. The intent of Summer Pulse 2004 was to demonstrate the Fleet Response Plan's new readiness construct and the Navy's ability to deploy multiple carrier strike groups of varying levels of readiness. However, Summer Pulse 2004 was not a methodical and realistic test of the Fleet Response Plan for three reasons. First, Summer Pulse 2004 did not follow best practices regarding study plans and the ability to evaluate the impact and outcomes of the plan. The Navy did not develop a formal study plan identifying study objectives, data collection requirements, and analysis, or produce a comprehensive after-event report describing the study's findings. Navy officials have stated that the elements of a formal study plan were there for the individual deployments, exercises, and training events constituting Summer Pulse 2004, but were not brought together in a single package. While the Navy may have had the study elements present for the individual exercises, they were not directly linked to testing the Fleet Response Plan. Without such a comprehensive study plan and overall evaluation, there is no ability to discern potential impacts on fleet readiness, maintenance, personnel, and other issues that are critical to the Fleet Response Plan's long-term success. Second, Summer Pulse 2004 was not a realistic test because all participating units had several months' warning of the event. As a result, five carriers were already scheduled to be at sea and only two had to surge. Because six ships are expected to be ready to deploy with as little as 30 days' notice under the plan and two additional carriers within 90 days, a more realistic test of the Fleet Response Plan would include no-notice or short-notice exercises. Such exercises conducted without advance notification to the participants would provide the highest degree of challenge and realism. Without such exercises, the Navy might not be able to realistically practice and coordinate a full surge deployment. Third, Summer Pulse 2004 was not a sufficient test because the Navy involved only seven carriers instead of the eight carriers called for in the plan. Therefore, it did not fully test the Navy's ability to meet deployment requirements for the expected force. Another event cited by the Navy as evidence of the Fleet Response Plan's success is the deployment of the U.S.S. Abraham Lincoln carrier strike group while it was in surge status in October 2004. Originally scheduled to deploy in the spring of 2005, the Lincoln was deployed early to support operations in the Pacific Command area of operation and provide aid to areas devastated by a tsunami in the Indian Ocean in December 2004. Navy officials said that the Fleet Response Plan enabled the Navy to identify a carrier to send to the Pacific and to quickly tailor its training package based on its progressive readiness status. The Navy touted this rapid response relief work by a strike group deployed during surge status as a Fleet Response Plan success story. We agree that the Lincoln carrier strike group was able to respond quickly. However, the extent to which this event realistically tested the Fleet Response Plan's expectations for surging one carrier strike group is not known. As with Summer Pulse 2004, the Lincoln deployment was not a methodical test of the Fleet Response Plan because there was no plan to systematically collect or analyze data that would evaluate the outcomes of the Lincoln deployment against Fleet Response Plan-related study goals. The Navy also pointed to a third event, its recent Global War on Terrorism Surge 2005, as an indicator that the Fleet Response Plan works. The Global War on Terrorism surge was a response to a request for forces from which the Navy is looking to glean Fleet Response Plan-related information about what did and did not work when the ships return. However, this is not a good test of the Fleet Response Plan because there is no plan showing what specific data are being collected or what analytical approaches are being employed to assess the ships' experiences. As of September 2005, no other events had been scheduled to further test and evaluate the Fleet Response Plan. The Navy has not developed the kind of comprehensive plans to test and evaluate the Fleet Response Plan as recommended by DOD and Navy guidance and best practices because Navy officials have stated that existing readiness reporting processes effectively evaluate the Fleet Response Plan's success on a daily basis. They said after-action reports from training exercises and the Joint Quarterly Readiness Review assist with this function. Navy officials explained that they implemented the Fleet Response Plan the same way they had implemented the Inter- Deployment Training Cycle, the predecessor to the Fleet Response Plan's Fleet Readiness Training Plan. While this may be true, the Inter- Deployment Training Cycle was focused on the specific training needed to prepare units for their next deployment, not for implementing a new readiness construct that emphasized surge versus routine deployments. Furthermore, the Inter-Deployment Training Cycle did not contain stated goals whose validity the Navy needed to test. In addition, ongoing readiness reports do not provide information on important factors such as costs, long-term maintenance implications, and quality of life issues. The Summer Pulse 2004, Lincoln surge deployment, and Global War on Terrorism Surge 2005 testing events were not part of a methodical test and evaluation approach. Therefore, the Navy is unable to convincingly use these events to evaluate the Fleet Response Plan and determine whether the plan has been successful in increasing readiness or achieving other goals. Moreover, without effective evaluation of the Fleet Response Plan, the Navy may be unable to identify and correct potential problem areas across the fleet. Without a comprehensive long-range plan that establishes methodical and realistic testing of the Fleet Response Plan, the Navy may be unable to validate the Fleet Response Plan operational concept, evaluate its progress and success over time, and ensure that it can effectively meet Navy goals over the long term without any adverse, unintended consequences for maintenance, quality of life, and fleet readiness. The formal Navy repository for lessons learned, the Navy Lessons Learned System, has not been used to disseminate Fleet Response Plan-related lessons learned or to analyze test results to evaluate the progress of the plan and improve implementation. The Navy Lessons Learned System has been designated by the Chief of Naval Operations as the singular Navy program for the collection, validation, and distribution of unit feedback as well as the correction of problems identified and derived from fleet operations, exercises, and miscellaneous events. However, there are no mechanisms or requirements in place to force ships, commands, and numbered fleet staffs to submit all lessons learned to the Navy Lessons Learned System, although such mechanisms exist for the submission of port visit and other reports. For the events that the Navy cites as tests of the Fleet Response Plan, it did not analyze and evaluate the results and produce formal lessons learned to submit to the Navy Lessons Learned System for recordation and analysis. Any evaluation done of the testing events has not been incorporated into the Lessons Learned System, preventing comprehensive analyses of lessons learned and identification of problems and patterns across the fleet that may require a high-level, Navy-wide response. Some ship and carrier strike group staff informed us that they prefer informal means of sharing lessons learned, because they feel the process through which ships and commands have to submit lessons learned for validation and inclusion in the database can be complex and indirect. This may prevent ship and command staffs across the fleet from learning from the experiences of others, but it also prevents the Navy Lessons Learned System from performing comprehensive analyses of the lessons learned and possibly identifying problems and patterns across the fleet that may require a high-level Navy-wide response. In addition, the lessons learned are recorded by mission or exercise (e.g., Operation Majestic Eagle) and not by operational concept (e.g., the Fleet Response Plan), making identification of Fleet Response Plan-specific lessons learned difficult and inconsistent. Over the last 10 years, we have issued several reports related to lessons learned developed by the military. We have found that service guidance does not always require standardized reporting of lessons learned and lessons learned are not being used in training or analyzed to identify trends and performance weaknesses. We emphasized that effective guidance and sharing of lessons learned are key tools used to institutionalize change and facilitate efficient operations. We found that despite the existence of lessons learned programs in the military services and the Joint Staff, units repeat many of the same mistakes during major training exercises and operations. Our current review indicates that the Navy still does not include all significant information in its lessons learned database. Therefore, Navy analysts cannot use the database to perform comprehensive analyses of operational concepts like the Fleet Response Plan to evaluate progress and improve implementation. Officials from the Navy Warfare Development Command stated that the Navy is currently drafting a new Chief of Naval Operations Instruction governing the Navy Lessons Learned System that will address some of these issues. Navy Warfare Development Command officials hope that the new instruction will result in several improvements over the current system. First, they would like to see a dual reporting system, so that lessons learned are simultaneously sent to the Navy Lessons Learned System for preliminary evaluation when they are submitted to the numbered fleets for validation. This would allow Navy Lessons Learned analysts to look at unvarnished data for patterns or issues of interest to the Chief of Naval Operations, without taking away the numbered fleets' validation processes. In addition, officials would like to establish deadlines for the submission of lessons learned to ensure timeliness. Not only will these changes add value to the data stored in the Navy Lessons Learned System, but they will keep the data flowing while ensuring that data are actually submitted and not lost as they move up the chain of command. According to Navy Lessons Learned officials, other branches of the military already allow operators in the field to submit lessons learned directly to their lessons learned systems, enabling value-added analysis and the timely posting of information. By addressing these issues, the Navy can help ensure that the lessons learned process will become more efficient, be a command priority, and produce actionable results. Two years after implementing a major change in how it expects to operate in the future, the Navy has not taken all of the steps needed to enable the Navy or Congress to assess the effectiveness of the Fleet Response Plan. As the Navy prepares to implement the Fleet Response Plan across the entire naval force, it becomes increasingly important that the Navy effectively manages this organizational transformation so that it can determine if the plan is achieving its goals. The absence of a more comprehensive overarching management plan to implement the Fleet Response Plan has left essential questions about definitions, goals, performance measures, guidance, timelines, milestones, benchmarks, and resources unanswered, even though sound management practices recognize the need for such elements to successfully guide activities and measure outcomes. The absence of these elements could impede effective implementation of the Fleet Response Plan. Furthermore, without a comprehensive plan that links costs with performance measures and outcomes, neither the Navy nor Congress may be able to determine if the Fleet Response Plan is budget neutral. More effective communications throughout the fleet using an overall communications strategy could increase employee awareness of the plan and help ensure successful implementation. The Navy also has not developed a comprehensive long-range plan for testing and evaluating the Fleet Response Plan. Without a well-developed plan and methodical testing, the Navy may not be aware of all of the constraints to successfully surging its forces to crises in a timely manner. Moreover, the absence of an overarching testing and evaluation plan that provides for data collection and analysis may impede the Navy's ability to use its testing events to determine whether the Fleet Response Plan has been successful in increasing readiness and to identify and correct problem areas across the fleet. Failure to document and record the results of testing and evaluation efforts in the Navy Lessons Learned System could limit the Navy's ability to validate the value of the concept, identify and correct performance weaknesses and trends across the fleet, perform comprehensive analyses of lessons learned, and disseminate these lessons and analyses throughout the fleet. To facilitate successful implementation of the Fleet Response Plan and enhance readiness and ensure the Navy can determine whether the plan has been successful in increasing readiness and is able to identify and correct performance weaknesses and trends across the fleet, we recommend that the Secretary of Defense take the following two actions: Direct the Secretary of the Navy to develop a comprehensive overarching management plan based on sound management practices that will clearly define goals, measures, guidance, and resources needed for implementation of the Fleet Response Plan, to include the following elements: establishing or revising Fleet Response Plan goals that identify what Fleet Response Plan results are to be expected and milestones for achieving these results, developing implementing guidance and performance measures based on these goals, identifying the costs and resources needed to achieve each performance goal, and communicating this information throughout the Navy. Direct the Secretary of the Navy to develop a comprehensive plan for methodical and realistic testing and evaluation of the Fleet Response Plan. Such a comprehensive plan should include a description of the following elements: how operational tests, exercises, war games, experiments, deployments, and other similar events will be used to show the performance of the new readiness plan under a variety of conditions, including no-notice surges; how data will be collected and analyzed for these events and synthesized to evaluate program success and improvements; and how the Navy Lessons Learned System will collect and synthesize lessons from these events to avoid repeating mistakes and improve future operations. In written comments on a draft of this report, DOD generally concurred with our recommendations and cited actions it will take to implement the recommendations. DOD concurred with our recommendation that the Navy should develop a comprehensive overarching management plan based on sound management practices that would clearly define the goals, measures, guidance, and resources needed for successful implementation of the Fleet Response Plan, including communicating this information throughout the Navy. DOD noted that the Navy has already taken action or has plans in place to act on this recommendation, and described several specific accomplishments and ongoing efforts in this regard. DOD also noted that the Navy intends to communicate through message traffic, white papers, instructions, lectures, and meetings with Navy leadership. We agree that these means of communication are an important part of an effective communication strategy; however, we do not believe that these methods of communication constitute a systemic strategy to ensure communication at all personnel levels. We believe the Navy would benefit from a comprehensive communication strategy that builds on its ongoing efforts, but encompasses additional actions to ensure awareness of the plan throughout the Navy. DOD partially concurred with our recommendation to test and evaluate the Fleet Response Plan. DOD noted that it plans to use a variety of events and war games to evaluate the Fleet Response Plan, but it does not see a need to conduct no-notice surges to test the Fleet Response Plan. DOD stated that it believes no-notice surges are expensive and unnecessary and could lead to penalties on overall readiness and the ability to respond to emergent requirements. DOD also noted that the Navy has surged single carrier strike groups, expeditionary strike groups, and individual ships or units under the Fleet Response Plan, and it cited several examples of such surges. We commend the Navy's plans to use a variety of events to evaluate the Fleet Response Plan and its use of the Navy Lessons Learned System to report and evaluate the lessons learned in the Global War on Terrorism Surge 2005 exercise held earlier this year. However, we continue to believe that no-notice surges are critical components of realistic testing and evaluation plans and that the benefits of such exercises can outweigh any additional costs associated with conducting such tests on a no-notice basis. Both we and Congress have long recognized the importance of no-notice exercises. For example, in a 1989 report, we noted that DOD was instituting no-notice exercises to assess the preparedness of combatant commands' state of training of their staffs and components. In addition, in 1990 the Department of Energy conducted no-notice tests of security personnel in response to our work and out of recognition that such tests are the best way to assess a security force's ability at any given time. Furthermore, in recent years, the Department of Homeland Security, Department of Energy, and others have conducted no-notice exercises because they add realism and demonstrate how well organizations are actually prepared to respond to a given situation. Despite the importance of no-notice exercises, the Navy has not conducted no-notice exercises to test and evaluate the centerpiece surge goal of 6 + 2 for carrier strike groups. We believe that the smaller surges cited by DOD can provide insights into the surging process, but we do not believe that such surges can effectively test the Navy's readiness for a full 6 + 2 carrier strike group surge. DOD also provided technical and editorial comments, which we have incorporated as appropriate. DOD's comments are reprinted in appendix II of this report. We are sending copies of this report to other interested congressional committees; the Secretary of Defense; the Secretary of the Navy; and the Director, Office of Management and Budget. We will make copies available to others upon request. In addition, the report will be available at no charge on the GAO Web site at http://www.gao.gov. If you or your staff have any questions about this report, please contact me at (202) 512-4402 or [email protected]. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this report. GAO staff who made major contributions to this report are listed in appendix III. To assess the extent to which the Navy has employed a sound management approach in implementing the Fleet Response Plan, we interviewed Navy headquarters and fleet officials; received briefings from relevant officials; and reviewed key program documents. In the absence of a comprehensive planning document, we compared best practices for managing and implementing major efforts to key Navy messages, directives, instructions, and briefings, including, but not limited to, the Culture of Readiness message sent by the Chief of Naval Operations (March 2003); the Fleet Response Concept message sent by the Chief of Naval Operations (May 2003); the Fleet Response Plan Implementation message sent by the Commander, Fleet Forces Command (May 2003); the Fleet Response Plan Implementation Progress message sent by the Commander, Third Fleet (September 2003); and the U.S. Fleet Forces Command's Fleet Training Strategy instruction (May 2002 and an undated draft). We also conducted meetings with several of the commanding officers, executive officers, and department heads of selected carrier strike groups, aircraft carriers, and air wings to obtain information on how the plan had been communicated, how the plan had changed their maintenance and training processes, the impact on their quality of life, the cost implications of the plan, and other factors. To assess the extent to which the Navy has tested the effectiveness of the Fleet Response Plan and shared results to improve its implementation, we obtained briefings; interviewed Navy headquarters and fleet officials; and reviewed test and evaluation guidance for both the Navy and other federal agencies. To evaluate the three Fleet Response Plan demonstrations identified by the Navy, we interviewed officials from the Fleet Forces Command and the Navy Warfare Development Command, reviewed existing documentation on the demonstrations, queried the Navy Lessons Learned System for lessons learned from the demonstrations, and compared our findings to accepted best practices for tests and evaluations. Further, we reviewed Navy Lessons Learned System instructions and queried the system to determine recorded lessons learned pertaining to the Fleet Response Plan. We validated the Navy Lessons Learned System data and determined the data were sufficiently reliable for our analysis. We conducted our review from January 2005 through August 2005 in accordance with generally accepted government auditing standards at the following locations: The Joint Staff, Washington, D.C. U.S. Pacific Command, Camp H. M. Smith, Hawaii Offices of the Chief of Naval Operations, Washington, D.C. In addition to the contact named above, Richard Payne, Assistant Director; Renee Brown; Jonathan Clark; Nicole Collier; Dawn Godfrey; David Marroni; Bethann Ritter; Roderick Rodgers; John Van Schaik; and Rebecca Shea made significant contributions to this report. | The Navy has been transforming itself to better meet 21st century needs. Since 2000, the Congress has appropriated about $50 billion annually for the Navy to operate and maintain its forces and support around 376,000 military personnel. In recognizing that the Navy faces affordability issues in sustaining readiness within its historical share of the defense budget, the Chief of Naval Operations announced a concept called the Fleet Response Plan to enhance its deployment readiness status. The Fleet Response Plan is designed to more rapidly prepare and sustain readiness in ships and squadrons. GAO evaluated the extent to which the Navy has (1) employed a sound management approach in implementing the Fleet Response Plan and (2) tested and evaluated the effectiveness of the plan and shared results to improve implementation. In establishing the Fleet Response Plan, the Navy has embraced a major change in the way it manages its forces. However, the Navy's management approach in implementing the Fleet Response Plan has not fully incorporated sound management practices needed to guide and assess implementation. These practices include (1) establishing a coherent mission and strategic goals, including resource commitments; (2) setting implementation goals and a timeline; and (3) establishing a communication strategy. While the Navy has taken a number of positive actions to implement the plan, it has not provided readiness goals for units other than carrier strike groups; resource and maintenance goals; performance measures and timelines; or a communications strategy. Sound management practices were not fully developed because senior leaders wanted to quickly implement the plan in response to changes in the security environment. However, without an overall management plan containing all of these elements, it may be difficult for the Navy to determine whether its efforts to improve the fleet's readiness are achieving the desired results, adequately measuring overall progress, or identifying what resources are needed to implement the Fleet Response Plan. The Navy has not fully tested and evaluated the Fleet Response Plan or developed lessons learned to identify the effectiveness of its implementation and success over time. Systematic testing and evaluation of new concepts is an established practice to gain insight into how systems and capabilities will perform in actual operations. However, instead of methodically conducting realistic tests to evaluate the Fleet Response Plan, the Navy has tried to demonstrate the viability of the plan by relying on loosely linked events that were not part of an overall test and evaluation strategy. This approach could impair the Navy's ability to validate the plan and evaluate its success over time. In addition, the Navy has not used its lessons learned system to share the results of its Fleet Response Plan events or as an analytical tool to evaluate the progress of the plan and improve implementation, which limits the Navy's ability to identify and correct weaknesses across the fleet. | 7,518 | 551 |
The NFIP seeks to minimize human suffering and flood-related property losses by making flood insurance available on reasonable terms and encouraging its purchase by people who need flood insurance protection--particularly those living in flood-prone areas known as special flood hazard areas (SFHA). Prior to the flood insurance program's inception, private insurance companies generally did not offer coverage for flood disasters because of the high risks involved, such as high-risk homeowners being more likely to purchase flood insurance. The National Flood Insurance Act of 1968 (P.L. 90-448) established the program to identify SFHAs, make flood insurance available to property owners living in communities that joined the program, and encourage floodplain management efforts to mitigate flood hazards and thereby reduce federal expenditures on disaster assistance. In order for a community to join the program, any structures built within an SFHA after it has been identified as such are required to be built to the program's building standards, which are aimed at minimizing flood losses. FEMA estimates that its implementation of the program's standards for new construction is now saving about $1 billion annually in flood damage avoided. The 1973 Flood Disaster Protection Act (P.L. 93-234) required flood insurance for borrowers whose mortgages are on structures located in SFHAs in participating communities and are originated, guaranteed, or serviced by federal agencies or federally regulated institutions.Subsequently, the National Flood Insurance Reform Act of 1994 (P.L. 103- 325) directed federal regulators of lending institutions to assess penalties on any regulated lending institution found to have a pattern or practice of violating the act. Violations include failing to require flood insurance coverage for properties in SFHAs used to secure mortgage loans. In addition, the act mandated that regulated lenders (1) purchase flood insurance for borrowers who are required to have it but fail to purchase it and (2) escrow funds for flood insurance premiums if other funds are also escrowed. The owners of properties in SFHAs with no mortgages or properties with mortgages held by unregulated lenders are not legally required to buy flood insurance. Because risk levels are the same for homeowners in SFHAs regardless of whether flood insurance is required, FEMA encourages all homeowners residing in SFHAs to buy flood insurance. FEMA's Mitigation Directorate maintains and updates flood insurance rate maps (FIRM), which identify the geographic boundaries of SFHAs. FIRMs are derived from base maps, which show the basic geographic and political boundaries of a community. Various mapping technologies are used to establish flood elevations on FIRMs and to delineate the boundaries of SFHAs. Base maps are generally obtained from local communities or the U.S. Geological Survey (USGS). While flood maps should be updated as necessary to remain accurate, approximately 63 percent of the nation's 100,000 flood maps are at least 10 years old. Consequently, the Mitigation Directorate has developed a Flood Map Modernization Plan to update the maps and convert them to a digital format. Digital mapping processes, along with other technologies, will improve the collection of data on structures in SFHAs and allow for the electronic distribution of these data through the Internet and on CD-ROM. In accordance with the Government Performance and Results Act (GPRA), FEMA has established various goals and strategies to determine the success of the NFIP in fulfilling its mission to minimize property losses after flood disasters and to reduce losses from future disasters. According to FEMA officials, these goals allow the agency to monitor its progress in meeting its performance goals and address key outcomes. While the results achieved under these goals--increasing the number of insurance policies in force and reducing flood-related losses--provide valuable insights into how well the NFIP's mission is being accomplished, they do not gauge participation in the program by the most vulnerable residents--those living in SFHAs. Participation rates--the percentage of structures in SFHAs that are insured--are an effective way to measure the results of the NFIP because they are objective, measurable, and quantifiable. By using participation rates to measure performance, FEMA could assess other program results, such as the extent to which the most vulnerable residents are participating in the program; determine whether the financial risk to the government from floods is increasing or decreasing; and focus marketing and compliance activities to maximize program participation in SFHAs. Like other federal agencies, FEMA is mandated under GPRA to develop annual performance plans that link the agency's long-term strategic planning to its daily activities. FEMA established three performance goals that pertain to the flood insurance program. These goals include reducing flood losses, increasing the number of flood insurance policies sold, and improving the program's financial status. These endeavors are part of FEMA's mission to protect lives and reduce losses from future disasters through insurance and mitigation efforts. Table 1 describes FEMA's fiscal year 2002 Performance Plan goals for the NFIP and the strategies by which the agency intends to accomplish these goals. In developing annual performance goals, agencies should focus on the results they expect their programs to achieve--the differences the programs will make in people's lives. The three NFIP performance goals address the program's objectives of minimizing human suffering and property losses caused by floods. However, opportunities are developing for FEMA to obtain valuable information about the program's success through analysis of the rate of participation for those communities involved in the program. The participation rate is obtained by dividing the number of properties located in SFHAs with flood insurance by the total number of properties in these SFHAs. This information would allow FEMA to assess whether the program is penetrating those areas most at risk of flooding, determine whether the financial risks to the government in these areas are increasing or decreasing, and better target marketing efforts to increase participation. In other words, through analysis of participation rates, FEMA would be better able to maximize the effectiveness and efficiency of the program in protecting lives and reducing financial losses. FEMA currently collects data on the number of active flood insurance policies. Its goal is to increase the number of NFIP policies in force by 5 percent annually. While FEMA tracks the growth in the number of active policies, its estimates of the number of households located in SFHAs without flood insurance coverage vary. A DeKalb County, Georgia, study illustrates why participation-rate data can be a more useful measure of the program's success than a tally of policies in force. According to the study, the number of policies in force in DeKalb County grew from the previous year by 13 percent in 1998 and by 17 percent in 1999 but fell to 3 percent in 2000. In fiscal year 1999, DeKalb County officials conducted a study of NFIP participation. This study was initiated to provide information about flood hazards, prevention, and mitigation. Local officials made flood-zone determinations on every structure in the county using FIRMs, tax maps, and limited geographic information system technology. This effort resulted in the creation of an electronic database of the addresses of all structures in the SFHAs. According to the data collected, there were 17,078 buildings in the SFHAs, of which 3,145, or 18 percent, had flood insurance. Thus, while an analysis of the number of policies in force showed significant growth in 1998 and 1999, these data did not capture the fact that fewer than 20 percent of the homeowners in DeKalb County's SFHAs had flood insurance. FEMA's policy growth target also does not take into account whether the policy growth is greater or less than the population change in DeKalb County's SFHAs. For example, a 5-percent increase in the number of policies at a time when the SFHA's population is increasing by 20 percent may not represent program success for DeKalb County or any other community participating in the NFIP. Nor does the policy growth target take into account changes that occur when flood maps are updated, which could result in the addition of some structures to an SFHA. Such information is important for communities like DeKalb County, where new maps took effect this month. Knowledge of DeKalb County's participation rate would also help FEMA better market its flood insurance program there. As noted in table 1, marketing and educational outreach efforts are two of FEMA's strategies to increase the number of policies in force. A 5-percent increase in the number of policies might lead to the erroneous conclusion that DeKalb County did not need additional marketing or outreach campaigns to increase public awareness of flood insurance. A participation rate of 18 percent, however, might indicate that, among other things, additional marketing and educational outreach was necessary for DeKalb County residents. Increasing the share of structures in SFHAs with flood insurance would provide added income to the NFIP's insurance fund and decrease the financial burden that flooding places on the federal government and the citizens who are victims of floods when uninsured structures suffer flood damage and may qualify for other forms of federal disaster relief. Moreover, increased participation would provide a broader base of policyholders so that the primary objective of insurance--the pooling of risk--would be more fully realized. FIA officials agree that program participation rates are a useful measure that can provide insights for measuring the program's success, including the effectiveness of marketing. The data currently available to determine flood insurance participation rates within SFHAs are not always accurate or complete. While FIA maintains data on the number of flood insurance policies, the information it has on the total number of structures within SFHAs is poor, according to FIA's Acting Administrator. FIA acknowledges weaknesses in its estimates of the total number of structures within SFHAs nationwide and is taking steps to obtain more accurate data. New technologies are also becoming available that may be used to estimate the number of structures within floodplains, thereby increasing the reliability of the data needed to determine participation rates. Similarly, local communities are increasingly using these technologies to obtain a more reliable count of the number of structures within SFHAs. While the cost of obtaining more reliable data is not fully known, FEMA is engaging in partnerships to test new technologies that will allow it to share the costs with local communities and other federal agencies. Two numbers are needed to determine participation rates in the NFIP-- the number of insured structures and the total number of structures located within SFHAs. When flood insurance policies are sold, private insurance companies that have agreements with FIA to sell NFIP policies collect information on the insured structure, such as whether it is located within an SFHA, its address, and the name of the mortgage lender. They report this information to FIA, which maintains a database on the number of flood insurance policies in force including the number in SFHAs. FEMA also maintains a database containing estimates of the number of structures within SFHAs. However, FIA's Acting Administrator acknowledges that the data on both the national and local community levels are of varying quality. FEMA has been unable to identify one definitive source of information on the number of structures within SFHAs but is taking steps to obtain more reliable information. FEMA collects data for its Biennial Report on the number of structures within SFHAs from local communities participating in the NFIP. Every 2 years, participating communities report on, among other things, the number of structures within SFHAs as well as within the entire community. However, communities do not always report or provide accurate information. According to a Mitigation Directorate official, about 10 percent of the communities do not report any information. Consequently, older data on the number of structures in these communities are used. Moreover, the communities that do report such information do not always update or report accurate data, since they use different ways to determine the number of structures within SFHAs. For example, some communities have submitted reports showing no increase in the number of structures, but significant increases in population. In other cases, communities reported more structures within the SFHA than within the entire community. According to this official, smaller rural communities may rely on local officials to use their personal knowledge or conduct drive-bys to estimate the number of structures within the SFHA. In contrast, large urban areas typically use technologies such as geographic information systems (GIS) to estimate the number of structures within the SFHA. FIA officials also told us they have information on the number of structures in SFHAs from other databases, but the accuracy of these data is also low. For example, FEMA has a database that estimates the number of structures in SFHAs nationally at six to eight million. However, FIA officials told us that these data are based on the assumption that there is a uniform distribution of structures in SFHAs. Other agencies, such as the U.S. Bureau of the Census, maintain data on street names, addresses, and locations, but their data are not in a format that is useful for determining the number of structures in SFHAs. Similarly, data on the total number of structures cannot be captured from FIRMs, which FEMA currently uses to identify SFHAs, because FEMA's Mitigation Directorate does not include data on structures on these maps. Existing FIRMs identify only the boundaries of SFHAs, streams, and selected roads. Furthermore, FEMA's Mitigation Directorate does not use FIRMs to identify structures because (1) FEMA's regulations on floodplain mapping do not require the depiction of structures on FIRMs; (2) the map scales used for FIRMs are too small to legibly show structures, and enlarging the scales would be cost prohibitive; and (3) the information available on the location of structures is inconsistent. Four studies conducted between 1997 and 2000 that were designed to examine compliance with the mandatory purchase of flood insurance provide some information on participation rates within SFHAs. One study was conducted by FEMA's Inspector General (IG), one was sponsored by FEMA, and private companies conducted the remaining two. Each of the studies was limited to a few communities; none produced nationally representative results or included all of the structures in the appropriate SFHAs in their analysis. See table 2 for a synopsis of each of these four studies. While these studies provide some useful information, they are of limited value in understanding the percentage of structures in SFHAs covered by flood insurance. several current mapping technologies can be used to facilitate the collection of data on the number of structures in SFHAs. These technologies can be used not only to show buildings and houses on maps but also to pinpoint the exact location of such structures. Combining these technologies with the digital flood maps that FEMA is already producing would allow for increased accuracy in the identification of structures within SFHAs and the calculation of participation rates. For example, USGS has produced computer-generated images of aerial photographs--that is, pictures taken from airplanes of the land below--for about 74 percent of the United States. These images are called digital orthophoto quadrangles (DOQ), and essentially combine the characteristics of a photograph with the geometric qualities of a map. FIA currently uses these images to produce some if its flood maps. While DOQs show pictures of structures, each structure must be digitized in order to be identified by a geographic information system. Local communities are also beginning to use these emerging technologies, although to widely differing degrees. In DeKalb County, Georgia, local officials have purchased DOQs of its 270 square miles from a contractor and digitized the structures in the photos. The county plans to geographically reference each of the structures to create a base map that shows the accurate location of structures. The county can then lay digital flood-maps over its base maps to determine the number of structures in the local SFHAs. According to county officials, once this technology is in place, it will be easy to determine the number of structures in local SFHAs. NFIP participation rates will also be easy to calculate. A DeKalb County official told us that this digitized mapping technology has many practical applications for the county, including engineering, planning and zoning, crime analysis, and disaster recovery, and it will allow maps to be generated for presentations at public hearings and other meetings. FEMA officials told us that similar efforts are occurring in Charlotte, North Carolina, and Louisville, Kentucky. A 1998 survey by the National States Geographic Information Council and the Federal Geographic Data Committee found that 69 percent of the GIS data users from state, regional, and local governments responding to its survey create, update, integrate, and distribute digital geographic data. This indicates that a number of localities have some technology available to create digital base maps and that the potential exists for localities to use such technology to identify structures within SFHAs. However, FIA officials told us that the number of communities that currently have detailed data available is small. They also told us that as more FIRMs are produced digitally and more communities improve the ability of their mapping technologies to collect data on properties and buildings, measuring the number of structures located within SFHAs will become easier and more efficient. The costs of using technology to accurately identify the number of structures in SFHAs are not fully known. In March 2000, FEMA estimated the total costs to modernize flood maps from fiscal year 2001 through fiscal year 2007 to be $773 million above expected annual funding levels, with digitization and map maintenance costs alone totaling $156 million.The modernization of maps includes converting paper flood maps to a digital format, which is the first step in using available technology to identify the number of structures within SFHAs. FEMA continues to refine the cost estimate as it updates its projection of needs and improves its cost data, including the impacts on costs of partnerships with communities and other local, regional, state, and federal agencies, and new technologies. The partnerships that FEMA has developed with state, local, and other federal agencies should reduce some of its costs to modernize its flood maps. Along with enabling the agency to share some of the costs to modernize flood maps, the partnerships will facilitate the development of technology that can be used to estimate the number of structures within SFHAs. For example, through FEMA's Cooperating Technical Partners initiative, 62 partnerships had been developed with local communities as of September 2000. Through this effort, communities, states, and regional agencies perform all or portions of data collection and mapping tasks to create their own FIRMs. An FIA official told us that the cost benefits to FEMA from this effort have not yet been determined. FEMA has also entered into partnerships with other federal agencies to fund cooperatively the production of DOQs and high-accuracy elevation data. As discussed previously, DOQs provide detailed images of land, including the location of houses. Elevation data are useful because they help make flood maps more accurate. Both of these technologies can be manipulated with geographic information systems to more accurately identify the number of structures within SFHAs. While FIA has factored in the costs of cooperatively producing DOQs with other agencies in its mapping modernization cost estimate, funding arrangements to produce elevation data with other federal agencies have not yet been determined. Program participation rates are an effective way to gain insights into and improve the performance of the NFIP program. Incorporating participation rates into FEMA's goals can provide results that are in line with GPRA--objective, measurable, and quantifiable. While it will be many years before the data needed to determine national participation rates become available, some communities are already collecting such data. These communities are using technologies that allow them to count the number of structures in SFHAs and some are using these technologies to determine participation rates. As our preceding discussion of DeKalb County, Georgia, demonstrates, such community-level data can provide FIA with useful information on the degree of participation by residents living in SFHAs. In addition to our work on the NFIP, we have two other studies under way involving FEMA. The first responds to your request, in the September 16, 1999, Senate Report (106-161) accompanying the fiscal year 2000 appropriations bill, that we evaluate FEMA's processes for ensuring that disaster assistance funds are used effectively and efficiently. This report, which we expect to issue this summer, will provide information on (1) the adequacy of the criteria FEMA employs to determine if a presidential disaster declaration is warranted and the consistency with which FEMA applies these criteria and (2) the policies and procedures FEMA has developed to ensure that individual Public Assistance Program projects in disaster areas meet eligibility requirements. We also plan to issue a report in late summer that looks at all federal agencies involved in combating terrorism--including FEMA--with a specific emphasis on (1) the overall framework for managing federal agencies' efforts; (2) the status of efforts to develop a national strategy, plans, and guidance; (3) the federal government's capabilities to respond to a terrorist incident; (4) federal assistance to state and local governments to prepare for an incident; and (5) the federal structure for developing and implementing a strategy to combat cyber-based terrorism. For future information on this testimony, please contact JayEtta Hecker at (202) 512-2834. Individuals making key contributions to this testimony included Martha Chow, Lawrence Cluff, Kerry Hawranek, Signora May, John McGrail, Lisa Moore, Robert Procaccini, and John Strauss. Combating Terrorism: FEMA Continues to Make Progress in Coordinating Preparedness and Response (GAO-01-15, Mar. 20, 2001). Disaster Relief Fund: FEMA's Estimates of Funding Requirements Can Be Improved (GAO/RCED-00-182, Aug. 29, 2000). Observations on the Federal Emergency Management Agency's Fiscal Year 1999 Performance Report and Fiscal Year 2001 Performance Plan (GAO/RCED-00-210R, June 30, 2000). Disaster Assistance: Issues Related to the Development of FEMA's Insurance Requirements (GAO/GGD/OGC-00-62, Feb. 25, 2000). Flood Insurance: Information on Financial Aspects of the National Flood Insurance Program (GAO/T-RCED-00-23, Oct. 27, 1999). Flood Insurance: Information on Financial Aspects of the National Flood Insurance Program (GAO/T-RCED-99-280, Aug. 25, 1999). Disaster Assistance: Opportunities to Improve Cost-Effectiveness Determinations for Mitigation Grants (GAO/RCED-99-236, Aug. 4, 1999). Disaster Assistance: Improvements Needed in Determining Eligibility for Public Assistance (GAO/RCED-96-113, May 23, 1996). Flood Insurance: Financial Resources May Not Be Sufficient to Meet Future Expected Losses (GAO/RCED-94-80, Mar. 21, 1994). | This testimony discusses the preliminary results of GAO's ongoing review of the National Flood Insurance Program (NFIP), which is run by the Federal Emergency Management Administration's (FEMA) Federal Insurance Administration (FIA) and Mitigation Directorate, a major component of the federal government's efforts to provide flood assistance. This program creates standards to minimize flood losses. GAO found that FEMA has several performance goals to improve program results, including increasing the number of insurance policies in force. Although these goals provide valuable insight into the degree to which the program has reduced flood losses, they do not assess the degree to which the most vulnerable residents--those living in flood-prone areas--participate in the program. Capturing data on the number of uninsured and insured structures in flood-prone areas can provide FEMA with another indication of how well the program is penetrating those areas with the highest flood risks, whether the financial consequences of floods in these areas are increasing or decreasing, and where marketing efforts can better be targeted. However, before participation rates can be used to measure the program's success, better data are needed on the total number of structures in flood-prone areas. FIA tracks the number of insurance policies in these areas, but data on the overall number of structures are incomplete and inaccurate. Some communities are developing better data on the number of structures in flood-prone areas. FEMA is also trying to improve the quality of its data on the number of structures in flood-prone areas and is working to develop new mapping technologies that could facilitate the collection of such data. The cost of this new technology is not fully known, but the expense will be shared among federal, state, and local agencies. | 5,002 | 364 |
The U.S. government classifies information that it determines could damage the national security of the United States if disclosed publicly. Currently, all classified information falls under two authorities, one for national defense and foreign relations, the other for nuclear weapons and technology. Beginning in 1940, classified national defense and foreign relations information has been created, handled, and safeguarded in accordance with a series of executive orders. Executive Order 12958, Classified National Security Information, as amended, is the most recent. It establishes the basis for designating National Security Information (NSI). It demarcates different security classification levels, the unauthorized disclosure of which could reasonably be expected to cause exceptionally grave damage (Top Secret), serious damage (Secret), or damage (Confidential). It also lists the types of information that can be classified and describes how to identify and mark classified information. In 2005, about one quarter of DOE classification decisions concerned NSI. The advent of nuclear weapons during World War II, led to a new category of classified information. In 1946, the Congress enacted the Atomic Energy Act, which established a system for governing how U.S. nuclear information is created, handled, and safeguarded. Nuclear information categorized as Restricted Data (RD) or Formerly Restricted Data (FRD) is not governed by Executive Order 12958. RD is defined as data concerning the design, manufacture, or utilization of atomic weapons; production of special nuclear material; and use of special nuclear material in the production of energy. This includes information about nuclear reactors that produce plutonium and tritium, radioactive isotope separation techniques, and the quantities of nuclear materials involved in these processes. FRD relates primarily to data regarding the military use of nuclear weapons. Examples of FRD include weapons stockpile data, weapon yields, the locations of nuclear weapons, and data about weapons safety and storage. Like NSI, classified nuclear information also has three classification levels: Top Secret, Secret, or Confidential. Naval Nuclear Propulsion Information (NNPI) is an exceptional category, which may fall under either of the two classification authorities. NNPI is deemed by both DOE and the Department of Defense (DOD) to be sufficiently sensitive to merit special protections and may be classified under the Atomic Energy Act or Executive Order 12958, depending on its subject and details. Two categories of nuclear information can be withheld from the public without being classified: Unclassified NNPI and Unclassified Controlled Nuclear Information (UCNI). Unclassified NNPI and UCNI are information the government considers sufficiently sensitive to withhold from public release, but not so sensitive as to warrant designation as RD, FRD, or NSI. UCNI is a category created under the authority of the Atomic Energy Act, which enables DOE officials to share information with state and local law enforcement and emergency services personnel who, while lacking security clearances, may have a legitimate need to know operational details about, for example, planned shipments of special nuclear materials. According to the current executive order, documents containing only NSI must be "portion marked," for instance, classified paragraph-by-paragraph. For example, a document containing NSI may have paragraphs classified as Top Secret, Secret, or Confidential, along with others that are unclassified. However, documents containing any RD or FRD are classified in their entirety at the level of the most sensitive information in the document. Portion marking of documents containing RD and FRD is not required by the Atomic Energy Act and is discouraged by DOE policy. Executive Order 12958, as amended, states that NSI shall be declassified as soon as it no longer meets the standards for classification. The point at which information is to be declassified is set when the decision is made to classify it, and it is linked to an event, such as a completed mission, or to a period of time. Classified records that are older than 25 years and have permanent historical value are automatically declassified unless an exemption is granted because their contents still remain sensitive and their release could harm national security. Agencies have adopted processes to facilitate declassification in compliance with the executive order. Unlike documents containing NSI, documents containing RD or FRD are not reviewed automatically for possible declassification. The reason for this is that these two categories are mostly scientific and technical and may not become less sensitive with the passage of time. In fact, such data may be useful to nations and terrorist groups that are trying to build nuclear weapons. At a time of increased concern about nuclear proliferation, some of the oldest and simplest nuclear technology can be useful for making weapons of mass destruction. For this reason, documents about nuclear weapons and technologies from the 1940s and 1950s remain especially sensitive and worthy of protection. DOE implements the executive order and classification statutes by issuing departmental regulations, directives, and extensive use of classification guides. DOE's directive, Identifying Classified Information, is the department's comprehensive guide to classifying, declassifying, marking, and protecting information, documents, and material. The directive also establishes policies and procedures, such as departmentwide training and certification requirements for staff authorized to classify or declassify information, and for periodic self-assessments. Classification guides are manuals specifying precisely which DOE information must be classified, how it should be categorized (NSI, RD, or FRD), and at what level (Top Secret, Secret, or Confidential) it should be protected. DOE has a detailed and comprehensive set of classification guides that are integral to efficient functioning of the department's classification activities. The department limits the use of "source documents" for the purpose of making classification decisions. Source documents may be used to classify documents containing NSI, but only when there is no guidance available. For example, if a DOE classifier is evaluating a new document with the same information found in another document already classified as Secret, then this new document may also be classified as Secret. RD and FRD documents can never be used as source documents. DOE's Office of Classification's systematic training, comprehensive guidance, and rigorous oversight programs had a largely successful history of ensuring that information was classified and declassified according to established criteria. DOE's training requirements and classification guidance are essential internal controls that provide a strong framework for minimizing the risk of misclassification. However, since responsibility for classification oversight was shifted from the Office of Classification to the Office of Security Evaluations in October 2005, the pace of oversight was interrupted--creating uncertainty about how oversight will be performed and whether it will continue to be effective. Systematic training requirements are an important element of DOE's framework for maximizing the proper classification of documents. Only staff that have successfully completed training are authorized to classify or declassify documents. Staff must be recertified as classifiers and/or declassifiers every 3 years, in order to retain their authority. Staff are typically trained as "derivative classifiers" and, in some cases, as "derivative declassifiers" as well. They are limited in their authority to those areas in which they have special knowledge and expertise and are only authorized to classify (or declassify) documents "derivatively"--that is, only if the document in question contains information a DOE or other U.S. government agency classification guide specifically requires be classified or declassified. There are currently about 4,600 derivative classifiers in DOE, nearly all of whom do classification work only as a collateral duty. For example, most derivative classifiers in DOE are scientists, engineers, or other technically trained people who work in programs or areas involving classified information that need staff who can properly classify the documents these programs produce. Relatively few DOE staff (just 215 as of May 2006) are authorized to declassify documents. Because a declassified document may become publicly available, derivative declassifiers are among the most experienced derivative classifiers. Only original classifiers, of which there are currently 25 throughout the DOE complex, are authorized to classify previously unclassified information. All DOE original classifiers are either very senior, full-time classification professionals, such as the director and deputy director of the Office of Classification, or one of the department's top-level political appointees, such as the Administrator, National Nuclear Security Administration. DOE has developed an extensive collection of more than 300 classification guides, or manuals, specifying precisely which DOE information must be classified, how it should be categorized, and at what level (Top Secret, Secret, or Confidential) it should be protected. The Office of Classification oversees the regular updating of all classification guides used in DOE and must ultimately approve the use of every guide. DOE prohibits classification decisions based on source documents for documents containing RD and FRD and permits their use only when no guidance is available for documents containing NSI from other federal agencies. The Information Security Oversight Office considers the use of classification guides to be a best practice because they provide a singular, authoritative voice that is less open to individual interpretation or confusion than source documents and so using these guides are less likely to result in errors. According to the Information Security Oversight Office, DOE's use of classification guides is among the most extensive in the federal government. Classification guides are integral to the efficient functioning of the department's classification program. Some classification guides are more general in nature, such as those dealing with physical security, and are used widely throughout DOE. Others, known as "local guides," are used at a few or even a single site because they provide guidance specific to a single DOE program or project. For example, a classification guide used by contractors working on a decontamination and clean-up project at a site in Oak Ridge, Tennessee, provides specific guidance on nuclear waste and storage unique to this site. DOE has also implemented an extensive and rigorous oversight program. From 2000 through 2005, the Office of Classification and its predecessor offices have conducted on-site inspections of classification activities at 34 DOE field offices, national laboratories, and weapons manufacturing facilities. In calendar years 2004 and 2005, the Office of Classification conducted an average of 10 oversight inspections a year. Classification activities were evaluated in depth in eight different functional areas, including site-provided classification training, self-assessment efforts, and overall senior management support for (and awareness of) classification activities. To this end, before a team of 3 to 10 Office of Classification inspectors arrived, it would send the site's classification officer a "data call" requesting detailed and specific answers to dozens of questions about the procedures and practices of the site's classification program. For example, to ascertain how effectively classification guidance was being used, requests were made for information about what guidance was in use at the site; the names of authorized classifiers who had guides; whether there were any local (site-specific) guides in use, and if so, when were they last validated by Office of Classification officials. Similarly detailed requests for information were requested about each of the other classification program elements. Having such detailed information in hand prior to arrival at the site allowed inspection teams to undertake a comprehensive evaluation in just 2 to 5 days because they could focus more on validating the information provided in the data call than on undertaking the time-consuming task of gathering data themselves. The Office of Classification staff's expertise in classification matters is augmented with subject area experts. For example, to ensure the inspection team had adequate expertise to make valid assessments of classification decisions about nuclear weapons design at Los Alamos National Laboratory, a staff member with nuclear weapons design experience was assigned to the team. Moreover, in many cases, members of the inspection team had more than 20 years of classification experience. As a result of the extensive information provided by the data call, and the level of experience of the inspection team, generally the team submitted a draft inspection report to the site's classification officer before leaving. It is DOE policy that any findings requiring immediate correction resulted in the creation of a corrective action plan, which had to be completed within 60 days of the inspection. DOE officials told us progress on implementing corrective action plans was reported to the Office of Classification quarterly. In September 2005, the Information Security Oversight Office reviewed DOE's classification program just prior to the shift in responsibility for classification oversight. Officials at the Information Security Oversight Office found DOE's program to be much better than the average federal agency. They singled out DOE's training program and extensive use of classification guidance as especially impressive. One official called DOE's program for ensuring that all staff authorized to classify and declassify documents were recertified every 3 years "outstanding." Another official called DOE's extensive use of classification guides a "best practice." Overall, Information Security Oversight Office officials were impressed with DOE's classification program, noting that robust oversight is a very important part of an effective program for managing classified information. Since responsibility for classification oversight was shifted from the Office of Classification to the Office of Security Evaluations, the pace of oversight was interrupted--creating uncertainty about how oversight will be performed and whether it will continue to be effective. The Office of Security Evaluations is the DOE office responsible primarily for the oversight of physical security at DOE sites, with a special emphasis on Category 1 sites (sites containing special nuclear materials). Since October 2005, the Office of Security Evaluations has completed one inspection of two offices at the Pantex Site in Texas and another inspection of four offices at the Savannah River Site is under way. In April 2006, Office of Security Evaluations officials provided us plans for performing additional oversight inspections for the remainder of 2006. These plans included inspections evaluating classification activity at eight DOE offices at three additional sites. Classification oversight has been incorporated into larger oversight efforts on physical security at DOE sites. Classification oversight ceased from October 2005 until February 2006 when the Office of Security Evaluations began its inspection of two offices at the Pantex Plant, a nuclear weapons manufacturing facility in Texas. Before the shift in responsibility, DOE officials did not conduct any risk assessment of the likely effects on the classification oversight program of the shift for three reasons: (1) they did not consider the shift to be a significant organizational or management challenge because the upper- level management remained the same; (2) the Office of Security Evaluations would continue to draw on many of the same experienced Office of Classification staff who have been performing classification oversight for many years; and (3) responsibility for other key internal controls for managing classification activities, namely training and guidance, would remain with the Office of Classification. The director of the Office of Security Evaluations and the acting deputy director of the Office of Classification told us that the goal of shifting responsibility for classification oversight from one office to the other was to consolidate all oversight functions in one area. The idea arose in the course of a periodic reassessment of the organization of the Office of Security and Safety Performance Assurance--the larger organization of which these and several other offices are part--and a judgment by senior DOE management that one group should do all the oversight. The Office of Security Evaluations seemed the most logical place to locate classification oversight, according to senior DOE management. DOE officials also told us that the Office of Security and Safety Performance Assurance was not the only part of DOE affected by this drive to consolidate functions in single offices, and there was no intent to downgrade oversight. According to the Director of the Office of Security Evaluations, the procedures for conducting future oversight are still evolving--including the numbers of sites to be inspected and the depth of analysis to be performed. The office currently plans to evaluate classification activities at 14 offices within five DOE sites in calendar year 2006, integrating classification oversight into its regularly scheduled inspections of Category 1 sites with inspections at a few non-Category 1 sites. The director of the Office of Security Evaluations said the goal is to visit each of DOE's 10 Category 1 sites every 2 years. However, this schedule has been recently delayed as the office has been tasked by senior DOE management to perform security reviews in other areas of DOE operations. Now that classification oversight is a component within the much larger oversight agenda of the Office of Security Evaluations--one focused on the physical security of DOE's most sensitive sites--it raises uncertainty about whether classification oversight will have a diminished priority than when it was solely an Office of Classification responsibility. However, if all of the visits planned for 2006 are completed, then the Office of Security Evaluations will be conducting oversight at a pace similar to what was done prior to October 2005. As classification oversight is now the responsibility of the Office of Security Evaluations--and will be reported as one component in a much larger report on the overall security of DOE sites--it is unclear if the new format will have the same depth of analysis or be as comprehensive, detailed, and useful as the format used by the Office of Classification. The Office of Security Evaluations reports are bigger and have a much higher profile with senior DOE management than reports by the Office of Classification. As such, they are written to convey information to a broader and less technically oriented audience. Each element of security is rated as "effective performance" (green), "needs improvement" (yellow), or "significant weakness" (red). To accommodate this shift, the format for reporting the results of inspections of classification activities has changed to fit into this larger, well-established Office of Security Evaluations reporting format. These reports have relatively brief executive summaries but are supplemented by several appendixes, one for each component of site security. The executive summary includes the highlights of the inspection, an overall evaluation of security at the site, the formal findings (that is, deficiencies uncovered), and a brief scope and methodology section (which includes a listing of the personnel participating in the inspection). It is uncertain if the results of the inspection of classification activities will be included in the executive summary, or if this depends on whether the results are particularly noteworthy. Not all aspects of an inspection will be mentioned in the summary section, and most of what is reported on classification and other topics will be in their respective appendixes. The Office of Security Evaluation's full report will be classified because it will contain information on the vulnerabilities in site security. However, according to the Office's director, the appendix on classification will likely be unclassified. Since the shift in responsibility, the Office of Security Evaluations has completed one classification inspection of two offices at the Pantex Site; and the new procedures for oversight are still evolving. It is uncertain whether the reporting on classification oversight will be as detailed, specific, and, ultimately, as useful as it was prior to the October 2005 shift in responsibility. While the overall reporting format for the Office of Security Evaluations reports is firmly in place, the director of the office told us that the details of how to assess the effectiveness of the classification program is still evolving. Initially, the Office of Security Evaluations plans to gather similarly detailed and comprehensive information from the sites it inspects using the same "data call" as the Office of Classification; the data call requests detailed and specific answers to dozens of questions about the procedures and practices of the site's classification program. The director of the Office of Security Evaluations stressed--and the deputy director of the Office of Classification agreed--that they plan to have the information reported in the classification appendix written in language similar to that in Office of Classification reports, and findings and recommendations for improvement will be conveyed in language no less specific and "actionable" than in the previous reports. Nonetheless, until the Office of Security Evaluations performs several classification inspections and establishes its own record of accomplishment in overseeing DOE classification activities, it is not clear whether oversight will be as effective as it was before the shift in responsibility. Without continued effectiveness, DOE classification activities could become less reliable and more prone to misclassification. On the basis of reviews of over 12,000 classified documents totaling nearly a quarter million pages at 34 sites between 2000 and 2005, DOE officials have found that very few documents are misclassified. Office of Classification inspectors found 20 documents had been misclassified, an error rate of about one-sixth of 1 percent. At more than two-thirds of the sites (25 of 34) inspectors found no classification errors. The most misclassified documents that these inspectors found at any site were five, at the Los Alamos National Laboratory in May 2005. Four of these documents were classified, but not at the proper level or category. A fifth document containing nuclear weapons information should have been classified but was unclassified and found in the laboratory's technical library. (See table 1.) Most misclassified documents remained classified, just not at the appropriate level or category. Of greater concern would be documents that should be classified but mistakenly are not. When mistakenly not classified, such documents may end up in libraries or on DOE Web sites where they could reveal sensitive RD and FRD to the public. When documents are not classified but should be, these errors can only be uncovered through some form of oversight, such as the document reviews that occurred in preparation for, and during, Office of Classification inspections. For example, during an inspection at the Sandia National Laboratories in March 2005, Office of Classification inspectors reviewed more than 170 unclassified documents in the laboratory's holdings and found 2 documents that contained classified information. Without systematic oversight, these kinds of errors are unlikely to be discovered and corrected. While DOE's extensive document reviews provided depth and rigor to its oversight inspections, two notable shortcomings in this process were (1) the inconsistent way that inspectors gained access to the many documents they would review and (2) the failure to adequately disclose these procedures in their reports. At the six DOE sites we visited, the procedures that the Office of Classification inspection teams used to obtain documents varied widely. For example, at the Los Alamos National Laboratory, inspectors were granted unfettered access to any storage vault and library, and they themselves chose the documents for review. Once in the vault or library, inspectors used the document indexes or interviewed the librarians to decide which documents and topics were recently classified or declassified. The inspectors requested the documents of most interest, or they browsed in the collection and pulled files randomly from the shelves. By contrast, at the NNSA Service Center in Albuquerque, site officials selected documents from several different locations, and then inspectors chose from among them. By not being able to select their own samples, Office of Classification inspectors limited their independence--which could possibly undermine the credibility of their findings. Because DOE does not have a complete inventory of its classified documents, it cannot select a strictly random sample. Nonetheless, DOE officials acknowledged they could improve their selection procedures to make them more consistent and random. Furthermore, in the 34 inspection reports we analyzed, Office of Classification inspectors did not disclose to the reader key facts about how information was gathered, what limitations they agreed to, and how this affected their findings. According to Standards for Internal Control in the Federal Government, independent inspections should properly document and report on the processes they use in their evaluations. The Office of Classification's reports provided no detail about how documents were chosen. Such detail would increase public confidence that DOE's classification oversight is transparent and robust. Since the 1950s, the DOE's Office of Classification and its predecessor organizations have developed strong systems of internal controls for managing classified information. At the core of these systems are (1) DOE's requirement that staff authorized to classify documents must complete training and be periodically recertified, (2) its comprehensive guidance, and (3) its program of regular and rigorous oversight to ensure that DOE sites are following agency classification policies. These training, guidance, and oversight programs have provided a proven framework that has contributed to DOE's success in managing classified information. However, the recent reduction in oversight activity following a shift in responsibilities raises questions about whether this framework will continue to be as strong. If the oversight inspections planned for the remainder of 2006 are effectively completed, it will demonstrate resumption in the pace of oversight conducted prior to October 2005. However, if these inspections are not completed, or are not as comprehensive, then the extent and depth of oversight will be diminished and may result in DOE classification activities becoming less reliable and more prone to misclassification. In addition, by implementing more random selection procedures for identifying classified documents to review--and by disclosing these procedures clearly in their reports--DOE has the opportunity to assure both itself and the public that its oversight is, indeed, effective. DOE is the agency most responsible for safeguarding the nation's nuclear secrets, and its classification and declassification procedures are especially vital to national security. At a time when risks of nuclear proliferation are increasing, it is imperative that DOE build on its past successes in order to continue to be effective. To help ensure that DOE classification activities remain effective and result in documents that are classified and declassified according to established criteria, we recommend that the Secretary of Energy take the following three actions: ensure that the classified information oversight program provides oversight to a similar number of DOE sites, as it did before October 2005, and provides a similar depth of analysis; strengthen the review of classified documents by applying selection procedures that more randomly identify documents for review; and disclose the selection procedures used for documents for review in future classification inspection reports. In commenting on the draft of this report, DOE agreed with the findings and recommendations of the report. DOE was pleased that its classification program is being recognized as particularly effective in protecting information vital to national security. However, while DOE agreed with our recommendation that steps be taken to ensure that the classification oversight program provide oversight to a similar number of sites at a similar depth of analysis, it asserted that it is in fact already taking the needed actions and has, overall, "retained the effective framework previously established by the Office of Classification." Although we are encouraged by DOE's efforts, until the agency establishes a record of accomplishment under the new organizational structure, it will not be clear whether oversight will be as effective as it has been in the past. DOE also concurred with our recommendations to strengthen the review of classified documents by applying selection procedures that more randomly identify documents for review and disclose these procedures in future reports and outlined steps it will take to implement these two recommendations. Comments from DOE's Director, Office of Security and Safety Performance Assurance are reprinted in appendix II. DOE also provided technical comments, which we incorporated into the report as appropriate. We are sending copies of this report to the Secretary of Energy; the Director, Office of Management and Budget; and interested congressional committees. We will also make copies available to others upon request. In addition, this report will be available at no charge on the GAO Web site at http://www.gao.gov. If you or your staff have any questions concerning this report, please contact me at (202) 512-3841 or [email protected]. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this report. GAO staff that made major contributions to this report are listed in appendix III. The Department of Energy (DOE) classifies and declassifies information under authorities granted by the Atomic Energy Act, first passed in 1946, and under presidential executive orders governing national security information. These authorities and corresponding implementing directives provide for three classification levels: Top Secret, Secret, and Confidential. DOE uses three categories to identify the different types of classified information: Restricted Data, Formerly Restricted Data, and National Security Information. In addition to classified information, certain types of unclassified information are sensitive and require control to prevent public release. The markings used and the controls in place depend on the statutory basis of the unclassified control system and vary in DOE, from Official Use Only information to Unclassified Controlled Nuclear Information. At a practical level, unclassified information is controlled or not controlled, depending on its sensitivity, any overriding public interest requiring release, or operational considerations involving the benefit of control versus the cost of control (for example, it must be shared with uncleared state or local government officials). The information presented below is a summary of the various levels and categories used by DOE to classify and control information. All classified information and documents are classified at one of three levels, listed in descending order of sensitivity: Top Secret (TS), Secret (S), or Confidential (C). Classified under authority of the Atomic Energy Act (AEA) of 1954, as amended. Defined in the AEA as all data concerning: the design, manufacture, or utilization of atomic weapons; and the production of special nuclear material. Examples include: (1) Production reactors (2) Isotope separation (gaseous diffusion, gas centrifuge, laser isotope separation). The use of special nuclear materials in the production of energy. Examples include: (1) naval reactors, and (2) space power reactors. But not information declassified or removed from the RD category. Documents are not portion marked-an entire document is classified at the level of the most sensitive information contained in the document. Classified under authority of the AEA of 1954, as amended. Information that has been removed from the RD category because DOE and the Department of Defense have jointly determined that the information (1) now relates primarily to the military utilization of atomic weapons and (2) can be adequately safeguarded as defense information. Examples include: weapon stockpile quantities, weapons safety and storage, weapon yields, and weapon locations. Documents are not portion marked. Classified under the authority of Executive Order 12958, as amended. Information that pertains to the national defense or foreign relations of the United States and classified in accordance with the current executive order as Top Secret, Secret, or Confidential. NSI documents may be classified up to a 25 year limit unless containing information that has been approved for exemption from declassification under Executive Order 12958, as amended, and based on an approved declassification guide. For example, DOE treats certain nuclear-related information that is not RD or FRD, such as security measures for nuclear facilities, as exempt from declassification until such facilities are no longer in use. Many of these facilities have been in use for over 50 years. Documents are portion marked by paragraph. Confidential Foreign Government Information - Modified Handling Authorized (C/FGI-MOD) An agency must safeguard foreign government information under standards providing a degree of protection at least equivalent to that required by the government or international organization that furnished the information. If the FGI requires a level of protection lower than that for Confidential, the United States can, under Executive Order 12958 section 4.1(h), classify and protect it as C/FGI-MOD, which provides protection and handling instructions similar to that provided to United States Official Use Only. Before C/FGI-MOD was created, the only legal way for such information to be controlled was at the Confidential level, which resulted in over-protection, increased security cost, and operational complexity. Each classified document must be marked to show its classification level (and classification category if RD or FRD), who classified it, the basis for the classification, and the duration of classification (if NSI). Lack of a category marking indicates the classified document is NSI. A document containing only NSI must be portion marked. An RD document, for example, will be marked TSRD (Top Secret Restricted Data), showing the classification level and category. RD documents are similarly marked SRD (Secret Restricted Data), or CRD (Confidential Restricted Data). A document should never simply be marked "RD." The same rules apply to FRD information (TSFRD, SFRD, and CFRD). A classified document that is not RD or FRD is an NSI document. NSI documents are marked as TSNSI (Top Secret National Security Information), SNSI (Secret National Security Information), or CNSI (Confidential National Security Information); or simply Top Secret, Secret, or Confidential. Controlled under authority of the AEA of 1954, as amended. the design of nuclear material production facilities or utilization facilities; security measures for protecting such facilities, nuclear material contained in such facilities, or nuclear material in transit; The design, manufacture, or utilization of any atomic weapon or component if it has been declassified or removed from the RD category. UCNI markings - A document containing UCNI must be marked at the top and bottom of each page with "Unclassified Controlled Nuclear Information" or "UCNI" and include, on the front of the document, a marking that identifies the Reviewing Official making the determination, the date of the determination, and the guidance used. Unclassified information that may be exempt from public disclosure under provisions of the Freedom of Information Act (FOIA) that is not otherwise subjected to a formally implemented control system. A decision to control information as OUO does not mean that such information is automatically exempt from disclosure if requested under the FOIA. That determination is made by a FOIA Authorizing Official only when the document is requested. The OUO marking merely serves as a warning that the document reviewer considers the information to be sensitive and indicates why by including on the document the FOIA exemption that the document reviewer thinks applies. OUO markings - Documents determined to contain OUO information are and when they do, they state which FOIA exemption applies. This classification guide is then cited on the OUO stamp.) NNPI concerns all classified and controlled unclassified information related to the naval nuclear propulsion program. This marking supplements existing classification and control systems and is not a separate category outside of the authorities provided under the AEA or Executive Order 12958 for, as an example, classified NNPI. The use of "NNPI" is an additional marking applied to some of the previously defined categories of information to indicate additional controls for protection or access. Classified under the authority of the AEA of 1954, as amended, or Executive Order 12958, as amended. All classified information concerning the design, arrangement, development, manufacture, testing, operation, administration, training, maintenance, and repair of propulsion plants of naval nuclear powered ships and prototypes, including associated shipboard and shore-based nuclear support facilities. Markings can be RD or NSI. C-NNPI documents containing RD information are marked TSRD, SRD, or CRD. C-NNPI NSI documents are typically marked Secret NOFORN ("not Documents containing information classified under the authority of the AEA are not portion marked. Controlled in accordance with Naval Sea Systems Command Instruction C5511.32B and protected pursuant to export control requirements and statutes. All unclassified but controlled information concerning the design, arrangement, development, manufacture, testing, operation, administration, training, maintenance, and repair of propulsion plants of naval nuclear powered ships and prototypes, including associated shipboard and shore-based nuclear support facilities. U-NNPI documents will be marked and controlled as NOFORN (not releasable to foreign nationals). In addition, Nancy Crothers, Robin Eddington, Doreen Feldman, William Lanouette, Greg Marchand, Terry Richardson, Kevin Tarmann, and Ned Woodward made significant contributions to this report. | In recent years, the Congress has become increasingly concerned that federal agencies are misclassifying information. Classified information is material containing national defense or foreign policy information determined by the U.S. government to require protection for reasons of national security. GAO was asked to assess the extent to which (1) DOE's training, guidance, and oversight ensure that information is classified and declassified according to established criteria and (2) DOE has found documents to be misclassified. DOE's Office of Classification's systematic training, comprehensive guidance, and rigorous oversight programs had a largely successful history of ensuring that information was classified and declassified according to established criteria. However, an October 2005 shift in responsibility for classification oversight to the Office of Security Evaluations has created uncertainty about whether a high level of performance in oversight will be sustained. Specifically, prior to this shift, the Office of Classification had performed 34 inspections of classification programs at DOE sites since 2000. These inspections reviewed whether DOE sites complied with agency classification policies and procedures. After the October 2005 shift, however, the pace of this oversight was interrupted as classification oversight activities ceased until February 2006. So far in 2006, one classification oversight report has been completed for two offices at DOE's Pantex Site in Texas, and work on a second report is under way at four offices at the Savannah River Site in South Carolina. More oversight inspections evaluating classification activity at eight DOE offices are planned for the remainder of 2006. In addition, according to the Director of the Office of Security Evaluations, the procedures for conducting future oversight are still evolving--including the numbers of sites to be inspected and the depth of analysis to be performed. If the oversight inspections planned for the remainder of 2006 are completed, it will demonstrate resumption in the pace of oversight conducted prior to October 2005. However, if these inspections are not completed, or are not as comprehensive as in the past, the extent and depth of oversight will be diminished and may result in DOE classification activities becoming less reliable and more prone to misclassification. On the basis of reviews of classified documents performed during its 34 oversight inspections, the Office of Classification believes that very few of DOE's documents had been misclassified. The department's review of more than 12,000 documents between 2000 and 2005 uncovered 20 documents that had been misclassified--less than one-sixth of 1 percent. DOE officials believe that its misclassification rate is reasonable given the large volume of documents processed. Most misclassified documents remained classified, just not at the appropriate level or category. Of greater concern are the several documents that should have been classified but mistakenly were not. When mistakenly not classified, such documents may end up in libraries or on DOE Web sites where they could reveal classified information to the public. The only notable shortcomings we identified in these inspections were the inconsistent way the Office of Classification teams selected the classified documents for review and a failure to adequately disclose these procedures in their reports. Inspection teams had unfettered access when selecting documents to review at some sites, but at others they only reviewed documents from collections preselected by site officials. Office of Classification reports do not disclose how documents were selected for review. | 7,725 | 677 |
Over the past decade, DOD has increasingly relied on contractors to provide a range of mission-critical services from operating information technology systems to providing logistical support on the battlefield. The growth in spending on services clearly illustrates this point. DOD's obligations on service contracts, expressed in constant fiscal year 2006 dollars, rose from $85.1 billion in fiscal year 1996 to more than $151 billion in fiscal year 2006, a 78 percent increase. More than $32 billion--or 21 percent--of DOD's obligations on services in fiscal year 2006 were for professional, administrative, and management support contracts. Overall, according to DOD, the amount obligated on service contracts exceeded the amount the department spent on supplies and equipment, including major weapon systems. Several factors have contributed to the growth in service contracts. For example, after the September 2001 terrorist attacks, increased security requirements and the deployment of active duty and reserve personnel resulted in DOD having fewer military personnel to protect domestic installations. For example, the U.S. Army awarded contracts worth nearly $733 million to acquire contract guards at 57 installations. Growth was also caused by changes in the way DOD acquired certain capabilities. For example, DOD historically bought space launch vehicles, such as the Delta and Titan rockets as products. Now, under the Evolved Expendable Launch Vehicle program, the Air Force purchases launch services using contractor-owned launch vehicles. Similarly, the Air Force and Army turned to service contracts for simulator training primarily because efforts to modernize existing simulator hardware and software had lost out in the competition for procurement funds. Buying training as a service meant that operation and maintenance funds could be used instead of procurement funds. Overall, however, our work found that to a large degree, this growth simply happened and was not a managed outcome. As the amount and complexity of contracting for services have increased, the size of the civilian workforce has decreased. More significantly, DOD carried out this downsizing without ensuring that it had the requisite skills and competencies needed to manage and oversee service acquisitions. Consequently, DOD is challenged in its ability to maintain a workforce with the requisite knowledge of market conditions, industry trends, and the technical details about the services they procure; the ability to prepare clear statements of work; and the capacity to manage and oversee contractors. Participants in an October 2005 GAO forum on Managing the Supplier Base for the 21st Century commented that the current federal acquisition workforce significantly lacks the new business skills needed to act as contract managers. In June 2006, DOD issued a human capital strategy that acknowledged that DOD's civilian workforce is not balanced by age or experience. DOD's strategy identified a number of steps planned over the next 2 years to more fully develop a long-term approach to managing its acquisition workforce. For example, DOD's Director of Defense Procurement and Acquisition Policy testified in January 2007 that DOD has been developing a model that will address the skills and competencies necessary for DOD's contracting workforce. That model will be deployed this year. The Director stated that this effort would allow DOD to assess the workforce in terms of size, capability, and skill mix, and to develop a comprehensive recruiting, training, and deployment plan to meet the identified capability gaps. A report we issued in November 2006 on DOD space acquisition provides an example of downsizing in a critical area--cost estimating. In this case, there was a belief within the government that cost savings could be achieved under acquisition reform initiatives by reducing technical staff, including cost estimators, since the government would be relying more on commercial-based solutions to achieve desired capabilities. According to one Air Force cost-estimating official we spoke with, this led to a decline in the number of Air Force cost estimators from 680 to 280. According to this official, many military and civilian cost-estimating personnel left the cost-estimating field, and the Air Force lost some of its best and brightest cost estimators. In turn, because of the decline in in-house resources, space program offices and Air Force cost-estimating organizations are now more dependent on support from contractors. For example, at 11 space program offices, contractors accounted for 64 percent of cost- estimating personnel. The contractor personnel now generally prepare cost estimates while government personnel provide oversight, guidance, and review of the cost-estimating work. Reliance on support contractors raises questions from the cost-estimating community about whether numbers and qualifications of government personnel are sufficient to provide oversight of and insight into contractor cost estimates. Turning to Iraq, DOD has relied extensively on contractors to undertake major reconstruction projects and provide support to troops in Iraq. DOD is responsible for a significant portion of the more than $30 billion in appropriated reconstruction funds and has awarded and managed many of the large reconstruction contracts, such as the contracts to rebuild Iraq's oil, water, and electrical infrastructure, as well as to train and equip Iraqi security forces. Further, U.S. military operations in Iraq have used contractors to a far greater extent than in prior operations to provide interpreters and intelligence analysts, as well as more traditional services such as weapons systems maintenance and base operations support. These services are often provided under cost-reimbursement-type contracts, which allow the contractor to be reimbursed for reasonable, allowable, and allocable costs to the extent prescribed in the contracts. Further, these contracts often contain award fee provisions, which are intended to incentivize more efficient and effective contractor performance. If contracts are not effectively managed and given sufficient oversight, the government's risk is likely to increase. For example, we have reported that DOD needs to conduct periodic reviews of services provided under cost-reimbursement contracts to ensure that services are being provided and at an appropriate level and quality. Without such a review, the government is at risk to pay for services it no longer needs. Our work, along with that of the Inspectors General, has repeatedly found problems with the practices DOD uses to acquire services. Too often, the department obtains services based on poorly defined requirements and inadequate competition. Further, DOD's management and use of contractors supporting deployed forces suffers from the lack of clear and comprehensive guidance, among other shortfalls. Similarly, DOD does not always oversee and manage contractor performance, in part due to capacity issues, once a contract is in place. Many of these problems show up in the department's use of other agencies' contracts. Collectively, these problems expose DOD to unnecessary risk, complicate efforts to hold DOD and contractors accountable for poor acquisition outcomes, and increase the potential for fraud, waste, or abuse of taxpayer dollars. Poorly defined or broadly described requirements have contributed to undesired service acquisition outcomes. To produce desired outcomes within available funding and required time frames, DOD and its contractors need to clearly understand acquisition objectives and how they translate into the contract's terms and conditions. The absence of well-defined requirements and clearly understood objectives complicates efforts to hold DOD and contractors accountable for poor acquisition outcomes. Contracts, especially service contracts, often do not have definitive or realistic requirements at the outset needed to control costs and facilitate accountability. This situation is illustrated in the following examples: In June 2004, we found that during Iraqi reconstruction efforts, when requirements were not clear, DOD often entered into contract arrangements that introduced risks. We reported that DOD often authorized contractors to begin work before key terms and conditions, such as the work to be performed and its projected costs, were fully defined. In September 2006, we reported that, under this approach, DOD contracting officials were less likely to remove costs questioned by auditors if the contractor had incurred these costs before reaching agreement on the work's scope and price. In one case, the Defense Contract Audit Agency questioned $84 million in an audit of a task order for an oil mission. In that case, the contractor did not submit a proposal until a year after the work was authorized, and DOD and the contractor did not negotiate the final terms of the contract until more than a year after the contractor had completed the work. We will issue a report later this year on DOD's use of undefinitized contract actions. In July 2004, we noted that personnel using the Army's Logistics Civil Augmentation Program (LOGCAP) contract in Iraq, including those who may be called upon to write statements of work and prepare independent government cost estimates, had not always received the training needed to accomplish their missions. We noted, for example, the statement of work required the contractor to provide water for units within 100 kilometers of designated points but did not indicate how much water needed to be delivered to each unit or how many units needed water. Without such information, the contractor may not be able to determine how to meet the needs of the Army and may take unnecessary steps to do so. Further, we have reported that contract customers need to conduct periodic reviews of services provided under cost-reimbursable contracts to ensure that services provided are supplied at an appropriate level. Without such a review, the government is at risk of paying for services it no longer needs. For example, the command in Iraq lowered the cost of the LOGCAP contract by $108 million by reducing services and eliminating unneeded dining facilities and laundries. Competition is a fundamental principle underlying the federal acquisition process. Nevertheless, we have reported on the lack of competition in DOD's acquisition of services since 1998. We have reported that DOD has, at times, sacrificed the benefits of competition for expediency. For example, we noted in April 2006 that DOD awarded contracts for security guard services supporting 57 domestic bases, 46 of which were done on an authorized, sole-source basis. The sole-source contracts were awarded by DOD despite recognizing it was paying about 25 percent more than previously paid for contracts awarded competitively. In this case, we recommended that the Army reassess its acquisition strategy for contract security guards, using competitive procedures for future contracts and task orders. DOD agreed and is in the process of revising its acquisition strategy. In Iraq, the need to award contracts and begin reconstruction efforts quickly contributed to DOD's using other than full and open competition during the initial stages of reconstruction. While full and open competition can be a tool to mitigate acquisition risks, DOD procurement officials had only a relatively short time--often only weeks--to award the first major reconstruction contracts. As a result, these contracts were generally awarded using other than full and open competition. We recently reported that DOD competed the vast majority of its contract obligations between October 1, 2003, through March 31, 2006. We were able to obtain data on $7 billion, or 82 percent, of DOD's total contract obligations during this period. Our ability to obtain complete information, however, on DOD reconstruction contract actions was limited because not all DOD components consistently tracked or fully reported this information. Since the mid-1990s, our reports have highlighted the need for clear and comprehensive guidance for managing and overseeing the use of contractors that support deployed forces. As we reported in December 2006, DOD has not yet fully addressed this long-standing problem. Such problems are not new. In assessing LOGCAP implementation during the Bosnian peacekeeping mission in 1997, we identified weaknesses in the available doctrine on how to manage contractor resources, including how to integrate contractors with military units and what type of management and oversight structure to establish. We identified similar weaknesses when we began reviewing DOD's use of contractors in Iraq. For example, in 2003 we reported that guidance and other oversight mechanisms varied widely at the DOD, combatant command, and service levels, making it difficult to manage contractors effectively. Similarly, in our 2005 report on private security contractors in Iraq, we noted that DOD had not issued any guidance to units deploying to Iraq on how to work with or coordinate efforts with private security contractors. Further, we noted that the military may not have a clear understanding of the role of contractors, including private security providers, in Iraq and of the implications of having private security providers in the battle space. In our view, establishing baseline policies for managing and overseeing contractors would help ensure the efficient use of contractors in places such as Iraq. DOD addressed some of these issues when it issued new guidance in October 2005 on the use of contractors who support deployed forces. However, as our December 2006 report made clear, DOD's guidance does not address a number of problems we have repeatedly raised--such as the need to provide adequate contract oversight personnel, to collect and share lessons learned on the use of contractors supporting deployed forces, and to provide DOD commanders and contract oversight personnel with training on the use of contractors overseas before deployment. Since our December 2006 report was issued, DOD officials indicated that DOD was developing a joint publication entitled Contracting and Contractor Management in Joint Operations, which is expected to be distributed in May 2007. Our work has also highlighted the need for DOD components to comply with departmental guidance on the use of contractors. For example, in our June 2003 report we noted that DOD components were not complying with a long-standing requirement to identify essential services provided by contractors and develop backup plans to ensure the continuation of those services during contingency operations should contractors become unavailable to provide those services. Other reports highlighted our concerns over DOD's planning for the use of contractor support in Iraq, including the need to comply with guidance to identify operational requirements early in the planning process. When contractors are involved in planning efforts early and given adequate time to plan and prepare to accomplish their assigned tasks, the quality of the contractor's services improves and contract costs may be lowered. DOD's October 2005 guidance on the use of contractor support to deployed forces went a long way to consolidate existing policy and provide guidance on a wide range of contractor issues. However, as of December 2006, we found little evidence that DOD components were implementing that guidance, in part because no individual within DOD was responsible for reviewing DOD's and the services' efforts to ensure the guidance was being consistently implemented. In our 2005 report on LOGCAP we recommended DOD designate a LOGCAP coordinator with the authority to participate in deliberations and advocate the most effective and efficient use of the LOGCAP contract. Similarly, in 2006 we recommended that DOD appoint a focal point within the Office of the Under Secretary of Defense for Acquisition, Technology, and Logistics--at a sufficiently senior level and with the appropriate resources--dedicated to leading DOD's efforts to improve its contract management and oversight. DOD agreed with these recommendations. In October 2006, DOD established the office of the Assistant Deputy Under Secretary of Defense for Program Support to serve as the office of primary responsibility for contractor support issues, but the office's specific roles and responsibilities have not yet been clearly defined. GAO has reported on numerous occasions that DOD did not adequately manage and assess contractor performance to ensure that the business arrangement was properly executed. Managing and assessing post-award performance entails various activities to ensure that the delivery of services meets the terms of the contract and requires adequate surveillance resources, proper incentives, and a capable workforce for overseeing contracting activities. If surveillance is not conducted, not sufficient, or not well documented, DOD is at risk of being unable to identify and correct poor contractor performance in a timely manner and potentially paying too much for the services it receives. Our work has found, however, that DOD is often at risk. In March 2005, for example, we reported instances of inadequate surveillance on 26 of 90 DOD service contracts we reviewed. In each instance, at least one of the key factors to ensure adequate surveillance did not take place. These factors are (1) training personnel in how to conduct surveillance, (2) assigning personnel at or prior to contract award, (3) holding personnel accountable for their surveillance duties, and (4) performing and documenting surveillance throughout the period of the contract. Officials we met with during our review expressed concerns about support for surveillance. The comments included those of Navy officials who told us that surveillance remains a part-time duty they did not have enough time to undertake and, consequently, was a low-priority task. More recently, in December 2006 we reported that DOD does not have sufficient numbers of contractor oversight personnel at deployed locations, which limits its ability to obtain reasonable assurance that contractors are meeting contract requirements efficiently and effectively. For example, an Army official acknowledged that the Army is struggling to find the capacity and expertise to provide the contracting support needed in Iraq. A LOGCAP program official noted that if adequate staffing had been in place, the Army could have realized substantial savings on the LOGCAP contract through more effective reviews of new requirements. A Defense Contract Management Agency official responsible for overseeing the LOGCAP contractor's performance at 27 locations noted that he was unable to visit all of those locations during his 6-month tour to determine the extent to which the contractor was meeting contract requirements. The lack of visibility on the extent of services provided by contractors to deployed forces contributes to this condition. Without such visibility, senior leaders and military commanders cannot develop a complete picture of the extent to which they rely on contractors to support their operations. We first reported the need for better visibility in 2002 during a review of the costs associated with U.S. operations in the Balkans. At that time, we reported that DOD was unaware of (1) the number of contractors operating in the Balkans, (2) the tasks those contractors were contracted to do, and (3) the government's obligations to those contractors under the contracts. We noted a similar situation in 2003 in our report on DOD's use of contractors to support deployed forces in Southwest Asia and Kosovo. Our December 2006 review of DOD's use of contractors in Iraq found continuing problems with visibility over contractors. For example, when senior military leaders began to develop a base consolidation plan, officials were unable to determine how many contractors were deployed and therefore ran the risk of over- or under-building the capacity of the consolidated bases. DOD's October 2005 guidance on contractor support to deployed forces included a requirement that the department develop or designate a joint database to maintain by-name accountability of contractors deploying with the force and a summary of the services or capabilities contractors provide. The Army has taken the lead in this effort, and recently DOD designated a database intended to provide improved visibility over contractors deployed to support the military in Iraq, Afghanistan, and elsewhere. According to DOD, in January 2007, the department designated the Army's Synchronized Predeployment & Operational Tracker (SPOT) as the departmentwide database to maintain by-name accountability of all contractors deploying with the force. According to DOD, the SPOT database includes approximately 50,000 contractor names. Additionally, in December 2006, the Defense Federal Acquisition Regulation Supplement was amended to require the use of the SPOT database by contractors supporting deployed forces. In January 2005, we identified management of interagency contracts as a high-risk area because of their rapid growth, limited expertise of users and administrators, and unclear lines of accountability. Since DOD is the largest user of interagency contracts in the government, it can ill-afford to expose itself to such risks. Relying on other agencies for contracting support requires sound practices. For example, under an interagency arrangement, the number of parties in the contracting process increases, and ensuring the proper use of these contracting arrangements must be viewed as a shared responsibility that requires agencies to define clearly who does what in the contracting process. However, the problems I discussed previously regarding defining requirements, ensuring competition, and monitoring contractor performance are frequently evident in interagency contracting. Additionally, DOD pays a fee to other agencies when using their contracts or contracting services, which could potentially increase DOD costs. Our work, as well as that of the Inspectors General, found competition- related issues on DOD's use of interagency contracting vehicles. DOD is required to foster competition and provide all contractors a fair opportunity to be considered for each order placed on GSA's multiple- award schedules, unless certain exceptions apply. DOD officials, however, have on numerous occasions avoided the time and effort necessary to award individual orders competitively and instead awarded all the work to be performed to a single contractor. We found that this practice resulted in the noncompetitive award of many orders that have not always been adequately justified. In April 2005, we reported that a lack of effective management controls-- in particular insufficient management oversight and a lack of adequate training--led to breakdowns in the issuance and administration of task orders for interrogation and other services in Iraq by the Department of the Interior on behalf of DOD. These breakdowns included: issuing 10 out of 11 task orders that were beyond the scope of underlying contracts, in violation of competition rules; not complying with additional DOD competition requirements when issuing task orders for services on existing contracts; not properly justifying the decision to use interagency contracting; not complying with ordering procedures meant to ensure best value for not adequately monitoring contractor performance. Because officials at Interior and the Army responsible for the orders did not fully carry out their responsibilities, the contractor was allowed to play a role in the procurement process normally performed by government officials. Further, the Army officials responsible for overseeing the contractor, for the most part, lacked knowledge of contracting issues and were not aware of their basic duties and responsibilities. In July 2005, we reported on various issues associated with DOD's use of franchise funds at the departments of the Interior and the Treasury-- GovWorks and FedSource--that acquired a range of services for DOD. For example, GovWorks did not receive competing proposals for work and added substantial work to the orders without determining that prices were fair and reasonable. FedSource generally did not ensure competition for work, did not conduct price analyses, and sometimes paid contractors higher prices for services than were specified in the contracts, with no justification in the contract files. At both funds, we found that the files we reviewed lacked clear descriptions of requirements the contractor was supposed to meet. For its part, DOD did not analyze contracting alternatives and lacked information about purchases made through these arrangements. We also found DOD and franchise fund officials were not monitoring contracts and lacked criteria against which contractor performance could be measured to ensure that contractors provided quality services in a timely manner. We identified several causes for the lack of sound practices. In some cases, there was a lack of clear guidance and contracting personnel were insufficiently trained on the use of interagency contracting arrangements. In many cases, DOD users chose the speed and convenience of an interagency contracting arrangement to respond and meet needs quickly. Contracting service providers, under a fee-for-service arrangement, sometimes inappropriately emphasized customer satisfaction and revenue generation over compliance with sound contracting policies and procedures. These practices put DOD at risk of not getting required services at reasonable prices and unnecessarily wasting resources. Further, DOD does not have useful information about purchases made through other agencies' contracts, making it difficult to assess the costs and benefits and make informed choices about the alternatives methods available. Similarly, the DOD Inspector General recently reported on issues with DOD's use of contracts awarded by the departments of the Interior and the Treasury, GSA, and the National Aeronautics and Space Administration (NASA). For example, in November 2006, the Inspector General reported that DOD contracting and program personnel did not comply with acquisition rules and regulations when using contracts awarded by NASA, such as not always complying with fair opportunity requirements or not adequately justifying the use of a non-DOD contracting vehicle. As a result, the Inspector General concluded that funds were not used as intended by Congress, competition was limited, and DOD had no assurance that it received the best value. Additionally, the Inspector General found that DOD used Interior and GSA to "park" funds that were expiring. The agencies then subsequently placed contracts for DOD using the expired funds, thereby circumventing appropriations law. The Inspector General concluded that these problems were driven by a desire to hire a particular contractor, the desire to obligate expiring funds, and the inability of the DOD contracting workforce to respond to its customers in a timely manner. DOD and other agencies have taken steps to address some of these issues, including issuing an October 2006 memorandum intended to strengthen internal controls over the use of interagency contracts and signing a December 2006 memorandum of understanding with GSA to work together on 22 basic contracting management controls, including ensuring that sole-source justifications are adequate, that statements of work are complete, and that interagency agreements describe the work to be performed. Similarly, GSA has worked with DOD to identify unused and expired DOD funds maintained in GSA accounts. Further, according to the Inspector General, Interior has withdrawn numerous warrants in response to these findings. Congress and GAO have identified the need to improve DOD's overall approach to acquiring services for several years. In 2002, we noted that DOD's approach to buying services was largely fragmented and uncoordinated. Responsibility for acquiring services was spread among individual military commands, weapon system program offices, or functional units on military bases, and with little visibility or control at the DOD or military department level. Despite taking action to address the deficiencies and implement legislative requirements, DOD's actions to date have not equated with progress. DOD's current approach to acquiring services suffers from the absence of key elements at the strategic and transactional levels and does not position the department to make service acquisitions a managed outcome. Considerable congressional effort has been made to improve DOD's approach to acquiring services. For example, in 2001, Congress passed legislation to ensure that DOD acquires services by means that are in the best interest of the government and managed in compliance with applicable statutory requirements. In this regard, sections 801 and 802 of the National Defense Authorization Act for Fiscal Year 2002 required DOD to establish a service acquisition management approach, including developing a structure for reviewing individual service transactions based on dollar thresholds and other criteria. Last year, Congress amended requirements pertaining to DOD's service contracting management structure, workforce, and oversight processes, among others. We have issued several reports that identified shortcomings in DOD's approaches and its implementation of legislative requirements. For example, we issued a report in January 2002 that identified how leading commercial companies took a strategic approach to buying services and recommended that DOD evaluate how a strategic reengineering approach, such as that employed by leading companies, could be used as a framework to guide DOD's reengineering efforts. In September 2003, we reported that DOD's actions to implement the service acquisition management structure required under Sections 801 and 802 did not provide a departmentwide assessment of how spending for services could be more effective and recommended that DOD give greater attention to promoting a strategic orientation by setting performance goals for improvements and ensuring accountability for achieving those results. Most recently, in November 2006, we issued a report that identified a number of actions that DOD could take to improve its acquisition of services. We noted that DOD's overall approach to managing services acquisitions suffered from the absence of several key elements at both a strategic and transactional level. The strategic level is where the enterprise, DOD in this case, sets the direction or vision for what it needs, captures the knowledge to enable more informed management decisions, ensures departmentwide goals and objectives are achieved, determines how to go about meeting those needs, and assesses the resources it has to achieve desired outcomes. The strategic level also sets the context for the transactional level, where the focus is on making sound decisions on individual service acquisitions. Factors for good outcomes at the transactional level include valid and well-defined requirements, appropriate business arrangements, and adequate management of contractor performance. DOD's current approach to managing the acquisition of services tended to be reactive and did not fully addressed the key factors for success at either the strategic or the transactional level. At the strategic level, DOD had not developed a normative position for gauging whether ongoing and planned efforts can best achieve intended results. Further, DOD lacked good information on the volume and composition of services, perpetuating the circumstance in which the acquisition of services tended to happen to DOD, rather than being proactively managed. For example, despite implementing a review structure aimed at increasing insight into service transactions, DOD was not able to determine which or how many transactions had been reviewed. The military departments had only slightly better visibility, having reviewed proposed acquisitions accounting for less than 3 percent of dollars obligated for services in fiscal year 2005. Additionally, most of the service acquisitions the military services review involved indefinite delivery/indefinite quantity contracts. DOD's policy for managing service acquisitions had no requirement, however, to review individual task orders that were subsequently issued even if the value of the task order exceeded the review threshold. Further, the reviews tended to focus more on ensuring compliance with applicable statutes, regulations, and other requirements, rather than on imparting a vision or tailored method for strategically managing service acquisitions. Our discussions with officials at buying activities that had proposed service acquisitions reviewed under this process revealed that, for the most part, officials did not believe the review significantly improved those acquisitions. These officials indicated that the timing of the review process--which generally occurred well into the planning cycle--was too late to provide opportunities to influence the acquisition strategy. These officials told us that the reviews would be more beneficial if they were conducted earlier in the process, in conjunction with the program office or customer, and in the context of a more strategic approach to meeting the requirement, rather than simply from a secondary or tertiary review of the contract. At the transactional level, DOD tended to focus primarily on those elements associated with awarding contracts, with much less attention paid to formulation of service acquisition requirements and to assessment of the actual delivery of contracted services. Moreover, the results of individual acquisitions were generally not used to inform or adjust strategic direction. As a result, DOD was not in a position to determine whether investments in services are achieving their desired outcomes. Further, DOD and military department officials identified many of the same problems in defining requirements, establishing sound business arrangements, and providing effective oversight that I discussed previously, as the following examples show: DOD and military department officials consistently identified poor communication and the lack of timely interaction between acquisition and contracting personnel as key challenges to developing good requirements. An Army contracting officer issued a task order for a product that the contracting officer knew was outside the scope of the service contract. The contracting officer noted in an e-mail to the requestor that this deviation was allowed only because the customer needed the product quickly and cautioned that no such allowances would be granted in the future. Few of the commands or activities could provide us reliable or current information on the number of service acquisitions they managed, and others had not developed a means to consistently monitor or assess, at a command level, whether such acquisitions were meeting the performance objectives established in the contracts. To address these issues, we made several recommendations to the Secretary of Defense. DOD concurred with our recommendations and identified actions it has taken, or plans to take, to address them. In particular, DOD noted that it is reassessing its strategic approach to acquiring services, including examining the types and kinds of services it acquires and developing an integrated assessment of how best to acquire such services. DOD expects this assessment will result in a comprehensive, departmentwide architecture for acquiring services that will, among other improvements, help refine the process to develop requirements, ensure that individual transactions are consistent with DOD's strategic goals and initiatives, and provide a capability to assess whether service acquisitions are meeting their cost, schedule, and performance objectives. In closing, I would like to emphasize that DOD has taken, or is in the process of taking, action to address the issues we identified. These actions, much like the assessment I just mentioned, however, will have little meaning unless DOD's leadership can translate its vision into changes in frontline practices. In our July 2006 report on vulnerabilities to fraud, waste, and abuse, we noted that leadership positions are sometimes vacant, that the culture to streamline acquisitions for purposes of speed may have not been balanced with good business practices, and that even in newly formed government-industry partnerships, the government needs to maintain its oversight responsibility. Understanding the myriad causes of the challenges confronting DOD in acquiring services is essential to developing effective solutions and translating policies into practices. While DOD has generally agreed with our recommendations intended to improve contract management, much remains to be done. At this point, DOD does not know how well its services acquisition processes are working, which part of its mission can best be met through buying services, and whether it is obtaining the services it needs while protecting DOD's and the taxpayer's interests. Mr. Chairman and members of the subcommittee, this concludes my testimony. I would be happy to answer any questions you might have. In preparing this testimony, we relied principally on previously issued GAO and Inspectors General reports. We conducted our work in May 2007 in accordance with generally accepted government auditing standards. For further information regarding this testimony, please contact John P. Hutton at (202) 512-4841 or [email protected]. Contact points for our Offices of Congressional Relations and Public Affairs can be found on the last page of this testimony. Key contributors to this testimony were Theresa Chen, Timothy DiNapoli, Kathryn Edelman, and John Krump. This is a work of the U.S. government and is not subject to copyright protection in the United States. It may be reproduced and distributed in its entirety without further permission from GAO. However, because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately. | The Department of Defense (DOD) is relying more and more on contractors to provide billions of dollars in services. Congress has pushed DOD to employ sound business practices when using the private sector for services. This testimony discusses DOD's (1) increasing reliance on contractors; (2) efforts to follow sound business practices when acquiring services; and (3) actions to improve its management and oversight of services. This testimony is based on GAO's work spanning several years as well as recent reports issued by the Inspectors General. Over the past decade, DOD has increasingly relied on contractors to provide a range of mission-critical services from operating information technology systems to providing logistical support on the battlefield. The growth in spending on services clearly illustrates this point. DOD's obligations on service contracts, expressed in constant fiscal year 2006 dollars, rose from $85.1 billion in fiscal year 1996 to more than $151 billion in fiscal year 2006, a 78 percent increase. While obligations increased, the size of the civilian workforce decreased. Moreover, DOD carried out this downsizing without ensuring that it had the requisite skills and competencies needed to manage and oversee service acquisitions. Overall, our work found that to a large degree, this growth in spending on services simply happened and was not a managed outcome. The lack of sound business practices--poorly defined requirements, inadequate competition, the lack of comprehensive guidance and visibility on contractors supporting deployed forces, inadequate monitoring of contractor performance, and inappropriate use of other agencies' contracts and contracting services--expose DOD to unnecessary risk, waste resources, and complicate efforts to hold contractors accountable for poor service acquisition outcomes. For example, DOD awarded contracts for security guard services supporting 57 domestic bases, 46 of which were done on an authorized, sole-source basis. The sole-source contracts were awarded by DOD despite recognizing it was paying about 25 percent more than previously paid for contracts awarded competitively. Further, the lack of sufficient surveillance on service contracts placed DOD at risk of being unable to identify and correct poor contractor performance in a timely manner and potentially paying too much for the services it receives. Overall, DOD's management structure and processes overseeing service acquisitions lacked key elements at the strategic and transactional levels. DOD has taken some steps to improve its management of services acquisition, including developing a competency model for its contracting workforce; issuing policies and guidance to improve its management of contractors supporting deployed forces and its use of interagency contracts; and developing an integrated assessment of how best to acquire services. DOD leadership will be critical for translating this assessment into policy and, most importantly, effective frontline practices. At this point, DOD does not know how well its services acquisition processes are working, which part of its mission can best be met through buying services, and whether it is obtaining the services it needs while protecting DOD's and the taxpayer's interests. | 7,321 | 622 |