Unnamed: 0
int64
0
4.52k
report
stringlengths
7
512
3,800
single-family loan insurance programs, such as the borrower’s address, Social Security number, income, and debt are not collected by HUD when Title I loans are made. HUD does collect all of the information available on borrowers, property, and loans when Title I loans default and lenders submit claims. Title I officials told us they collected little information when loans were made because they consider the program to be lender-operated. As a result, HUD cannot identify the characteristics of borrowers and
3,801
neighborhoods served by the program, nor can it identify certain potential abuses of the program. For example, HUD does not collect borrowers’ Social Security numbers and property addresses when loans are made. Therefore, HUD would have difficulty determining if some borrowers are obtaining multiple Title I loans or if some borrowers are exceeding the maximum amount of Title I loans per property when loans are made. HUD regulations limit the total amount of indebtedness on Title I loans to $25,000 for each
3,802
single-family property. In this regard, our examination of HUD’s Title I claims data found a number of instances in which the same Social Security number was used for multiple claims. As discussed previously, claims on about 10 percent of the program’s loans can be expected over the life of program loans. Our examination of 16,556 claims paid by HUD between January 1994 and August 1997 revealed 247 instances in which the same Social Security number appeared on multiple claims. These cases totaled about $5.
3,803
2 million in paid claims. In several instances, claims were paid on as many as five loans having the same Social Security number during the 3-1/2-year period. Our Office of Special Investigations, together with HUD’s Office of the Inspector General, is inquiring further into the circumstances surrounding these loans. However, because these loans may have been for multiple properties, or multiple loans on the same property that totaled less than $25,000, they may not have violated program regulations. Allowi
3,804
ng individual borrowers to accumulate large amounts of Title I HUD insured debt, however, exposes HUD to large losses in the case of financial stress on the part of such heavily indebted borrowers. In addition, while information available to HUD allows identification of potential abuses of the $25,000 indebtedness limit after loans have defaulted, control over the indebtedness limitation is not possible for 90 percent of the program’s loans made that do not default because borrowers’ Social Security numbers
3,805
and property addresses are not collected when the loans are made. While HUD collects more extensive information on program loans when they default, we found problems with the accuracy of some of the information recorded in its claims database. Our random sample of 53 loans on which a claim had been denied and subsequently paid by HUD, found that 7 loans, or 13 percent, had been miscoded as dealer loans when they were direct loans, or direct loans when they were dealer loans. This is important because HUD r
3,806
ecently cited high default rates on dealer loans, among other reasons, for proposing regulations to eliminate the dealer loan portion of the program. Considering the miscoding on identifying loans as dealer or direct, we question HUD’s ability to identify default experience by loan type. In addition, HUD’s information on claims denied and subsequently approved was problematic. Although HUD can deny claims for property improvement loans for a number of reasons, HUD did not have a system in place to provide i
3,807
nformation on why claims are denied or approved for payment following a denial. HUD could not provide us with information on how many claims it denied because of poor underwriting or other program abuses or which lenders had a higher-than-average number of claims denied for specific program violations. In addition, we were unable to determine from HUD’s data system why a denied claim was subsequently paid following an appeal by the lender or waiver by HUD. Such information is important in determining how we
3,808
ll lenders are complying with program regulations, whether internal controls need to be strengthened, and which lenders should be targeted for review by HUD’s Office of Quality Assurance. We also found that files for claims that were initially denied by HUD and subsequently paid frequently did not contain the names of program officials who decided the denied claims should be paid and the reasons for their decisions. Of the 53 randomly selected loan claim files we examined, 50 contained no evidence of furthe
3,809
r review by a HUD official following the initial denial or provided any basis for eventually paying the claim. Unless information on who makes decisions to deny claims and the reasons for the denial and subsequent payments are documented, HUD has no basis for reviewing the reasonableness of those decisions. HUD recently made changes to its claims database system to identify the reasons claims are denied. Program officials agreed that such information is important in determining how well program regulations
3,810
are being complied with and in targeting lenders for quality assurance reviews. Claims examiners are now required to identify their reasons for denial, including the section of the regulation that was violated. However, the change does not address the problem of missing documentation in the claims file explaining the reasons for paying claims that were previously denied. HUD’s monitoring reviews of Title I lenders to identify compliance problems have declined substantially in recent years. Between fiscal ye
3,811
ars 1995 and 1997, HUD performed 33 Title I on-site quality assurance reviews of lenders. Most of these reviews (26) were performed in fiscal year 1995. During fiscal years 1996 and 1997, HUD performed five and two on-site lender reviews, respectively. According to HUD officials, prior to fiscal year 1997, HUD had a staff of 23 individuals to monitor the 3,700 lenders approved by FHA to make Title I loans and about 8,000 other FHA approved lenders making loans on other FHA insurance programs. Because of thi
3,812
s limited monitoring resource, HUD decided to focus its lender monitoring on major high volume FHA programs, according to these HUD officials. Monitoring priorities have also led to few follow-up reviews by HUD. As a result, it is difficult to determine the impact of the quality assurance reviews that were performed on improving lenders’ compliance. When making Title I loans, lenders are required to ensure that borrowers represent acceptable credit risks, with a reasonable ability to make payments on the lo
3,813
ans, and to see that the property improvement work is completed. However, our examination of 53 loan claim files revealed that one or more required documents needed to ensure program compliance were missing from more than half (30) of the files. In 12 cases, the required original loan application, signed by the borrower, was not in the loan file. The original loan application is important because it is used by the claims examiner to review the adequacy of the lender’s underwriting and to ensure that the bor
3,814
rower’s signature and Social Security number matches those on other documents, including the credit report. Furthermore, for 23 of the 53 claim files, we found that required completion certificates, certifying that the property improvement work had been completed, were missing or were signed but not dated by the borrowers. According to program guidelines, claims submitted for payment after defaults have occurred on dealer loans should not be paid unless a signed completion certificate is in the file. We fou
3,815
nd that completion certificates were missing from the files for 13 dealer loans and were not dated for another 4 dealer loans. Lastly, for 33 loans on which program regulations required that an inspection be conducted by the lender, 18 loan files did not contain the report. We also reviewed the 53 claim files to determine how well lenders were complying with underwriting standards. All documentation supporting the underwriting determination should be retained in the loan file, according to HUD regulations.
3,816
HUD can deny a lender’s claim if the lender has not followed HUD underwriting standards in making the loan. However, HUD does not examine the quality of a lender’s loan underwriting during the claims process if 12 loan payments were made by the borrower before defaulting on the loan. Since 27 percent of the Title I loans that default do so within the first year, this practice, in effect, exempts the majority of defaulted loans from an examination of the quality of the lenders’ underwriting. Of the 53 loans
3,817
in our sample, 13 defaulted within 12 months of loan origination and were subject to an underwriting review by HUD. We focused our underwriting examination on these 13 loan claim files. We found that for 4 of the 13 loans, on which HUD eventually paid claims, lenders made questionable underwriting decisions. Title I program regulations require that the credit application and review by the lender must establish that the borrower, is an acceptable credit risk, had 2 years of stable employment, and that his/he
3,818
r income will be adequate to meet the periodic payments required by the loan, as well as the borrower’s other housing expenses and recurring charges. However, for four of these loans, information in the files indicated that the borrowers may not have had sufficient income to qualify for the loan or had poor credit. For example, on one loan, the lender used a pay stub covering the first 2 weeks of March to calculate the borrower’s annual income. The pay stub showed that the borrower’s year-to-date earnings w
3,819
ere $6,700 by the middle of March, and this amount was used to calculate that his annual income was $34,000, or about $2,800 per month. However, the pay stub also showed that for the 2-week period in March, the borrower worked a full week with overtime and only earned $725, or about $1,600 per month. The file contained no other documentation, such as income tax returns, W-2 forms, or verification from the employer to support the higher monthly income. Program officials told us that it was acceptable to use
3,820
one pay stub to calculate monthly income; however, the “yearly earnings to date” figure should not be used because it can at times inflate the actual income earned during a normal pay period. The borrower, with about $1,600 per month in corrected income, still met HUD’s income requirements for the amount of the loan. However, HUD denied the original claim because its underwriting standards had not been followed in that the borrower had poor credit at the time the loan was made. In a letter responding to HUD
3,821
’s denial of its claim, the lender acknowledged that the borrower had limited credit at the time the loan was made, but pointed out the (mis-calculated) higher income of $2,800 per month to justify making the loan. This reasoning was apparently accepted by HUD as there was no evidence in the claim file that HUD questioned the error in calculating the borrower’s monthly income. The borrower defaulted on the loan after making two payments, and HUD paid a claim of $14,000. Similar problems with lenders’ noncom
3,822
pliance with Title I program regulations have been identified by HUD. As noted previously, between fiscal years 1995 and 1997, HUD performed 33 Title I on-site quality assurance reviews of lenders. Among other things, HUD cited lenders for engaging in poor credit underwriting practices and having loan files with missing inspection reports or inspection reports that were not signed or dated. HUD sent the lenders letters detailing its findings and requested a written response addressing the findings. HUD, how
3,823
ever, did not perform follow-up, on-site reviews on 32 lenders to ensure that they had taken corrective actions. For the 33 on-site reviews, nine lenders were referred to HUD’s Mortgagee Review Board for further action. The Board assessed four of these lenders a total of $23,500 in civil penalties. Under its HUD 2020 Management Reform Plan and related efforts, HUD has been making changes to the Title I program operations. HUD has relocated its claims examination unit to the Albany (New York) Financial Opera
3,824
tions Center and contracted with Price Waterhouse to develop claims examination guidelines. According to program officials in Albany, the new claims process will be more streamlined and automated and include lenders filing claims electronically. In addition, HUD is consolidating all single-family housing operations from 81 locations across the nation into four Single-Family Homeownership Centers. Each center has established a quality assurance division to (1) monitor lenders, (2) recommend sanctions against
3,825
lenders and other program participants such as contractors and loan officers, (3) issue limited denials of program participation against program participants, and (4) refer lenders for audits/investigations. However, since HUD’s quality assurance staff will monitor lenders involved in all FHA single-family programs, the impact of this change on improving HUD’s oversight of Title 1 lenders is unclear. Overall, by the end of fiscal year 1998, the quality assurance staff will increase to 76, up from 43 in Feb
3,826
ruary 1998. HUD expects that the addition of more quality assurance staff will increase the number of reviews of lenders and allow more comprehensive reviews of lender operations. In closing, Mr. Chairman, our preliminary analysis shows weaknesses in HUD’s management of its Title I property improvement loan insurance program and oversight of program lenders. These weaknesses center on the absence of information needed to manage the program and HUD’s oversight of lenders’ compliance with program regulations.
3,827
HUD officials attributed these weaknesses to the program’s being lender-operated, limited staff resources, and HUD’s assignment of monitoring priorities. Because of these weaknesses, we are concerned that HUD may have little assurance that the property improvement program is operating efficiently and free of abuse. The challenge faced by HUD in managing and overseeing this program centers on how to obtain the information needed to manage the program and to strengthen the oversight of lenders for this progr
3,828
am, which is relatively small compared with other FHA housing insurance programs. Our report will include any recommendations or options we have to offer to strengthen HUD’s management and oversight of the program. Mr. Chairman, this concludes my statement. We would be pleased to respond to any questions that you or Members of the Subcommittee may have. The first copy of each GAO report and testimony is free. Additional copies are $2 each. Orders should be sent to the following address, accompanied by a che
3,829
ck or money order made out to the Superintendent of Documents, when necessary. VISA and MasterCard credit cards are accepted, also. Orders for 100 or more copies to be mailed to a single address are discounted 25 percent. U.S. General Accounting Office P.O. Box 37050 Washington, DC 20013 Room 1100 700 4th St. NW (corner of 4th and G Sts. NW) U.S. General Accounting Office Washington, DC Orders may also be placed by calling (202) 512-6000 or by using fax number (202) 512-6061, or TDD (202) 512-2537. Each day
3,830
, GAO issues a list of newly available reports and testimony. To receive facsimile copies of the daily list or any list from the past 30 days, please call (202) 512-6000 using a touchtone phone. A recorded menu will provide information on how to obtain these lists.
3,831
DLA is DOD’s logistics manager for all departmental consumable items and some repair parts. Its primary business function is materiel management: providing supply support to sustain military operations and readiness. In addition, DLA performs five other supply-related business functions: distributing materiel from DLA and service-owned inventories, purchasing fuels for DOD and the U.S. government, storing strategic materiel, marketing surplus DOD materiel for reuse and disposal, and providing numerous infor
3,832
mation services, such as item cataloging, for DOD and the U.S. government, as well as selected foreign governments. These six business functions are managed by field commands that report to and support the agency’s central command authority. In 2000, DLA refocused its logistics mission from that of a supplier of materiel to a manager of supply chain relationships. To support this transition, the agency developed a strategic plan (known as DLA 21) to reengineer and modernize its operations. Among the goals o
3,833
f DLA 21 are to optimize inventories, improve efficiency, increase effectiveness through organizational redesign, reduce inventories, and modernize business systems. DLA relies on over 650 systems to support warfighters by allowing access to global inventories. Whether it is ensuring that there is enough fuel to service an aircraft fleet, providing sufficient medical supplies to protect and treat military personnel, or supplying ample food rations to our soldiers on the frontlines, information technology pl
3,834
ays a key role in ensuring that Defense Department agencies are prepared for their missions. Because of its heavy reliance on IT to accomplish its mission, DLA invests extensively in this area. For fiscal year 2002, DLA’s IT budget is about $654 million. Our recent reviews of DLA’s IT management have identified weaknesses in such important areas as enterprise architecture management, incremental investment management, and software acquisition management. In June 2001, we reported that DLA did not have an en
3,835
terprise architecture to guide the agency’s investment in its Business Systems Modernization (BSM) project—the agency’s largest IT project. The use of an enterprise architecture, which describes an organization’s mode of operation in useful models, diagrams, and narrative, is required by the OMB guidance that implements the Clinger-Cohen Act of 1996 and is a commercial best practice. Such a “blueprint” can help clarify and optimize the dependencies and relationships among an agency’s business operations and
3,836
the IT infrastructure and applications supporting them. An effective architecture describes both the environment as it is and the target environment that an organization is aiming for (as well as a plan for the transition from one to the other). We concluded that without this architecture, DLA will be challenged in its efforts to successfully acquire and implement BSM. Further, we reported that DLA was not managing its investment in BSM in an incremental manner, as required by the Clinger-Cohen Act of 1996
3,837
and OMB guidance and in accordance with best commercial practices. An incremental approach to investment helps to minimize the risk associated with such large-scale projects as BSM. Accordingly, we recommended that DLA make the development, implementation, and maintenance of an enterprise architecture an agency priority and take steps to incrementally justify and validate its investment in BSM. According to DLA officials, the agency is addressing these issues. In January 2002, we reported a wide disparity
3,838
in the rigor and discipline of software acquisition processes between two DLA systems. Such inconsistency in processes for acquiring software (the most costly and complex component of systems) can lead to the acquisition of systems that do not meet the information needs of management and staff, do not provide support for necessary programs and operations, and cost more and take longer than expected to complete. We also reported that DLA did not have a software process-improvement program in place to effecti
3,839
vely strengthen its corporate software acquisition processes, having eliminated the program in 1998. Without a management-supported software process-improvement program, it is unlikely that DLA can effectively improve its institutional software acquisition capabilities, which in turn means that the agency’s software projects will be at risk of not delivering promised capabilities on time and within budget. Accordingly, we recommended that DLA institute a software process-improvement program and correct the
3,840
software acquisition process weaknesses that we identified. According to DLA officials, the agency is addressing each of these issues. In May 2000, we issued the Information Technology Investment Management (ITIM) maturity framework, which identifies critical processes for successful IT investment and organizes these processes into an assessment framework comprising five stages of maturity. This framework supports the fundamental requirements of the Clinger-Cohen Act of 1996, which requires IT investment an
3,841
d capital planning processes and performance measurement. Additionally, ITIM can provide a useful roadmap for agencies when they are implementing specific, fundamental IT capital planning and investment management practices. The federal Chief Information Officers Council has favorably reviewed the framework, and it is also being used by a number of executive agencies and organizations for designing related policies and procedures and self-led or contractor-based assessments. ITIM establishes a hierarchical
3,842
set of five different maturity stages. Each stage builds upon the lower stages and represents increased capabilities toward achieving both stable and effective (and thus mature) IT investment management processes. Except for the first stage—which largely reflects ad hoc, undefined, and undisciplined decision and oversight processes—each maturity stage is composed of critical processes essential to satisfy the requirements of that stage. These critical processes are defined by core elements that include orga
3,843
nizational commitment (for example, policies and procedures), prerequisites (for example, resource allocation), and activities (for example, implementing procedures). Each core element is composed of a number of key practices. Key practices are the specific tasks and conditions that must be in place for an organization to effectively implement the necessary critical processes. Figure 1 shows the five ITIM stages and a brief description of each stage. Using ITIM, we assessed the extent to which DLA satisfied
3,844
the five critical processes in stage 2 of the framework. Based on DLA’s acknowledgment that it had not executed any of the key practices in stage 3, we did not independently assess the agency’s capabilities in this stage or stages 4 and 5. To determine whether DLA had implemented the stage 2 critical processes, we compared relevant DLA policies, procedures, guidance, and documentation associated with investment management activities to the key practices and critical processes in ITIM. We rated the key prac
3,845
tices as “executed” based on whether the agency demonstrated (by providing evidence of performance) that it had met the criteria of the key practice. A key practice was rated as “not executed” when we found insufficient evidence of a practice during the review, or when we determined that there were significant weaknesses in DLA’s execution of the key practice. As part of our analysis, we selected four IT projects as case studies to verify application of the critical processes and practices. We selected proj
3,846
ects that (1) supported different DLA business areas (such as materiel management), (2) were in different lifecycle phases (for example, requirements definition, design, operations and maintenance), (3) represented different levels of risk (such as low or medium) as designated by the agency, and (4) included at least one investment that required funding approval by a DOD authority outside of DLA (for example, the Office of the Secretary of Defense (OSD)). The four projects are the following: Business System
3,847
s Modernization: This system, which supports DLA’s materiel management business area, is in the concept demonstration phase of development. DLA reported that it spent about $136 million on this system in fiscal year 2001, and it has budgeted about $133 million for fiscal year 2002. BSM is intended to modernize DLA’s materiel management business function, replacing two of its standard systems (the Standard Automated Materiel Management System and the Defense Integrated Subsistence Management System). The pro
3,848
ject is also intended to enable the agency to reengineer its logistics practices to reflect best commercial business practices. For example, in support of DLA’s goal of reducing its role as a provider and manager of materiel and increasing its role as a manager of supply chain relationships, BSM is to help link customers with appropriate suppliers and to incorporate commercial business practices regarding physical distribution and financial management. The agency has classified this project as high risk, an
3,849
d OSD has funding approval authority for this project. Hazardous Materials Information System (HMIS): This system, which supports DLA’s logistics operations function, was implemented in 1978. In fiscal year 2001, DLA reported that it spent about $1 million on this system and budgeted about $2.4 million for fiscal year 2002. In 1999 DLA began a redesign effort to transform HMIS into a Web-based system with a direct interface to the manufacturers and suppliers of hazardous material. The project is in the deve
3,850
lopment stage. It contains data on the chemical composition of materials classified as “hazardous” for the purposes of usage, storage, and transportation. The system is used by Emergency Response Teams whenever a spill or accident occurs involving hazardous materials. The agency classified this project as low risk, and funding approval occurs within DLA. The Defense Reutilization and Marketing Automated Information System (DAISY): This system, which supports DLA’s materiel reuse and disposal mission, is in
3,851
the operations and maintenance lifecycle phase. The agency reported that it spent approximately $4.4 million on DAISY in fiscal year 2001, and it has budgeted about $7 million for fiscal year 2002. This system is a repository for transactions involving the reutilization, transfer, donation, sale, or ultimate disposal of excess personal property from DOD, federal, and state agencies. The excess property includes spare and repair parts, scrap and recyclable material, precious metals recovery, hazardous materi
3,852
al, and hazardous waste disposal. Operated by the Defense Reutilization and Marketing Service, the system is used at 190 locations worldwide. The agency classified this project as low risk, and funding approval occurs within DLA. Standard Automated Materiel Management System (SAMMS): This system, which supports DLA’s materiel management business area, is 30 years old and approaching the end of its useful life. The agency reports that investment in SAMMS (budgeted at approximately $19 million for fiscal year
3,853
2002) is directed toward keeping the system operating until its replacement, BSM, becomes fully operational (scheduled for fiscal year 2005). This system provides the Inventory Control Points with information regarding stock levels, as well as with the capabilities required for (1) acquisition and management of wholesale consumable items, (2) direct support for processing requisitions, (3) forecasting of requirements, (4) generation of purchase requests, (5) maintenance of technical data, (6) financial man
3,854
agement, (7) identification of items, and (8) asset visibility. The agency has classified the maintenance of SAMMS as a low risk effort, and funding approval occurs within DLA. For these projects, we reviewed project management documentation, such as mission needs statements, project plans, and status reports. We also analyzed charters and meeting minutes for DLA oversight boards, DLA’s draft Automated Information System Emerging Program Life Management (LCM) Review and Milestone Approval Directive and Port
3,855
folio Management and Oversight Directives, and DOD’s 5000 series guidance on systems acquisition. In addition, we reviewed documentation related to the agency’s self-assessment of its IT investment operations. To supplement our document reviews, we interviewed senior DLA officials, including the vice director (who sits on the Corporate Board, DLA’s highest level investment decisionmaking body), the chief information officer (CIO), the chief financial officer, and oversight board members. We also interviewed
3,856
the program managers of our four case study projects, as well as officials responsible for managing the IT investment process and other staff within Information Operations. To determine what actions DLA has taken to improve its IT investment management processes, we interviewed the CIO and officials of the Policy, Plans, and Assessments and the program executive officer (PEO) operations groups within the Information Operations Directorate. These groups are primarily responsible for implementing investment
3,857
management process improvements. We also reviewed a draft list of IT investment management improvement tasks. We conducted our work at DLA headquarters in Fort Belvoir, Virginia, from June 2001 through January 2002, in accordance with generally accepted government auditing standards. In order to have the capabilities to effectively manage IT investments, an agency should (1) have basic, project-level control and selection practices in place and (2) manage its projects as a portfolio of investments, treating
3,858
them as an integrated package of competing investment options and pursuing those that best meet the strategic goals, objectives, and mission of the agency. DLA has a majority of the project-level practices in place. However, it is missing several crucial practices, and it is not performing portfolio-based investment management. According to the CIO, the evolving state of its investment management capabilities is the result of agency leadership’s recently viewing IT investment management as an area of manag
3,859
ement focus and priority. Without having crucial processes and related practices in place, DLA lacks essential management controls over its sizable IT investments. At ITIM stage 2 maturity, an organization has attained repeatable, successful IT project-level investment control processes and basic selection processes. Through these processes, the organization can identify expectation gaps early and take appropriate steps to address them. According to ITIM, critical processes at stage 2 include (1) defining i
3,860
nvestment board operations, (2) collecting information about existing investments, (3) developing project-level investment control processes, (4) identifying the business needs for each IT project, and (5) developing a basic process for selecting new IT proposals. Table 1 discusses the purpose for each of the stage 2 critical processes. To its credit, DLA has put in place about 75 percent of the key practices associated with stage 2 critical processes. For example, DLA has oversight boards to perform invest
3,861
ment management functions, and it has basic project-level control processes to help ensure that IT projects are meeting cost and schedule expectations. However, DLA has not executed several crucial stage 2 investment practices. For example, the business needs for IT projects are not always clearly identified and defined, basic investment selection processes are still being developed, and policies and procedures for project oversight are not documented. Table 2 summarizes the status of DLA’s stage 2 critical
3,862
processes, showing how many associated key practices the agency has executed. DLA’s actions in each of the critical processes are discussed in the sections that follow. To help ensure executive management accountability for IT capital planning and investment decisions, an organization should establish a governing board or boards responsible for selecting, controlling, and evaluating IT investments. According to ITIM, effective IT investment board operations require, among other things, that (1) board membe
3,863
rship have both IT and business knowledge, (2) board members understand the investment board’s policies and procedures and exhibit core competencies in using the agency’s IT investment policies and procedures, (3) the organization’s executives and line managers support and carry out board decisions, (4) the organization create organization-specific process guidance that includes policies and procedures to direct the board’s operations, and (5) the investment board operate according to written policies and p
3,864
rocedures. (The full list of key practices is provided in table 3.) DLA has established several oversight boards that perform IT investment management functions. These boards include the following: The DLA Investment Council, which is intended to review, evaluate, and approve new IT and non-IT investments between $100,000 and $1,000,000. The Program Executive Officer Review Board, which is intended to review and approve the implementation of IT investments that are budgeted for over $25 million in all or ov
3,865
er $5 million in any one year. The Corporate Board, which is intended to review, evaluate, and approve all IT and non-IT investments over $1 million. DLA is executing four of the six key practices needed for these boards to operate effectively. For example, the membership of these boards integrates both IT and business knowledge. In addition, board members informed us of their understanding of their board’s informal practices. Further, according to IT investment officials, project managers, and agency docum
3,866
entation, the boards have a process for ensuring that their decisions are supported and carried out by organization executives and line managers. This process involves documenting board decisions in meeting minutes, assigning staff to carry out the decisions, and tracking the actions taken on a regular basis until the issues are addressed. Nonetheless, DLA is missing the key ingredient associated with two of the board oversight practices that are needed to operate effectively— organization-specific guidance
3,867
. This guidance, which serves as official operations documentation, should (1) clearly define the roles of key people within its IT investment process, (2) delineate the significant events and decision points within the processes, (3) identify the external and environmental factors that will influence the processes (that is, legal constraints, the behavior of key subordinate agencies and military customers, and the practices of commercial logistics that DLA is trying to emulate as part of DLA 21); and (4) e
3,868
xplain how IT investment-related processes will be coordinated with other organizational plans and processes. DLA does not have guidance that sufficiently addresses these issues. Policies and procedures governing operations are in draft for one board and have not been developed for the two other boards. Without this guidance governing the operations of the investment boards, the agency is at risk of performing key investment decisionmaking activities inconsistently. Such guidance would also provide a degree
3,869
of transparency that is helpful in both communicating and demonstrating how these decisions are made. Table 3 summarizes the ratings for each key practice and the specific findings supporting the ratings. An IT project inventory provides information to investment decision- makers to help evaluate the impacts and opportunities created by proposed or continuing investments. This inventory (which can take many forms) should, at a minimum, identify the organization’s IT projects (including new and existing sys
3,870
tems) and a defined set of relevant investment management information about them (for example, purpose, owner, lifecycle stage, budget cost, physical location, and interfaces with other systems). Information from the IT project inventory can, for example, help identify systems across the organization that provide similar functions and help avoid the commitment of additional funds for redundant systems and processes. It can also help determine more precise development and enhancement costs by informing decis
3,871
ionmakers and other managers of interdependencies among systems and how potential changes in one system can affect the performance of other systems. According to ITIM, effectively managing an IT project inventory requires, among other things, (1) identifying IT projects, collecting relevant information about them, and capturing this information in a repository, (2) assigning responsibility for managing the IT project inventory process to ensure that the inventory meets the needs of the investment management
3,872
process, (3) developing written policies and procedures for maintaining the IT project inventory, (4) making information from the inventory available to staff and managers throughout the organization so they can use it, for example, to build business cases and to support project selection and control activities, and (5) maintaining the IT project inventory and its information records to contribute to future investment selections and assessments. (The full list of key practices is provided in table 4.) DLA
3,873
has executed many of the key practices in this critical process. For example, according to DLA’s CIO, IT projects are identified and specific information about them is entered into a central repository called the DLA Profile System (DPS). DPS includes, among other things, project descriptions, key contact information, lifecycle stage, and system interfaces. In addition, the CIO is responsible for managing the IT project identification process to ensure that DPS meets the needs of the investment management p
3,874
rocess. However, DLA has not defined written policies and procedures for how and when users should add to or update information in the DPS. In addition, DLA is not maintaining DPS records, which would be useful during future project selections and investment evaluations, and for documenting the evolution of a project’s development. Without appropriate policies and procedures in place to describe the objectives and information requirements of the inventory, DPS is not being maximized as an effective tool to
3,875
assist in the fundamental analysis essential to effective decisionmaking. Table 4 summarizes the ratings for each key practice and the specific findings supporting the ratings. Investment review boards should effectively oversee IT projects throughout all lifecycle phases (concept, design, development, testing, implementation, and operations/maintenance). At stage 2 maturity, investment review boards should review each project’s progress toward predefined cost and schedule expectations, using established cr
3,876
iteria and performance measures, and should take corrective actions to address cost and milestone variances. According to ITIM, effective project oversight requires, among other things, (1) having written polices and procedures for project management, (2) developing and maintaining an approved management plan for each IT project, (3) having written policies and procedures for oversight of IT projects, (4) making up-to-date cost and schedule data for each project available to the oversight boards, (5) review
3,877
ing each project’s performance by regularly comparing actual cost and schedule data to expectations, (6) ensuring that corrective actions for each under- performing project are documented, agreed to, implemented, and tracked until the desired outcome is achieved, and (7) using information from the IT project inventory. (The complete list of key practices is provided in table 5.) DLA has executed most of the key practices in this area. In particular, DLA relies on the guidance in the Department of Defense 50
3,878
00 series directives for project management and draft guidance in an Automated Information System (AIS) Emerging Program Life-Cycle Management (LCM) Review and Milestone Approval Directive for specific IT project management. In addition, for each of the four projects we reviewed, a project management plan had been approved, and cost and schedule controls were addressed during project review meetings. Further, based on our review of project documentation and in discussion with project managers, up-to-date co
3,879
st and schedule project data were provided to the PEO Review Board. This board oversees project performance regularly by comparing actual cost and schedule data to expectations and has a process for ensuring that, for underperforming projects, corrective actions are documented, agreed to, and tracked. Notwithstanding these strengths, DLA has some weaknesses in project oversight. Specifically, although the Corporate Board and the Investment Council have written charters, there are no written policies or proc
3,880
edures that define their role in collectively overseeing IT projects. Without these policies and procedures, project oversight may be inconsistently applied, leading to the risk that performance problems, such as cost overruns and schedule slippages, may not be identified and resolved in a timely manner. In addition, according to representatives from the oversight boards, they do not use information from the IT project inventory to oversee projects because they are more comfortable using more traditional me
3,881
thods of obtaining and using information (that is, informally talking with subject matter experts and relying on experience). The inventory is of value only to the extent that decisionmakers use it. As discussed earlier, while the inventory need not be the only source of information, it should nevertheless serve as a reliable and consistent tool for understanding project and overall portfolio decisions. Table 5 summarizes the ratings for each key practice and the specific findings supporting the ratings. De
3,882
fining business needs for each IT project helps ensure that projects support the organization’s mission goals and meets users’ needs. This critical process creates the link between the organization’s business objectives and its IT management strategy. According to ITIM, effectively identifying business needs requires, among other things, (1) defining the organization’s business needs or stated mission goals, (2) identifying users for each project who will participate in the project’s development and impleme
3,883
ntation, (3) training IT staff adequately in identifying business needs, and (4) defining business needs for each project. (The complete list of key practices is provided in table 6.) DLA has executed all but one of the key practices associated with effectively defining business needs for IT projects. For example, DLA’s mission goals are described in DLA’s strategic plan. In addition, according to IT investment management officials, the IT staff is adequately trained in identifying business needs because th
3,884
ey generally have prior functional unit experience. In addition, according to DLA directives, IT projects are assigned an Integrated Process Team (IPT) to guide and direct the project through the development lifecycle. The IPTs are composed of IT and functional staff. Moreover, DOD and DLA directives require that business requirements and system users be identified and that users participate in the lifecycle management of the project. According to an IT investment official, each IT project has a users’ grou
3,885
p that meets throughout the lifecycle to discuss problems and potential changes related to the system. We verified that this was the case for the four projects we reviewed. While the business needs for three of the four projects we reviewed were clearly identified and defined, DLA has reported that this has not been consistently done for all IT projects. According to IT investment management officials, this inconsistency arose because policies and procedures for developing business needs were not always fol
3,886
lowed or required. DLA officials have stated that they are developing new guidance to address this problem. However, until this guidance is implemented and enforced, DLA cannot effectively demonstrate that priority mission and business improvement needs are forming the basis for all its IT investment decisions. Table 6 summarizes the ratings for each key practice and the specific findings supporting the ratings. Selecting new IT proposals requires an established and structured process to ensure informed dec
3,887
isionmaking and infuse management accountability. According to ITIM, this critical process requires, among other things, (1) making funding decisions for new IT proposals according to an established process, (2) providing adequate resources for proposal selection activities, (3) using an established proposal selection process, (4) analyzing and ranking new IT proposals according to established selection criteria, including cost and schedule criteria, and (5) designating an official to manage the proposal se
3,888
lection process. (The complete list of key practices is provided in table 7.) DLA has executed some of the key practices for investment proposal selection. For example, DLA executives make funding decisions for IT investments using DOD’s Program Objective Memorandum (POM) process, which is part of DOD’s annual budgeting process. Through this process, proposals for new projects or enhancements to ongoing projects are evaluated by DLA’s IT and financial groups and submitted to OSD through DLA’s Corporate Boar
3,889
d with recommendations for funding approval. In addition, according to the CIO, adequate resources have been provided to carry out activities related to the POM process. Nonetheless, DLA has yet to execute some of the critical practices related to this process area. Specifically, DLA acknowledges that the agency is not analyzing and prioritizing new IT proposals according to established selection criteria. Instead, the Corporate Board uses the expertise from the IT organization and its own judgment to analy
3,890
ze and prioritize projects. To its credit, DLA recognizes that it cannot continue to rely solely on the POM process to make sound IT investment selection decisions. Therefore, the agency has been working to establish an IT selection process over the past two budget cycles that is more investment-focused and includes increased involvement from IT Operations staff, necessary information, and established selection criteria. Until DLA implements an effective IT investment selection process that is well establis
3,891
hed and understood throughout the agency, executives cannot be adequately assured that they are consistently and objectively selecting proposals that best meet the needs and priorities of the agency. Table 7 summarizes the ratings for each key practice and the specific findings supporting the ratings. An IT investment portfolio is an integrated, enterprisewide collection of investments that are assessed and managed collectively based on common criteria. Managing investments within the context of such a port
3,892
folio is a conscious, continuous, and proactive approach to expending limited resources on an organization’s competing initiatives in light of the relative benefits expected from these investments. Taking an enterprisewide perspective enables an organization to consider its investments comprehensively so that the collective investments optimally address its mission, strategic goals, and objectives. This portfolio approach also allows an organization to determine priorities and make decisions about which pro
3,893
jects to fund based on analyses of the relative organizational value and risks of all projects, including projects that are proposed, under development, and in operation. According to ITIM, stage 3 maturity includes (1) defining portfolio selection criteria, (2) engaging in project-level investment analysis, (3) developing a complete portfolio based on the investment analysis, (4) maintaining oversight over the investment performance of the portfolio, and (5) aligning the authority of IT investment boards.
3,894
Table 8 describes the purposes for the critical processes in stage 3. According to DLA officials, they are currently focusing on implementing stage 2 processes and have not implemented any of the critical processes in stage 3. Until the agency fully implements both stage 2 and 3 processes, it cannot consider investments in a comprehensive manner and determine whether it has the appropriate mix of IT investments to best meet its mission needs and priorities. DLA recognizes the need to improve its IT investme
3,895
nt processes, but it has not yet developed a plan for systematically correcting weaknesses. To properly focus and target IT investment process improvements, an organization should fully identify and assess current process strengths and weaknesses (that is, create an investment management capability baseline) as the first step in developing and implementing an improvement plan. As we have previously reported, this plan should, at a minimum, (1) specify measurable goals, objectives, milestones, and needed res
3,896
ources, and (2) clearly assign responsibility and accountability for accomplishing well-defined tasks. The plan should also be documented and approved by agency leadership. In implementing the plan, it is important that DLA measure and report progress against planned commitments, and that appropriate corrective action be taken to address deviations. DLA does not have such a plan. In March 2001, it attempted to baseline agency IT operations by reviewing its project-level investment management practices using
3,897
ITIM. This effort identified practice strengths and weaknesses, but DLA considered the assessment to be preliminary (to be followed by a more comprehensive assessment at an unspecified later date) and limited in scope. DLA used the assessment results to establish broad milestones for strengthening its investment management process. The agency did not, however, develop a complete process improvement plan. For example, it did not (1) specify required resources to accomplish the various tasks, (2) clearly ass
3,898
ign responsibility and accountability for accomplishing the tasks, (3) obtain support from senior level officials, and (4) establish performance measures to evaluate the effectiveness of the completed tasks. At the same time, the agency has separately begun other initiatives to improve its investment management processes, but these initiatives are not aligned with the established milestones or with each other. The DLA CIO characterizes the agency’s approach to its various process improvement efforts as a ne
3,899
cessary progression that includes some inevitable “trial and error” as it moves toward a complete process improvement plan. Without such a plan that allows the agency to systematically prioritize, sequence, and evaluate improvement efforts, DLA jeopardizes its ability to establish a mature investment process that includes selection and control capabilities that result in greater certainty about future IT investment outcomes. Until recently, IT investment management has not been an area of DLA management att