Ce diaporama a bien été signalé.
Le téléchargement de votre SlideShare est en cours. ×

Organizing and Managing Program Evaluation

Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Publicité
Chargement dans…3
×

Consultez-les par la suite

1 sur 54 Publicité

Plus De Contenu Connexe

Diaporamas pour vous (20)

Publicité

Similaire à Organizing and Managing Program Evaluation (20)

Plus par Urban Community Research Center for Asia (14)

Publicité

Organizing and Managing Program Evaluation

  1. 1. Organizing and Managing Program Evaluation
  2. 2. Presentation Outline <ul><li>Setting the Stage: U.S. Housing Ministry </li></ul><ul><li>Broad Organization of the Evaluation Function </li></ul><ul><li>Defining Evaluation Topics </li></ul><ul><li>Managing Evaluation Projects </li></ul><ul><li>Concrete Examples </li></ul>
  3. 3. Principal tasks <ul><li>Insuring mortgages – FHA </li></ul><ul><li>Mortgage refinance – GNMA </li></ul><ul><li>Direct loans for elderly and handicapped housing </li></ul><ul><li>Housing subsidies to assist low income families </li></ul><ul><li>Grants to localities for community development </li></ul><ul><li>Promote “fair housing” </li></ul>
  4. 4. Budget and Program Overview: Fiscal Year 2000 Program Block Budget ($ billion) No. of Major Programs Percent of budget total Community development 7.8 12 24 Housing Assistance 23.3 11 72 Housing finance 0.8 25-30 2.5 Fair housing 0.4 2 1.4 total 32.2 60-65 100
  5. 5. Organization <ul><li>Minister </li></ul><ul><li>Deputy Minister </li></ul><ul><ul><li>Program Assistant Ministers (A/M) </li></ul></ul><ul><ul><li>A/M for Administration (budget, personnel) </li></ul></ul><ul><ul><li>CFO (financial management; efficiency) </li></ul></ul><ul><ul><li>Inspector General </li></ul></ul><ul><ul><li>A/M for Policy Development & Evaluation (PDE) </li></ul></ul>
  6. 6. Office of Policy Development and Evaluation <ul><li>Provide advice and information to minister for decision making </li></ul><ul><li>Monitor and evaluate ministry’s programs </li></ul><ul><li>Conduct research on priority housing and community development issues </li></ul><ul><li>Key participant in the budget process </li></ul><ul><li>Feeds evaluation results into policy </li></ul>
  7. 7. 2. Organization of the Evaluation Function
  8. 8. Division of tasks among offices <ul><li>Program monitoring, including performance measurement </li></ul><ul><ul><li>Program A/Ms </li></ul></ul><ul><ul><li>CFO </li></ul></ul><ul><ul><li>PDE </li></ul></ul><ul><li>Program evaluation – PDE </li></ul><ul><ul><li>Within PDE, Office of Research & Evaluation </li></ul></ul>
  9. 9. Office of Research & Evaluation <ul><li>Defines evaluation agenda </li></ul><ul><li>Working with program offices, defines the objectives of the evaluations and methods </li></ul><ul><li>Conducts competitions to select firms to do work </li></ul><ul><li>Oversees works of contractors </li></ul>
  10. 10. 3. Defining Evaluation Topics
  11. 11. Who nominates programs for evaluation? <ul><li>Congressional mandates </li></ul><ul><li>Office of Management & Budget (Office of the President) </li></ul><ul><li>Program A/Ms </li></ul><ul><li>Office of Policy Development and Evaluation </li></ul>
  12. 12. Federal Government Budget Cycle --Illustrated for the FY2001 Budget-- Date Action August 1998 OMB sends “management letter to agency January 1999 Minister issues instructions for the budget process, including guidance on priorities February-March Offices work on submission April – July Internal review process July-August Finalization September Submission to OMB October-November Formal hearings with OMB mid-November OMB “pass-back” mid-Nov to mid-December Appeals; final decisions mid-January 2000 Submission to Congress May-August / October 1 Congress acts ; new fiscal year
  13. 13. 4. Managing evaluation projects
  14. 14. Basic steps in the process <ul><li>Defining the questions to be addressed </li></ul><ul><li>Determining the methodolgy </li></ul><ul><li>Estimating the budget </li></ul><ul><li>Preparing the TOR or RFP </li></ul><ul><li>Competition and award </li></ul><ul><li>Overseeing the contractor </li></ul><ul><li>Getting policy impacts </li></ul>
  15. 15. 1. Defining the questions to be addressed <ul><li>Understand the program before meeting with program officials about the evaluation </li></ul><ul><li>“ No client, no impact.” </li></ul><ul><ul><li>Make the program office your partner </li></ul></ul><ul><ul><li>Understand who else may be a “primary intended user” </li></ul></ul><ul><ul><li>Consult them at the beginning and keep them fully informed </li></ul></ul>
  16. 16. Reality check <ul><li>For each question ask </li></ul><ul><ul><li>Can the question be answered with program evaluation tools? </li></ul></ul><ul><ul><li>What is the relevant indicator? </li></ul></ul><ul><ul><li>What is the source of the data? What does it cost? </li></ul></ul><ul><li>Is the overall cost affordable? </li></ul><ul><li>If the answers are positive proceed. If not, redesign. </li></ul>
  17. 17. 2. Determine the methodology <ul><li>Depends on the type of evaluation—process, impact or benefit-cost </li></ul><ul><li>Prepare clear statement of method that can be critically reviewed </li></ul><ul><li>Where data are to be collected, estimate sample sizes </li></ul><ul><li>If you do not have the capacity on staff, hire an expert to help </li></ul>
  18. 18. 3. Estimate the budget <ul><li>Labor for structuring the project, analysis, reporting (build up task-by-task) </li></ul><ul><li>Travel </li></ul><ul><li>Data collection </li></ul><ul><ul><li>Cost per household or program manager interviewed; office file checked </li></ul></ul><ul><li>Overheads </li></ul><ul><ul><li>Office space, communications, support staff, graphics, computers… </li></ul></ul>
  19. 19. Staff loading chart for analysis and results write-up for impact analysis using econometric methods (days) Task Senior staff Mid-level Junior staff Data analysis --specify models 12 8 -- --estimation 3 5 15 --sensitivity tests 4 3 9 Results write-up --outline development 3 1 -- --writing 10 15 2 --table preparation -- -- 8 TOTAL 32 32 34
  20. 20. 4. Preparing the Analysis Plan or RFP <ul><li>Share the statement with the client even if they are not experts in evaluation or social sciences </li></ul><ul><li>Even if internal staff will do the analysis, the discipline of the Analysis Plan is very useful </li></ul><ul><li>The RFP should be read by the review panel to be sure they understand it and agree with it. </li></ul>
  21. 21. The RFP <ul><li>Be clear about </li></ul><ul><ul><li>Objectives; questions to be answered </li></ul></ul><ul><ul><li>Expected products, schedule </li></ul></ul><ul><ul><li>Sample sizes, illustrative list of locations, etc. to facilitate analysis of budgets </li></ul></ul><ul><ul><li>Do not give too much guidance on methodology </li></ul></ul><ul><ul><li>Provide an indication of level-of-effort </li></ul></ul><ul><ul><li>Results dissemination: buy some time </li></ul></ul><ul><ul><li>Include the evaluation criteria and weight assigned to each </li></ul></ul>
  22. 22. 5. Competition and award <ul><li>Review panel is formed before the RFP is issued </li></ul><ul><li>Includes people with the necessary competence (use consultant if necessary) </li></ul><ul><li>Have someone from the client’s staff participate </li></ul><ul><li>Formal criteria and scoring sheets </li></ul><ul><li>“ Orals” and “Best and Finals” </li></ul><ul><li>Absolute confidentiality </li></ul>
  23. 23. Standard factors evaluated <ul><li>Contractor’s understanding of the issues </li></ul><ul><li>Quality of the methodological design </li></ul><ul><li>Quality of the management plan </li></ul><ul><li>Quality of the proposed staff </li></ul><ul><li>Past experience of organization </li></ul><ul><li>Price </li></ul>
  24. 24. 6. Overseeing the contractor <ul><li>Standard tools </li></ul><ul><ul><li>Detailed review of </li></ul></ul><ul><ul><ul><li>Work plan </li></ul></ul></ul><ul><ul><ul><li>methodology </li></ul></ul></ul><ul><ul><li>Monthly progress reports </li></ul></ul><ul><ul><li>Briefings at key points </li></ul></ul><ul><ul><li>“ Milestone” reports </li></ul></ul><ul><ul><li>Draft final report </li></ul></ul>
  25. 25. What to do if the contractor underperforms? <ul><li>More frequent, intense reviews; be clear on areas of weakness </li></ul><ul><li>Informal contact with senior management at the firm </li></ul><ul><li>Official letter to firm stating problems </li></ul><ul><li>Legal action to deny payment (very unusual) </li></ul><ul><li>Important that end-of-project records of performance be maintained </li></ul>
  26. 26. 7. Getting policy impacts <ul><li>Work with the client </li></ul><ul><ul><li>Don’t overlook the positive points </li></ul></ul><ul><ul><li>Make sure they understand the basis for criticisms </li></ul></ul><ul><ul><li>Have specific recommendations for improvements </li></ul></ul><ul><li>Report the results to other interested parties—but do it in a constructive way </li></ul>
  27. 27. Sharing results <ul><li>Never “slant” the results or your credibility will be severely undermined </li></ul><ul><li>Make sure you are informed by the contractor about their plans for release of the results </li></ul>
  28. 28. <ul><li>Specific examples of program </li></ul><ul><li>evaluations </li></ul>
  29. 29. <ul><li>Demonstration project— </li></ul><ul><li>“ Moving to Opportunity” </li></ul>
  30. 30. Subject -- Using housing vouchers to help poor families leave inner-city neighborhoods for middle class areas -- Evidence from non rigorous studies that such moves had significant positive impacts on the families who moved
  31. 31. Context & Client <ul><li>Use of housing vouchers as a successful anti- poverty instrument was very exciting </li></ul><ul><li>Congress could mandate changes in the use of vouchers, increasing use for this purpose </li></ul><ul><li>Congress mandated a rigorous evaluation to determine if there is a sound basis for change </li></ul><ul><li>Funds included in HUD budget </li></ul>
  32. 32. Specific objectives <ul><li>Determine the impacts on participating families of moving to middle-class neighborhoods </li></ul><ul><li>Impacts on both </li></ul><ul><ul><li>adults (employment, welfare dependence) </li></ul></ul><ul><ul><li>children (school achievement, crime, drugs) </li></ul></ul>
  33. 33. Participants in determining questions & design <ul><li>HUD-evaluation office </li></ul><ul><li>Congressional staff </li></ul><ul><li>Advisory panel, including foundations (who contributed funds for the evaluation) </li></ul>
  34. 34. Evaluation design <ul><li>10-year impact evaluation; 5-cities, almost 5,000 households </li></ul><ul><li>Random assignment </li></ul><ul><li>Three groups selected from residents of public or assisted housing </li></ul><ul><ul><li>Families given vouchers to rent housing in middle class areas </li></ul></ul><ul><ul><li>Families give vouchers to use anywhere </li></ul></ul><ul><ul><li>Control group, starting in public housing </li></ul></ul>
  35. 35. Measurement Example: education achievement <ul><li>Achievement measured by </li></ul><ul><ul><li>Scores on series of standardized tests </li></ul></ul><ul><ul><li>School performance—grades </li></ul></ul><ul><ul><li>Advanced coursework </li></ul></ul><ul><ul><li>Application of seniors for college </li></ul></ul>
  36. 36. Procurement—interim evaluation <ul><li>Contract awarded through a competition </li></ul><ul><li>5 firms competed </li></ul><ul><li>Evaluation board--only PDE staff </li></ul><ul><li>Size of contract: $8.2 million </li></ul><ul><li>3 year contract </li></ul>
  37. 37. Monitoring contractor performance (only interim report) <ul><li>Workplan required, including detailed analysis plan </li></ul><ul><li>Monthly progress reports </li></ul><ul><li>Several briefings to HUD, Hill staff, and foundations </li></ul><ul><li>HUD staff time: 1 staff year </li></ul>
  38. 38. Conclusions for education achievement <ul><li>Results are for MTO-vouchers in comparison with the both other groups </li></ul><ul><li>Findings for 1994 thru 1997 </li></ul><ul><li>No significant effect on achievement by all measures </li></ul><ul><li>No significant effect on the quality of schools attended </li></ul>
  39. 39. Policy impact <ul><li>Results are very fresh </li></ul><ul><li>Interim findings; 5 years to go </li></ul><ul><li>Results contrary to expectations </li></ul><ul><li>Sufficient to prevent major change in use of housing vouchers </li></ul><ul><li>Congress will probably wait for final results </li></ul>
  40. 40. <ul><li>Lending for Economic Development </li></ul><ul><li>in Low-Income Neighborhoods </li></ul>
  41. 41. Subject-1 <ul><li>-- HUD has several programs that lend money to </li></ul><ul><li>private businesses, guarantee loans made by banks </li></ul><ul><li>or provide credit enhancements for guaranteed </li></ul><ul><li>loans. All involve subsidies </li></ul><ul><li>CDBG – loans of grants funds by city through banks </li></ul><ul><li>S.108 – loan guarantees </li></ul><ul><li>Credit enhancements on S.108 loans </li></ul>
  42. 42. Subject-2 -- These loans are to businesses in CDBG target neighborhoods or primarily serving poor households --Evaluation Issue: how successful are these programs in generating economic development?
  43. 43. Context & Client <ul><li>Programs not previously carefully evaluated </li></ul><ul><li>Client: Office of Community Planning and Development—request of the A/M </li></ul><ul><li>Request to evaluation office </li></ul>
  44. 44. Specific questions (selected) <ul><li>Impact of programs on business development and job creation </li></ul><ul><li>How do these loans perform? </li></ul><ul><li>What is feasibility of creating a secondary market for these loans? </li></ul><ul><li>So combination impact and process evaluation </li></ul>
  45. 45. Participants in determining questions & design <ul><li>HUD-evaluation office </li></ul><ul><li>HUD program office </li></ul>
  46. 46. Evaluation design: business development & job creation <ul><ul><li>Measurements </li></ul></ul><ul><ul><li>-- No. of jobs created vs. no. planned </li></ul></ul><ul><ul><li>-- Cost per job created vs. cost per job of other federal econ-develop programs </li></ul></ul><ul><ul><li>--survival rate of businesses compared with all small businesses </li></ul></ul>
  47. 47. Program experience analyzed <ul><li>Loans originated </li></ul><ul><ul><li>CDBG: 1996 – 1999 </li></ul></ul><ul><ul><li>S.108: 1994 - 1999 </li></ul></ul>
  48. 48. Evaluation design: loan performance <ul><ul><li>Measurement: default rates compared with those of banks to similar business </li></ul></ul><ul><ul><li>-- closed loans, 1996-99 </li></ul></ul><ul><ul><li>-- loans still open at the time of the study </li></ul></ul><ul><ul><li>Loss rates </li></ul></ul>
  49. 49. Procurement <ul><li>Contract through competition </li></ul><ul><li>3 firms competed </li></ul><ul><li>Evaluation board--only PDE staff </li></ul><ul><li>Size of contract: $2.1million </li></ul><ul><ul><li>Huge data collection effort </li></ul></ul><ul><li>2.5 year contract </li></ul>
  50. 50. Monitoring contractor performance <ul><li>Workplan, analysis design report, and data collection plan required </li></ul><ul><li>Monthly progress reports—narrative, issues, financial report </li></ul><ul><li>Two reports: </li></ul><ul><ul><li>Interim on data assembled </li></ul></ul><ul><ul><li>final </li></ul></ul><ul><li>Two briefings </li></ul><ul><li>HUD staff time: 4 staff months </li></ul>
  51. 51. Conclusions for business and job development <ul><li>Jobs created equaled 93% of those indicated in loan applications </li></ul><ul><li>Average cost to create one job </li></ul><ul><ul><li>CDBG = $2,675 </li></ul></ul><ul><ul><li>S.108 (loan guarantee) = $7,865 </li></ul></ul><ul><ul><li>Compared with: Other gov’t econ develop: $936 - $6,250 </li></ul></ul><ul><li>Overall positive impact on business survival rates </li></ul>
  52. 52. Conclusions for loan performance <ul><li>Default rates on closed loans </li></ul><ul><ul><li>CDBG = 26% </li></ul></ul><ul><ul><li>S.108 = 42% </li></ul></ul><ul><li>Unsubsidized loans have lower default rates but are made to the most creditworthy borrowers </li></ul>
  53. 53. Policy impact <ul><li>Results are only a few months old </li></ul><ul><li>Likely that the program office will use results to provide better guidance to cities on use of loan programs, particularly the characteristics of excessively risky loans </li></ul><ul><li>Explicit guidance on underwriting standard may result </li></ul>
  54. 54. Key Steps in Managing a Program Evaluation <ul><li>Have a strongly interested client </li></ul><ul><li>Rigorously define the questions to be addressed </li></ul><ul><li>Have a general idea of the methodology </li></ul><ul><li>Open competition </li></ul><ul><li>Invest in monitoring contractor performance </li></ul><ul><li>Work with the program office to get the results used </li></ul>

Notes de l'éditeur

  • Most of you have been to presentations about the technical aspects of conducting program evaluations. Such presentations focus on the methods for carrying out evaluations—sample sizes, control groups, statistical analysis receive prominent attention. This presentation is designed for those government managers who will be responsible for organizing and managing evaluations within their ministries. The material presented may also be of interest to others who want to understand broadly how a government evaluation office should operate.
  • To make the later parts of the discussion concrete it is necessary to establish the context in which the evaluations are being defined and managed. I am using HUD because I know it well. So we open with a brief overview of the Ministry’s responsibilities and organization. Within this we focus on the location of the evaluation function.
  • GNMA – credit enhancement to mortgage-backed bond sales by banks; government guarantee that investors will be paid. Community development: CDBG
  • Define the fiscal year All figures are approximate. Some smoothing for year-to-year variation. Excludes tax benefits which are substantially larger ~$50 billion this year Total spending is about $120 per person. Total number of all programs : ~260 Staff: 1980 12,500 2000 8,000 and falling Outsourcing
  • Flat organization chart. 12 A/Ms report to Minister. All A/Ms appointed by the president. D/M responsible for administration. IG – both fraud and more and more examination of inefficiency. Unclear lines sometimes. Does not report directly to the minister. President and Congress. Appointed by president and confirmed by the Congress. PDE – next slide
  • Special position. In some sense the “right hand” of the minister. Centralization of the evaluation function is very valuable. Quality control, trained staff, objectivity. In conducting research and evaluation the office makes substantial use of outside contractors. PDE receives a separate appropriation (line item in ministry budget) for this.
  • Program A/Ms – primary responsibility for program monitoring CFO – primary responsibility for the new performance measurement requirements under GPRA. Annual “Performance Accountability Report” PDE—consults all of the offices on these topics. Office of Research and Evaluation – next slide
  • Office asks each A/M office for their ideas for evaluations needed in the next budget year. This is a formal request. The response from the program office forms the basis for an ongoing discussion on what will be included in the agenda. The reality is that the program offices often do not want programs evaluated—no good news from an evaluation. So PDE has the right to propose programs to be evaluated as part of the budget process—which is discussed more in a few minutes.
  • Congressional mandates Explain authorizing and appropriations committees. Requests usually come from the authorizing committees. They take two forms: Request for a demonstration project with an accompanying evaluation (often result of lobbying by industry, e.g., homeownership counseling) Concern about the way the program is delivering services; these requests may be supported by advocacy NGOs who monitor the program or be stimulated by less systematic analysis (GAO). These requests are written into the law with a due date for the evaluation.
  • Areas in yellow are those points at which requests for program evaluations can be made: OMB asks both for “in house” assessments and for large scale evaluations of programs about which it has concerns. These are strong requests not orders; usually a lot of discussion, especially about priorities. OMB staff can also veto evaluations proposed by the ministry. A/Ms – sometimes made requests; A/M PDE always has candidates. Review process – sometimes the need for an evaluation becomes evident and minister puts it on the list. Final agenda —depends on the available budget, the minister’s priorities, OMB’s requirements and Congressional mandated studies.
  • Do your homework; map the program. Understand how each element works. Meetings with the program office will likely be essential. They will appreciate your effort to learn. If you meet with the program people about the evaluation before you really know the program, you are likely not to be taken seriously. Define “primary intended user” No surprises for your client
  • Consider the case of a housing allowance program—define. Question that cannot be answered: on grounds of equal treatment of the population is targeting subsidies the correct thing to do? Example of Question to be addressed: what percentage of eligible households are receiving housing allowances Indicator: percent of eligible population receiving benefits at a particular point in time. Data sources Participants – program office Eligible households: secondary data or household survey Check the data sources on the program with the program office. Staff there should know which data series are not reliable.
  • Not going to talk about alternative methods. Really no sufficient time for this.
  • Labor: Spend a lot of energy on this estimate, as it is usually two-thirds or more of the total budget (including related overheads). More on see next slide Travel: important for studies involving a lot of field work or visits to other countries, etc. Data collection: after some experience, managers will develop their own estimates but can get rules-of-thumb from survey firms, e.g., cost per completed interview (in person, telephone, mail) of national random sample of 1,200 households with interviews lasting 30 minutes. Overheads: will learn quickly from reading proposals. At the beginning when you do not know how much it will cost, specify the resources available in man-months; can also set the sample size in the RFP and then change it to a more realistic level, if needed, after some initial analysis.
  • Obviously, this is a small segment of the full staff loading chart. Once you have the total days, multiply the staff days in each category by the estimated per day salary level. Note that even though the total LOE per staff is about the same, the elapsed time would be quite a bit longer because the work has to be done sequentially.
  • Define the difference between the two. Analysis plan – is an outline of the work to be done by an internal group. It should contain a clear statement on the methods to be used, the data sources, expected outcome and schedule—and the level of effort expected from various staff members. (Need to have the later well-understood at the outset.) The RFP will be less developed with respect to the methodology, giving some range for competition. On the other hand, it will contain all of the details about the agency’s contracting requirements.
  • Be clear about… --sample sizes, etc. Important to give specific guidance so that you receive comparable bids. A first task in the contract can be for the contractor to review the sampling plans and make recommendations. Not too much on methodology . This is an important area of the competition, i.e., contractors can distinguish themselves with strong presentation. With no level of effort , will get estimates that vary widely because the contractors are guessing. UI’s experience bidding on the KZ financial sector reform project sponsored by USAID. Dissemination : the contractor should have the right to publish the results. But ministry can protect itself with a 60 or 90 period after the final report is accepted before the results can be shared.
  • Participation by the client’s staff (program office) is important because: (1) they have more detailed knowledge of the program; (2) want them to endorse selection of the contractor. “Buy in”
  • Understanding the issues : both program knowledge and knowledge of other research and evaluation on the topic Methodology : selection of appropriate evaluation approach, e.g., who to interview, how to interview (in-person, phone, mail; focus group); what data to analyze (file information, newly collected, etc); analytic methods. Management plan : responsibilities clearly allocated; who is control; who responsible for each major task; Too many persons or organizations involved—leads to loss of control and possibly duplication Level of resources assigned to different tasks is reasonable? Proposed staff : experience with doing similar tasks in the past, both analytic and managerial Organization’s past experience : reliable? PPRs should be requested and some checked by panel members Price
  • AT THE BEGINNING: importance of making these deliverables and very carefully reviewing them. Time for outside consultants if you need them. Milestone reports : require reports at each major phase of the project so that you know if it is beginning to drift off of its intended direction. Draft final report : this is the time to request changes. If you ask later, the contractor can reasonably ask for additional funding.
  • Contractors want future business and are very likely to respond to clear statements of dissatisfaction. Very important to be precise about problems. Sometimes the project manager will hide problems from senior officials at the company. That is why one should contact senior management at the firm if the project manager remains unresponsive to the requests for improvements.
  • Evaluations will inevitably discover areas in which performance can be improved. That is why they are done. And appropriately, addressing these problems receives the majority of attention. BUT, it is important for inter-office relations, as well as making an objectively balanced presentation, to highlight the positive findings as well. Leading with the “good news” is often a good presentation strategy. The client is the organization that can make change. They have to understand the findings and be convinced that the findings are valid. It is at this point that the working relationships you have built with the other offices pays off.
  • Pt. 2: No surprises! Positive attitude about information being publicly available.
  • -- define vouchers -- define the nature of the earlier studies for Chicago (Geautreaux Case)
  • Comment on Size/composition of Congressional staff Foundation participation
  • Participants were all residents of public housing who applied to participate in a new program. So all were motivated to some degree to leave the “projects”
  • Quality of schools measured by (examples) Percent receiving free lunches Percent white Pupil-student ratio Percentile rank on state exam Results were surprising. The treatment group should have been moving to better neighborhoods (as defined in terms of poverty rates) but that condition seemed not to have been sufficient to change the average school profile. This is different from earlier relocations. Results for employment are similar—measured by Currently employed Currently employed at job offering health insurance Currently employed full-time Annual earnings Current weekly earnings Employed over one year in main job
  • Draft report was submitted to HUD in March 2003.
  • Programs CDBG direct loan program, through banks S.108 guarantees loans made by private banks; guarantees are future funds to be received by the city through CDBG Credit enhancements are through: Section 108 loan guarantee program and the Brownfields Economic Development Initiative.
  • Qualifying businesses Area standard. Business located in a CDBG neighborhood--80 percent of CDBG funds must be spent in areas where households have incomes less than 50 percent of the median for the urban area OR The business must primarily serve low-income families (50 percent of area median)
  • Loan performance means default rates We do not further consider the issue of securitizing these loans.
  • Standards to use in the evaluation were a significant issue. Jobs were those that the loan application claimed would be created as a consequence of receiving the loan. If anything, applicants had an incentive to overstate the number of jobs to give a greater justification for the subsidy. Results of lending programs working in other environments is of limited use but offers some guidance. If did as well as those in other environments, the program is doing very well. General business survival rates is a hard-to-interpret standard because of the very high rate of death of new firms. In U.S. in first two years about 20% of new firms fail each year.
  • In reality banks are not making many loans to firms located in these neighborhoods and then only to the stronger firms. Indeed the reason the programs exist is to provide credit where it would not otherwise be available.
  • Why was the contract so costly? Data collection in 51 high volume communities: Loan files from city agencies—976 third-party loans Information from borrowers on financial condition, employment, etc. over a several year period Supplementary data on businesses and loans made by banks Several hundred interviews with program directors and borrowers
  • Explain difference between workplan and design report
  • The study observed borrowers, on average, about 3 years after they had taken a loan. Failure rates for them were CDEBG loans 20% S.108 25 Compares with annual rates of 20% for all businesses in the early years down to about 7-8% by the ninth year. So overall look good.
  • The guidance on underwriting guidelines would be based on extensive analysis of the probability of default on these types of loans that was conducted as part of the project.

×