Saturday 1 February 2014

Program Evaluation Assessment

The Upward Bound Math and Science initiative (UBMS) was established in 1990 within the scope of the existing Upward Bound (UB) program. It was created, within the United States Department of Education, with the goal of improving the performance and enrollment of economically disadvantaged K-12 students in math and science. Additionally, it sought to address underrepresentation of disadvantaged groups in math and science careers. These goals were pursued through funding of UBMS projects at Colleges and Universities that provided hands-on experience in laboratories, field sites, and with computers; and through summer programs targeted towards university preparation in math and science. UBMS grew from 30 programs in 1990 to 126 programs in 2006, servicing 6707 students at an annual cost of $4990 per student.

There have been evaluations of UBMS, in 1998, 2007, and 2010. My assessment will focus on the 2010 evaluation. The 2010 report combined data from 2001-2002 and 2003-2004 with a focus entirely on postsecondary outcomes. Each subsequent evaluation builds on previous work, which represents the first difficulty that I had with this report. Many times old data was presented for comparison to new data and, in some situations where new data was not available, old data was incorporated into the analysis. Without very careful reading it was often quite difficult to distinguish what was being reported.
This was a purely summative, outcomes-based evaluation presented exclusively from a quantitative, statistical perspective. The summative, quantitative approach is effective in justifying the cost per student as it provides hard numbers to compare against hard numbers. The method does not match up particularly well with any specific program evaluation model, although it comes closest with goal attainment as defined by Tyler as the purpose of the report is to connect the objectives of UBMS with measurable outcomes.

There were two components to the evaluation. A descriptive analysis of survey information about the credentials and demographics of staff, student recruitment and enrollment, program offerings, and other demographic aspects. This analysis was not the focus of the study and was used solely for descriptive purposes. The bulk of the study consisted of an impact analysis where outcomes of UBMS were measured against participants in the regular Upward Bound program rather than the general populace. This enabled measurement of the additional success provided by UBMS over and above any success reported by UB. However, since the successes of UB are not enumerated in this evaluation, the numbers presented are somewhat misleading as they are primarily relevant in comparison to data that are missing.

Even with this caveat the results are impressive. The evaluation found statistically significant increases in enrollment at four-year institutions (12%), enrollment at selective institutions (18.6%), enrollment in math and science courses (36.5% more credits completed), and social science degree completion (11%). There was also an increase in completion of math and physical science degrees, though it was not statistically significant. Overall, despite some minor data and analysis issues, the UBMS evaluation paints a clear picture of the successes of the program.

1 comment:

  1. Very interesting choice Jeff. We should all be so lucky to have multiple evaluations to build upon. Did they use similar data collection strategies in all of the evaluations? As they have gone to great lengths to provide longitudinal information I am wondering who the audience was for the report. I think that you are correct to bring Tyler into the mix. I am also wondering if Provus might be applied based on the financial lens that you identify.

    ReplyDelete