Final deliverable presented by my project team to the UC Berkeley Business Services department, outlining key performance metrics for the ongoing procurement reform initiative.
4. This presentation aims to add specificity to the selected metrics, as well as present our recommendations and the scorecard 4 Midterm Deliverable Final Deliverable During our midterm presentation we presented Our background research and initial takeaways Preliminary metrics, along with our reasoning and expected trends The next steps we would be taking before the final deliverable This final presentation will serve to present Updated information on data sourcing and metric calculation. Actionable recommendations for the advancement of the procurement initiative A dynamic one-page procurement performance scorecard
5.
6. Variations5 The calculation boxes describe how to use the data pieces to determine the current value of the metric Boxes in yellow indicate relevant data pieces which should be considered when calculating the metrics Metric Calculation This section describes the reasoning behind the metric, its various uses, and its advantages This section highlights how and with whom the metric should be shared as well as any variations or adaptations of that metric
29. This metric will be most powerful in the beginning stages of the BearBuy rollout in defending the procurement initiative by showing its potential to generate cost savings in terms departments understand and care about
57. BearBuy should decrease CTOT by eliminating inefficiencies caused by non-value adding activities
58. End-users will be more likely to use the new BearBuy if the approval and payment processes occur faster on the new system as compared to the current one
74. Two separate surveys should be designed: one that specifically targets the BearBuy system and another that addresses customer-service by the CPO so as to receive clearer results
75. Additionally, a clear distinction should be made between the campus-wide procurement team and department-specific procurement teams, according to Barbara Lane16
90. Number of suppliers measures the flexibility of BearBuy in accommodating user desires
91. Number of suppliers can be leveraged to convince users that progress is being made to allow them to use their preferred vendors
92.
93. Our research has led to several key recommendations for the future success of the Procurement Initiative at UC Berkeley 20
94. University of California, Berkeley | Procurement KPI Scorecard EFFECTIVENESS Effectiveness metrics at the top provide high level campus administrators with the indicators that affect the bottom line. They offer a measurements across time to track the success of the Procurement Initiative. This section of writing can be used to summarize results, or highlight key initiatives that have driven changes. EFFICIENCY Supplier Information 0% 50% Efficiency metrics below track how well the initiative has streamlined Procurement Services and increased the quality of services to end-users. Again, this section can be used to describe new developments from Procurement Services, such as new local contracts, end user concerns gathered from surveys, or the success of incentives increasing supplier performance. Local 12.5% 0% 30% Minority 22% GOAL CURRENT Supplier Performance 90% 100% Invoice Turnaround 100% 96% 80% 0% 10% Error Rate 3.5% 0% 10% Return Rate 0% 5.2% GOAL CURRENT 21 * Disclaimer: Data is arbitrary
97. Metric Summary 24 Effectiveness Efficiency Total Cost Savings Return on Investment Spend Under Management Unrealized Cost Savings Adoption Rate Transaction Requiring Post-Issuance Activities Cycle Time of Transaction End User Satisfaction Supplier Performance Number of Suppliers
99. Bibliography Leadership Development Program Report “Berkeley Buying Power: Driving Contract Spent to Savings.” Tipping Point Solutions, September 2010. The Hackett Group Report “2005 Performance Metrics and Practices of World-Class Procurement Organization.” The Hackett Group, 2005. Penn Procurement “Performance Metrics – Penn Purchasing Services.” University of Pennsylvania, accessed Feb 2011. <http://www.purchasing.upenn.edu/supply-chain/performance-metrics.php> Notre Dame Sustainability “Procurement Metrics – Office of Sustainability.” University of Notre Dame, accessed Feb 2011. <http://green.nd.edu/sustainability-at-nd/procurement/procurement-metrics/> Expert Sources Ralph Meier, Chief Procurement Officer at University of Pennsylvania Jim Hine, Executive Director of Procurement at University of California, San Francisco Erin Wixson, Co-Author of Berkeley Buying Power Report Andrea Rex, Co-Author of Berkeley Buying Power Report Heidi Hoffman, Operational Excellence Procurement Initiative Manager at University of California, Berkeley Barbara Lane, Assistant Dean, College Administration Shana Amenaghawon, Financial Services Director, College of Chemistry Karen Kato, Enterprise Data Warehouse & Database Manager , IST Rich Taylor, Director of Procurement Strategies
Notes de l'éditeur
Olivier
Akshay
Akshay
Akshay
Josh
JoshJust remembered what Karen Kato said about this metric. It might be negative! That’s not good and should be mentioned by whoever presents this slide. Unrealized Cost Savings is measured as the sum of the difference between the price actually paid for a good and the price of a comparable good from a supplier under a contract negotiated by the OP
Diana
DianaWe were told this might be hard to measure. It’s ok to admit that and say that if it does turn out to be measurable, we think it would be very valuable.
DianaRemember to discuss the limitations of CTOT due to SciQuest. i.e. we don’t know when Requisitions are placed, only when Placement Order is put out.
Sharvani
Sharvani
SharvaniYou can mention that Supplier Performance is made up of these types of metrics, and that a combination could easily be devised to make one consolidated score (i.e. just sum them all?)
SharvaniSustainability= Buying local. Could discuss how the other option is to measure supplies like paper and see how much of that is “green”.