Laserfiche WebLink
Intended Result: Preparing discrete decision-units <br />that produce a clear result for evaluation. Think about <br />evaluating these decision-units against each other <br />and not necessarily about evaluating departments <br />against each other. <br />S. Score Of against Results. Once the <br />organization has identified its priority results and more pre- <br />cisely defined what those results mean in terms of meeting <br />the unique expectations of the community, it must develop a <br />process to objectively evaluate how the offers/programs <br />achieve or influence the priority results. Scoring can be <br />approached in several ways, but the system must ensure that <br />scores are based on the demonstrated and measurable influ- <br />ence the offers/programs have on the results. In many organi- <br />zations, such as the cities of Lakeland Nalnut Creek, and San <br />Jose, programs were scored against all the organization's pri- <br />ority results.The idea was that a program that influenced mul- <br />tiple results must be a higher priority — programs that <br />achieved multiple results made the best use of taxpayer <br />money Alternately organizations such as <br />Mesa County City of Savannah, Polk <br />County, and Snohomish County matched <br />each offer with only one of the priority <br />results and evaluated it based on its L) III id,g et I� In g. <br />degree of influence on that result. Using <br />this scenario, a jurisdiction should estab- <br />lish guidelines to help it determine how <br />to assign an offer/program to a priority <br />area and how to provide some accommodation for those <br />offers/programs that demonstrate critical impacts across pri- <br />ority result areas. Both of these approaches have been used <br />successfully in PD B. <br />against their ability to influence the priority result. Owners of <br />offers/programs submit them for review by the committee, <br />which in turn scores the programs against the result.The PLEB <br />process becomes more like a formal purchasing process <br />based on the assumption that those doing the evaluations <br />might be more neutral than those proposing the offers/pro- <br />grams. Committees could be made up entirely of staff, includ- <br />ing people who have specific expertise related to the result <br />being evaluated and others who are outside of that particular <br />discipline. An alternate committee composition would <br />include both staff and citizens to gain the unique perspectives <br />of both external and internal stakeholders <br />Regardless of who is evaluating the offers/programs and <br />assigning the scores, there are two key points. To maintain the <br />objectivity and transparency of the PLEB process, offers/pro- <br />grams must be evaluated against the priority results as com- <br />monly defined (see step 3). Also, the results of the scoring <br />process must be offered only as recommendations to the <br />elected officials who have the final authority to make <br />resource allocation decisions. <br />Pl-i(DIrity ... driver) L) III dgebr),g I�s a <br />natur-a�] a,[ter-r)a'b've ti(D <br />There are two basic approaches to scoring offers/programs <br />against the priority results. One approach is to have those who <br />are putting forth the offers/pr grams assign scores based on a <br />self-assessment. This approach engages the owners in the <br />process and taps into their unique understanding of how the <br />offers/programs influence the priority result.When taking this <br />approach, it is critical to incorporate a peer review or other <br />quality control process that allows review by peers in the <br />organization and external stakeholders (citizens, elected of fi- <br />cials, labor unions, business leaders, etc.). During the peer <br />review, the owner of the offer/program would need to provide <br />evidence to support the scores assigned. <br />A second approach to scoring establishes evaluation com- <br />mittees that are responsible for scoring the offers/programs <br />�14 Government Finance Review IAF)ril 2010 <br />Organizations should establish the <br />elected governing board's role at the out- <br />set. In some jurisdictions, the board is <br />heavily integrated into the PLEB process, <br />participating in the scoring and evalua- <br />tion step.They can question the assigned <br />scores, ask for the evidence that supports a score, and ulti- <br />mately request that a score be changed based on the evi- <br />dence presented and their belief in the relative influence that <br />an offer/program has on the priority result it has been evalu- <br />ated against. In other organizations such as Snohomish <br />County, Washington, the PLEB process is implemented as a <br />staff-only tool that is used to develop a recommendation to <br />the governing body <br />Intended Result: Scoring each unit of priloritizatilon <br />in a way that indicates its relevance to the stated priorities. <br />6. Compare Scores Between Offers/Programs. A real <br />moment of truth comes when scoring is completed and the <br />information is first compiled, revealing the top-to-bottom <br />comparison of prioritized offers/programs. Knowing this, an <br />organization must be sure that it has done everything possible <br />prior to this moment to ensure that there are no surprises, that <br />the results are as expected, and that the final comparison of <br />offers/programs in priority order is logical and intuitive. <br />