Tag Archives: evaluation

What Are You Evaluating? Part 3

Over the last two weeks we have discussed the evaluation of the purposes and mechanics of a citizen engagement process.  This week we conclude this brief series by offering a few thoughts on evaluating outcomes – changes in policy, actions, or resource allocation that were influenced by the engagement process.  Evaluating the purpose and mechanics of a given engagement process is important for understanding how to best operate a process; evaluation of outcomes helps you measure the overall value of the process in relation to the actual delivery of services and the efficiency and efficacy of governing.  Evaluating and reporting  outcomes will also build trust in the process and assure the public that their input is being used.  This in turn can lead to increased involvement by the public in future processes.

As with the other forms of evaluation, it is best to keep future evaluation in mind from the start.  If you can identify or develop data sets that will be relevant before, during, and after a given engagement process, you will have different data points to compare at the end of the process and will be able to more effectively demonstrate where changes have occurred and why.

Some questions you may consider asking early in a process when developing an evaluation plan for outcomes are:  What data is currently collected within the community that might serve as a baseline? What additional data might be useful and can a baseline be obtained through an initial survey? What types of issues have been arising in our community, both with regard to specific policies and with the way those policies are made?  What existing processes or efforts could be informed by the information obtained during the engagement process and how will we connect with those? If policy or action recommendations are to be made during the process, what is the likely time period for implementation, and thus for evaluation?  During the process you might gather, through surveys and evaluations, information that answers the following questions:  What subjects are citizens most focused on?  What interests or concerns or values are being expressed? What information is being relied on? What information is missing or misunderstood? What kinds of changes are being proposed or recommendations made? What types of time frames are being discussed for implementation? These kinds of questions will help you fine-tune your evaluation plan.   After the process you can review the data and ask:  What have we learned and how can it help us make better decisions?  What ongoing efforts could be informed by the information obtained or included in an implementation plan?  If there are barriers (including lack of resources) to implementing certain changes or recommendations made by citizens, what are they, and how might we address them? How can track report progress and over what time frame should that occur?  Asking and answering these types of questions can lead to further dialogue, education, and reduced conflict over decisions made.

When evaluating the outcomes of your engagement process, you can use a wide array of survey tools and local data.   If you choose to focus on quantitative data, you may consider looking at an organization like the Baltimore Neighborhood Indicators Alliance for some examples on measurement.  Depending on the issue, you could also choose to focus on simple and easily conveyed indicators like money saved or spent or changes in energy use.  For other issues, qualitative data may be used.  For example, Columbia, Missouri’s vision tracking report, which we helped to develop, simply indicates for specific, identified goals whether the goal has been completed, progress is being made, further action or approval is needed, or the goal is no longer being pursued.

A common complaint citizens make about engagement efforts is that their recommendations just “sat on the shelf”.    Members of your community want to see that their input is used.  This is why tracking and reporting how various substantive decisions and actions are affected by that input is the type of evaluation the public is most likely to be interested in.  Dissemination and discussion of such evaluations can and should occur over a defined time period, since implementation often occurs over many years.

What Are You Evaluating? Part 2

Last week we discussed evaluating the purpose of a citizen engagement process.  This week we continue our series of posts on evaluating citizen engagement processes by offering a few suggestions on how you might consider evaluating the mechanics of a citizen engagement process.

Once you know your purpose you can move on to mechanics of an engagement process.  Questions you might ask as you setup the process include: What type of process would best engage our audience? What kind of process can we implement in a responsible and sustainable manner? What types of resources (including information, volunteers, rooms, equipment, food, etc.) will we need? What types of funds or in-kind donations are available? What kinds of outreach are needed?  What level of participation are we hoping for?  What type of training or orientation will be needed for the process to be productive? Setting baseline, (high and low) targets in these areas and developing related checklists for implementation will help in recruitment, volunteer training, and ongoing evaluation.

After you have already initiated a process you can ask, are we on track? Is our process operating as planned?  Were our original assumptions and projections correct or do we need to adjust to changing circumstances?  In evaluating an ongoing process you have to be willing to make changes in setup and how the dialogue is being facilitated.  You may also need to reassess your expectations and slow the process down, allowing more time for discussion, or break it into stages.  In a process focused on building citizen engagement, for instance, if citizens are in fact engaged, too strong of a push on “completing” the process and calling it finished rather than allowing for extra time, can create dissatisfaction, disengagement, and distrust.

After an engagement process you can evaluate the mechanics of what worked, and what might be improved and document lessons learned.  Questions you might ask here are: Did things flow smoothly?  Were resources available when needed?  Was the process operated cost effectively?  Were issues sequenced effectively?  Were good connections made between various stages of the process?  Were some channels of communication and outreach more productive than others and if so which ones? Did it vary by community?  How satisfied was the public with the opportunities provided for input, or that their input was heard and valued? These evaluations can be shared with  internal audiences for future planning, and making them public can also build trust with your constituents and demonstrate that you respect and want to encourage ongoing engagement.