Pilot Evaluation (16)

Description:

This task includes developing a plan to evaluate the pilot according to various metrics derived from the pilot goals and objectives and conducting a complete pilot evaluation based on that plan. The pilot evaluation plan shows how pilot goals break down into specific metrics, which pilot data to collect, and how to compute the evaluation metrics from that data. The final evaluation report should document pilot background, high-level design, and why certain decisions were made; describe the pilot system as implemented and pilot operations; and include data analysis, lessons learned during pilot, and pilot conclusions.


Details:

First, complete the evaluation plan as written. Then, write the pilot report, including the following sections:

  • Background. What led to the pilot, especially regarding policy development?
  • High-level design. What were the basic design and duration, and who were the participants? Why were the decisions made the way they were?
  • Pilot system. How were mileage reporting, account management, and customer service implemented?
  • Live pilot operations. What were the pilot events as they unfolded, from enrollment to closeout?
  • Data analysis. What were the results of the pilot evaluation plan, and what did the results mean?
  • Lessons learned during pilot.
  • Pilot conclusions. What can be concluded about pilot goals? What are the next steps?
  • Scalability. What did the pilot reveal about program scalability? What additional processes or tools will make a larger program successful?

Primary Use:

Determine how to illustrate whether the pilot goals have been met and to make conclusions about the pilot that can be shared with the public and legislators.


Best Practices/Lessons Learned:

  • Direct pilot vendors to record and transmit data regularly.
  • Write surveys in a neutral, non-leading way so data are useful.
  • Begin data analysis during the pilot to ensure the system is working correctly. It will need to be finished after the pilot is complete.
  • Prepare versions of the report for both technical and nontechnical audiences, or keep technical material in appendices and have the main body of the report aimed at nontechnical audiences.
  • Use simple, intuitive graphics wherever possible.
  • Analyze, illustrate, and discuss raw metrics (numbers). Metrics alone may not mean anything to readers.

State Government Context and Assumptions:

The lead RUC agency does pilot evaluation planning before the pilot commences, and the evaluation is completed immediately following the pilot.