Many of us are looking for a simple template we can use to report our success stories. This is especially helpful for the teams we work with for collecting data on similar program efforts that need to be aggregated for a regional, state, or national report. If creation and use of a success story template appeals to you, take a look at this Journal of Extension article and the related link showing how a template was developed for an Extension food safety program team: http://www.joe.org/joe/2009june/tt5.php
I’d be happy to share more about this process and help teams create public value templates like this.
At our recent Families inservice, one small group discussion focused on improving program evaluation and success stories. They asked for writing tips. Here’s a module from my work in Virginia that provides some helpful tips: http://connect.ag.vt.edu/impactwriting
You may also find this success story review form to be helpful Success Story Review. The group that discussed this topic at inservice also reminded us to tie our success stories back to our work team logic models, to work with our external relations coworkers, and to send our success stories to the counties and REEDs we work with.
JOE Advanc Pub Value 12-10 edits
Some of you have been asking me how we as an organization get closer to measuring and telling our public value stories. Later this month in the Journal of Extension, I have an article coming out that suggests how we do this. Here is your preview copy! Let me know what you think.
I continue to find great nuggets of wisdom in Mike Patton’s book on Developmental Evaluation. One observation he makes that fits our work well is that as Extension educators, we are operating in the “muddled middle”. He considers this the area between grass roots/bottom up efforts and top down mandates. This often requires us to move away from best practices determined by randomized control trials to effective principles developed from context specific case studies. He specifically believes that effective principles can provide guidance for how we should act in complex environments where best practices don’t seem to work. I belive one of our roles as “context sensitive” educators is to record and share these prinicples so that others don’t have to start from scratch. We need to consider how we can more fully share these observations and practices at conferences and meetings and through publications. I can tell that Mike Patton was an Extension worker and hasn’t forgotten the realities of his Extension experience.
Over the last several months several of us have had conversations about focus group interviews as a valuable way to conduct needs assessment and program evaluation data. Those of us who have been doing this for awhile have discovered that “unfocused” focus groups have some nuanced benefits not found in other groups. I recently had an article accepted for “The Qualitative Report” on this topic. Some of you have asked to read it so here it is: QR_THE_UNFOCUSED_FOCUS_GROUP
I’d love to hear what you think of this as we continue to learn from each other and build best practices in our work.
Environmental Scanning, Programming
I attended the American Evaluation Association meeting last week to finish out my commitment as an officer of the Extension evaluator’s group and to present a variety of workshops. I always come home with a great update on Extension around the country as well as lots of ideas for improving our programming and reporting. The last two years I’ve found the “developmental evaluation” movement to be very compelling for some of our work. Michael Patton from Minnesota has finally published his book on this topic (I’m reading it now). This type of evaluation adds a new approach to the traditional formative and summative evalation processes we are so familiar with. Developmental evaluation is most appropriate for programs and programs environments that are complex or chaotic. As the world around us changes more frequently, I find this to be more of a reality. In complex and chaotic environments, evaluation tends to be rapid, just in time, and focused on processes and responses to the changing environment. This flies in the face of the evidnence-based movement that requires a static environment to ensure program fidelity. I believe we should consider the appropriate fit between program environments and evaluation approaches as the world of education changes. Feel free to check out “Micahel Patton” and “developmental evaluation” in Google scholar to learn more. I’d love to hear what you think of this approach to evaluation and the implications for our work.
Professional Development, Reporting