Gratitude for 8 Years of Bee's Knees!

A drawing of many 8s using a range of colors and pens

Yesterday, my business, Bee’s Knees Consulting LLC turned 8 years old. Thank you to my clients, friends, and colleagues for sharing your work, engagement, encouragement, opportunities, wisdom, inspiration, and joy. Every year, I get to learn and try new things and build new connections (in my heart, brain, and between people and organizations) in service.

8 has always been my favorite number for all sorts of reasons. For fun, I looked up some musings about 8 and found that 8 often represents balance and stability, abundance, constant flow, and renewal. There are 8 major planets in our solar system, and oxygen is the eighth element of the periodic table. What does this mean for Bee’s Knees? No idea, but particularly after the past few years, it feels good to reflect on balance, renewal, and breathing for sure, as well as what I want to do more of, less of, when, and how. I'm excited for the year ahead.

In gratitude,

Gretchen

ps, it was fun to do a meditative drawing of 8s last night at a craft night hosted by two of my favorite local businesses, Tiny Turns Paperie and Remnant Brewing

#eval #evaluation #nationalservice #consulting #programdesign

Places, Spaces, Science, and Bravery

The moments when seemingly disparate threads of our interests and professional directions intertwine and reveal a pattern, connecting past and present, are beautiful to me. This post is about how a recent learning experience reminded me of how “place” is one unifying theme in my work and life.

On June 13, I joined one of the best Greater Boston Evaluation Network presentations I’ve ever attended—Using Space and Plan to Enhance Program Evaluation by Katie Butler. Katie is the founder and principal evaluator of The Geoliteracy Project. In her talk, she featured ways we could use place (physical location) and space (your context for a place, e.g., your classroom) to enhance evaluations, whether process or outcome studies. I’ve thought about her presentation many times and ways that place and space have been a part of my work (and life). I’ve often reflected on how fortunate I was to grow up in safe neighborhood, with kind neighbors, plenty of space and a garden, in a relatively affordable city (Pittsburgh), within walking distance to cultural institutions, my schools, and parks. The economic and structural situation of my tiny family changed dramatically for the worse between my infancy and school age, and later, there would have been no way we could have landed in that home, if we hadn’t already been there. We had no car after I turned 4, nor could we afford one, so walkability and public transit were critical and thankfully excellent. Place contributed to my resilience. When I was at City Year, the organization talked about how a zip code should not determine the quality of a child’s education and future. In my work with Fishing Partnership, “place” is a critical part of nearly every conversation (e.g., the community health workers who are from the fishing communities they serve, bringing preventive healthcare to the docks where fishermen are, the contributions of the fishing industry to their port towns and the economy of Massachusetts, the risks of the sea and the ways fishermen serve as first responders). Anyway, during her presentation, Katie’s passion and skills were evident, and I kept thinking about a formative experience in my career and one of my most important mentors.

I got to practice so many skills, that I still use today in my first job out of college. For 3 years, I served as a Research Associate for a large NIH study at the University of Pittsburgh Medical Center with Herb Needleman, M.D. We studied the neurocognitive and behavioral effects of low levels of lead exposure among boys living in the city of Pittsburgh, who had first been a part of another longitudinal study. Herb was a hero and my mentor. He trusted me and treated me like a colleague from the first time he interviewed me, and I treasured our lunches years later on my visits home from graduate school. On my first day of work, Herb gave me a manual for the new software he had purchased to build our study’s database and trusted me to build it. Herb and my other exceptionally wonderful boss, mentor, and friend, Julie Riess, Ph.D. nurtured my learning and interests. Thanks to them, my first publication was in JAMA and meaningful to policy (Needleman HL, Riess JA, Tobin MJ, Biesecker GE, Greenhouse JB. Bone Lead Levels and Delinquent Behavior. JAMA. 1996;275(5):363–369); you can read more about it here.

Maybe another time, I’ll describe all the things I got to do and learn while working with Herb and how it was a lucky combination of events that I even ended up there. What I most want to do today is to honor Herb. There are so many books and articles about him and by him, but for now, I hope you will read the links below. In recent years, we’ve witnessed the lead water crisis in Flint, Michigan and the devastating effects of environmental toxins in too many other places. We see how we are connected in this world as the air quality in the U.S. is affected by the fires in Canada. Science and bravery still matter.

https://www.pbs.org/wgbh/nova/article/herbert-needleman/

https://ehp.niehs.nih.gov/doi/10.1289/EHP2636

  

Putting Capacity Back into Capacity-Building

A sunflower plant grows with other plants and supports in a community garden

(I wrote this for the American Evaluation Association’s AEA365 site during a special Organizational Learning-Evaluation Capacity Building week in April 2023).

Hi, I am Gretchen Biesecker, Principal Consultant with Bee’s Knees Consulting LLC in Somerville, MA. A large part of my practice focuses on evaluation capacity-building with nonprofits small and large, including AmeriCorps programs across the U.S. AmeriCorps. AmeriCorps is a federal agency that “brings people together to tackle the country’s most pressing challenges through national service and volunteering.” Through a national network, AmeriCorps enrolls 200,000 Americans each year to meet critical needs in education, the environment, disaster services, public health, among others.

Sometimes my capacity-building work can get pretty meta! For instance, my colleague, Marc Bolan and I conducted a randomized control trial of an AmeriCorps program’s efforts to build evaluation capacity among organizations hosting their AmeriCorps members. The goal of the study was to measure the program’s impact on participants’ knowledge, attitude, and confidence relating to evaluation, and their capacity to carry out evaluation and performance measurement activities.

More commonly, over the past 7 years, I’ve worked with AmeriCorps programs and their state commissions to build their knowledge, understanding, confidence, and use of evaluation and data. I conduct trainings in person and via Zoom and offer office hours or individual coaching sessions. I also lead cohorts of 4-6 programs who want to improve how they collect or use data, articulate research questions and ideas for evaluation studies, or better put results into action.

Lessons Learned

• Evaluators need to develop better tools and approaches to measure the outcomes of our capacity-building work across a range of programs and settings. Few tools exist, and those that do can be too long and full of jargon to work well among nonprofits.

• Too often, evaluation capacity-building focuses on deficits, rather than capacities and assets that already exist. I see this when I look at survey tools and assessments designed to measure capacity-building, when I review evaluation training materials, and when program staff share past experiences with me. I’ve seen assessments that yield scores based on the absence of capacities. Not only can this approach feel demoralizing to programs, but it flies in the face of the very name—capacity-building!

Instead, I find that most programs and their staff want to learn and improve in their work, and they believe in evaluation. Their lack of capacity is not rooted in a lack of interest—it’s stalled by a lack of time, positive experiences, resources, and funding. When we start by focusing on capacities that programs identify and want to work on, ask about their assets and build on them, and create dedicated time and space, we see success.

Hot Tips

• Appreciative inquiry, an approach that focuses on strengths, works well in evaluation capacity-building. Asking questions and focusing on tiny and bold steps that could lead to improvements, creates excitement, confidence, and positive momentum.

• Building evaluation capacity takes time and money, or in other words… CAPACITIES! We need more foundations and funders to pay for this work. When evaluators take the time to deeply understand the programs they serve, and programs can get work done within the learning experience (e.g., writing an evaluation plan as part of a cohort experience), capacity grows.

• A combination of small group and 1:1 work with programs can be powerful. Individual, 1:1 time allows programs to ask specific questions and reflect on their unique challenges, strengths, and plans. Small group time helps peers learn and share ideas and realize that many evaluation-related challenges and ideas affect us all.

15 Tips for Success as an Internal Evaluator

Recently, the wonderful Tiffany Berry suggested that I write down some lessons or tips I'd learned from my time as an internal evaluator at City Year (CY). Some of these tips come up as Bee's Knees coaches other internal evaluators. The tips are in no particular order yet, although I feel like the first one can be a very important place to start. Too many people have had negative experiences with evaluation and evaluators. Let's change that.

1. Adopt a customer service mindset. Seek to anticipate the needs of other departments and leaders and to be consistently responsive. It's not always saying yes--it's managing expectations. Be clear on what you do, what you don't do, and why. And rather than saying "no" to a data or eval request, problem solve together with the person making the request (e.g., "I can't give you x, but could y work?"). Follow up to "close the loop" on requests and communications; check in after you help to see how things went.

2. Share enthusiasm for the work, things you are excited about, small wins. Encourage each member of your team to be mindful of this: even when you are having a "how's it going?" conversation with a colleague in the elevator, share something good going on with eval, which also helps educate others on the work you do. This helps ripple your work beyond your department--others can share the story of eval and the cutting-edge or latest advancements you are making. You can be serious in purpose, rigorous as possible in the work, but still infuse your own personality, fun, and excitement in how you talk about the work. A lot of people aren't inherently excited by numbers and data, so it can be extra powerful to revise their perceptions with your enthusiasm!

3. Start with what you can control. When it feels like you have obstacles ahead, or that there are some big things beyond your control, start with the things you can control or do to keep up your morale, to keep the work moving.

4. Celebrate small wins. You are truly running a marathon and some parts of evaluation move so slowly and can appear to be happening in a black box....it's important to recognize and share small wins along the way to help sustain yourself and to "show your work." 

5. Avoid short term decisions that will cause long term pain. When possible, before taking on new work or responding in a particular way to a short term fire-drill, consider how your response or action will affect the long term.

6. Explain the "whys" behind what you do, the choices you make, even when people are not asking. They may not always agree with your position or understand something about evaluation, but you can help them learn. Sharing the why builds knowledge and eval capacity within your org, it builds trust and confidence in you, and it's a good gut check for yourself-- if you don't know the why, then you may want to rethink your position. 

7. Stage out the work. Plan for training and delays. Help leaders and other see how what you propose for today links to a more ideal state. Name stages of your work (e.g. Design, Pilot, Refine, Scale) to help others connect and remember story of how you are building eval capacity for your organization. Make sure you allot time and care to training and messaging.

8. Look for outside examples, benchmarks to help make the case and inform what you do. Sometimes leadership needs to see that other organizations have done what you propose. Sometimes you need a gut check backed by research and peers from other organizations.

9. Take care of each other as a team...even if you are a team of one. Make others you work with feel part of your eval team too. For instance, Program Design and Development/Advancement staff are your best friends (and hopefully IT peeps too, if you are part of a larger organization). Invite them to celebrate small wins with you, and thank them for their contributions.

10. Set, share, and keep a data schedule. At first, this might lay out when you will have tools ready/launched (e.g. when surveys will "open," upgrades to systems will be ready) and data collection or entry due dates. As you gain experience, you'll develop a better sense of an annual eval ecosystem, and you'll have a sense of internal capacity. Then, it can be extraordinarily helpful to add dates by which you'll have results from each of your tools or studies ready/posted. Honor those dates. This helps your colleagues see what kinds of data you are managing and feel less in the dark about when they might see results. Also, it can serve as a helpful reference doc to share, so you aren't answering the same questions again and again. And, it can help Development and other staff plan or negotiate reporting dates in grants.

11. Create business processes that are clear (this relates to having a customer service mindset). Clear business processes help others know how to work with you. They help you triage data requests, clarify roles-- who can help w what--and help keep your team more sane with some clear boundaries. It may take your team and others some time to adjust to new processes, but usually people will appreciate the clarity and consistency. For example, early on at CY, we set up a simple online survey (but we didn't call it that-- it was a "data request form") that anyone requesting data would be asked to fill out. It asked key questions like who, what, when data were needed, dept, and why. This helped us avoid a lot of wasted time going back and forth asking people to give us the specifics we needed in order to fulfill their request. And, it helped us keep track of patterns of requests (e.g., which departments wanted what and when, who might benefit from a little nudge to stop waiting until the last minute to make requests, common data points requested that we might want to make sure everyone could access, busy times of year that we may not have anticipated, etc.). We set our process up to auto email every member of our team when a request came in, and then we would triage and jump in to fulfill it. That way too, we all were aware of the kinds of requests the team got, and we didn't drop the ball or lose a request in one person's inbox. Even if someone stopped by or called us, we would still have them fill out the form. This is one example of a business process, but you may have other regular types of work or requests that no one in the past has taken time to articulate and hold.

12. Be a proactive listener. Look for ways to proactively share data or updates with other departments based on what you hear they are working on or interested in. You may get new ideas from other departments too, from other disciplines or fields, or the news. For instance, at one point I realized that our leadership, including some Executive Directors, who had mostly business backgrounds, were reading a book called, The 4 Disciplines of Execution and adopting some of the language and approaches (even in areas unrelated to eval). I made sure my team read the book, and we started to use examples and language from it before anyone even thought to ask us to. I continue to use examples from this book when talking about performance measurement, progress monitoring, and dashboards.

13. Keep reading and learning-- make time for this. Our internal eval team took time each year to read a few books unrelated directly to eval like Made to Stick about how to make communications/messages more sticky, because so often we needed to send out requests, updates, instructions, reminders, summaries, etc.; we saw that another department was using some of its tips and how their messaging improved. We would choose a book to read as a team and break it into chapters or sections to read and discuss every few weeks. We also would read articles related to education and evaluation, of course. Personally, I also found it helpful to skim through business news-related articles for ideas like Fast Company and Business Insider

14. Stay connected to field staff and the sites where your organization works (e.g., schools, neighborhoods, etc.). Leverage evaluation pilots to visit and take time during other opportunities like summer training to get to know field staff. Make sure each new evaluation staff member takes time as part of on-boarding to connect to field staff and to see your program in action.

15. Use empathy in your design. Walk in the shoes of your audience. Simplify your message and presentation of data. Make it fun and clear. Connect to the culture of your organization.

Of course, internal evaluators need solid technical skills in evaluation and research to be successful; you've got to know your stuff. But, you've also got to learn ways to work with others in complex environments. These 15 tips cover important things that most academic programs don't cover. I learned so much from City Year and as a staff member within other nonprofits. I'm thankful to get to pass the learning on.

What tips do you have for internal evaluators? 

Happy Bee-Day!

Photo by villagemoon/iStock / Getty Images

 

We just celebrated our first birthday, and it's been one sweet, busy-bee year. Thank you to friends, colleagues, and clients for your support and business this year.

Today, we are working with 5 fantastic clients on an ongoing basis-- in most cases, across multiple years.

Some highlights of our first year

  • Founded in August 2015
  • Opened for business in September 2015
  • Engaged two excellent affiliates in over 140 hours of work to date across multiple Bee's Knees projects
  • Served more than 9 non-profits, building evaluation capacity and learning with and from them
  • Designed and led multiple evaluation trainings for all AmeriCorps programs in MA
  • Joined the National Evaluation Advisory Board of After-School All-Stars
  • Conducted evaluation activities across a variety of programs and domains, including: addiction medicine education; parenting education and family support; early childhood literacy; positive youth and leadership development; community health work and safety and survival training for fishermen; immigrant and refugee assistance; and after-school education and mentoring.
  • Wrote successfully funded federal and state grant proposals; kicked off facilitating a year-long Theory of Change process; helped programs improve the design, efficiency, and implementation of their data collection processes; developed presentations for the American Evaluation Association; analyzed data and developed new ways to visualize and report on results; learned some new Salesforce tricks; connected and reconnected with valued colleagues and more....

With gratitude for this work and to all of you,

Gretchen

Cultivating joy in evaluation

Isabella Stewart Gardner Museum Courtyard

Isabella Stewart Gardner Museum Courtyard

Recently, I've been in a number of situations where discussions of evaluation seem devoid of joy or fun. Evaluation is hard, hard work in multiple ways. Data collection often takes many hands and more time than expected, and missing just a few details can result in messy data and hours of sleuthing. Often analysts spend something like 80% of their time cleaning data. Systems sometimes feel like black boxes. Goal-setting is tough, especially setting the right targets when you are doing something new. Passionate people, doing good work, worry about hitting their goals and meeting funders' expectations. Internal evaluation staff can feel attacked when program staff question the results they see. The very word "evaluation" can feel icky. 

"Success is not the key to happiness. Happiness is the key to success. If you love what you are doing, you will be successful." -Albert Schweitzer

So, how can we bring the joy back into our evaluation work and cultivate a positive culture of data? Here are a few strategies to start.

Celebrate small successes

Remember how I said this is hard work? Everyone makes mistakes in data entry, syntax, etc. at one time or another. Figure out ways to recognize when individuals or groups get their data in on time, completely, show improvement, or try to innovate for efficiency. Given how long it can take to get from design to seeing outcomes, be sure to celebrate your small successes to stay energized. A colleague just told me that at her non-profit, which used Efforts to Outcomes (ETO), they would pass a little E-T action figure around to those who did the best job of entering their data on time.

Build on the culture of your organization

Years ago, I taught a course on children's social-emotional development to one of the best groups of 70 undergraduates ever. Early in the course, I discovered that many of them, who were mostly child development and psychology majors, happened to be getting dual degrees in music or were very involved in music performance. This inspired me to use music in teaching that course, because I knew they loved it. For instance, we studied patterns of parent-child attachment using pop-song lyrics that matched those patterns.

One reason that the hanging nasturtiums and other flowers in the garden above are so beautiful and inspiring is that they fit with the architecture and the traditions of the Isabella Stewart Gardner museum. 

What do the people you are working with enjoy? Does your organization have existing stories, structures, or traditions that you can connect your evaluation work to?

Don't use data as a weapon

I hope this doesn't require explanation. Truly, think about the tone you set when you are reviewing results with your staff, particularly when you are using data for performance measurement and management. 

Emphasize that evaluation=learning

Evaluation can help us grow and improve, share promising and evidence-based programs and practices. If you've created a culture of continuous improvement, with staff who are invested in doing their best work to best serve others, then this reminder can be really helpful and freeing. (It can also help you to remember to match evaluation plans and methods to questions that you care about and want to learn about).

Use visuals and metaphors

We all like and deserve to see beautiful, intriguing, fun, clear, and helpful images. You think you don't need to make pretty charts for your internal staff? Well, I've been there--you have little time and you know you are among friends who don't need perfection so you pull together something quick and rough. That's understandable. But also, when you can make the time, don't your internal colleagues deserve to see beautiful, intriguing, fun, clear, and helpful images? Couldn't that help inspire them to get more excited about data?

Non-evaluation metaphors can also help clarify and de-mystify evaluation concepts. Use ones that make sense or are funny and authentic to you. I've seen two cyclist/evaluators share their experiences using data to improve in their sport to illustrate evaluation concepts. It didn't matter that there were no other cyclists in the crowd. People appreciated their passion and thinking about evaluation outside of the pressure of their own work.

Crack yourself up

I wish I could remember who said it, but recently I heard a comedian talking about how he talks about things in his act that make him laugh. He doesn't worry about what an audience will think is funny. If he worries about that, then he won't be himself and won't be funny. 

So, when you can, infuse some humor into your work. Start with things that crack you up. Your nerdy humor truly can help others enjoy evaluation (more?).

My City Year team, inspired by Jess Tau's dog, for example, came up with some memes to include in our newsletters to our evaluation staff around the country. We loved them, and soon others did too. It helped make some very tedious reminders more fun. And I'm guessing that already memes are so-3-years-ago, but I've been playing around with an Oprah meme-generator I discovered today. Uh-oh. Look out! (warning: the heat map meme is dripping with sarcasm).

Involve everyone

Many of us have been trained in excellent academic programs. We've been coached and have learned to give presentations, following a traditional formula for conferences (e.g., theoretical background, research questions, methods, results, discussion, etc.). There are definite times and places for that formula.

Guess what? If your go-to method of sharing information about evaluation activities, processes, or even sharing results follows that formula, you are likely missing out on some big opportunities and maintaining a distance between evaluators and other staff. Find out what people want to learn, what data they would find useful. Involve them in reviewing data and interacting with it. Honor their knowledge of context and interest in problem-solving and curiosity.

This is a starting list of ideas. A next step is to collect and share some specific examples. If you have ideas, things you've tried or seen and loved, I'd love to hear about them.

 

Getting squishy

New Bedford, MA

New Bedford, MA

Eleanor Roosevelt said, "Do one thing every day that scares you." City Year said, "Do 3 squishy things a day...You know that you are truly leading when you do at least three things a day that make you a little bit uncomfortable." Word.

Any evaluator worth her/his salt will tell you that time out in the field, seeing our programs in action, is invaluable. We can experience, or at least get a sense of, what it’s like to be a staff member, to be a participant in the program,  to be a piece of data moving through different processes and systems, and see the contexts that interact with all. Walking in others' shoes offers new perspectives, and some of the top design firms in the world like IDEO stress the importance of design informed by empathy.

This week, I got to join two trainings with Fishing Partnership Support Services, an innovative community health program. Long hours, strenuous labor, time away from families, harsh weather, and inconsistent income are just a few of the stressors affecting the health of fishing workers and their families. Fishing Partnership employs trusted women from the fishing community as insurance Navigators, provides health interventions (e.g., vaccines, health, and dental screenings), safety trainings, and financial planning and stress-reduction workshops. Program activities are located harborside, where fishing workers and families are.

On Wednesday, as part of their monthly staff meeting, the team and I participated in Narcan training in New Bedford. Opioid addiction and overdoses are affecting communities across New England, including my own. Narcan can prevent deaths by overdose, and Fishing Partnership wanted to learn more about it and be ready to respond, if ever needed, just as they ensure staff have CPR training. As we left the training, this news story broke about heroin arrests in New Bedford, underscoring the need. 

Then, on Thursday, I went through this Safety & Survival Training with about 40 fishermen.

There are more than 11,000 fishing workers and families in New England, and it’s a big part of our economy, but you may not know that:

  • Commercial fishing is one of the most dangerous occupations in the United States. Groundfish fishermen in the Northeast are 37 times more likely to die on the job than police officers.
  •  Safety equipment in good order, precautions, and hands-on practice drills can make a life-saving difference. That’s why Fishing Partnership offers free trainings like the one I joined. But until two fishermen decided to do something in 2005, after losing too many of their friends, these kinds of trainings were not offered in New England.
  • And there's evidence that they work. Efforts in Alaska, pioneered over the last 22 years by the Alaska Marine Safety Education Association (AMSEA), resulted in a dramatic 67% decline in commercial fishing deaths in that state, and in the past 16 months there have been 0 fatalities there.

I got a refresher in CPR, learned about emergency evacuation procedures, MacGyvered repairs to simulated leaks in a boat, put out a fire, plunged into water in an immersion suit and scrambled/got pulled into a small life raft for 6.

I got to see behind the scenes of the effort, activities, and care that the Fishing Partnership staff put into the training. I got to learn how to make a May Day call correctly. I got to experience what the training is like for the fishermen. I saw their reactions, their genuine engagement, and how much they appreciated the learning.

I got to push myself out of my day to day and comfort zone. And I got to crack myself up thinking things like, “Some evaluators help their clients put out fires. Bee’s Knees puts out FIRES.”

So, if you are an evaluator, make sure you take time to get out there and spend time with your programs, especially if you are helping to build evaluation tools, processes, and capacity. If you are a program, don’t forget to invite evaluators to really experience your work; sometimes we nerds can be a little shy, but it's good for all of us to get squishy.

A bee by any other name....

Photo by Valengilda/iStock / Getty Images
Photo by Valengilda/iStock / Getty Images

You might be wondering where the name, Bee's Knees Consulting came from.

Well, I wanted a name that would:

  • stand out*
  • grow with me and my business
  • inspire a little joy
  • convey values that matter to me and my company

What are those values? They include:

  • Excellence, becoming the best
  • Adaptability
  • Communication
  • Collaboration
  • Servant Leadership

Bees and beehives display many behaviors related to these values, and they demonstrate that small steps and adaptations can yield big gains. Here are a few short articles about this. Who knew I was biomimicking?

Leadership Lessons from a Beehive

6 Lessons that Businesses Can Learn from Bees

What Workplaces Need to Learn from Bees

Thanks to a recommendation from some City Year pals, I was also digging the podcast StartUp, around the time I was starting my business, particularly this episode on naming a company. It gave me some ideas, some good laughs, and a sense of freedom to choose something unique.

My maternal grandmother was another inspiration--she used phrases like "bee's knees" and "the cat's meow," popular in her youth. In my work, I use some tried and true, traditional program evaluation approaches, combined with new twists. "Bee's knees" evokes that tradition while it remains current (just check out some drink menus these days). And, as a bonus, the black and gold colors of bees reflect my Pittsburgh roots.

Best,

Gretchen 

*I noticed that a lot of evaluation consultants name their firms after themselves. There is nothing wrong with that, but my name is a mouthful and I wanted my company to be more than just about me and my name. My last name is pronounced Bee-secker (or bee-sucker, if you were a certain boy in my elementary school), so Bee's Knees is a play on that too. Others seem to pick names that are descriptive of where they are (e.g., Boston) or what they do (e.g., Research).

Resolutions, Goals, and Data

A favorite moment from a spring run

A favorite moment from a spring run

Happy New Year and Happy MLK Day!

It's a new year and a time of reflection and new resolutions. I took a little time to look back at 2015. I'm thankful for many wonderful professional and personal experiences last year and for the work I get to do. Later today, I'm joining my community in this day of service.

As I was reviewing some data I'd tracked about myself (would you expect anything less?), I started thinking about connections to program evaluation.

Context matters. It helps you make meaning of the numbers.

Last year, I ran 698 miles, across 174 runs. Is that a lot? A little? If you are a runner, and an elite one at that, then 698 may not seem like many. If you hate running, it may seem like a lot.

Here are just a few ways I analyzed my data and put my numbers in context.

Time as context

  • I ran an average of 1.9 miles per day, 13.3 miles per week, and 58.2 miles per month. On average, I ran about 3.3 days per week. 
  • In my best month (May), I ran 20 times, more than 93 miles. In my worst (September), I ran 6 times and only about 20 miles. For more than half of the year, I ran at least 15 times per month.

Past performance

  • In 2015, I increased the number of miles I ran by 69% compared to 2014.
  • One month in 2014, I ran over 15 times, but I never ran 20 times in a month as I did multiple times in 2015.
  • For both 2014 and 2015, I ran the most in the month of May.

Distance

  • 412 miles (my total for 2014) is about the distance from Boston to Washington DC
  • 698 miles (my total for 2015) is more than the distance from Boston to Columbus OH
    •  (Btw, this context is kind of depressing. After a year! I'd only get as far as Columbus!)

Evidence

  • I compared my miles to research on how many miles lead to different outcomes.
    • Improved health? Some research like this suggests that 20-30 minutes 2-3 times per week would suffice. So I did well in relation to this goal.
    • Improved time for a half-marathon? A lot of research-based training plans suggest a minimum of about 20-30 miles per week, so in the months I trained, I aimed for that. 

Goals

  • I compared my results to goals I'd set. At one point in 2015, I'd hoped to hit 730 miles for the year, or about an average of 2 miles per day. I didn't quite make it :( That's another way to make meaning of my results.
    • I've got company. You may have seen that in 2016, Mark Zuckerberg plans to run 365 miles, or 1 mile a day, for the year, and you can join this pledge.
    • This month, I'm running or walking outdoors 3 miles every day as part of a local running store's Winter Warrior Challenge. So far, I'm on track. Brrr.

Bringing in other contextual data to make meaning

  • So, I know that May has two long distance races that I've done and trained for in both 2014 and 2015--the 10 mile Broad Street Run in Philly and the Run to Remember half-marathon in Boston. That helped explain why I ran more miles in May than other months in both of those years, but also....
  • May was a peak month for other activities I track. Using my Goodreads data, I saw that I read the most books in the month of May--5 books that month out of 26 finished for the year. What's up with me and May? Spring fever?

How does this apply to program evaluation?

  • Context matters, and some of the ways I've looked at my "quantified self" data are ways that we commonly track and explore data in program evaluation.
  • Some kinds of context can make sense to lots of people, even if they aren't experts on the particular kind of data you are presenting. Time or performance against goals are just two examples. Consider your audience.
  • Now, I have some information and insights that will help me plan some goals I'm setting for 2016. Ideally, the learning from evaluation helps us learn and improve.
  •  And one study or set of analyses often leads to new questions like....

How can I overcome the instinct to hibernate in January and infuse it with some May spring-fever-like activity?

How do I make sure my program is ready for an evaluation study?

Last week, I attended the American Evaluation Association's conference, and I was reminded of a common challenge and some tools to help. Often an organization or a funder wants a randomized control trial or other rigorous study of a program. Of course, studies that provide a comparison or control group, or some kind of counterfactual, are important. What is also important?--that these studies actually study "the program" compared to "not the program." 

What do I mean by this? Well, imagine you are still designing a program when you start it. Imagine you have just started to implement it in a new place--maybe a place you don't know so well yet. So then, maybe in the first days or months there, you aren't able to actually deliver much of the program as intended. You are still working out kinks in your partnerships or training staff who are new to your model. If we compare your performance in that scenario to a control situation, we may not yet be measuring your program. If we see poor results, what have we learned? We haven't really learned that your program doesn't work, or which elements are most effective for whom, because there is so much noise in the way, and your program wasn't yet really your program.

Is the first time you ever rehearse a play with a new cast in a new location, a true reflection of the production's quality? If we watched you flub the lines, your cast miss cues, without costumes and lights, would we yet really be seeing the play as written? Wouldn't it be better to give you a little time and structure to rehearse so that we can compare your performance to another play we've seen?

So, I'm not saying a newer program can't be evaluated. But wouldn't it helpful to know, assess, and then establish some pre-conditions before spending time and resources on a study? Recently, a colleague from an evaluation firm I've worked with shared that she often gets approached to do a study, only to find the program isn't ready. The program may not yet have a logic model or theory of change, a defined intended duration of the program, data on participation, or a sense of the staff time it will take to work with the evaluators. 

There are a few tools out there that can help. This evaluability assessment from the Corporation for National and Community Service (CNCS)/Social Innovation Fund (SIF), developed by Lily Zandniapour and Nicole Vicinanza is one example that I like and that you could modify. It helps you think about organizational readiness, program readiness, and evaluation readiness. 

Have you seen or used similar tools? How are you getting ready? Can I help? Let me know.

Best,

Gretchen