Log in

WE President's Blog

<< First  < Prev   1   2   Next >  Last >> 
  • Fri, October 05, 2018 4:23 PM | Anonymous member (Administrator)

    Washington Evaluators members will be well represented, not only as attendees, but also as presenters in this year's American Evaluation Association fall conference in Cleveland, OH. At least 59 Washington Evaluators members will participate as presenters, panel chairs, group leaders, in poster sessions, and as discussants in at least 124 sessions. Congratulations to all!

    A list of Washington Evaluators members participating in AEA sessions in any capacity is provided below (as of Oct. 5, 2018). Please refer to the official program at www.eval.org for all final session dates and times.

    Michelle Abraczinskas:

    Friday Concurrents 8:00am-9:00am - YFE3: Youth Speaking Truth To Power

    Joy Amulya:

    Thursday Concurrents 11:30am-12:15pm - 1645: Winning the war on state-sponsored propaganda: Results from an impact evaluation of a Ukrainian news media and information literacy program

    Thursday Concurrents 5:00pm-5:45pm - 1659: Using Evaluation to Inform the Design of Program Networks for Societal Change

    Saturday Concurrents 10:15am-11:00am - 1989: Democratizing evaluation: Whose Questions? Whose Data? Whose Learning?

    Stephen Axelrad:

    Thursday Concurrents 5:00pm-5:45pm - 1248: Building Collaborative Evaluations from Stakeholder Analysis

    TIG Business Meetings - TIGBM37: Military and Veteran Evaluations TIG Business Meeting

    Denise Baer:

    Wednesday Concurrents 5:45pm-6:30pm - 2702:"Capacity To" Collaboration Tool for Assessing Partnership and Coalition "democratic" Governance for Advocacy and Policy Reform

    Saturday Concurrents 9:15am-10:00am - DG2: Parliaments, Policy and Monitoring and Evaluation

    Gail Barrington:

    Pre-Conference Workshop - One Day (Wed) - 46: (46) Intermediate Consulting Skills

    Thursday Concurrents 8:00am-9:00am - 1518: AEA's Evaluator Competencies-- what can we learn from other Evaluation societies about next steps?

    Friday Concurrents 5:45pm-6:45pm - 1247: What Happens Afterwards? Ways to De-brief with Clients

    Post-Conference Workshop - Half Day (Sat PM) - 64: (64) Consulting After 50: Career Transition Issues

    Alemayehu Bekele:

    Thursday Concurrents 3:45pm-4:45pm - 2431: GEDI 14: Building emerging CRE Practitioners and Scholars

    Heather Britt:

    Thursday Concurrents 8:00am-9:00am - 1906: Systems Approaches for Organizational Development

    Thursday Concurrents 1:45pm-2:30pm - 1936: Principles-Focused Evaluation for Robust M&E Frameworks: Examples from countering violent extremism

    TIG Business Meetings - TIGBM54: Systems in Evaluation TIG Business Meeting

    Friday Concurrents 5:45pm-6:45pm - 1186: Applying Principles for the Effective Use of Systems Thinking and Complexity Science in Evaluation

    Adalei Broers:

    Thursday Concurrents 5:00pm-5:45pm - 2695: Five Organizations and One MEL System: Successes and Challenges

    Kerry Bruce:

    Pre-Conference Workshop - One Day (Wed) - 19: (19) Big data and evaluation

    Friday Concurrents 8:00am-9:00am - 2395: How can MEL Tools lead Adaptive Management?

    Friday Concurrents 4:30pm-5:30pm - 2193: When Many Choices are Good Choices: Considerations in Qualitative Data Analysis Software Platforms Through a Discussion of MAXQDA, NVivo, and Dedoose

    David Bernstein:

    Wednesday Concurrents 5:45pm-6:30pm - DUP1: Power to Succeed: Exploring Employment and Living Outcomes for People with Disabilities

    Friday Concurrents 10:30am-11:15am - 1187:35 Years of Evaluation Learning in 5 Minutes

    Friday Concurrents 3:30pm-4:15pm - 1113: Monitoring and Evaluating Inclusive Program Practices

    Val Caracelli:

    Friday Concurrents 5:45pm-6:45pm - 2391: Speaking Truth to Power When Power Isn’t Listening: The Current Challenges of Evaluation Influence

    Ratiba Cherif:

    Friday Concurrents 2:15pm-3:15pm - 2639: The OL-ECB Information Commons: Guidelines for Selecting and Curating Content

    Friday Concurrents 4:30pm-5:30pm - 2307: Harnessing creativity for evaluation

    Cynthia Clapp-Wincek:

    Friday Concurrents 8:00am-9:00am - 1643: Roundtable Discussion of International VOPE and AEA Efforts in Evaluation Policy Development and Advocacy

    Friday Concurrents 5:45pm-6:45pm - 1667: Update on AEA's Role in EvalPartners and the International Organization on Cooperation in Evaluation

    Susan Cottrell:

    Pre-Conference Workshop - Half Day (Wed PM) - 56: (56) Using Surveys and Brainstoming Tools to Empower Staff

    Katherine Dawes:

    Wednesday Concurrents 4:30pm-5:30pm - 1045: Learning from failure: Speaking truth to each other

    Giovanni Dazzo:

    Poster Reception and Meet-the-Authors - 2092: Evaluating Visually: Translating raw data to stunning visual communications to facilitate uptake of evaluation information

    Thursday Concurrents 11:30am-12:15pm - 1382: Whose Stories Matter?: Using Secondary Analysis and Meta-monitoring to Explore Power and Values in the ‘Most Significant Change’ Process

    Thursday Concurrents 1:45pm-2:30pm - 1841: Words matter. Managing power dynamics across stakeholders to ensure participation during evaluation and use of evidence.

    Jennifer Dewey:

    TIG Business Meetings - TIGBM5: Business, Leadership, and Performance TIG Business Meeting

    Friday Concurrents 8:00am-9:00am - BLP2: Towards a General Theory of Monitoring and Evaluation

    Danuta Dobosz:

    Thursday Concurrents 10:30am-11:15am - 1694: Teaching Qualitative Methods in Evaluation Online: Instructor and Student Perspectives

    Friday Concurrents 3:30pm-4:15pm - DG1: Evidence-Based Programs and Evaluating Democracy

    Ann Doucette:

    Pre-Conference Workshop - Half Day (Wed PM) - 50: (50) Measurement: How precise are the finding of my evaluation??

    Thursday Concurrents 8:00am-9:00am - 2776: Practitioner Challenges and Successes when Speaking Truth: Cases from Various Stages in an Evaluation

    Friday Concurrents 4:30pm-5:30pm - 1427: Giving an active voice to respondents through effective survey design

    Diana Epstein:

    Thursday Concurrents 3:45pm-4:45pm - 2502: Executive and Legislative Perspectives on Federal Evaluation Policy

    Lynne Franco:

    Pre-Conference Workshop - Half Day (Wed PM) - 55: (55) Enhancing the truth in triangulation: Approaches to Data Analysis and Interpretation

    Thursday Concurrents 1:45pm-2:30pm - 1841: Words matter. Managing power dynamics across stakeholders to ensure participation during evaluation and use of evidence.

    Friday Concurrents 3:30pm-4:15pm - 1504:The Pathway to High Effective Coverage at Scale-a framework for enhancing truth in program strategy, monitoring and evaluation

    Rebecca Frazier:

    Friday Concurrents 10:30am-11:15am - 2679: How to Use a “Bundled” Evaluation Methodology to Speak Truth to Power

    Sierra Frischknecht:

    Thursday Concurrents 2:45pm-3:30pm - 2762: How to Make Evaluating in Transition a Joint Initiative

    Friday Concurrents 8:00am-9:00am - 2167: Capturing Leadership Outcomes with Most Significant Change

    Friday Concurrents 4:30pm-5:30pm - 2536: Networks of International Leadership Development Programs: Who has the power and why do we care?

    Friday Concurrents 5:45pm-6:45pm - HPEER2: Intro to HPEER Part 2: Conducting Evaluation Work in the Healthcare Environment

    Saturday Concurrents 9:15am-10:00am - 2519:A new look for greater use: visual reporting at MCC

    Alejandra Garcia Diaz Villamil:

    Friday Concurrents 8:00am-9:00am - ICCE4: Cultural Competency and Sense Making in International Evaluation

    Saturday Concurrents 11:15am-12:00pm - FIE1: The Power of Evidence in Gender Equality Evaluations

    Nick Hart:

    Thursday Concurrents 10:30am-11:15am - 3062: AEA Evaluation Policy Task Force (EPTF) Update

    Thursday Concurrents 11:30am-12:15pm - 1025: Promoting Evidence-Building Capacity for Learning in Government

    Thursday Concurrents 3:45pm-4:45pm - 2502: Executive and Legislative Perspectives on Federal Evaluation Policy

    Friday Concurrents 8:00am-9:00am - 1643: Roundtable: Updating AEA's "Evaluation Roadmap for a More Effective Government"

    Friday Concurrents 11:30am-12:15pm (Presidential Strand) -2701: Power to Truth = Evidence: Establishing a Culture of Evaluation for Evidence-Based Decisions and Policymaking

    Kelva Hunger:

    Friday Concurrents 3:30pm-4:15pm - 2779: Investigating and Measuring Burnout among Program Evaluators

    Mary Hyde:

    Friday Concurrents 2:15pm-3:15pm - 1843: Using Evidence to Scale-Up Community Based Solutions That Work – A Federal Agency’s Framework and Approach

    Jonathan Jones:

    Poster Reception and Meet-the-Authors - 2092: Evaluating Visually: Translating raw data to stunning visual communications to facilitate uptake of evaluation information

    Thursday Concurrents 1:45pm-2:30pm - 1841: Words matter. Managing power dynamics across stakeholders to ensure participation during evaluation and use of evidence.

    Akashi Kaul:

    Friday Concurrents 5:45pm-6:45pm - MIE1: Culturally Responsive Strategies for Addressing the Elephants in the Room: Race and Power

    Jackie Kaye:

    Friday Concurrents 3:30pm-4:15pm - 1813: Measuring the Health of Social Movements: Exploring questions of power, truth and speaking

    Kirk Knestis:

    Pre-Conference Workshop - Half Day (Wed PM) - 48: (48) Overcoming Traditional Weaknesses of Logic Models with a Novel "Condition Modeling" Approach

    Wednesday Concurrents 4:30pm-5:30pm - 1303: Game Changers: Lessons from Small Evaluation Firms

    Thursday Concurrents 10:30am-11:15am - 1455: Gun violence in the US: What if it’s a data problem before it’s a “gun problem” or “people problem?”

    Anne Laesecke:

    Friday Concurrents 8:00am-9:00am - 2167: Capturing Leadership Outcomes with Most Significant Change

    Friday Concurrents 4:30pm-5:30pm - 2536: Networks of International Leadership Development Programs: Who has the power and why do we care?

    Steven Lize:

    Thursday Concurrents 11:30am-12:15pm - 3061: Innovative Efforts in State-Level Evaluation Policy and Practice

    Alissa Marchant:

    Thursday Concurrents 5:00pm-5:45pm - 2219: Find your niche: Using evaluation to improve business communications

    McDonald, Sarah-Kay:

    Friday Concurrents 11:30am-12:15pm (Presidential Strand) - 2701: Power to Truth = Evidence: Establishing a Culture of Evaluation for Evidence-Based Decisions and Policymaking

    Donna Mertens:

    Pre-Conference Workshop - Two Day (Mon/Tues) - 01: (01) Eval 101

    Birds of a Feather - 1229: Intersecting transformative evaluation and social impact investing

    Friday Concurrents 5:45pm-6:45pm (Presidential Strand) - 1111: Challenging and re-framing truth and power in evaluation

    Post-Conference Workshop - Half Day (Sat PM) - 59: (59) Mixed Methods Design in Evaluation

    Patricia Moore Shaffer:

    Thursday Concurrents 3:45pm-4:45pm - 2286: Arts Education Collective Impact Initiatives: Meeting the Challenge of Shared Measurement across Sites

    Friday Concurrents 3:30pm-4:15pm - PD1: Exploring Approaches to Program Design: Finding Voice and Reconciling Truths

    Susan Morawetz:

    Thursday Concurrents 3:45pm-4:45pm - 2582: Fighting Famines

    Ian David Moss:

    Thursday Concurrents 10:30am-11:15am - 1571: Embedding Evaluation in the Decision-Making Process

    Ioana Munteanu:

    Friday Concurrents 5:45pm-6:45pm - GE3: Strengthening Government Performance Through Evaluations

    Erin Murrock:

    Thursday Concurrents 11:30am-12:15pm - 1645: Winning the war on state-sponsored propaganda: Results from an impact evaluation of a Ukrainian news media and information literacy program

    Thursday Concurrents 5:00pm-5:45pm - 1659: Using Evaluation to Inform the Design of Program Networks for Societal Change

    Kathryn Newcomer:

    Thursday Concurrents 11:30am-12:15pm - 1025: Promoting Evidence-Building Capacity for Learning in Government

    Cheryl Oros:

    Poster Reception and Meet-the-Authors - 1877: Applying theory-driven approach in evaluating the planning and implementation of a school-based telemedicine program in rural Georgia

    Thursday Concurrents 11:30am-12:15pm - 1211: Using Theory-Driven Approach to Assess Unintended Effects

    Friday Concurrents 10:30am-11:15am - 1003: Introduction to Evaluation and Policy

    William Pate:

    Friday Concurrents 2:15pm-3:15pm - CMME1: Managing Site Specifics in Multisite Evaluation

    Friday Concurrents 4:30pm-5:30pm - 3064: Good Data, Bad Data

    Julia Rollison:

    Wednesday Concurrents 5:45pm-6:30pm - 1594: Moving Beyond Document Storage: Using SharePoint to More Effectively Manage Evaluations

    Thursday Concurrents 10:30am-11:15am - 1597: Using SharePoint to Track and Present Data for Process and Outcome Evaluations

    Katelyn Sedelmyer:

    Thursday Concurrents 3:45pm-4:45pm - CPPE5: Learning from Community Evaluations: Theory and Practice

    Friday Concurrents 4:30pm-5:30pm - OLECB1: ECB in Community-Based and Nonprofit Organizations

    Godfrey Senkaba:

    Friday Concurrents 8:00am-9:00am - 1852: Working with Assumptions: Understanding how Evaluators Make Decisions about Capturing Reality/Context

    Jacqueline Singh:

    Friday Concurrents 11:30am-12:15pm - 1285: Who has Power in Higher Education? Deeper Thinking About Evidence-Based Decision-Making

    Kelly Skeith:

    Poster Reception and Meet-the-Authors - 1815: Understanding Rights-based Politics: Using Applied Political Economy Analysis to Guide Human Rights Programming

    Thursday Concurrents 3:45pm-4:45pm - 2439: Risky Business: Evaluation in High-Risk Political Environments

    Thursday Concurrents 5:00pm-5:45pm - 2695: Five Organizations and One MEL System: Successes and Challenges

    TIG Business Meetings - TIGBM14: Democracy and Governance TIG Business Meeting

    Juna Snow:

    Poster Reception and Meet-the-Authors - 1639: Is evaluator-focused meta-evaluation occurring or just theorized?

    Thursday Concurrents 8:00am-9:00am - 2776: Practitioner Challenges and Successes when Speaking Truth: Cases from Various Stages in an Evaluation

    TIG Business Meetings - TIGBM6: Cluster, Multi-site and Multi-level Evaluation TIG Business Meeting

    Friday Concurrents 2:15pm-3:15pm - CMME1: Managing Site Specifics in Multisite Evaluation

    Friday Concurrents 3:30pm-4:15pm - GE1: A Closer Look at Telework in the Federal Environment

    Sarya Sok:

    Thursday Concurrents 11:30am-12:15pm - 1382: Whose Stories Matter?: Using Secondary Analysis and Meta-monitoring to Explore Power and Values in the ‘Most Significant Change’ Process

    Linda Stern:

    Wednesday Concurrents 4:30pm-5:30pm - 1978: Speaking Many Truths to Power: re-appropriating mobile ethnography for evaluative case studies.

    Friday Concurrents 10:30am-11:15am - 1707: Speaking Truth to Political Power: Lessons from International Democracy Assistance Evaluations

    Beeta Tahmassebi:

    Thursday Concurrents 11:30am-12:15pm - 2179: The Art of Crafting a Successful Engagement – How to balance the power dynamics between funders and consultants to ensure evaluation success

    Friday Concurrents 11:30am-12:15pm - 2161: Grassroots or Grasstops? Measuring the Effectiveness of Advocacy Strategies

    Brandie Taylor:

    Poster Reception and Meet-the-Authors - 2013: Assessment of the Centers for HIV/AIDS Vaccine Immunology and Immunogen Design at the National Institute of Allergy and Infectious Diseases

    Organizational Learning & Evaluation Capacity Building - 1996: Evaluation Training Needs Assessment at the National Institute of Allergy and Infecti cous Diseases

    Research, Technology & Development Evaluation - 2022: Process and Outcome Evaluation of the Antibacterial Resistance Leadership Group at the National Institute of Allergy and Infectious Diseases

    Friday Concurrents 3:30pm-4:15pm - GE1: A Closer Look at Telework in the Federal Environment

    Juha Uitto:

    Thursday Concurrents 8:00am-9:00am - 1816: Lessons from applying Rapid Impact Evaluation

    Dana Wanzer:

    Wednesday Concurrents 4:30pm-5:30pm - RE1: Contextual issues in evaluation practice

    Thursday Concurrents 8:00am-9:00am - DVR1: Research on Data Visualization

    Asia Williams:

    Thursday Concurrents 2:45pm-3:30pm - CD2: Effectiveness of skill-building and self-sufficiency initiatives: Results from two recent studies

    Friday Concurrents 5:45pm-6:45pm - MIE1:Culturally Responsive Strategies for Addressing the Elephants in the Room: Race and Power

    Saturday Concurrents 10:15am-11:00am - 2750: Catalysts of Change: Out-of-School Time STEM Programs for Underrepresented Youth

    Brian Yates:

    Pre-Conference Workshop - Half Day (Wed PM) - 45: (45) Adding Costs to Make Your Evaluation More Impactful (and Better Used): Using Cost-Effectiveness, Cost-Benefit, Cost-Utility Analyses for Health and Human Services

    Wednesday Concurrents 4:30pm-5:30pm - 1154: Seeking and Speaking Truths in Terms that Power Understands: Problems and Solutions in Cost-Inclusive Evaluation

    Lily Zandniapour:

    Friday Concurrents 2:15pm-3:15pm - 1843: Using Evidence to Scale-Up Community Based Solutions That Work – A Federal Agency’s Framework and Approach

    Friday Concurrents 4:30pm-5:30pm - 2442: Maximizing Evaluation Use: Examples from Social Innovation Fund Intermediary Funders


  • Wed, January 31, 2018 10:43 AM | Anonymous member (Administrator)

    Dear Colleagues:

    As I said in my 2018 New Year’s greeting, I am excited to engage with Washington Evaluators (WE) members and the larger DC evaluation community over the next year as we continue to highlight evaluation’s critical role to inform decision-making in all sectors and levels of government.  In particular, I look forward to working with WE’s 2018 Board of Directors  and committee/task force chairs on the many initiatives that support the promotion of evaluation as a profession and that seek to improve evaluation practices and increase routine use of evaluation and evidence.

    As is WE’s tradition, I want to take this opportunity to share my priorities over this next year as WE’s 2018 President:

    Promoting the Field of Evaluation Locally and Nationally – AEA and its affiliates recognize that evaluation is an essential function of government.  Given WE’s proximity to a large federal workforce and three state governments, WE will partner with agencies to continue the focus on the Commission on Evidence-based Policymaking’s recommendations to improve evidence-building capacity in government and non-government agencies.  We will periodically shed light on EvalAction 2017 by sharing ideas on ways to stay engaged in discussing the value of evaluation in government.  And we are laying the groundwork to expand WE’s successful Evaluation Without Borders initiative for WE members to meaningfully connect and give back to the Washington, DC community.

    Supporting WE Operations Through Our Members: Organizational Sponsors – Our members are the lifeblood of Washington Evaluators.  As such, we will actively seek opportunities to meaningfully engage our members, starting with our newest membership category—Organizational Sponsors.  A priority of WE is to ensure our members find value in their membership, so we will partner with different sectors of our membership to create professional development and networking events for members to learn from and interact with one another. 

    Strengthening the Sustainability of the Evaluation Community – The board is working with the points of contact for its major initiatives targeted at new and young professionals in the evaluation discipline.  We are working with our New Professionals Scholarship, University Ambassadors, and Student Conference Taskforce chairs to grow efforts devoted to support the next generation of professional evaluators, and look for synergies among these initiatives. For starters, WE members can look forward to a brown bag event to meet and hear from the three recipients of WE’s 2017 New Professionals Scholarship. 

    Volunteer Engagement & Recognition – A central focus will be a continuation of the rich menu of opportunities for WE members to make an impact through professional development, mentoring, access to job listings and evaluation-related events, and leadership opportunities locally and nationally with AEA, etc.  We will continue the tradition started by Past President Nick Hart of recognizing individuals for contributions to WE and to the DC evaluation community at large, and will develop and grow organizational leadership pipelines on WE committees and taskforces. 

    American Evaluation Associates Affiliates – Last, but certainly not least, as one of 30+ AEA affiliates, through WE’s representative to the AEA-Local Affiliates Collaborative (AEA-LAC) we will work with other affiliates to develop and implement actions that affiliates and the parent organization, AEA, can take to mutually support one another.

    In the immediate future, we will be updating our Action Plan to make available to you this Spring.  We hope it will be a reflection of WE’s efforts to be of service to our members and fellow evaluators by providing avenues to attend events, network with colleagues in the field, address critical topics like building evidence-capacity, and share expertise by volunteering.  We look forward to this next year with you, and invite you to stay in touch by dropping us a line at washingtonevaluators@gmail.org.

    Regards,

    Stephanie Cabell




  • Sat, December 30, 2017 1:48 PM | Anonymous member (Administrator)

    Dear Fellow WE Members –

    I am excited to be starting 2018 as President of Washington Evaluators.  It is a great honor to take over the helm as president and I thank the Board, WE’s past president Nick Hart for his leadership and ongoing support, and all of you for this opportunity. 

    As Nick said in his 2017 closing message to all of us, 2017 was indeed a fantastic year for evaluation in the Washington, DC, area.  Whether at the local, state or national level, Washington Evaluators was at the forefront of key discussions in the discipline and proud to be an actor in this dynamic environment.  Our members reflect that dynamism through their engagement in their organizations and externally in any number of realms.  I look forward to soon sharing our goals and priorities for 2018 with all of you.  We will continue with many of initiatives detailed in WE’s strategic plan and will build upon our accomplishments of 2017, not the least of which is increasing the opportunities for WE members to meaningfully engage with fellow evaluators. 

    For now, on behalf of the board of Washington Evaluators:  “Best wishes to all in the New Year.”

    Stephanie Cabell


  • Tue, December 05, 2017 4:19 PM | Nick Hart (Administrator)

    As 2017 comes to a close, our profession has much to celebrate. 2017 was a fantastic year for the evaluation field in Washington, DC.

    At the outset of 2017 when my term as president of Washington Evaluators began, I outlined three of my overarching priorities for the organization this year: to strengthen the national evaluation community, enhance our organizational services, and to improve our infrastructure for the sustainability of Washington Evaluators. We made tremendous progress in addressing each of these three priorities throughout the year.

    I am proud of all that the Board and volunteer members of Washington Evaluators were able to accomplish in just 12 short months. Thank you to all of the volunteers who supported Washington Evaluators activities this year. As we reflect on the past year, I want to briefly highlight several accomplishments of our organization in 2017.

    Strengthening the National Evaluation Community

    Throughout the year Washington Evaluators partnered with numerous organizations to host events and dialogues to advance evaluation practice, and strengthen the interactions between evaluators not just here in DC but from around the country.

    • Early this year Washington Evaluators co-sponsored a dialogue for our field to discuss the role of race and class in evaluation. The event, co-sponsored with the American Evaluation Association and George Washington University, was the first of four sessions that ended with a capstone plenary at the fall AEA conference here in Washington.
    • In September, following the release of the Commission on Evidence-Based Policymaking's report, Washington Evaluators partnered with AEA, George Washington University, and the Society for Benefit-Cost Analysis to sponsor an event for dialogue about how to proceed in implementing the Commission's recommendations. The event attracted more than 150 attendees including individuals from fields that partner with evaluators to begin a discussion that continues today about next steps for improving the entire evidence-building community. 
    • In November, Washington Evaluators' volunteers led excellent initiatives during the fall AEA #Eval17 conference, and more than one-third of our membership participated in panels.  Brian Yoder's leadership in EvalAction, an initiative conducted in partnership with AEA's Evaluation Policy Task Force, led to more than 100 evaluators volunteering to visit with congressional staff and Congressmen to discuss the field of evaluation this fall.  Giovanni Dazzo and Jonathan Jones co-chaired the Local Arrangements Working Group in preparation for the fall AEA conference, and did a tremendous job launching new initiatives such as the Evaluation without Borders effort to encourage evaluators to give back to the community during the conference. And Washington Evaluators members contributed $600 to support five graduate students from around the country participating in the #Eval17 conference.

    Each of these events and contributions made substantial in-roads to strengthening the evaluation profession not just here in DC, but by demonstrating the value of evaluation and future directions for evaluation across the country.

    Enhancing Evaluation Services and Benefits in DC

    While Washington Evaluators this year exhibited leadership for evaluators across the country, our volunteers also designed and led numerous efforts to enhance the benefits of membership for our local evaluators right here in Washington, DC.

    • As Washington Evaluators revamped its communications efforts under the leadership of Patricia Shaffer, our members received improved weekly digests with job announcements and opportunities for events around the city.
    • Our program committee, led by Giovanni Dazzo, coordinated ten professional development events ranging from discussions with the Government Accountability Office to former AEA president Rodney Hopson. Many of these events were made available to members through new virtual participation options.
    • The program committee also sought out new opportunities for evaluators to productively network with each other at social events, including Washington Evaluators' first event at Nationals Park, and our membership committee chaired by Robin Kelley hosted its second members-only meet and greet.
    • Washington Evaluators launched a new short-term mentoring program to better meet the needs of our members, led by Nick Zyznieuski. Nearly a dozen of our members participated in the program as mentees this year and we expect the initiative to grow further in coming years.
    • Finally, recognizing that DC is a large city and sometimes events may be difficult to attend downtown during the workday to interact with fellow evaluators, this year Washington Evaluators hosted the Sine Qua Non dinner series, where we made an effort to connect evaluators to those who live and work in close proximity to each other. Throughout the year, Washington Evaluators volunteers hosted nine dinners attended by dozens of members to discuss their work, the state of the evaluation profession, and suggestions for the DC evaluation community.

    We hope that all Washington Evaluators members personally experienced many of the specific benefits of membership throughout the year by attending an event or participating in one of the many activities available to members.

    Reinforcing Organizational Infrastructure

    While the business matters of Washington Evaluators are rarely the most exciting for many members -- there are many encouraging actions undertaken this year that will hopefully shape the future direction of the organization for years to come. 

    • The Board developed a robust, long-term strategic plan, and with member input approved a plan outlining new and ambitious goals for the organization. To support implementation of the strategic plan the Board also approved an action plan for 2017, and completed 43 of 50 of the items in that plan in full and 5 in part or with some modification from the original plan (2 items deferred for future action).
    • The Communications Committee in 2017 engaged in a full rebranding of the organization with a new logo and by launching a new website that is easier to navigate and with greater visual appeal. In addition, the social media presence was substantially enhanced with an active and growing engagement on Twitter and LinkedIn.
    • The Membership Committee led an effort to revise how members join and retain a relationship with the organization, Washington Evaluators this year launched a two year membership status and the new organizational sponsor status. To date more than 30 members have changed their status to two year memberships and four organizations have signed up as sponsors.
    • The long-term infrastructure of the organization is built on the engagement and participation of dozens of volunteers. In recognition of the important role volunteers play in the success of Washington Evaluators, we are pleased to be able to recognize a volunteer of the year for the first time in 2017.

    Looking Forward

    2017 was a phenomenally energizing year for Washington Evaluators as an organization and for all evaluators in Washington, DC. In addition to the many achievements of Washington Evaluators throughout the year, policymakers in DC renewed calls for institutionalizing evaluation in the federal government. With the American Evaluation Association's annual conference in DC as a backdrop, the U.S. Congress advanced legislation to encourage more evaluation in agencies across government. We have much to look forward to in coming years!

    Thank you to all who supported the many activities of the organization and strengthening our evaluation community in 2017. Please join me especially in thanking the entire Washington Evaluators Board of Directors and leaders of our many task forces for their leadership this year.

    I hope you will continue to be engaged next year as well to support our growing community of evaluation practice!

    NICK HART, PHD is the 2017 President of Washington Evaluators.

  • Fri, November 24, 2017 2:32 PM | Nick Hart (Administrator)

    As the number of Washington Evaluators members and volunteers continues to grow, the Board of Directors of the organization has acknowledged a growing need to recognize our stellar volunteers. Washington Evaluators does not currently have a single staff member, so every service, event, and resource is produced by an all-volunteer team who provide countless hours of exceptional service to our local evaluation community.

    Because so many volunteers offer their time to strengthen our profession and often do not ask for any recognition or compensation, I am proud to announce that this year Washington Evaluators will help fill this gap by launching a new "Volunteer of the Year Award."

    The Volunteer of the Year Award is intended to recognize outstanding volunteers who provide dedicated and selfless service to the organization and the Washington, DC evaluation community. The intent is that recipients will have made significant contributions to the success of the organization's goals and mission attainment throughout the year. 

    The creation of this award was contemplated by the Board of Directors through a strategic planning process in 2017 and a subsequent action plan intended to strengthen the organization's infrastructure and long-term sustainability.

    For 2017, nominations will be accepted through December 6, 2017. The recipient will be announced at the 2017 Holiday Party. Learn more about the nomination criteria, eligibility, selection process, and timing of the award here.

    NICK HART, PHD is the 2017 President of Washington Evaluators. 

  • Sun, November 05, 2017 8:37 PM | Nick Hart (Administrator)

    Later this week the largest gathering of professional program evaluators in the world will  convene here in Washington, DC as the American Evaluation Association launches its annual conference, Evaluation 2017. While many exciting activities will occur during the week, the conference theme -- "From Learning to Action" -- provides us all the opportunity to reflect on one basic question: why do I evaluate?

    Learning Cultures

    We live in a society that often focuses, perhaps too much, on the consequences of failure. For organizations and grantees, failing to deliver on promised activities can result in a loss of funding. In government, current political discourse would have us believe programs that operate imperfectly can or should be terminated altogether.

    Instead of focusing on the consequences of failure, we could choose to focus on the benefits of failure. Consider failure from a personal rather than organizational perspective. In childhood, we learn quickly from mistakes like touching a hot pan on the stovetop or, in my case, shooting your brother with a bb gun. The benefits are that we generally avoid touching hot objects or take greater care in gun safety in the future. Over the course of our lives we make thousands of "mistakes" that productively inform our future behaviors.

    Learning from failure is a natural part of the human experience, just as much as learning from success. Because organizations are comprised of humans, we should expect that both failure and success are similarly an organic component of organizational learning.

    A learning culture must become more pervasive and routine in organizations and in our government -- it's how we improve, it's how we enhance ourselves, and it's how we make the world a better place to live.  Learning cultures are what drive continuous improvements in the outcomes that matter. Learning cultures are how we ensure those in our society who need help and support receive effective assistance. And learning cultures are how we develop the information to act, ensuring our children grow into a better world that we have prepared for them. Recognizing that failure is inevitable and can be used to productively improve is a key component of a learning culture.

    Why I Evaluate

    This perspective on the purpose of a learning culture is one that is very timely for me. My son was born just over one week ago. His entry into the world has left me reflecting in recent days on many of life's priorities and the process of learning.

    It's difficult to imagine becoming a parent that only admonishes my son's inevitable "failures" in life. It's also difficult to imagine only praising his successes. Both failure and success will present incredible learning opportunities and invaluable teaching moments.

    How we act in response to any form of information is a direct reflection on our values. In my son and in my government, I value continuous improvement to be the best person or entity possible. I value a recognition that even in mistakes or failures, we can always improve ourselves to be our best reflection of the world. I value learning because it enables action in our lives, for our families, and for our futures.

    So why do I evaluate? I evaluate to learn and improve through appropriate action. I evaluate to make the world a little better for my son. I evaluate to help make society stronger. 

    #WhyEval: A Call for Reflection

    Evaluation is not merely a profession, it derives from a greater motivation, goal, and purpose. During the American Evaluation Association's conference this week, I encourage you to consider what drives you to support evaluation:

    • What is your motivation?
    • What is your goal?
    • What is your purpose?

    As you reflect, I also encourage you to share why you evaluate (#WhyEval) with others as we all strive to better understand how learning segues to action in our own work and in our own lives.

    NICK HART, PH.D. is the President of Washington Evaluators in 2017 and Director of the Evidence-Based Policymaking Initiative at the Bipartisan Policy Center. 

  • Sat, October 14, 2017 8:53 AM | Nick Hart (Administrator)

    Originally posted on AEA 365's A Tip a Day by and for Evaluators for the Local Arrangements Working Group sponsored week in July 2017

    As the American Evaluation Association’s 2017 conference returns to Washington, DC, this fall, on behalf of the Washington Evaluators affiliate allow me to welcome you to DC for #Eval17!  I am Nick Hart, current president of Washington Evaluators, AEA’s DC-based affiliate.

    Washington Evaluators launched in 1984 and has grown to more than 300 local evaluators today. Our goal is to strengthen the evaluation community in the Washington, DC area. We pride ourselves on having a diverse representation of government, non-profit, academic, and independent evaluators that comprise our membership.

    This year our membership worked to produce a new strategic plan to ensure the services and professional development opportunities offered truly serve our community. We now have four key strategic goals: strengthen the sustainability of the evaluation community; enhance evaluation relationships and interactions; support individual evaluators' professional development needs; and ensure strong administration of the organization. Each of these four strategic goals is a core component of the Washington Evaluators mission. In implementing our ambitious strategic plan, Washington Evaluators is working to create more opportunities to engage new evaluation professionals, further the professional development of long-time evaluation professionals, and offer the 30+ years of experience of our evaluation organization to other communities of practice throughout the country.

    As the seat of the United States government, Washington, DC is perhaps best known for its influence in evaluation policy. But beyond the government, DC is home to leading evaluation organizations and the brightest evaluation minds in the U.S.  Building on this broad evaluation expertise, as we prepare for an exciting #Eval17 this fall, over the course of this week on AEA365 we will be showcasing local resources, sites to visit, volunteer opportunities, a major advocacy event on Capitol Hill, and other tips for your trip to DC.

    Rad Resource:  Follow Washington Evaluators on Twitter or check out our website to learn more about the many opportunities available in the DC area.  Many of our events are open to non-members as we support the entire DC evaluation community.

    Lesson Learned:  Book your travel for the conference early. There are three airports in close proximity to DC (Dulles, Reagan, and Baltimore).  From any of these airports, the conference site is just a short Uber ride away.  All are also reachable by DC’s public transit options.

    Hot Tip:  In addition to the resources we will share in advance of the conference, Washington, DC has an excellent tourism website that explains the sites to see in America’s Front Yard, provides tips on accessing the many free museums, and explains the neighborhoods in the city.

    Get excited for a great conference this fall. We look forward to seeing you in DC! 

    NICK HART, PH.D. is the 2017 President of Washington Evaluators and a member of the American Evaluation Association's 2017 Conference Planning Committee.

  • Fri, September 29, 2017 6:38 PM | Nick Hart (Administrator)

    Cross posted from the American Evaluation Association monthly newsletter from September 2017.

    In September 2017, the U.S. Commission on Evidence-Based Policymaking proposed a bipartisan strategy – approved unanimously by the Members of the Commission – for improving the quantity and quality of evidence generated to support decision-makers in government. As the Commission published its strategy, a new initiative concurrently launched at the Bipartisan Policy Center in Washington, DC, to promote implementation of the Commission’s recommendations in months and years to come. Serving as the Commission’s policy and research director and now as the director of the Bipartisan Policy Center’s new initiative, I’m excited about the enthusiasm in Washington for ensuring policymakers have access to relevant and useful information to guide their decisions. But we must carry this enthusiasm forward to action that can improve our field, the policies we study, and ultimately the lives of individuals in our communities.

    Aligning Values with Action

    The vast majority of my professional career in evaluation has focused on supporting the policies that enable evaluation to be generated and used in government. The Commission’s recommendations present a tremendous opportunity for the evaluation community. This is an opportunity to exhibit leadership and champion improvements in the availability of evidence, to ultimately improve how government’s policies and programs are designed and implemented.

    As the conversation continues in coming months and years about how government can better generate and use evidence, the values articulated by AEA for evaluation are constructive guideposts. As AEA members, we value “excellence in evaluation practice” and “utilization of evaluation findings.” Each of these value statements can and should be embodied and encouraged by the policies that support evaluation in government. This is precisely the nature of my work.

    An evaluation that doesn’t exist, can’t inform policymakers. I’m a proponent of recognizing and addressing the many institutional barriers to supply of evaluation. There are many barriers that exist today – laws, resources, will, leadership, organizational culture, political environment, program designs. The Commission’s report emphasizes three key barriers to generating evidence, including evaluation, in the United States: “unintentional limits on data access, inadequate privacy practices, and insufficient capacity to generate the amount of quality evidence needed to support policy decisions.” All of these barriers are solvable and can be transformed into enablers of evaluation.

    The Opportunities Ahead

    Changing expectations for senior leaders, planning for evaluation at the outset of a program or policy, and establishing appropriate incentives are all approaches to emphasize enabling evaluation in our institutions. How do we accomplish these approaches? The Commission specifically recommends that as we improve data access and privacy protections, capacity gaps can be partially addressed by establishing a Chief Evaluation Officer position within each Federal department and that learning agendas be developed to prioritize evaluation where the need is greatest. When implemented, these recommendations will help ensure that senior leaders are attuned to the needs of evaluation practice, supporting excellence, and that the capacity exists to encourage appropriate and responsible use of evaluation findings.

    These recommendations aren’t impossible. The recommendations aren’t unrealistic. In fact, it’s just the opposite. They are on the horizon and likely to become the norm in coming years. But as we all seek to strengthen the evaluation field, improving our practice, and enabling the ability to make evidence available for decision-making, it’s important to remember that many of these changes will not happen overnight.

    In my view, the Commission’s bipartisan recommendations mark a major milestone for our country for recognizing that government needs better information to guide policymaking, and that generating this evidence is really possible. I hope the evaluation community will join me in advocating for these improvements – consistent with our values – to seize the rare opportunity to vastly improve government’s capacity to support evaluation.

    NICK HART, PH.D. is the Director of the Evidence-Based Policymaking Initiative at the Bipartisan Policy Center and the former Policy and Research Director for the U.S. Commission on Evidence-Based Policymaking. He is the 2017 President of Washington Evaluators and a member of the American Evaluation Association’s Evaluation Policy Task Force.

  • Thu, September 07, 2017 10:00 PM | Nick Hart (Administrator)

    Rarely does the topic of generating evidence to support government decision-making reach an audience outside the statistical, evaluation, and policy analysis communities. But today, the U.S. Commission on Evidence-Based Policymaking submitted its bipartisan set of recommendations -- supported unanimously by Members of the Commission -- igniting a discussion about how to do better.

    In The Promise of Evidence-Based Policymaking, the Commission lays out a strategy for vastly improving the quantity and quality of evidence available in our country. The strategy seeks to overcome three prevailing challenges identified by the Commission: “unintentional limits on data access, inadequate privacy practices, and insufficient capacity to generate the amount of quality evidence needed to support policy decisions.”

    I’ve had the great privilege of working with the Commission Members over the past year as their Policy and Research Director. But my personal involvement in the project should in no way minimize this message: in coming weeks, months, and years, these recommendations will set the tone for how our country goes about developing evidence to inform decisions in government for decades to come.

    While the report of the Commission submitted to the President and Congress today addresses a range of issues and is not exclusively focused on the field of evaluation, there is no doubt that the recommendations could tremendously benefit the field if implemented. Take, for example, the Commission’s agreement with the American Evaluation Association that evaluation in government is too often “sporadic, applied inconsistently, and supported inadequately” (p. 26). One solution offered by the Commission is that departments in the Federal government should have Chief Evaluation Officers (see Recommendation 5-1). This alone is a strong statement about the value of and need for evaluation in our society.

    But there’s much more. Chapter 2 of the Commission’s report highlights challenges and potential solutions to data access that can improve the evaluation community. Chapter 3 features improvements for privacy protections that go above and beyond approaches applied in much of government today. Chapter 4 offers a new solution to a long-standing issue about securely linked data together, including for evaluation. And Chapter 5 describes the basic capacity gaps in government today, along with strategies to vastly improve government’s coordination and infrastructure.

     
    In my opinion, today marks a major milestone for our country in recognizing that government needs better information to guide policymaking
    , and that generating this evidence is really possible. I hope the evaluation community in Washington, D.C. will review, consider, discuss, and work to improve government’s capacity to better enable evaluation in support of evidence-based policymaking.

    NICK HART, PH.D. is the President of Washington Evaluators in 2017 and served as the Policy and Research Director for the U.S. Commission on Evidence-Based Policymaking.  The views presented here are those of the author and do not represent the official position of the U.S. Government, including the Office of Management Budget and the Commission on Evidence-Based Policymaking.

    Related Links:

    Upcoming Events:


  • Mon, August 14, 2017 8:19 PM | Nick Hart (Administrator)

    In 1984, Lee Cronbach urged that "the evaluator is an educator; his success is to be judged by what others learn." [1] It's no coincidence that 1984 is also the year in which Washington Evaluators formed as one of the country's earliest professional evaluation societies committed to fostering continuous learning in our field.

    Today, Washington Evaluators is committed to ensuring that our current cohort of professionals not only advocate to support the profession, but recruit future professionals into the field. Earlier this year, the Board of Washington Evaluators approved a new strategic plan that specifically identifies this as an objective for a goal to strengthen the evaluation community (see Objective 1.1).

    To accomplish this objective the Washington Evaluators Board earlier this year established two new task forces to better address the needs of new professionals. First, we created a task force to develop a suite of recommendations for future consideration around improving the services available for new professionals.

    Second, the Washington Evaluators Board established another  task force led by Tamarah Moss from Howard University to design a new scholarship program for new professionals. This group's efforts resulted in the launch in August of the 2017 New Professionals Scholarship sponsored by Washington Evaluators. The new scholarship is intended to support new professionals in integrating evaluation practices and approaches within their respective organizations by encouraging participation in the American Evaluation Association's annual conference, as well as engagement over the next year with AEA and Washington Evaluators membership.

    Through this new scholarship opportunity, Washington Evaluators hopes to strengthen the sustainability of the evaluation community, by recruiting and helping to educate the next generation of evaluators. The scholarship serves as one means to recruit new professionals into the evaluation community to facilitate continued diversity in the profession. It also ensures that those of us already engaged in the evaluation field can fulfill Cronbach's charge: to be educators and mentors to those who are new to the profession.

    Learn more about the 2017 New Professional Scholarship here.

    NICK HART, PH.D. is the President of Washington Evaluators in 2017.  The views presented here are those of the author and do not represent the official position of the U.S. Government, including the Office of Management Budget and the Commission on Evidence-Based Policymaking.

    ______________________________

    References

    [1] Cronbach, L., et al. 1984. Towards Reform of Program Evaluation. Washington: Jossey-Bass Publishers.   

<< First  < Prev   1   2   Next >  Last >> 


(c) 2017 Washington Evaluators

Powered by Wild Apricot Membership Software