Workshop Proposals

The Active Community Engagement Model: fostering active participation in evaluation projects among diverse community stakeholders

HALF DAY

Level: Beginner     Language: English

Steve Dooley

Stephen Dooley is a faculty member at Kwantlen-Polytechnic-University and is the Director of The Centre for Interdisciplinary Research: Community Learning and Engagement (CIRCLE). Steve is also involved in a number of evaluations of community programs for at-risk youth. Steve brings to the workshop an extensive background in applied community development.

Dr. Nathalie Gagnon
Kwantlen Polytechnic University

Dr. Nathalie Gagnon is a faculty member in the Criminology department at Kwantlen Polytechnic University. Nathalie is involved in the evaluation of a number of community programs for at-risk youth and other vulnerable groups.

Dr. Roger Tweed
Kwantlen Polytechnic University

Dr. Roger Tweed is a faculty member at Kwantlen Polytechnic University. He conducts applied research on facilitators of positive life change. He has studied facilitators of positive life change among people who are homeless and is currently studying strengths (social and personal) that keep kids away from violence.

Dr. Gira Bhatt Kwantlen
Polytechnic University

Dr. Gira Bhatt is the principal investigator and director of the Acting Together: SSSHRC-CURA project. She completed her PhD from the Simon Fraser University. Currently she is a faculty member in the psychology department at Kwantlen Polytechnic University. Her research interests are in applied social psychology area of self,

Abstract

Participants will be introduced to the Active Community Engagement model (ACE), a set of processes, procedures and techniques that have been developed to engage diverse community stakeholders in the evaluation process. ACE has been used successfully in a federal context for as many as 15 stakeholders. This workshop combines practical lessons with hands on experience. The challenges in measuring community participation in applied evaluation research will be discussed.

Audience

This workshop is applicable for people conducting evaluations in community based settings where the stakeholder voice is important but not often heard and for evaluation professionals looking for practical community engagement skills.

Back to Top >>


Diagnosing Diversity: International development lessons for evaluation practice

HALF DAY

Level: Intermediate     Language: English

Catherine Elkins
Consultant, and Faculty, Sanford School of Public Policy at Duke University

Catherine Elkins, MALD, PhD, has experience spanning the foreign service, development, and research. She consults internationally and teaches graduate seminars in evaluation at Duke University. She combines management and strategic communications skills with interdisciplinary professional expertise to build knowledge of effective and efficient ways to enhance impact.

Abstract

International development interventions challenge the status quo. International development evaluators must stay constantly attuned to the dynamic patterns of groups, sub-groups, and other identities. Diagnosing these relationships is necessary for developing strategies and methods relevant to all program stakeholders. A participatory approach will examine international methods and approaches within a diagnostic framework to build skills and tools for assessing interventions, incentives, and implications for evaluation practice in any context.

Participants will increase their capacity to implement professional evaluation skills while accounting for diversity and its relevant impacts. Evaluation and monitoring professionals, program managers and designer/developers will find this workshop especially informative.

Back to Top >>


Cleaning Your Evaluation Data: Strategies for Dealing with Dirty Data

HALF DAY

Jennifer Ann Morrow

Dr. Jennifer Ann Morrow (University of Rhode Island) is an Assistant Professor of Evaluation, Statistics, and Measurement at the University of Tennessee in Knoxville, TN. She focuses her research on three main areas: program evaluation, college student development, and teaching research methods and statistics.

Gary Skolits, University of Tennessee

Dr. Gary J. Skolits (East Tennessee) is the director of the Institute for Assessment and Evaluation as well as an Associate Professor of Evaluation, Statistics, and Measurement. His research interests include evaluation methods, the evaluation of educational interventions, P-16 and college access program evaluation as well as higher education accountability.

Abstract

Evaluation data can be messy and is rarely given to evaluators ready to be analyzed. Missing data, coding mistakes, and outliers are just some of the problems that evaluators encounter prior to analysis. The resources that are available in the literature tend to be complex and not always user friendly. The goal of this workshop is to go through a data cleaning process offering suggestions for each step along the way.

The target audience is evaluators who work with quantitative evaluation data as part of their evaluations. A basis understanding of statistical analysis is all that is required.

Back to Top >>


Waawiyeyaa (Circular) Evaluation Tool: Its Use and Lessons Learned

HALF DAY

Level: Intermediate     Language: English

Andrea L.K. Johnston

Andrea L. K. Johnston, CEO, is a skilled trainer and facilitator. Johnston Research Inc. is a forerunner in utilizing new technology and media to develop culture-based evaluation tools that can properly assess and improve culture-based social programming. Since 2001, Johnston has managed 120 evaluation and research projects.

Back to Top >>


Difference and Value in Design: Understanding, Deciding Upon, and Assessing the Value of Qualitative Approaches in Work With Diverse/Marginalized People

HALF DAY

Level: Intermediate     Language: English

Susan Scott

Susan Scott is an Assistant Professor of Social Work at Lakehead University, Orillia Campus. She has been a program evaluator for over 30 years in many capacities including: government in-house specialist; agency specialist; consultant to governments, transfer payment agencies, and the private sector.

Anne Marie Walsh

Anne Marie Walsh, Lakehead University is an Assistant Professor of Social Work at Lakehead University’s campus in Orillia Ontario.

Abstract

The workshop will examine the value of a number of qualitative approaches both generally and in working with diverse/marginalized populations. Participants will learn how the value of a qualitative approach is dependent on the group involved and the evaluation questions posed. Participants will learn how to determine an appropriate approach that respects diversity, questions the status quo when necessary, and develop recommendations that address the issues and aid in stakeholder understanding of the services and the populations using the services.

Objectives:

Learning objectives for the audience from the paper include the following:

  1. To understand the differences between various qualitative methods (e.g., narrative, phenomenological, grounded theory, ethnographical, case study);
  2. To understand the circumstances under which various qualitative methods (e.g., narrative, phenomenological, ethnographical, case study) can be considered;
  3. To understand the value of using various qualitative approaches when evaluating programs and services for marginalized or diverse populations; and,
  4. To understand processes for determining the best possible qualitative approaches when dealing with marginalized populations and diversity issues.

Audience

The target audience includes: intermediate and senior level evaluators although some beginning level evaluators may benefit from involvement

Back to Top >>


Exploring Systems Concepts and Mental Models to Address Diversity Issues in Program Evaluation

HALF DAY

Level: Intermediate     Language: English

Wendy Rowe

Dr. Wendy Rowe, is an evaluation and organizational consultant with over 25 years of experience working in areas of program evaluation, organizational assessment and development, and strategic planning. She has coordinated well over 150 projects, of which many have been with aboriginal organizations and communities including Round Lake Treatment

Dr Niels Agger-Gupta

Dr Niels Agger-Gupta: Royal Roads University, has been a core faculty member in the School of Leadership Studies since 2007, and, since August, 2010, is the Programme Head for the MA-Leadership programme. He has supervised capstone projects/ theses since 2004, initially as an Adjunct Faculty with the former Knowledge Management Programme at RRU.

Abstract

Understanding systems thinking and mental models can assist evaluators to better understand the complex environments in which they work. This workshop explores basic systems concepts applying them to various dimensions of evaluation. We will stretch our own mental models and how we tend to view the program as a system. We will seek to understand the role of feedback and learning in supporting the growth and development of programs, employing basic concepts for complex adaptive systems.

This will be an interactive workshop session. Readings will be sent out to all registered participants one month before the workshop.

Audience

The workshop targets evaluators and program managers who are working in or on behalf of larger organizational systems, such as the federal/provincial government, health authorities or foundations. This workshop is relevant for evaluators who are engaged in reviews of public policies and/ or programs that serve multiple sectors across multiple stakeholder groups.

Back to Top >>


Outcome Mapping: A Hands-on Workshop

FULL DAY

Level: Intermediate     Language: English

Kay Crinean

Kay Crinean is the President of Collective Wisdom Solutions (management consultants). She is an experienced management consultant, evaluator, facilitator, change agent and entrepreneur. She has extensive experience in collaborative and multi-stakeholder ventures, and has created and run two unique multi-stakeholder organizations in Canada and the UK.

Gabrielle Donnelly

Gabrielle Donnelly, Collective Wisdom Solutions, is a consultant with a focus on socio-economic research and analysis as well as project management. Her background includes working in the fields of social, cultural, economic and community development â€" both domestically and internationally â€" with business, government, non-profits and academia.

Abstract

Outcome Mapping is a participatory method for planning, monitoring and evaluation. It articulates progress in complex programs that have influence but little control over outcomes. It focuses on changes in behaviour of key stakeholders and participants, and is oriented towards social and organizational learning. By clarifying intermediate progress markers, it makes step by step progress visible.

Issues addressed include:

  • Moving beyond attribution to contribution to social change
  • Avoiding relying solely on anecdotal evidence
  • Increasing stakeholder buy-in
  • Detecting, explaining and responding to unexpected results
  • Deepening understanding of the change processes
  • Early assessment of progress and mid-course improvements

Back to Top >>


Hearing Silenced Voices: Techniques to Include Traditionally Disenfranchised Populations in Evaluation

FULL DAY

Level: Intermediate     Language: English

Denise Belanger

Ms Belanger’s 15 years of experience in evaluation has focused all aspects of the research process, from methodological development to instrument design, and reporting. She was a key liaison with longitudinal evaluation projects in Central and Eastern Europe and Argentina, many of which focused on children/youth in marginalized communities.

Linda E. Lee

Linda E. Lee, Proactive Information Services, Ms Lee has worked in the areas of evaluation, research, and program development since the late 1970’s, having conducted hundreds of studies involving children/youth. She has expertise in program implementation and evaluation, research methods, curriculum development, organizational change, and the development of both traditional and innovative evaluation and assessment instruments.

Abstract

Ensuring the participation of those "without a voice" and vulnerable populations can be a struggle for evaluators. Children and youth, persons with disabilities, or those with limited literacy hold important views regarding programs, services, and situations which affect them, but their perspectives are not always included. The workshop will explore the why and how of including traditionally disenfranchised populations. Participants will be introduced to the use of rubrics, case studies, visual methods, mapping, and learning walks as tools for eliciting often-silenced voices. Ethical considerations will be discussed.

Audience

This workshop is designed for "intermediate" evaluators with an understanding of qualitative methods.

Back to Top >>


Capturing Successes of Diversity-focused Programs: What Are Strengths and Limits of Performance Indicators and How Do We Create Complementary/Alternative Approaches?

HALF DAY

Level: Intermediate     Language: English

Tammy Horne

Tammy Horne is a community-based consultant in Edmonton, working in Alberta and across Canada. She conducts workshops on indicators, their limitations, and complementary/alternative approaches, for diverse program types. Tammy has also has conducted gender-based analysis. She is a past instructor for CES Essential Skills, and Program Co-chair, CES 2010 conference.

Abstract

Reporting progress on indicators has become central to evaluators' work, as emphasis on accountability has grown. However, simply monitoring indicators can lead to describing change, without exploring the implications. Over-emphasis on pre-determined indicators has been questioned; Critiques include: (a) indicators being poorly developed, (b) indicators chosen for convenience rather than relevance, and (c) missing program context and complexity and unexpected outcomes.

This workshop will be extensively hands-on as participants learn to distinguish between useful and poor indicators, write well-constructed indicators, distinguish between monitoring and evaluation, determine when indicators are most likely to provide useful data for decision-making, and develop participatory strategies.

Audience

Basic knowledge of indicators evaluation (outcomes, processes, logic models) will be assumed

Back to Top >>


Rapid Evaluation Methods for Summative, Developmental and Formative Settings

FULL DAY

Level: Intermediate     Language: English

Andy Rowe

Andy has over thirty years evaluation experience working with governments, foundations, multilateral donors and communities in North America, Europe, Asia and Africa. He has a PhD from the London School of Economics and is a past President of CES. Current assignments focus on sustainability in resource and development settings.

Abstract

Full summative evaluations are intrusive and expensive, should be reserved for mature interventions, and employ comparative approaches that are not workable in many settings. Rapid Evaluation techniques enable summative evaluation previews in settings where full summative evaluations are unlikely. Such techniques are also useful early in the life of a program by building on the capacities of staff to create initial snapshots and observe changes attributable to the program.

This workshop focuses on rapid evaluation methods to achieve high quality summative evaluation. It includes new approaches for knowledge use, comparison, and simplified measurement techniques that yield valid and reliable judgments about the outcomes attributable to the intervention.

Audience

Participants should already be familiar with the main evaluation categories and methods and have at least some experience in applying these.

Back to Top >>


Designing and Advancing Evaluation Quality

HALF DAY

Level: Beginner     Language: English

Cheryl Poth

Cheryl is a faculty member of the Center for Research in Applied Measurement and Evaluation within the Department of Educational Psychology at the University of Alberta. She teaches the doctoral level program evaluation course and brings over a decade of evaluation experience and research interests in developmental evaluation approaches.

Brenda Stead

Brenda Stead, Stead and Associates; Brenda is an evaluation and management consultant, a sessional lecturer at the University of New Brunswick and a former evaluation manager with the federal government. With CES National Council, she is on the Professional Development Committee and is the CES representative on the Joint Committee on Standards for Educational Evaluation.

Abstract

The Program Evaluation Standards, Third Edition (2010), developed by the Joint Committee on Standards for Educational Evaluation (JCSSE), has been approved by the American National Standards Institute, and adopted by the Canadian Evaluation Society (CES). This half-day workshop will introduce participants to the standards that define quality evaluation. The session will begin by applying dimensions of evaluation quality to a practical problem. Attendees will also have the opportunity to report on their own dilemmas and explore how the application of standards may serve to increase and balance utility, feasibility, propriety, accuracy, and evaluation accountability.

Audience

This half-day workshop will introduce participants to the five dimensions and 30 standards that, as a set, define quality evaluation. The target audience are evaluators, managers, and stakeholders who wish to improve the quality of their evaluations.

Back to Top >>


Conducting Complex Evaluations

HALF DAY

Level: Intermediate     Language: English

Simon Roy

Simon Roy is a Partner at Goss Gilroy Inc. and specializes in social-economic and evaluation research. Dr. Roy has conducted evaluations in a wide variety of areas including human resources and training, economic development, languages and culture, science programs and Aboriginal affairs. He has a PhD in Sociology.

Benoit Gauthier

Benoit Gauthier, President, Circum Networks; Mr. Gauthier has strong experience in organizational research (program evaluation, market research, social research and policy analysis). He holds degrees in political science and public administration, and the titles of Credentialed Evaluator, Certified Management Consultant, and Certified Marketing Research Professional. He has received awards from the CES and the CESEF.

Abstract

Evaluations can be highly complex for a number of reasons. This complexity can lead to a number of undesired outcomes, including delayed time lines, ill-adapted methodologies, inappropriate findings/conclusions or recommendations, conflicts and other problematic situations. For this workshop, the presenters will present key strategies and tips that could be used for four types of challenging

evaluations: horizontal and multi-program evaluations, evaluations of highly complex subject matters/ programs, evaluations with tight time lines/budgets, and evaluations about sensitive programs or issues. Participants will learn about practical strategies and tools to help them deal with these complexities and achieve successful evaluations.

Audience

Some experience in evaluation will allow participants to fully understand some of the suggested strategies and techniques shared by the presenters.

Back to Top >>


Getting Started: Introductory Consulting Skills for Evaluators

HALF DAY

Level: Beginner     Language: English

Gail V. Barrington, PhD, CMC, CE

Dr. Gail Barrington contributes to the evaluation profession through her practice, writing, teaching, training, mentoring and service. She founded Barrington Research Group, Inc. in 1985 and has conducted over 125 evaluation and research studies. She received the 2008 CES Contribution to Evaluation in Canada award and in 2012 published her

Abstract

Evaluators who are thinking about going out on their own need some simple but important skills to be successful. This practical workshop offers a synthesis of management consulting concepts entrepreneurial skills, and small business processes. Valuable samples, worksheets, insider tips, and personal anecdotes will help participants determine if consulting is an appropriate career choice for them. The personal characteristics and essential values of the independent consultant, the details of setting up shop, how to set fees and track time, and how to get work are explored. Participants will problem solve around this career choice and develop an agenda for action.

Audience

This workshop is relevant for any evaluator who is thinking about becoming an independent consultant. It is useful for those in the academic community as well as those currently working for government or not-for-profits who are considering a consulting career. It appeals to graduate students, mid-careerists, and pre-retirees as well as consultants who have recently taken the plunge.

CES Halifax 2012

  • Pre-Conference Workshops: May 13, 2012
  • Conference: May 13-16, 2012
  • Westin Hotel, Halifax Nova Scotia
-->