Creative Evaluation

Creative Evaluation

At Inspire to Change, we support purpose-driven change and changemakers in a variety of ways, including through Creative EvaluationCreative Evaluation (CE) discovers impact by combining elements of developmental evaluation, principles-focused evaluation, and arts-based evaluation to understand highly complex human systems and work towards understanding and solving difficult social issues. Developmental evaluation examines how human systems operate in dynamic, novel environments with complex interactions, focusing on innovation and strategic learning. Principles-focused evaluation examines whether the principles that inform and guide our decisions are clear, meaningful, and actionable, whether they are followed, and whether they are leading to desired results. Arts-based evaluation examines how human cultures encode wisdom and values in the arts, and is especially effective in capturing emotional and cultural realities. 


We succeed when our partners can identify and respond to change. Partners must recognize and understand their current situation, have greater confidence that they can manage change, act with intentionality to move toward a desired vision, and work through limitations, differences, interests, and other barriers to resolve problems while considering the impact of its actions. This creates tension for researchers and evaluators, as the proponents of complexity thinking challenge their colleagues to take a critical and reflective look at evaluation assumptions and practices as they relate to these competing paradigms and to explore the implications for practice (Eoyang, 2007; Parsons, 2012; Patton, 2010; Westley, Zimmerman, & Patton, 2007).

The dominant existing paradigm takes the stance that the evaluand can be viewed as a mechanistic set of interactions embedded within a closed system and that the evaluand can be held still and described. Through this paradigm, there is a search for something that works - for the model to be defined, improved, tested, and replicated. Formative-summative is a traditionally dominant distinction in the field of evaluation: a formative evaluation is meant to improve a program model, stabilize it, standardize it, and prepare it for a summative evaluation, which is meant to test, prove, and validate a fixed program model. Both formative evaluation and summative evaluation thus assume that the purpose of the evaluation is to test a model. This implies a certain degree of hubris on the part of evaluators and program administrators because it implies that they have control over both the program as well as the systems it interacts with. As the dominant paradigm, this method of evaluation engages a worldview that is both powerful and enduring, stemming from a post-positivist assumption that there must be a best way to do everything. When working in this paradigm an evaluator also must make basic assumptions about linear organizational dynamics (predictability, low dimensionality, system closure, stability, and equilibration).

Developmental evaluation emerged in response to the need to support real-time learning in complex and emergent situations. Traditional forms of evaluation work well in situations where the progression from problem to solution can be laid out in a relatively clear sequence of steps (Gamble, 2008). However, initiatives with multiple stakeholders, high levels of innovation, fast-paced decision-making, and areas of uncertainty require more flexible approaches (Patton, 2008). 

DE is appropriate when the problem addressed assumes a world of multiple causes, diversity of outcomes, inconsistency of interventions, emergent phenomena, and interactive effects at every level, including the system level (Patton, 1994). Unlike formative and summative evaluations, developmental evaluation does not assume a logical chain of events that can predict outcomes, nor does it assume that outcomes can be produced on a predictable timeline. DE is especially appropriate for investigating issues of racial equity and social justice, which cannot and should not wait for a published evaluation report to make meaningful changes. 

Table X: Developmental Evaluation Guiding Principles for Social Justice (Mckegg, Wehipeihana, and Murphy Johnson, 2017)

DE for Social Justice Purpose. Illuminates, informs, and supports what is being developed and how it addresses the root causes of systemic inequities, identifying the implications.

Utilization-Focused Evaluation. Pays attention to intended use by intended users from beginning to end, facilitating the evaluation progress to ensure utility and actual use.

Innovation Niche. Illuminates how the change is new, novel, or adapts and interprets old wisdom to new contexts.

Systems Thinking. Thinks systemically throughout, understanding interrelationships, engaging with contrasting perspectives, and reflecting ethically on boundaries of the social system that the innovation and evaluation are being developed within.

Complexity Concepts. Interprets development through a complexity lens, recognizing that situations are often uncertain, emergent, and dynamic, and evaluation is responsive to this reality.

Co-creation. Develop the innovation and evaluation together – interwoven, interdependent, iterative, and co-created – so that developmental evaluation becomes part of the change process.

Timely and Culturally Appropriate Feedback. Informs ongoing adaptation as needs, findings, and insights emerge, responding to the natural rhythms and cultural norms of the context the development and evaluation are happening within.

Evaluation Rigor. Asks probing evaluation questions; thinks and engages evaluatively; questions assumptions; applies evaluation logic; uses appropriate methods; synthesizes and makes meaning from a values-inspired framework and stays empirically grounded.

Principles-focused Developmental Evaluation (P- FDE) uses guiding principles to guide decision-making. In a Collective Impact model, the guiding principles articulate how partners will work together towards a shared vision.

In practical terms, P-FDE involves three types of activities: a) applied and systematic inquiry, b) meaning-making, and c) a consulting co-creative role with key stakeholders to support ongoing learning and adaptation. Each of these three types of activities happens in the context of trusting relationships that support critical examination of the work.

  • Applied and systematic inquiry

We will use conventional methods (surveys, interviews, and observations) and less- conventional evaluation methods (arts-based inquiry) to produce data, reports, and presentations about the development of PPP and its related initiatives within alumni institutions.

  • Meaning making

We will work with partners in PPP to make meaning and integrate learnings from past developmental evaluations through guiding-principles-based activities such as rubric development and use, and arts-based activities.

  • Relationship building

We will hold regular check-ins with key PPP leadership and stakeholders, as well as attend virtual events hosted by PPP. 

  • Co-creation

We will work with partners at PPP to co-create the evaluation methods that elicit the perspectives of multiple stakeholders.

Arts-Based Evaluation

Arts-Based Evaluation (ABE) collects, analyzes, and reports data through artistic methods, including painting, drawing, sculpture, fiber arts, dance, theater, photography, animation, music, poetry, fiction, culinary arts, and other forms. At Inspire to Change, we use arts-based evaluation techniques to supplement quantitative- and qualitative-based evaluation methods. 

Arts-based evaluation techniques typically uncover or explain emotional salience among participants. Points of emotional salience can determine why a logic model works with one group of stakeholders and fails with another, why participants engage or disengage, and areas requiring further attention. Arts-based data collection techniques allow participants to express thoughts or feelings that they may not be ready to put into words. Arts-based data analysis techniques help evaluators uncover hidden assumptions or associations connected to the data. Arts-based data reporting techniques create accessible entries into the evaluation for all stakeholders, while fostering an emotional connection between the participants and the data. While arts-based evaluation can collect, analyze, and report data using the methods of the arts, evaluators most often use these methods in conjunction with qualitative and quantitative methods of collecting, analyzing, and reporting data. 

Arts-based research methods are a set of related emergent research methods which use different artistic mediums (e.g. performance, written, photography, video) to understand the participants’ experience or perspective and address social research problems (McNiff, 2008; Leavy, 2009). The breadth of the field is large and descriptions of the approach vary. McNiff (2008) views arts-based research as remaining solely in the art: Arts-based research can be defined as the systematic use of the artistic process, the actual making of artistic expressions in all of the different forms of the arts, as a primary way of understanding and examining experience by both researchers and the people that they involve in their studies. These inquiries are distinguished from research activities where the arts may play a significant role but are essentially used as data for investigations that take place within academic disciplines that utilize more traditional scientific, verbal, and mathematic descriptions and analyses of phenomena (p. 29). 

The scientific method seeks to uncover the so-called “objective truth”. This approach stems from the modernist belief there is a single reality which exists, independent of our values and beliefs (Cosgrove & McHugh, 2008). Arts-based methods, on the other hand, represent a postmodern research framework in that they do not seek to determine objective reality. Rather, arts-based research practices uncover the multiple ways of knowing and subjective realities (Holm, 2008; Leavy, 2009). Harper (2003), as quoted in Holm (2008), relates this view to photo-elicitation—an arts-based research method: “the power of the photo lies in its ability to unlock the subjectivity of those who see the image differently from the researcher” (p. 328). Rudkin and Davis (2007), citing numerous sources, claim arts-based methods elicit different responses than more conventional qualitative methods, allowing participants to “shape their own messages and convey them in ways they deem meaningful” (p. 109). 

McNiff approaches art as the data itself. On the other hand, Leavy (2009) and Holm (2008) extend the realm of arts-based research to mixed-method approaches, such as using art as a method to generate dialogue and form discussion. Regardless of how these researchers see the application of arts-based research, they approach the set of arts-based methods as a distinct alternative to conventional qualitative research methods. 

Leavy (2009) compares quantitative, qualitative, and arts-based research approaches (Table 1). Leavy’s (2009) chart demonstrates the relationship between qualitative and arts-based research methods and contrasts these with quantitative approaches. 






Stories, images, sounds, scenes, sensory

Data discovery

Data collection

Data or content generation







Value neutral

Value laden

Political, consciousness-raising, emancipation







Prove / Convince


Compel, move, aesthetic power







By combining elements of Developmental Evaluation, Principles-Focused Evaluation, and Arts-Based Evaluation for a social justice purpose, Creative Evaluation understands complexity without oversimplifying our shared journey to make the world more whole, just, and beautiful. 

Never miss a blog post!

Join our mailing list to receive the latest news and updates from our team. Your information will not be shared.


50% Complete

Two Step

Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.