Download as pdf or txt
Download as pdf or txt
You are on page 1of 23

See discussions, stats, and author profiles for this publication at: ​https://www.researchgate.

net/publication/315777556

The Art and Science of Evaluating Organization Development Interventions


Article ​· April 2017
CITATIONS
READS ​11

5,737
1 author:
Some of the authors of this publication are also working on these related projects:
Allan H. Church ​PepsiCo Inc.
117 ​PUBLICATIONS ​2,439 ​CITATIONS
SEE PROFILE
Handbook of Strategic 360 Feedback ​View project
Digital Capabilities of the Future (Digital Fluency/Digital Leadership) ​View project
All content following this page was uploaded by ​Allan H. Church ​on 04 April 2017.
The user has requested enhancement of the downloaded file.
“Historically, the evaluation component of the classic consulting model has largely been downplayed or ignored.
Today, with increasing pressure from organizations to demonstrate the value of our efforts, having both a
well-designed and articulated evaluation strategy is key, as is a detailed multi-measure and level measurement
process.”

The Art and Science of


Evaluating Organization
Development Interventions
w types of tools and
rshadowed the rigor
n to measuring the
By Allan H.
Why is this
Church
practitioners’ lack of
we evaluate the impact of our
nd analytics capability
tion change programs, processes,
e.g., Church & Dutta,
atives? What are the best ways to
h conducted on over
success or failure of various
field (Shull, Church,
ions? How do we know we have
suggest this might be
ade a difference? While the field of
29% cited using
tion development (OD) has its ori-
methods in their OD
ction research and enhancing the
e it is not part of their
nd development of organizations
same study reported
people, if we are honest with
actice was ranked 21st
s, our focus as a field on formally
es that drive their
g the impact of our work has
evaluation as a core
r behind. The level of emphasis
uld also support this
ely placed as a field on the debate
ason for our lack of
aving a clear and consistent
ply be because it’s too
n of OD, the right set of core OD
to design a meaningful and robust evalu-
ation process?
to design a meaningful and robust evalu-
ation process?
Whether you are an external consul- tant or internal practitioner there are a host of challenges associated with the measure-
ment of causal relationships resulting from
organizational interventions at the individ-
ual or systems level. Despite the definitive
writings of Kirkpatrick (1998), the prospect
of evaluating much of what we do in the
social sciences and in organizational set-
tings involves dynamic, interdependent, and
All of this sits squarely in juxtaposition
often long-lead times thatwith
canthe
far client’s interest in mea
outlast the
consultant-client (or employee- employer)
relationship. In fact, one of the most
frustrating elements of being an external
practitioner is the lack of visibility to the
long-term impact of one’s work. This is
OD PRACTITIONER Vol. 49often No. 2cited
2017as a key reason why people take
internal positions. While it’s fantastic to
experience multiple client organizations and
been present (arguably few paid organiza- tional interventions are done purely for the sake of humanistic values alone), in
today’s hypercompetitive business environment the emphasis has never been stronger. Given the dynamics and challenges
cited above (e.g., capability, values, complexity, and personal interest in the game) how do we as OD practitioners move
the needle and more holistically embrace the evalua- tion conundrum in our efforts?
The purpose of this paper is to discuss this issue in depth and attempt to answer these questions by focusing on the art and
science involved in evaluating OD interventions. The paper begins with a brief overview of the evolution of the evaluation
phase of the classic consult- ing paradigm from an afterthought to a core element required in the field today. This will be
followed by a discussion of three key requirements for setting an evaluation strategy. Lack of attention to these areas
works against practitioners and their clients’ ability to appropriately conceptualize and implement suitable outcome
measures. These requirements will be presented in the context of why they cause issues and how best to address them
up-front in the process. The paper then offers three additional recommendations for creating an effective evaluation
process that will yield the right kinds of informa- tion needed to demonstrate the impact of OD and related applied
organizational interventions.
Recognizing Our Biases
Before proceeding, however, and as any OD practitioner should do, it is important to recognize several biases inherent in
the approach to be presented here. First, the perspective is from that of an organi- zational psychologist by formal training
with a significant grounding in applied statistics and measurement theory (and therefore represents a scientist-practitioner
mindset). Thus, there is an inherent bias that evaluation methods for all types of organizational initiatives (whether
directed at organization change, enhancing develop- ment, or improving performance) should follow some degree of rigor
and contain
Data Contracting GatheringEntry

Data Analysis
Evacuation /
Feedback & Success Metrics
Interpretation
Intervention(s)
Figure ​1​. ​Classic OD Consulting Process Model
both quantitative as well as qualitative
focus continues to be relatively limited components of OD as a data-driven process
in the field. (Waclawski & Church, 2002).
As many authors have noted (e.g., Second, the insights and observations
Anderson, 2012; Burke, 2011; Church, offered are based on the author’s personal
2003; Cummings & Worley, 2015), the external and internal consulting experience
evaluation stage in the model is often given and evaluation research over the past 25
lip service or overlooked entirely. This is years following the implementation of a
true whether you look at classic approaches variety of large-scale organizational change
to doing OD work as well as the newest initiatives and global talent management
dialogic approaches (Bushe & Marshak, processes. The focus therefore is less on
2015—where evaluation, for example, one-off individual coaching engagements,
is not even listed in the topic index). A team building efforts, or group interven-
quick scan of the EBSCO database shows tions but more on measuring the impact of
no academic articles published at all, for broader initiatives, systems, and processes.
example, on the terms evaluation and OD Thus, while the suggestions offered reflect
since 2012. a certain normative and science-based
Instead, the emphasis has often been paradigm, and are constrained by the expe-
placed on the actions taken or the change riences of the author, it is hoped that they
in behaviors and culture observed. While will appeal to a much broader spectrum of
some research on the evaluation of various OD applications.
individual OD methods has certainly been done (e.g., Basu & Das, 2000; Terpstra, ​The Evolution of Evaluation in
OD
1981; Woodman & Wayne, 1985), and there is a plethora of qualitative and quantitative Although the evaluation stage of
the clas-
case studies both individually and in OD sic consulting model that OD shares with
texts citing the impact of various programs, many other applied social science disci-
relatively few authors have taken a more plines, such as industrial-organizational
focused view of how to systematically psychology (I-O) and human resource
measure our efforts. Moreover, many of development (HRD), has been p resent
these have come from authors with cross- since Burke (1982), Nadler, (1977), and
disciplinary backgrounds (e.g., Armenakis, others (e.g., Rothwell, Sullivan, & McClean,
Bedeian, & Pond, 1983; Edwards, Scott, & 1995) discussed the framework, the
Raju, 2003; Martineau & Preskill, 2002;
The Art and Science of Evaluating Organization Development Interventions
27
Three ​Rothwell, et. al., 1995). At the same time,
Key Requirements for Setting an ​scholar-practitioners in other related dis-
Evaluation Strategy ​ciplines (e.g., Holton, 1996; Kirkpatrick, 1998; Roberts & Robertson, 1993; Svyan-
There are many reasons why OD profes- tek, O’Connell, & Baumgardner, 1992;
sionals might not pursue or even actively Terpstra, 1981, Woodman & Wayne, 1985)
seek to avoid engaging in the evalua- have taken this topic on years ago. Surpris-
tion stage of their work. Some of these ingly enough of those with more academic
reflect internal states and motivations orientations have advanced the field in far
(e.g., v alues, personal investment in the more significant ways than more tradi-
outcome) and/or a lack of specific capa- tional OD scholar-practitioners with the
bilities and skills (e.g., in research design introduction of scorecards (e.g., Becker,
and s tatistics). While very important for Huselid, & Ulrich, 2001) and more recently
s etting a baseline, they do not necessar- bottom-line linkages (e.g., Lawler, 2014;
ily tell the complete story. Instead, the
Many practitioners have as their goal culture change, behavior change,
process improvement, organizational effectiveness, enhanced team
functioning, etc. These are all excellent objectives but they are not tight
enough to be used as measures of impact. You need to be able to answer
the question of “what will be the measurable indictors of a positive
outcome as a result of this effort.” These can be quantitative or qualitative
(some of both methods are generally best for maximizing perceived
evaluation credibility) but they need to be measurable and aligned.
Savitz & Weber, 2013), and the application
e mphasis here is on a third set of rea- of decision-science (Boudreau & Ramstad,
sons—that is, the dynamics and com- 2007) to their work.
plexities of measuring change real time Given the trends in the field it would
in organizations. Listed below are three seem we are at cross-roads. There is
key challenges and requirements for increasing pressure to measure the impact
s etting an effective evaluation strategy. of our efforts (Anderson, 2012; Shull et al., 2014), at multiple levels in the
organization
1​. Clarifying the Definition of Impact ​(Lawler, 2014), and using more types of
Whether we start with a measurement complex and potentially anti-OD oriented
theory approach (what is the criterion?) or “Big Data” applications (Church & Dutta,
an OD consulting model (what is included 2014). Yet arguably few OD practitioners
in the contract?), the importance of clarify- have the right set of capabilities included
ing up front the outcomes to be measured in their formal training (Church, 2001) or
is critical to the success of any interven- core toolkits (Church & Dutta, 2013; Shull
tion (see ​Figure 1)​ . This is true whether the et al., 2014) to address these trends. Con-
effort is a simple team building exercise, sequently, we are simply not engaging in
an executive c oaching assignment, the rigorous evaluation methods of our organi-
implementation of a large-scale engage- zational interventions. Let us now explore
ment survey program, or a whole sys- the reasons in more depth in the hopes of
tems process intervention. Whatever the finding some answers and potential solu-
initiative, it is imperative that outcomes be tions to this challenge in the field.
clearly articulated and there is alignment
28
OD PRACTITIONER Vol. 49 No. 2 2017
up front in the contracting or project char- ter stages. Importantly, this is more than a set of objectives or goals for the
project. The definition of impact, i.e. the intended outcomes, needs to be crystal clear in such a way that it can be
measurable at one or more specific levels of analysis.
Many practitioners have as their goal culture change, behavior change, process improvement, organizational effectiveness,
enhanced team functioning, etc. These are all excellent objectives but they are not tight enough to be used as measures of
impact. You need to be able to answer the question of “what will be the measurable indictors of a positive outcome as a
result of this effort.” These can be quantitative or quali- tative (some of both methods are generally best for maximizing
perceived evaluation credibility) but they need to be measurable and aligned.
Marshall Goldsmith, for example, in his coaching practice is known for his c ontracting efforts around behavior change.
As part of the commitment to his work, he uses a pre-and post-behavioral feedback assessment tool following a yearlong
engagement. If he does not see change in the measure, it is a direct indicator of the success or failure of the coaching
project. While this might sound simple enough there are important mea- surement aspects, such as the quality of the
feedback measure used, the nature of the raters selected, the rating process (confidential vs. anonymous) etc. that can
impact the outcomes in ways that might be unexpected.
While other interventions (e.g., a c ultural integration effort following a merger and acquisition) are much more complex
than this, the same principle applies. In this context, the outcome mea- sure might be comprised of targeted levels of
turnover, identification and retention of key talent, improvements in levels of engagement or other cultural indicators on an
employee survey, increased perfor- mance in business units, or an increase in the innovation pipeline 2–3 years later.
There are no correct answers but there are aligned ones. The focus needs to be on the realistic measurable indicators of
impact
Figure ​2​. T
​ he Impact of Taking Action From Survey Results on Employee Satisfaction
that can be linked to the timing of a spe-
on target than there is on determining the
there is often not a definitive end to the cific intervention.
best window (and method) for measuring
engagement. In fact, many organiza- the impact of that work over time.
tional processes (and in particular those ​2​. Setting Realistic Time Horizons
A related issue, and common fallacy
implemented for employee development, ​for Measurement
in organizations, is the use of the “pilot”
performance management, and talent OD practitioners all know the simple
concept as a means for testing the impact
management purposes) continue to evolve fact that change takes time. Based on
of a new program. While launching an
long after the initial design and imple- the prevailing problem, scope, and inter-
intervention in a small-scale environment
mentation phases. From an evaluation vention, this can range anywhere from
or controlled area of the business can
perspective then the measurement aspect minutes following a process observation to
be very useful for ironing out the imple-
of assessing impact needs to be seen as years after a leadership transition. Unfor-
mentation kinks, rarely does this offer an
occurring at discrete points-in-time and tunately, clients are not always of the same
effective means for predicting the potential
not as an end-state. This is an important mind. While fewer and fewer executives
impact of a much larger scale program.
distinction as it enables the practitioner to seem to believe in the fallacy of changing
This is because larger scale OD efforts
contract regarding “points of impact” mea- corporate culture overnight, their sense of
need to be aligned to a larger set of systems
surement at different stages of evolution, timing and urgency is often directly pro-
factors which require much broader think-
and not rely exclusively on a single evalu- portional to the pace of their business. For
ing about organizational impact than what
ation metric. Not only should this remove example, consumer products organizations
typically occurs in a small pilot context.
some of the burden of having to show generally move faster than pharmaceuti-
The question to ask yourself here is “given
impact all at once, but the measurement cals. The point is that as part of the out-
what we are anticipating measuring, when
quality will improve as well. Time series come alignment process OD professionals
do we expect to see this outcome change as
studies and multi-method approaches are need to ensure that the timing window of
a result of our efforts?”
far more rigorous and valid than are single the evaluation and measurement compo-
One important caveat should be raised
program reviews. nents is clear and reasonable.
here. Although the discussion so far might
Case in point, in the mid-2000s there Another issue that can occur is an
suggest that all OD interventions have a
was an applied study done at PepsiCo on over emphasis up front on planning for
distinct beginning and end to them, we
the impact of their global employee organi- the timing of the intervention launch, and
know this is not the case. While the clas-
zational health survey program (Church & less attention paid to the appropriate lag
sic consulting model tends to present the
Oliver, 2006). The research was conducted time required to observe the impact of the
world in this semi-linear fashion, the vast
in an effort to answer senior leadership’s effort. Often this is because the bulk of the
majority of our work rarely begins from
questions regarding the impact of the development work and consulting delivery
a blank slate. Burke, et al. (1997) have
survey on key employee outcomes. The costs are front-loaded. There is more con-
termed this effect “midstream consult-
researchers analyzed survey data over time, cern about meeting the deadline to deliver
ing” and it applies in just about every
including the use of an action planning a new program or rollout a change agenda
case whether internal or external. Further,
variable, and demonstrated the impact
The Art and Science of Evaluating Organization Development Interventions
29
Figure ​3​. A
​ Multi-Level Framework for Aligning Evaluation Efforts
30 ​OD PRACTITIONER Vol. 49 No. 2 2017 ​of taking action from the results on both
of those impacting and impacted by the softer survey outcomes of employee satis-
change. Whether you prefer the Burke-Lit- faction and commitment as well as hard
win Model (1992) or some other approach metrics such as turnover, lost time and
to conceptualizing an organizational accidents at the plant level. Specifically,
system, it is critical that broader thinking they reported that employee satisfaction
be applied than just a micro analysis of a and commitment a year later (via the sec-
single intervention. This is actually one ond survey) were significantly impacted by
area where OD practitioners should have managers who both shared the results and
the advantage over practitioners from other took action versus those who only shared
disciplines such as I-O Psychologists who results or did nothing at all. While these
tend toward a more individual level of insights were extremely well received in the
analysis. Unfortunately, OD practitioners organization, several years later the same
seem to ignore their own strengths when it questions emerged under new leadership,
comes to evaluation. so the study was conducted again (Church
Too many practitioners treat the evalu- et. al., 2012). The same results were evident
ation process as an afterthought or some- across multiple years (see ​Figure 2)​ dem-
thing to be considered once they are further onstrating the power of an organizational
along in the intervention. Though there survey with an action planning focus at the
is certainly wisdom in revising, aligning, local level to drive organizational change
and adjusting the measurement approach and employee engagement.
if needed as the intervention progresses, it should have a solid basis in evidence-based ​3​. Applying Systems Thinking
to
science articulated at the beginning of the ​Systems Interventions
intervention. When practitioners fail to This third recommendation is simple
focus on evaluation early in the process, it enough. OD practitioners have a deep
is no surprise that when pushed by clients understanding of systems thinking so it
to show results, they need to scramble to should be easy to apply that same approach
put something in place. This often results to the evaluation of their efforts. This
in a poorly designed and/or implemented means considering variables across all
measurement approach which can yield levels and sub-systems involved (Katz &
inaccurate results and might even derail Kahn, 1978) and aligning the measurement
the intervention itself.
As it turns out, often the best form of evaluating impact is to design a new measurement process at the start of the
intervention or change effort to enable a pre-post comparison. The Marshall Gold- smith approach is in fact a perfect
example of this principle in action as is the PepsiCo organizational health survey where the questions regarding action
planning were asked the year before the company was interested in learning about the impact of the results. Even if you
are not in a posi- tion to implement a pre-measurement tool, there are scenarios where it might take significant time and
resources to collect the necessary information to show results, and this would need to start earlier on in the effort. Take for
example, an organization that would like to know the impact of a new leadership curriculum on online learn- ing
utilization. While the data regarding learning system utilization might not be centralized at the start of the project, by the
end centralization would be necessary to aggregate the data. This, would require lead time and dedicated resources.
Kirkpatrick’s famous multi-level framework (1998) is perhaps the best and most easily recognized approach to setting a
systemic strategy for evaluation. While initially designed for the evalua- tion of learning interventions it has since
been expanded to include additional concepts such as ROI (Phillips, 1991) and can easily be adapted to OD and related
work including talent management (e.g., Church, 2013; DeTuncq & Schmidt, 2013; Holton, 1996). The core idea is that
there are multiple levels of impact that can be measured in various ways (as aligned and timed per above). Essentially the
model measures outcomes at the following levels: (1) reaction, (2) learning, (3) application, (4) b usiness impact, and (5)
bottom-line/ long-term outcomes.
The key is spending the time required at the initial stages of the effort to design an integrated systems approach to the five
levels of analysis, with the right types of metrics and under the right time horizons. This needs to be measurable to: (a)
have sufficient rigor to demonstrate the impact of your efforts, (b) satisfy your clients, and (c) be reasonably executed with
enough latitude to adjust for potential contingen- cies. ​Figure 3 ​provides a simple framework for how this might be
applied.
Only by setting the strategy for evalu- ation up front can you get ahead of the potential issues that will naturally come with
these types of efforts.
Three Recommendations for Building an OD Evaluation Process
Now that the importance and critical factors involved in having an evaluation strategy have been discussed let us turn to
the process itself. What elements and types of data should be included in the evalua- tion process? What are the key factors
in conveying the messages around the impact of your efforts? What is the role of OD val- ues in analytics (is that an
oxymoron?) and what are the pitfalls to avoid? Listed below are three key recommendations for devel- oping an impactful
evaluation process. While they are not meant to reflect all the elements of what’s involved in designing evaluation research
(see instead texts such as Edwards et al., 2003; Kirkpatrick, 1998) these three should be helpful in design- ing an
evaluation approach that helps put your OD efforts in the best possible light yet stays grounded in solid measurement
theory and rigor.
can ​1​. Design Using a Multiple Measures
it serve as a criterion for something ​and Levels (MML) Approach
else? The bottom line here is that to have One of the aspects that makes OD
an effective evaluation process in today’s efforts so exciting and attractive to people
multi-faceted data-driven landscape you as a profession is the variety of projects
need to have a multiple measures and that comprise the spectrum of our work.
levels (MML) solution. Whether the interventions are focused on
What does this mean in practice? Max- the individual, group, or organizational
imizing your ability to measure the true levels, because they are grounded in the
impact of your efforts requires more than social and behavioral sciences there are
one type of tool, process, or information always a myriad of complex human dynam-
system producing data. Preferably these ics involved (unlike say pure management
multiple measures are done at different consulting which often focuses more on
levels of analysis (e.g., organization-macro, business strategy, design, or financials).
group-meso, and individual-micro), and
By using multiple levels of analysis, you enable a more complex and
interdependent way of assessing impact and change. So, for example, from
a multi-levels approach, in measuring the rollout of a new set of corporate
values you might measure both how senior executives model the behavior
in town halls and other public forums (via personal observation or
interview feedback), how middle managers are rated as demonstrating the
behaviors in the workplace (via 360 feedback), and how employees feel
about the authenticity of the new values for the culture (via employee focus
groups or a pulse survey).
Unfortunately, this element of OD work
you collect at least two different types of also makes it particularly challenging to
data at any given moment in time. By using evaluate at times. Often the interventions
multiple levels of analysis, you enable a are focused on less quantifiable aspects of
more complex and interdependent way human interactions such as group dynam-
of assessing impact and change. So, for ics, power, archetypes, norms, and culture.
example, from a multi-levels approach, in Even when behaviors are involved (e.g.,
measuring the rollout of a new set of corpo- such as leadership competencies, manage-
rate values you might measure how senior ment practices, communication or col-
executives model the behavior in town laboration skills, digital capability, learning,
halls and other public forums (via personal etc.) they are not always easily measured
observation or interview feedback), how by standard tools or off-the-shelf assess-
middle managers are rated as demonstrat- ments. In addition, the measurement of
ing the behaviors in the workplace (via 360 performance has come under fire recently
feedback), and how employees feel about in the literature (e.g., Church, Ginther,
the authenticity of the new values for the Levine, & Rotolo, 2015) as being negatively
culture (via employee focus groups or a received by leaders and poorly designed
pulse survey). and implemented in companies today. It is
Conversely by using a multiple mea- no wonder then that recent conversations
sures approach at the same level and point with senior leaders have resulted in their
in time you enable a process of triangula- questioning the use of internal perfor-
tion. This allows you to see if all measures mance management data as a valid indica-
are showing the same type of change as tor of the impact of OD interventions. If
a result of your intervention or if one PMP is not measuring the right things how
indicator is moving when another is not (or
The Art and Science of Evaluating Organization Development Interventions
31
For going in the opposite direction). Continu-
example, in evaluating the impact of ing with the values rollout example, from
PepsiCo’s ​Potential Leader Development ​a multiple measures perspective in order
Center (​ PLDC), a custom talent assessment to add a second indicator of the practice of
process and part of their broader LeAD the new values at the managerial level, you
program aimed at identifying high poten- might incorporate an audit of performance
tials early in their careers, the program reviews along with the 360-feedback pro-
team employed a MML approach (Church cess. The question would be are managers
& Rotolo, 2016). Specifically, they wanted being reviewed by their bosses based on
to know whether being transparent (i.e., the values, and are they being rated by
sharing the results of the assessment of others as demonstrating them? A nice side
“potential”) with over 5,000 employees benefit of this approach would be the cor-
globally had any impact on how people felt relations you could run between manager
about the program itself and/or resulted ratings on the 360 survey and manager rat-
in any unintended changes in turnover ings (if available) on the performance tool.
or performance. As part of the evaluation By using this MML approach applied to OD
strategy, at six months and one year after interventions you are essentially following
the individual feedback reports had been a similar path to what I-O psychologists
delivered, the team surveyed participants call the multi-trait multi-method (MTMM)
regarding their attitudes about the program which is used for assessing individual skills
content and mechanics and their percep- and capabilities.
tions of the feedback they had received. Survey feedback, behavioral feedback,
The data was then linked at the individual performance ratings, observations, inter-
level to assessment results (i.e. leadership views, and focus groups, etc. are just some
potential scores), annual business perfor- of the ways you can collect data and use
mance ratings, promotion rates, and turn- them for your evaluation purposes. The
over. After an in-depth analysis, the project practice of OD is replete with all sorts both
team was able to answer senior leaders’ key qualitative and quantitative data-driven
questions regarding the impact of the pro- tools (Waclawski & Church, 2002) and
gram one year later on talent movement, the savvy applied behavioral scientist can
the extent to which results were used in convert the output of almost anything into
individual development planning, ​and o​ n a measure that can be used for some level
employee perceptions of the culture. of analysis (even if it’s non-parametric).
More specifically, the research indi- Linking that softer data to perceived harder
cated that: (1) the assessment process was metrics such as the those in the list below
effective at predicting future success—i.e. is where the rubber meets the road:
actual performance and promotion rates ​» ​performance (caveats
one year later were significantly correlated notwithstanding)
with performance on the assessment tools; ​» ​turnover
(2) transparency of how employees scored ​» ​quality of new hires
(their level of LIFT, a proxy for potential) ​» ​promotion rates
had no negative impact on satisfaction ​» ​talent transfers
with the program (70% favorable), per- ​» ​external reputation indices and
ceptions of organizational commitment, awards
or actual turnover; and (3) the program ​» ​market share
met its goal of providing developmental ​» ​EBITA (Earnings before interest,
feedback to all participants with the vast taxes, and amortization)
majority (77% and 83% respectively) ​» ​diversity representation
indicating that the results had helped them ​» ​customer satisfaction data
increase their effectiveness as a leader and ​» ​learning completion rates
showed an investment by the company ​» ​sales
in their personal growth and develop- ​» ​productivity
ment. In short, the data provided statisti- ​» ​product shrinkage
cally significant and meaningful results ​» ​senior leadership tenure, etc.
regarding the impact of the program and
dispelled management’s myths regarding the potential negative outcomes of being transparent with the results. It would
have been impossible to demonstrate these rela- tionships without using this type of MML evaluation approach.
In the end, there is no single best outcome measure or set of measures. The important point is to always cycle back to your
evaluation strategy and measurement construct and build from there laying the foundation for multiple measures. It is also
important to remember, however, two old adages when it comes to measurement: (a) what gets measured gets done, and
(b) beware the law of unintended conse- quences. If the goal of your talent manage- ment program is enhanced movement
and you do not account for other factors then you will get movement even if it is of the wrong type (Church, 2013; Church
& Waclawski, 2010). Remember to think holistically and at the systems level when designing your measurement process.
2​. Build a Meaningful Story Through Insights
In as much as the first recommen- dation is about getting your hands on a significant amount of data from different
sources, the second key to creating a sus- tainable evaluation process is the ability to build a meaningful story out of the
insights you generate from that data. There are two parts to this.
First, having data by itself with no connectivity or insights is meaningless. It’s just information. This is true whether it’s
one simple piece of evaluating ratings or five years of culture data linked to business unit performance and employee
turnover. How significant is the overall effect? Where is the change really happening and where is it failing? What are the
dynamics and interplay between culture, behavior, and turnover? All of this needs to be answered in a way that answers
the questions posed at the start of the intervention.
The data you gather for your evalua- tion efforts must be analyzed in such a way to create insights into what is going on in
the organization. In terms of the values rollout example, if the 360-feedback data were to show that managers were
engaging
32
OD PRACTITIONER Vol. 49 No. 2 2017
part in the desired behaviors but the reward sys-
of impact and identify clear areas for action
of the ongoing debates in the field tems were not reinforcing them, this would
going forward?
(e.g., Chamorro-Premuzic, et al., 2016; be a key insight and useful in explaining
Although the approach outlined here
Church, 2014; Church & Silzer, 2016; why the program could be failing or having
is similar to the data analysis, feedback,
Rotolo & Church, 2015). As information less of an impact than might be desired.
and interpretation stages (aka diagnosis)
flows up and down at an increasingly rapid This is why analysis skills though impor-
of the classic OD consulting model, there
pace (remember Big Data is characterized tant by themselves are not enough (Church
is a slightly different spin. While the same
by volume, velocity, variety, and veracity) & Dutta, 2013). The ability to see con-
a nalytical techniques and capabilities
the challenge of determining causality nections and derive meaning from those
g enerally apply, the emphasis in evalua-
versus random relationships is even more connections, in some cases even causality
tion is on what has changed both positively
apparent. Unfortunately, many analyt- between variables if the methodology and
and/or negatively as a consequence of your
ics teams take the data blender approach research design support these conclu-
interventions. The point is to isolate the
(e.g., throw a large cluster of variables in sions, is critical. It is a skill that comes with
direct and indirect results of your efforts,
a blender and mix it all up and see what practice and experience working with large
and the potential moderating effect of other
spits out) and the resulting findings can be and often complex datasets (Church &
factors that you have measured over time
meaningless or at best difficult to interpret. Waclawski, 2001).
as well.
This suggests not evidence-based science but more of a fishing expedition. Relation- ​In
the end, there is no
single best outcome measure or set of measures. The important point is to
always cycle back to your
ships that make little sense can emerge and limited judgement is applied. With- out a values filter applied to the analysis
evaluation strategy and measurement construct and build from there laying
the foundation for multiple measures. It is also
of data, relationships identified might be s purious, or even potentially unethical from an OD perspective. ​important
to remember, however, two old adages when it comes to measurement: (a)
what gets measured gets done, and (b)
Consider the case of employee engage- ment survey data that was collected under the auspices of being confidential, yet
beware the law of unintended consequences. If the goal of your talent
management program is enhanced movement and you
through linkage research is connected at the individual level to other variables such as performance data, promotions,
social ​don’t
account for other factors it’s of the wrong type . . .
then you’ll get movement even if
media postings, and iPhone activity. If these analyses are done internally and mis- used (e.g., to classify individuals for
various decisions or opportunities) it could result Second is the ability to tell a compel-
3​. Maintain a Watchful OD Values Lens
in a major violation of employee trust and ling story from those insights. Here your
to Your Evaluation Work
engagement with the organization. Once it communications skills are tested. Even if
The final recommendation for build-
gets around that leaders cannot be trusted, you have the greatest dataset in the world
ing an effective evaluation process con-
the entire process of gathering employee regarding your OD intervention, if you
cerns a return to the use of OD values.
perspectives will be destroyed. This is put it in front of senior executives they
This paper has not been focused on the
just a simple example but a powerful and may not know what to do with it. In fact,
OD values component of evaluation, yet it
quite real one. As OD practitioners, we the more complex your data the worse it
is vital to ensuring our work is evaluated
need to ensure that the data is used in the gets. A collection of interesting insights
effectively and in the right context. While
way it was intended and communicated alone is often not enough to determine
there has always been an element of the
to employees. The psychological contract your intervention’s success. You need to
dark side involved in data analysis and
and integrity of our profession needs to be able to put it all together into a com-
storytelling (see ​How to Lie with Statistics ​by
be maintained. One misstep with employ- pelling package that is tailored to the
Huff, 1993), the rise of Big Data and analyt-
ees can erase in a heartbeat the positive right audience (Church & Waclawski,
ics functions in organizations is exacerbat-
momentum gained from an entire multi- 2001). What has been happening since
ing this problem exponentially. There are
year OD intervention. you launched your intervention? What
several reasons for this situation.
Big Data by itself of course is not other changes at the systems level have
First, the fascination with Big Data and
the problem, nor is the use of advanced occurred? How have these affected the
machine intelligence is making all types of
analytics capabilities. In fact, the future change program and how do you know
analyses even more complex. OD interven-
of human capital management is going that (e.g., what other measures or data do
tions and I-O practices such as selection
to require a digital mindset and statistical you have)? Can you pinpoint the drivers
and high-potential identification are now
prowess beyond what most practitioners
The Art and Science of Evaluating Organization Development Interventions
33
process. have in their toolkits today. Rather the
While many OD practitioners problem rests with those doing the analy-
continue to have limited interest and/or ses. The concern is the application of what
capability in engaging in evidence-based might be termed “values free analytics” to
science for their evaluation efforts, with the understanding the social organizational
increase in Big Data and the need for ROI phenomena at work. While linkage and
there is nowhere to hide anymore. Going similar research represent important
forward practitioners need to start with approaches for demonstrating impact, in
the science to build these skills and ensure the wrong hands they can be misleading or
they are designing the right types of mea- even damaging to an organizational change
sures and analyzing the data appropriately. agenda and threaten the existing culture
OD professionals need to be facile at gen- and practices.
erating both insights from their data and The key recommendation here is to
the ability to tell a compelling story about ensure that whatever analysis approach
the impact of their interventions. There is is taken safeguards are in place to protect
clearly a science and an art to conducting employees and the organization within
evaluation efforts in OD, and none of it the context of the work being done. OD
should be as painful or as daunting as one practitioners should play an integral role
might think. in the analyses being done (if not doing them ourselves) to ensure they are done in
References t​ he right manner and with the right frame of reference. This means being involved
Anderson, D. L. (2012). ​Organization devel- i​ n every phase along the way from design-
opment: The process of leading organiza- i​ ng the evaluation approach, selecting
tional change (​ 2nd ed.). Thousand Oaks, the outcomes, asking the right questions,
CA: Sage. ensuring the analyses are robust and
Armenakis, A. A., Bedeian, A. G., & Pond, appropriate to the data in hand, interpret-
S. B (1983). Research issues in OD ing results to determine key insights,
evaluation: Past, present, and future. telling the story given the context involved,
The Academy of Management Review, ​and working with the client to make the
8​(2), 320–328. right decisions. None of these steps should
Basu, K., & Das, P. (2000). Evaluations be left to someone with a limited sense of
dilemmas in OD interventions: Mixed OD or complex change dynamics (e.g., a
record involving Indian rural credit pure statistics or economics background).
institutions. ​Public Administration Quar- ​We bring a unique perspective to the table
terly, 24(​ 4), 433–444. as OD practitioners with a specific set of
Becker, B. E., Huselid, M. A., & Ulrich, D. research questions, and we should always
(2001). ​The HR scorecard: Linking people, ​be present to influence and protect the use
strategy, and performance. ​Boston, MA: and misuse of employee data.
Harvard Business School Press. Boudreau, J. W., & Ramstad, P. M. (2007). ​Conclusion
Beyond HR: The new science of human capital. ​Boston, MA: Harvard Business The purpose of this paper has been to dis-
School Press. cuss some of the challenges and opportuni-
Burke, W. W. (1982). ​Organization develop- ​ties associated with the evaluation of OD
ment: Principles and practices. G ​ lenview, interventions in organizations. Historically,
IL: Scott, Foresman. the evaluation component of the classic
Burke, W. W. (2011). ​Organization change: ​consulting model has largely been down-
Theory and practice ​(3rd ed.). Thousand played or ignored. Today, with increasing
Oaks, CA: Sage. pressure from organizations to demon-
Burke, W. W., Javitch, M. J., Waclawski, J., strate the value of our efforts, having both
& Church, A. H. (1997). The dynam- a well-designed and articulated evaluation
ics of midstream consulting. ​Consult- s​ trategy is key, as is a detailed multi-
ing Psychology Journal: Practice and m ​ easure and multi-level measurement
Research, 49​(2), 83–95.
Burke, W. W., & Litwin, G. H. (1992). A
causal model of organizational perfor- mance and change. ​Journal of Manage- ment, 18(​ 3), 523–545. Bushe, G. R., &
Marshak, R. J. (Eds.).
(2015). ​Dialogic organization develop- ment: The theory and practice of trans- formational change. ​Oakland, CA:
Berrett-Koehler. Chamorro-Premuzic, T., Winsborough, D.,
Sherman, R. A., & Hogan, R. (2016). New talent signals: Shiny new objects or a brave new world? ​Industrial and
Organizational Psychology: Perspectives on Science and Practice, 9​(3), 621–640. Church, A. H. (2001). The
professionaliza-
tion of organization development: The next step in an evolving field. In W. A. Pasmore & R. W. Woodman (Eds.),
Research in organizational change and development ​(Vol. 13, pp. 1–42). Green- wich, CT: JAI Press. Church, A. H.,
(2003), Organization devel- opment. In J. E. Edwards, J. C. Scott, & N. S. Raju (Eds.), ​The Human resources
program-evaluation handbook (​ pp. 322–342). Thousand Oaks, CA: Sage. Church, A. H. (2013). Assessing the effec-
tiveness of talent movement within a succession planning process. In T. H. DeTuncq & L. Schmidt (Eds.), ​Integrated
talent management scorecards: Insights from world-class organizations on demon- strating value (​ pp. 255–273).
Alexandria, VA: ASTD press. Church, A. H. (2014). What do we know
about developing leadership potential? The role of OD in strategic talent man- agement. ​OD Practitioner, 46​(3), 52–61.
Church, A. H., & Dutta, S. (2013). The
promise of big data for OD: Old wine in new bottles or the next generation of data-driven methods for change? ​OD
Practitioner, 45(​ 4), 23–31. Church, A.. H., Ginther, N., M. Levine, R.,
& Rotolo, C. T. (2015). Going beyond the fix: Taking performance manage- ment to the next level. ​Industrial and
Organizational Psychology: Perspectives on Science and Practice, 8​(1), 121–129. Church, A. H., Golay, L. M., Rotolo, C.
T.,
Tuller, M. D., Shull, A. C., & Desrosiers, E. I. (2012). Without effort there can be no change: Reexamining the impact of
34
OD PRACTITIONER Vol. 49 No. 2 2017
Allan H. Church, PhD, ​is Senior Vice President of Global Talent
Huff, D., (1993). ​How to lie with statistics. ​New York, NY: W. W. Norton & Com- pany, Inc.
Assessment & Development at PepsiCo. Over the past 16 years he has held a variety of roles in ​Katz, D.,
& Kahn, R. L. (1978). ​The social
organization development and ​psychology of organizations ​(2nd ed.).
talent management in the com- ​New York, NY: John Wiley. Kirkpatrick, D. L. (1998). ​Evaluating
​ an Francisco, CA: Berrett-Koehler. Lawler, E. E. III, (2014). Sustainable
training programs. S
pany. Previously he was with Warner Burke Associates for almost a decade, and before that at IBM. He is
currently on the ​e ffectiveness and organization develop-
Board of Directors of HRPS, the ​ment: Beyond the triple bottom line.
Conference Board’s Council of O ​ D Practitioner, 46​(4), 65–67. Martineau, J. W., & Preskill, H. (2002).
Evaluating the impact of organiza- tion development interventions. In J. Waclawski & A.H. Church, (Eds.),
Talent Management, an Adjunct Professor at Columbia University, and Associate Editor of ​JABS.​ He is a
former Chair of the Mayflower ​Organization development: A data-
Group. Church received his PhD ​driven approach to organizational change
in Organizational Psychology ​(pp. 286–301). San Francisco, CA: Jossey-Bass. Nadler, D. A. (1977). ​Feedback and
organiza-
tion development: using databased meth- ods. ​Reading, MA: Addison-Wesley. survey feedback and action planning on
employee attitudes. In A. B. Shani, W. A. Pasmore, & R. W. Woodman (Eds.), ​Research in organizational change and
development 20 ​(pp. 223–264). Bing- ley, UK: Emerald Group Publishing Limited. Church, A. H., & Oliver, D. H. (2006).
The importance of taking action, not just sharing survey feedback. In A. Kraut (Ed.), ​Getting action from organizational
surveys: New concepts, technologies and applications (​ pp. 102–130). San Fran- cisco, CA: Jossey-Bass. Church, A. H., &
Rotolo, C. T. (2016). Lift-
ing the veil: What happens when you are transparent with people about their future potential? ​People & Strategy, 39​(4),
36–40. Church, A. H., & Silzer, R. (2016). Are
we on the same wavelength? Four steps for moving from talent signals to valid talent management applications. ​Industrial
and Organizational Psychology:
from Columbia University, and is a F ellow of SIOP, APA, and APS. He an reached at ​Allan.Church@
​ erspectives on Science and Practice, 9​(3),
pepsico.com. P
Phillips, J. J. (1991). ​Handbook of train- ​645–654.
ing evaluation and measurement C ​ hurch, A. H., & Waclawski, J. (2001).
methods ​(2nd ed.). Boston, MA:
Svyantek, D. J., O’Connell, M. S., & Baum- ​Designing and using organizational
Butterworth-Heinemann.
gardner, T. S. (1992). Applications of ​surveys: A seven step approach. ​San Fran-
Roberts, D.R., & Robertson, P. J. (1993).
Bayesian Methods to OD evaluation cisco, CA: Jossey-Bass.
Positive-findings bias, and measuring
and decision making. ​Human Relations, C ​ hurch, A. H. & Waclawski, J. (2010). Take
methodological rigor, in evaluations of
45(​ 6), 621–636. the Pepsi Challenge: Talent develop-
organization development. ​Journal of
Terpstra, D. E. (1981). Relationship between ment at PepsiCo. In R. Silzer & B. E.
Applied Psychology, 77(​ 6), 918–925.
methodological rigor and reported Dowell (Eds.), ​Strategy-driven talent
Rothwell, W. J., Sullivan, R., & McLean, G.
outcomes in organization development ​management: A leadership imperative
N. (Eds.). (1995). ​Practicing organization
evaluation research. ​Journal of Applied (​ pp. 617–640). San Francisco, CA:
development: A guide for consultants. ​San
Psychology, 66(​ 5), 541–543. Jossey-Bass.
Francisco, CA: Jossey-Bass/Pfeiffer.
Waclawski, J., & Church, A. H. (2002). Cummings, T. G., & Worley, C. G. (2015).
Rotolo, C. T., & Church, A. H. (2015). Big
Introduction and overview of organi- ​Organization development and change
data recommendations for industrial-
zation development as a data-driven (10th ed.). Stamford, CT: Cengage
organizational psychology: Are we in
approach for organizational change. Learning.
Whoville? ​Industrial and Organizational
In J. Waclawski & A. H. Church, DeTuncq, T. H., & Schmidt, L. (Eds.),
Psychology: Perspectives on Science and
(Eds.), ​Organization development: A ​(2013). ​Integrated talent management
Practice, 8(​ 4), 515–520.
data-driven approach to organizational scorecards: Insights from world-class
Savitz, A. W., & Weber, K. (2013). ​Talent,
change ​(pp. 3–26). San Francisco, CA: ​organizations on demonstrating value.
transformation, and the triple bottom
Jossey-Bass. Alexandria, VA: ASTD press.
line: How companies can leverage human
Woodman, R. W., & Wayne, S. J. (1985). An Edwards, J., E., Scott, J. C., & Raju N. S.
resources to achieve sustainable growth.
investigation of positive findings bias in (Eds.) (2003). ​The human resources
San Francisco, CA: Jossey-Bass.
evaluation of organization development ​program evaluation handbook. ​Thousand
Shull, A. C., Church, A. H., & Burke, W. W.
interventions. ​Academy of Management ​Oaks, CA: Sage.
(2014). Something old, something new:
Journal, 28(​ 4), 889-913. Holton, E. E. III (1996). The flawed four-
Research findings on the practice and level evaluation model. ​Human Resource
values of OD. ​OD Practitioner, 46​(4), ​Development Quarterly, 7(​ 1), 5–21
23–30.
Copyright © 2017 by the Organization Development Network, Inc. All rights reserved.
The Art and Science of Evaluating Organization Development Interventions
35
A New e-Book Resource for
Practitioners
ORGANIZ
ization Development in ATION

ce DEVELOP

Editors William J. Rothwell, Jacqueline MENT ​IN


M. Stavros, Roland L. Sullivan, and
John Vogelsang PRACTIC

Available from theEOrganization Development


Network ​OD Network
Organization Development in Practice b ​ rings together experienced OD professionals who share their
methods for developing more effective and resilient organizations, enabling organizational and social
change, and being responsive to continuous change.

Some of the chapters


include:
Transition ​Mark Monchek, Lynnea
The Ebb and Flow of OD Methods ​Billie T. Brinkerhoff, ​and ​Michael Pergola ​explore how
Alban ​and ​Barbara Benedict Bunker ​describe the o foster ​resiliency​, the ability to respond
first and second wave of OD methods and their effectively to change or challenges. They
perspective on what is happening in the 21st century. examine the inherent potential of resilient
When OD methods first emerged in the 1960s, they organiza- tions to reinvent themselves by
were considered innovative and exciting. OD understanding their social networks, using
practitioners have shifted their methods with time and design thinking, and utilizing
adapted to current situations. However, Alban and the fundamentals of action research in a process
Bunker question which of the current methods are called the Culture of Opportunity that leverages
new and which are just a repackaging of already the talent, relationships, knowledge, capital, and
existing practices. As the pace of change has communications that are largely fragmented and
accelerated, they also wonder whether the turbulent disconnected in most organizations. They outline
external environment has driven many to think they the process of instilling a Culture of Opportunity
need new methods when what they may need is within three distinct organizations that hit crisis
more creative adaptation of existing methods. points in response to changing environments and
How the Mind-Brain Revolution Supports difficult circumstances.
the Evolution of OD Practice ​Teri Eagan, Julie At the Crossroads of Organization
Chesley, ​and ​Suzanne Lahl ​believe that the early Development and Knowledge
promise of OD was inspired by a desire to influence
Management ​Denise Easton ​describes what
human systems towards greater levels of justice,
emerges at the intersection of OD and Enterprise
participation, and excellence. They propose that a
Knowledge Management, where a collaborative
critical and integrative neurobiological perspective
partnership accelerates the understanding,
holds the potential to advance OD in two ways: what
development, and transformation of dynamic,
we do—the nature and quality of our ability to
techno-centric systems of knowledge, information,
assess and intervene in service of more effective
learning, and networks found in 21st century
organizations and a better world; and who we
organizations. When OD is part of developing
are—our competencies, resilience, and agility as
knowledge management processes, systems, and
practitioners.
structures the organization not only survives but
Culture of Opportunity: Building Resilient thrives.
Organizations in a Time of Great
companies. In addition, he offers five suggestions
Accelerating Change: New Ways of organizations can implement, drawing on several
Thinking about Engaging the Whole System examples from corporations such as Zappos,
Paul D. Tolchinsky ​offers new ways of developing, FedEx, HCL Technologies, and companies
nurturing, and leveraging intrapreneurialship in developing internal Kick Starters and crowd
organizations. Most organizations underutilize the sourcing platforms.
capabilities and the entrepreneurial spirit of WILLIAM J. ROTHWELL,
JACQUELINE M. STAVROS, ROLAND
L. SULLIVAN, & JOHN VOGELSANG

employees. Tolchinsky describes how to unleash Editors

the entrepreneurial energy that exists in most

Guidelines for Authors


Journal of the Organization Development Network ​

rocesses and should notify the Editor


which they prefer when they submit an
rticle:
Journal
Information rocess ​1 ​(open peer review): ​Submit
rticles with a cover page with the
The ​OD Practitioner ​(​ODP​) is pub- rticle’s title, all authors’ identify- ing
nd contact information, and a 50– 80
lished by the Organization Develop-
word biography for each of the authors;
ment Network. The purpose of the ​ODP
so include any acknowl- edgements.
is to foster critical reflection on OD
wo members of the ​ODP R ​ eview
theory and practice and to share applied
oard will review the article. They will
research, innovative approaches,
ecommend accepting the article for
evidence based practices, and new
ublication, pursuing publication after
developments in the OD field. We
uggested changes,
welcome articles by authors who are OD
e. If they decide the
practitioners, clients of OD processes,
with changes, one
Human Resource staff who have
members will
partnered with OD practitioners or are
ary author to dis-
practicing OD, and academics who teach
anges. Once the
OD theory and practice. As part of our
changes to the
commitment to ensure all OD Network
o Review Board
programs and activities expand the
ditor will work
culture of inclusion, we encourage
pare the article for
submissions from authors who represent
diversity of race, gender, sexual
orientation, religious/spiritual practice,
economic class, education, nationality, blind peer review):
experience, opinion, and viewpoint. d to meet the
ic insti tutions.
The Review a cover page with
authors’ identify-
Process
mation, and brief
of the authors; also
The ​ODP i​ s a peer reviewed journal.
edgements. Provide
Authors can choose between two review
eviated title running head for the relevance to practice ​» ​Is accessible to practitioners ​»
Do not include any identifying Presents applied research, innova- tive
ation other than on the title page. practice, or new developments in the
embers of the review board will OD field ​» ​Includes cases,
ndently receive the article without illustrations, and
hor’s information and without relevance to practice ​» ​Is accessible to practitioners ​»
g the identity of the other Presents applied research, innova- tive
er. Each reviewer will practice, or new developments in the
mend accepting the article for OD field ​» ​Includes cases,
tion, rejecting the article with illustrations, and
ation, or sending the article back practical applications ​» R
​ eferences sources for ideas,
uthor for revi- sion and theo-
ittal. Recommenda- tions for practical applications ​» R​ eferences sources for ideas,
n and resubmittal will include theo-
d feedback on what is required to
ries, and practices ​» ​Reflects
he article publish- able. Each
OD values: respect
oard member will send their
ries, and practices ​» ​Reflects
mendation to the ​ODP E ​ ditor. If
OD values: respect
tor asks the author to revise and
it, the Edi- tor will send the and inclusion, collaboration,
o both review- ers after the authenticity, self-awareness, and
has made the suggested changes. empowerment.
o members of the Review Board and inclusion, collaboration,
rk with the author on any further authenticity, self-awareness, and
s, then send it to the ​ODP ​Editor empowerment.

preparation for publication. The ​ODP Stylistic ​» ​Clearly states the


Editor makes the final decision about purpose and
which articles will be published. Stylistic ​» ​Clearly states the
preparation for publication. The ​ODP purpose and
Editor makes the final decision about content of the article ​» P
​ resents ideas logically and
which articles will be published. with
preparation for publication. The ​ODP content of the article ​» P​ resents ideas logically and
Editor makes the final decision about with
which articles will be published. clear transitions ​» I​ ncludes section headings to help
clear transitions ​» I​ ncludes section headings to help
Criteria for Accepting an
guide the reader ​» ​Is gender-inclusive ​» ​Avoids
Article
jargon and overly formal
Criteria for Accepting an
guide the reader ​» ​Is gender-inclusive ​» ​Avoids
Article
jargon and overly formal
expressions ​» ​Avoids
Content ​» B
​ ridges academic
self-promotion
rigor and
expressions ​» ​Avoids
Content ​» B​ ridges academic
self-promotion
rigor and
relevance to practice ​» ​Is accessible to practitioners ​»
If the article is accepted for publica-
Presents applied research, innova- tive
tion, the author will receive a PDF
practice, or new developments in the
proof of the article for final approval
OD field ​» ​Includes cases,
before publication. At this stage the
illustrations, and
author may make only minor changes eriodical.
to the text. After publication, the Edi- hat enhance an
tor will send the author a PDF of the d. The ​ODP
article and of the complete issue of resize graphics
ODP ​in which the article appears. he graphics should
If the article is accepted for publica- allows editing. We
tion, the author will receive a PDF atch the ​ODP​’s
proof of the article for final approval column, half-page
before publication. At this stage the . If authors have
author may make only minor changes ns about graphics
to the text. After publication, the Edi- se contact the
tor will send the author a PDF of the
article and of the complete issue of
ODP ​in which the article appears. The ​ODP
ticles, not
(continued next ublications or
page) ay publish
(continued next hed in the ​ODP
page) n as long as the p
it to the ​OD
riginal place of

Policy on Self-Promotion ​Although


6 publication in the ​ODP ​is a way of
0 letting the OD community know about
an author’s work, and is therefore good
publicity, the purpose of the ​ODP i​ s to
Guidelines for Authors exchange ideas and information.
(contd.) Consequently, it is the policy of the OD
Network to not accept articles that are
primarily for the purpose of marketing
or advertising an author’s practice.
Preparing the
Policy on Self-Promotion ​Although
Article for
publication in the ​ODP ​is a way of
Submission
letting the OD community know about
an author’s work, and is therefore good
Article Length ​Articles are
publicity, the purpose of the ​ODP i​ s to
usually 4,000 – 5,000 words. exchange ideas and information.
Consequently, it is the policy of the OD
Citations and References ​The ​ODP Network to not accept articles that are
follows the guidelines of the ​American primarily for the purpose of marketing
Psychological Associa- tion or advertising an author’s practice.
Publication Manual ​(6th edition). This Policy on Self-Promotion ​Although
style uses parenthetical reference publication in the ​ODP ​is a way of
citations within the text and full refer- letting the OD community know about
ences at the end of the article. Please an author’s work, and is therefore good
include the DOI (digital object identi- publicity, the purpose of the ​ODP i​ s to
fier; ​http://www.apastyle.org/learn/ exchange ideas and information.
faqs/what-is-doi.aspx​), if available, Consequently, it is the policy of the OD
with references for articles in a Network to not accept articles that are
primarily for the purpose of marketing Vogelsang, at ​jvogelsang@
or advertising an author’s practice. earthlink.net​. The deadlines for submit-
Policy on Self-Promotion ​Although ting articles are as follow: ​October 1 ​for
publication in the ​ODP ​is a way of the winter issue; ​January 1 f​ or the
letting the OD community know about spring issue; ​April 1 ​for the summer
an author’s work, and is therefore good issue; and ​July 1 f​ or the fall issue.
publicity, the purpose of the ​ODP i​ s to Submission Deadlines ​Authors should
exchange ideas and information. email articles to the editor, John
Consequently, it is the policy of the OD Vogelsang, at ​jvogelsang@
Network to not accept articles that are earthlink.net​. The deadlines for submit-
primarily for the purpose of marketing ting articles are as follow: ​October 1 ​for
or advertising an author’s practice. the winter issue; ​January 1 f​ or the
spring issue; ​April 1 ​for the summer
Submission Deadlines ​Authors should issue; and ​July 1 f​ or the fall issue.
email articles to the editor, John

Copyright © 2017 by the Organization Development Network, Inc. All rights


reserved.
61 ​OD Practitioner G
​ uidelines for Authors

Products and
Services
» ​Practicing OD ​provides practice
related concepts, processes, an
Publicatio
tools in short articles by and fo
ns
busy practitioners.
» ​OD Practitioner, t​ he flagship publica-
tion of the OD Network, is a peer- Both publications and their submi
reviewed quarterly journal. guidelines are available online at ​
www.odnetwork.org​. Conference Series, and OD Network
Live Briefs.

Member
Benefits Online
Resources
Low annual dues provide members with a
host of benefits: In addition to the online resources for
members only, the OD Network website
» Free subscriptions to our
offers valuable tools that are open to the
publications.
public: ​» Education directory of
» Free access to online job ads in the
OD Network Job Exchange. OD-related
degree and certificate programs.
» Discounts on conference registra-
tion, OD Network products (includ- » Catalog of OD professional develop-
ing back issues of this journal), Job ment and networking events.
Exchange postings, professional
» Bookstore of titles recommended by
liability insurance, books from John
Wiley & Sons, and more. OD Network members.

» OD Network Member Roster, an » Links to some of the best OD


essential networking tool, in print resources available.
and in a searchable online database.
» E-mail discussion lists that allow
» Online Toolkits on action research, OD practitioners worldwide to share
consulting skills, and HR for OD— ideas.
foundational theory and useful tools
» Lists, with contact information,
to enhance your practice.
of regional and international OD
Professional
networks.
Development

OD Network professional develop-


ment events offer cutting-edge theory
and practice. Learn more at
http://www.odnetwork.org.​
Copyright © 2017 by the Organization Development Ne
» OD Network Conferences, held reserved.

annually, provide unsurpassed pro-


fessional development and network-
ing opportunities.

» Regular webinars include events


in the Theory and Practice Series,

Mark your calendar!


Annual ​ 2017 ​
OD

Conference ​ NETWORK

THE ​ OD ​CALL ​INNOVATING


OF OUR TIME:
for

IMPACT ​O​CTOBER ​14-17,


2017 L​OEWS ​C​HICAGO ​O’H​ARE
Learn more: ​www.odnetwork.org/2017Conference
Sponsor & Exhibitor Opportunities are Open! Visit
www.odnetwork.org for details.
View View publication publication stats stats

You might also like