Indicators For Monitoring Undergraduate Stem Education
Indicators For Monitoring Undergraduate Stem Education
org/24943
DETAILS
244 pages | 6 x 9 | PAPERBACK
ISBN 978-0-309-46788-9 | DOI 10.17226/24943
CONTRIBUTORS
Mark B. Rosenberg, Margaret L. Hilton, and Kenne A. Dibner, Editors; Committee
on Developing Indicators for Undergraduate STEM Education; Board on Science
BUY THIS BOOK Education; Division of Behavioral and Social Sciences and Education; National
Academies of Sciences, Engineering, and Medicine
Visit the National Academies Press at nap.edu and login or register to get:
– Access to free PDF downloads of thousands of publications
– 10% off the price of print publications
– Email or social media notifications of new titles related to your interests
– Special offers and discounts
All downloadable National Academies titles are free to be used for personal and/or non-commercial
academic use. Users may also freely post links to our titles on this website; non-commercial academic
users are encouraged to link to the version on this website rather than distribute a downloaded PDF
to ensure that all users are accessing the latest authoritative version of the work. All other uses require
written permission. (Request Permission)
This PDF is protected by copyright and owned by the National Academy of Sciences; unless otherwise
indicated, the National Academy of Sciences retains copyright to all materials in this PDF with all rights
reserved.
Indicators for Monitoring Undergraduate STEM Education
Indicators
for Monitoring
Undergraduate
STEM
Education
Committee on Developing Indicators for Undergraduate STEM Education
THE NATIONAL ACADEMIES PRESS 500 Fifth Street, NW Washington, DC 20001
This study was supported by Contract/Grant No. 1533989 between the National
Academy of Sciences and the National Science Foundation. Any opinions, findings,
conclusions, or recommendations expressed in this publication do not necessarily
reflect the views of any organization or agency that provided support for the project.
Additional copies of this publication are available for sale from the National Acad-
emies Press, 500 Fifth Street, NW, Keck 360, Washington, DC 20001; (800) 624-
6242 or (202) 334-3313; http://www.nap.edu.
The National Academy of Engineering was established in 1964 under the char-
ter of the National Academy of Sciences to bring the practices of engineering
to advising the nation. Members are elected by their peers for extraordinary
contributions to engineering. Dr. C. D. Mote, Jr., is president.
Learn more about the National Academies of Sciences, Engineering, and Medicine
at www.nationalacademies.org.
For information about other products and activities of the National Academies,
please visit www.nationalacademies.org/about/whatwedo.
vi
Acknowledgments
T
his Consensus Study Report represents the work of many individuals,
especially those who served on the committee and participated in
the committee’s open sessions. The first thanks are to the committee
members for their deep knowledge and contributions to the study.
This report was made possible by the important contributions of the
National Science Foundation (NSF). We particularly thank Susan Singer,
the former director of NSF’s Division of Undergraduate Education, who
requested the study.
The committee benefited from presentations by, and discussions with,
the many individuals who participated in our three fact-finding meetings,
in January, February, and April 2016. We thank Alicia Dowd, Pennsylvania
State University; Jeff Gold, California State University Office of the
Chancellor; Beethika Khan, National Center for Science and Engineering
Statistics; Shirley Malcom, American Association for the Advancement of
Science; Jordan Matsudaira, Cornell University; Alexei Matveev, South-
ern Association of Colleges and Schools; Emily Miller and Josh Trapani,
Association of American Universities; Chris Rasmussen, San Diego State
University; and Matthew Wilson, National Science Board.
The committee also thanks the experts who discussed the public com-
ment draft during the committee’s October 2016 public meeting: Susan
Ambrose, Northeastern University, Boston, Massachusetts; Mica Estrada,
University of California, San Francisco; Adam Gamoran, William T. Grant
Foundation; Jillian Kinzie, Indiana University; Annette Parker, South Central
College, Minnesota; Kacy Redd, Association of Public and Land-Grant Uni-
vii
viii ACKNOWLEDGMENTS
ACKNOWLEDGMENTS ix
Contents
SUMMARY 1
1 INTRODUCTION 13
Interpreting the Study Charge, 15
Vision, 15
A Focus on the National Level, 18
Equity, Diversity, and Inclusion, 19
Goals and Objectives, 20
Measures and Indicators, 20
Undergraduate STEM Education, 22
Evidence-Based STEM Educational Practices and Programs, 22
Measuring College Quality in an Era of Accountability, 22
Employment Outcomes, 23
The STEM Workforce, 25
Learning Outcomes, 27
Goals of the Indicator System, 30
Study Approach and Organization of the Report, 31
References, 32
xi
xii CONTENTS
CONTENTS xiii
xiv CONTENTS
CONTENTS xv
APPENDIXES
A Public Comments on Draft Report and Committee Response 201
B Possible Formulas for Calculating Selected Indicators 209
C Agendas: Workshop and Public Comment Meeting 215
D Biographical Sketches of Committee Members and Staff 221
Summary
S
cience, technology, engineering, and mathematics (STEM) profes-
sionals generate a stream of scientific discoveries and technological
innovations that fuel job creation and national economic growth.
Undergraduate STEM education prepares graduates for today’s STEM
professions and those of tomorrow, while also helping all students develop
knowledge and skills they can draw on in a variety of occupations and
as citizens. However, many capable students intending to major in STEM
switch to another field or drop out of higher education altogether, partly
because of documented weaknesses in STEM teaching, learning, and stu-
dent supports. More than 5 years ago, the President’s Council of Advisors
in Science and Technology (PCAST) wrote that improving undergraduate
STEM education to address these weaknesses is a national imperative.
Many initiatives are now under way to improve the quality of under-
graduate STEM teaching and learning. Some focus on the national level,
others involve multi-institution collaborations, and others take place on
individual campuses. At present, however, policy makers and the public do
not know whether these various initiatives are accomplishing their goals
and leading to nationwide improvement in undergraduate STEM educa-
tion. Recognizing this challenge, PCAST recommended that the National
Academies of Sciences, Engineering, and Medicine develop metrics to evalu-
ate undergraduate STEM education. In response, the National Science
Foundation charged the National Academies to conduct a study of indica-
tors that could be used to monitor the status and quality of undergraduate
STEM education over time.
SUMMARY 3
Environment
1.2. Supports for Evidence-Based
Practices Outcomes
1.3. Institutional Culture That (Graduates with
Values Undergraduate STEM STEM Knowledge
2.3. Representational Diversity and Skills)
among STEM Instructors 2.2.
Inputs 2.4. Inclusive Institutions and Representational
2.1. Equity of Access to STEM Departments Equity in STEM
High-Quality
Undergraduate STEM Educational Credential
Attainment
Education. Processes 3.3. STEM
1.1. Use of Evidence-
Based Practices Credential
1.4. Continuous Attainment
Improvement
3.1. Foundational
Preparation
3.2. Successful
Navigation
numbers of STEM professionals and prepare all graduates with core STEM
knowledge and skills (Goal 3).
The study was conducted in the context of calls for greater account-
ability in higher education and ongoing efforts to define and measure higher
education quality, both generally and at individual colleges and universi-
ties. For example, student learning is a desired educational outcome. This
outcome is reflected in the committee’s goal of increasing all students’
mastery of STEM concepts and skills, whether they are taking general edu-
cation courses or pursuing a STEM credential. Some higher education and
professional associations have developed common disciplinary (or general)
learning goals, along with assessments of students’ progress toward these
goals, to measure quality. However, establishing common learning goals is
challenging because STEM is characterized by rapid discoveries, ongoing
development of new knowledge and skills, and continual emergence of new
subdisciplines and interdisciplinary fields, which result in ongoing changes
to learning goals. In engineering, for example, most graduating students
take the Fundamentals of Engineering Exam, which is offered in seven
different subdisciplines. A common national assessment of core STEM
concepts and skills does not exist. Therefore, the committee proposes to
monitor progress in student learning through objectives and indicators of
the adoption of teaching practices that have been shown by research to
enhance student learning.
1 The committee uses the term “instructors” to refer to all individuals who instruct under-
graduates, including tenured and tenure-track faculty, adjunct and part-time instructors, and
graduate student instructors.
SUMMARY 5
Goal 2: Strive for Equity, Diversity, and Inclusion of STEM Students and Instructors by
Providing Equitable Opportunities for Access and Success
Input 2.1 Equity of access to 2.1.1 Institutional structures,
high-quality undergraduate policies, and practices that
STEM educational strengthen STEM readiness for
programs and experiences entering and enrolled college
students
SUMMARY 7
Although they are conducted less frequently than IPEDS, federal longi-
tudinal surveys of student cohorts, such as the 2004/09 Beginning Postsec-
ondary Students Longitudinal Study conducted by the National Center for
Education Statistics (NCES), provide useful data related to the committee’s
objectives and indicators. The survey samples are carefully designed to be
nationally representative, and multiple methods are used to obtain strong
response rates. The resulting data can be used to track students’ trajectories
across institutions and fields of study, including STEM fields. Another for-
merly useful source was NCES’s National Study of Postsecondary Faculty,
which provided data on instructors’ disciplinary backgrounds, responsibili-
ties, and attitudes, but it was discontinued in 2004.
The committee found that IPEDS and other federal data sources gen-
erally allow data to be disaggregated by students’ race and ethnicity and
gender. However, conceptions of diversity have broadened to include ad-
ditional characteristics of students that may provide unique strengths in
SUMMARY 9
SUMMARY 11
Introduction
S
cience and technology are engines of U.S. economic growth and inter-
national competitiveness in the global 21st century economy. Leading
economists (e.g., Solow, 1957; Mankiw, 2003; Romer, 1990), policy
makers, and the public all agree that technological innovation fueled by
scientific research is the primary mechanism for sustained economic growth
(Xie and Killewald, 2012). As the nation continues to recover from the
2008 economic recession, the science, technology, engineering, and math-
ematics (STEM) fields are critical drivers for the health of the economy.
Hence, a robust, skilled STEM workforce is important for the nation
(National Academy of Sciences, National Academy of Engineering, and In-
stitute of Medicine, 2007, 2010). Because undergraduate STEM education
plays a central role in developing the STEM workforce, and also contrib-
utes to a strong general education for all students, improving the quality of
undergraduate STEM education is a national imperative.
Some recent trends raise concerns about the health of the nation’s
STEM workforce (see Xie and Killewald, 2012). First, scientists’ earnings
(adjusted for inflation) have stagnated since the 1960s and have declined
relative to those of other high-status, high-education professions, such as
law and business, which could discourage individuals from entering or stay-
ing in science careers. Second, it has become more difficult for recent science
doctorates to obtain any academic position, and the available academic
positions are weighted toward more postdoctoral appointments and fewer
faculty positions, which could discourage young people from pursuing
academic research. Third, U.S. science faces increasing foreign competition
as the share of global research conducted in other countries is increasing.
13
These trends could lead to gradual erosion of U.S. dominance in science and
a slowdown in the economic growth fueled by technological innovation.
To strengthen the nation’s research and technology enterprise in the
face of these trends, the President’s Council of Advisors on Science and
Technology (PCAST) (2012) recommended producing 1 million additional
college graduates with degrees in STEM over the following decade. Recog-
nizing that many students with an interest in and aptitude for STEM, espe-
cially females and underrepresented minorities, are not completing degrees
in these fields (see National Research Council, 2011; National Academies
of Sciences, Engineering, and Medicine, 2016a), the PCAST report called
for widespread implementation of strategies to engage, motivate, and retain
diverse students in STEM. Such strategies are beginning to emerge from
a growing body of relevant research, but they have not yet been widely
implemented (see National Research Council, 2012; National Academies
of Sciences, Engineering, and Medicine, 2016a).
Many initiatives to improve the quality of undergraduate STEM educa-
tion are now under way. Some focus on the national level, others involve
multi-institution collaborations, and others take place on individual cam-
puses. For example, the interagency Committee on STEM Education of
the National Science and Technology Council (2013) developed a STEM
education 5-year strategic plan that identified improving the experience of
undergraduate students as a priority goal for federal investment. Within this
broad goal, the strategic plan identified four priority areas: (1) promoting
evidence-based instructional practices; (2) improving STEM experiences in
community colleges; (3) expanding undergraduate research experiences;
and (4) advancing success in the key gateway of introductory mathematics.
Other initiatives include the undergraduate STEM initiative of the Asso-
ciation of American Universities;1 a workshop and sourcebook on under
graduate STEM reform of the Coalition for Reform in Undergraduate
STEM Education;2 and the Partnership for Undergraduate Life Sciences
Education, or PULSE.3
At present, policy makers and the public do not know whether these
various federal, state, and local initiatives are accomplishing their stated
goals and achieving nationwide improvement in undergraduate STEM
education. This is partly due to a lack of high-quality national data on
undergraduate STEM teaching and learning. A recent study of barriers and
opportunities for 2-year and 4-year STEM degrees (National Academies
of Sciences, Engineering, and Medicine, 2016a) highlighted the mismatch
between currently available datasets and the realities of student trajecto-
INTRODUCTION 15
Vision
In developing a conceptual framework and indicators to monitor im-
provement in undergraduate STEM education, the committee envisioned
what such improvement would look like. In this vision, students—from all
walks of life and with all types of experiences and backgrounds—would be
well prepared to help address global, societal, economic, and technological
challenges. Students would have the STEM background to become success-
ful in the careers of today as well as those of tomorrow as U.S. society con-
tinues to become increasingly diverse, global, and interconnected. Among
these well-prepared graduates, some would become professional scientists
and engineers, conducting research and developing new technologies to
support sustained economic growth.
BOX 1-1
Study Charge
At the same time, the committee envisions that all students, not merely
those who pursue STEM degrees and careers, would have both access and
exposure to high-quality STEM education to support the development
of STEM literacy, a relatively new concept. The committee’s adoption
of this concept was informed by a recent report (National Academies of
INTRODUCTION 17
4 The authors invited all members of the PULSE community (which includes 2-year and 4-year
colleges, regional comprehensive universities, and research universities) to submit their rubric
data. The respondents provided varying amounts of data: 26 institutions provided r ubric data
across all five areas, 57 provided data on curriculum alignment, 35 on assessment, 49 on faculty
practice/faculty support, 28 on infrastructure, and 32 on climate for change.
INTRODUCTION 19
INTRODUCTION 21
INTRODUCTION 23
Employment Outcomes
In response to growing calls for accountability, one of the most widely
used methods for measuring the quality or “value” of a college or university
is to assemble and analyze data on graduates’ earnings. However, research
has demonstrated that both graduation rates and postgraduation earnings
vary widely, depending on the type and selectivity of the institution and the
characteristics of incoming students (National Academies of Sciences, Engi
neering, and Medicine, 2016a; Matsudaira, 2015). Although economists
are beginning to develop methods to adjust graduates’ earnings to account
for the characteristics of incoming students, these methods are not yet fully
developed, and further research would be needed to develop uniform qual-
ity measures (Matsudaira, 2015).
In addition, graduates’ earning are influenced by labor market demand,
and the wage premium for STEM graduates varies by time, place, and field
in ways that are characteristic of a market economy. Furthermore, many
STEM majors enter occupations that are not traditionally considered part
of the STEM workforce (National Science Foundation, 2016), but their
STEM knowledge may indeed contribute to their earnings (Carnevale,
Smith, and Melton, 2011): It is not practical to precisely identify these
workers and measure this contribution. Moreover, individuals who have
INTRODUCTION 25
BOX 1-2
Debates about Supply and Demand for Science, Engineering,
Technology, and Mathematics (STEM) Professionals
Since the 1950s, scientists, engineers, employers, and policy makers have
periodically raised alarms about the possibility of impending shortfalls in the sup-
ply of STEM professionals. One of the most visible reports, Rising Above the
Gathering Storm ((National Academy of Sciences, National Academy of Engineer-
ing, and Institute of Medicine, 2007), conveyed deep concern about the future
supply of U.S. scientists and engineers at a time when other nations are rapidly
advancing in science and innovation. Lowell and Salzman (2007) disputed these
arguments, finding that the U.S. supply of graduates in science and engineering
was not only large, but also considered among the best in the world. Based on
their analysis of available data, they also argued that the U.S. education system
was producing far more science and engineering graduates than needed for the
available job openings. Studies by economists have also found little evidence of
a market shortage of scientists (see, e.g., Butz et al., 2003).
More recently, Xie and Killewald (2012) conducted a detailed analysis of
multiple datasets, finding little evidence of either an oversupply or a shortage of
U.S. scientists and engineers. They found that over the past four decades the
number of graduates in the biological and physical sciences, mathematics, and
engineering had grown, although at a slow rate. Contrary to the claims of an
oversupply, most of these graduates, especially bachelor’s and master’s degree
recipients, found jobs related to their training in these fields.1 However, the real
earnings of basic scientists generally declined over the same period, challeng-
ing the claims of a market shortage of basic scientists (a shortage should have
resulted in rising wages).
For more than two decades, the unemployment rate in STEM occupa-
tions has been considerably lower than the general unemployment rate (Na-
tional Science Foundation, 2016), a reflection of sustained demand for STEM
professionals. In 2013, a survey of STEM professionals with at least a 4-year de-
gree in a STEM field found that most (96.2%) were currently employed (National
Science Foundation, 2016). Furthermore, studies have found that individuals who
earned long-term certificates2 in STEM-intensive fields, such as health, nursing,
and transportation, gained positive economic returns on their educational invest-
ments in the form of higher wages and lower unemployment (Dadgar and Weiss,
2012; Stevens, Kurlaender, and Grosz, 2015). These recent data support Xie and
Killewald’s (2012) view that the supply of and demand for STEM professionals
overall is roughly in balance, with neither an oversupply nor a shortage.
INTRODUCTION 27
1In their analysis, Xie and Killewald (2012) did not include graduates in the social sciences.
They note that social science graduates were less likely to be employed in jobs related to
their training than were their peers in the biological and physical sciences, mathematics, and
engineering.
2Long-term certificates are generally defined as those earned in educational programs
Learning Outcomes
In Chapter 2, the committee identifies increasing students’ mastery of
STEM concepts and skills as one of three overarching goals for improving
the quality of undergraduate STEM education. However, the committee
does not propose any indicators that would directly measure student learn-
ing because of the complexities discussed here.
There is no simple way to address questions about whether students
are acquiring the STEM concepts, skills, and abilities that will serve them
for their lives after college, for several reasons. First, expectations for the
holder of an associate’s degree or certificate are different than those for
the holder of a baccalaureate degree. Second, faculty, employers, profes-
sional societies, accreditation agencies, testing companies, and curriculum
committees all have different answers about the ideal and acceptable levels
of proficiency, and, more fundamentally, about what concepts and skills
should be measured for proficiency. These groups have launched a variety
of efforts to define proficiency, some of which focus on core knowledge
and skills for all 2-year and 4-year graduates, across all fields of study (e.g.,
Association of American Colleges & Universities, 2007; Lumina Founda-
tion, 2015), while others focus on specific disciplines (e.g., Arum, Roksa,
and Cook, 2016). Leaders in life sciences education, for example, have iden-
tified core concepts, competencies, and disciplinary practices for “biological
literacy” in undergraduate biology (Brewer and Smith, 2011).
Third, the STEM disciplines are characterized by rapid discoveries and
the ongoing development of new knowledge and skills. Within and across
these disciplines, new subdisciplines and interdisciplinary fields are continu-
ally being created, bringing differing views about the core knowledge and
skills that define successful learning. Fourth, in U.S. higher education, there
have never been national tests, graduation standards, or uniform STEM
curricula. These would be incompatible with the tradition of state and
system-level autonomy in public higher education and with the diversity
of public, private nonprofit, and private for-profit institutions that provide
undergraduate STEM education. And fifth, some learning outcomes are
ways of seeing problems, analyzing them, and solving them, with appropri-
ate tools and with collaborations among diverse groups (see, e.g., Associa-
tion of American Colleges & Universities, 2007). These outcomes are not
knowledge about specific content areas, nor can they be easily translated
to a national-level measure.
Because of these complications and practical difficulties, the committee
does not propose any indicators that would directly measure student learn-
ing. However, the committee does target increased acquisition of STEM
concepts and skills as an overarching goal for improving undergraduate
STEM education (see Chapters 2 and 3).
In the future, with the growth of online instruction and assessment,
more detailed, automated, proficiency exams and fine-grained records of
accomplishment may be available: see Box 1-3. At that time, it will be im-
portant to revisit the conceptual framework and indicators proposed in this
INTRODUCTION 29
BOX 1-3
The Potential of Online Education
The idea has been growing that universities will change dramatically, and perhaps
largely fade away, under the spread of online education increasingly enabled by
improvements in broadband Internet access and new mobile devices. Recent years
have also seen advances in the science of learning that are enabling society and
researchers to look at new education approaches. The accumulating evidence chal-
lenges the model that has long dominated higher education: the sage on the stage;
that is, the lecture.
To date, however, there is little evidence that purely online education is as effec-
tive for supporting student learning of STEM concepts and processes as face-to-
face education.
Distance education or distance learning is defined as the education of stu-
dents independent of physical presence in a traditional classroom or campus set-
ting (Maeroff, 2003). Over the past two decades, distance education (i.e., online
courses and online degree programs) has increased in representation within
the undergraduate education landscape (Radford, 2011; Ginder and Stearns,
2014). This growth has been significantly influenced not only by the exponential
rise in the development and use of digital communication tools, but also by four
more specific factors: (1) to meet students’ demands for flexibility, (2) to widen
access to disadvantaged students, (3) to increase course availability, and (4) to
increase student enrollment (Parsad and Lewis, 2008). Another attractive aspect
of distance education is students’ ability to take courses across state lines without
paying out-of-state tuition, which is generally higher than in-state tuition: in 2012,
41 percent of undergraduates and 55 percent of graduate students participating in
distance education were enrolled in institutions outside of their state of residence
(Ginder and Stearns, 2014).
Undergraduate enrollment in distance education courses varies by field
of study. In 2008, students studying computer science made up the highest
share (27%) of all participants in distance education classes (Radford, 2011).
By contrast, engineering students (16%) and natural science, mathematics, and
agriculture students combined (14%) constituted the smallest shares among all
participants (Radford, 2011). The most recent data from 2012 showed that tradi-
tional classroom instruction was dominant for all STEM disciplines except com-
puter science (Snyder, de Brey, and Dillow, 2016). Thus, although the frequency
of technology-based course instruction is becoming comparable to face-to-face
instruction in some undergraduate STEM fields, the extent to which it will be ad-
opted across all disciplines is unknown.
continued
report. The committee envisions that its recommended indicator system will
undergo continuous improvement and updating (see Chapter 7).
INTRODUCTION 31
mastery of STEM concepts and skills; (2) strive for equity, diversity, and
inclusion; and (3) ensure adequate numbers of STEM professionals. These
three goals are discussed in greater detail in Chapter 2. In Phase II, the com-
mittee also reviewed additional literature as it deliberated on the proposed
indicators and developed conclusions and recommendations for research
and data collection to develop the indicator system. Throughout the study
process, committee members drafted sections of text, which were shared,
reviewed, edited, and revised across members of the entire committee.
The report is organized around the major tasks outlined in the com-
mittee’s charge. Chapter 2 presents the conceptual framework for the indi-
cator system; Chapters 3, 4, and 5 discuss the committee’s three goals for
improvement in undergraduate STEM, along with objectives and indicators
to measure progress toward those goals. Chapter 6 reviews existing moni-
toring systems and data sources related to undergraduate STEM educa-
tion, and Chapter 7 discusses alternative approaches to implementing the
indicator system.
REFERENCES
American Association for the Advancement of Science. (2011). Vision and Change in Un-
dergraduate Biology Education: A Call to Action. Washington, DC: Author. Available:
http://visionandchange.org/finalreport [July 2017].
Arum, R., Roksa, J., and Cook, A. (2016). Improving Quality in American Higher Education:
Learning Outcomes and Assessments for the 21st Century. Hoboken, NJ: John Wiley
& Sons.
Association of American Colleges & Universities. (2007). College Learning for the New
Global Century. Washington, DC: Author. Available: https://www.aacu.org/sites/default/
files/files/LEAP/GlobalCentury_final.pdf [July 2017].
Biel, R., and Brame, C. (2016). Traditional versus online biology courses: Connecting course
design and student learning in an online setting. Journal of Microbiology & Biology
Education, 17, 417–422.
Bonvillian, W.B., and Singer, S.R. (2013). The online challenge to higher education. Issues in
Science and Technology, 29(4). Available: http://issues.org/29-4/the-online-challenge-to-
higher-education [October 2017].
Bowen, W.G., Chingos, M.M., Lack, K.A. and Nygren, T.I. (2012). Interactive Learning
Online at Public Universities: Evidence from Randomized Trials. Available: http://www.
sr.ithaka.org/wp-content/uploads/2015/08/sr-ithaka-interactive-learning-online-at-public-
universities.pdf [October 2017].
Brancaccio-Taras, L., Pape-Lindstrom, P., Peteroy-Kelly, M., Aguirre, K., Awong-Taylor, J.,
Balser, R., Cahill, M.J., Frey, R.G., Jack, R., Kelrick, M., Marley, K., Miller, K.G.,
Osgood, M., Romano, S., Uzman, J.A., and Zhao, J. (2016). The PULSE vision and
change rubrics, version 1.0: A valid and equitable tool to measure transformation of life
sciences departments at all institution types. CBE-Life Sciences Education, 15(4), art. 60.
Available: http://www.lifescied.org/content/15/4/ar60.full [March 2017].
INTRODUCTION 33
Brewer, C.A., and Smith, D. (2011). Vision and Change in Undergraduate Biology Education:
A Call to Action. Final Report of a National Conference organized by the American As-
sociation for the Advancement of Science. Washington, DC: American Association for
the Advancement of Science. Available: http://visionandchange.org/files/2011/03/Revised-
Vision-and-Change-Final-Report.pdf [May 2017].
Brizius, J.A., and Campbell, M.D. (1991). Getting Results: A Guide for Government Account-
ability. Washington, DC: Council of Governors’ Policy Advisors.
Butz, W.P., Bloom, G.A., Gross, M.E., Kelly, T.K., Kofner, A., and Rippen, H.E. (2003). Is
There a Shortage of Scientists and Engineers? How Would We Know? RAND Science
and Technology Issue Paper. Available: https://www.rand.org/content/dam/rand/pubs/
issue_papers/2005/IP241.pdf [August 2017].
Carnevale, A.P., Smith, N., and Melton, M. (2011). STEM: Science, Technology, Engineer-
ing, and Mathematics. Washington, DC: Georgetown University Center on Education
and the Workforce. Available: https://cew.georgetown.edu/wp-content/uploads/2014/11/
stem-complete.pdf [July 2017].
Dadgar, M., and Weiss, M.J. (2012). Labor Market Returns to Sub-Baccalaureate Credentials:
How Much Does a Community College Degree or Certificate Pay? CCRC Working
Paper No. 45. New York: Columbia University, Teachers College, Community College
Research Center.
Ginder, S., and Stearns, C. (2014). Web Tables: Enrollment in Distance Education Courses by
State: Fall 2012. NCES 2014-023. Washington, DC: U.S. Department of Education, Na-
tional Center for Education Statistics. Available: https://nces.ed.gov/pubsearch/pubsinfo.
asp?pubid=2014023 [August 2017].
Hill, H., and Grossman, P. (2013). Learning from teacher evaluation: Challenges and oppor-
tunities. Harvard Educational Review, 82(1), 123–141.
Ho, A.D., Reich, J., Nesterko, S., Seaton, D.T., Mullaney, T., Waldo, J., and Chuang, I. (2014).
HarvardX and MITx: The First Year of Open Online Courses. HarvardX and MITx
Working Paper No. 1. Available: https://harvardx.harvard.edu/multiple-course-report
[August 2017].
Holzer, H.J., and Lerman, R.I. (2007). America’s Forgotten Middle Skill Jobs: Education and
Training Requirements in the Next Decade and Beyond. Available: http://www.urban.
org/sites/default/files/publication/31566/411633-America-s-Forgotten-Middle-Skill-Jobs.
PDF [March 2017].
Institute of Medicine. (2001). Crossing the Quality Chasm: A New Health System for the 21st
Century. Washington, DC: National Academy Press.
Kolowich, S., and Newman, J. (2013). The professors behind the MOOC hype. The Chronicle
of Higher Education, March 18. Available: http://www.chronicle.com/article/The-
Professors-Behind-the-MOOC/137905 [August 2017].
Lowell, B.L., and Salzman, H. (2007). Into the Eye of the Storm: Assessing the Evidence
on Science and Engineering Education, Quality, and Workforce Demand. Washing-
ton, DC: The Urban Institute. Available: http://www.urban.org/sites/default/files/
publication/46796/411562-Into-the-Eye-of-the-Storm.PDF [August 2017].
Lumina Foundation. (2015). The Degree Qualifications Profile: A Learning-Centered Frame-
work for What College Graduates Should Know and Be Able to Do to Earn the Associ-
ate’s, Bachelor’s or Master’s Degree. Indianapolis, IN: Lumina Foundation. Available:
https://www.luminafoundation.org/files/resources/dqp.pdf [November 2015].
Maeroff, G.I. (2003). A Classroom of One: How Online Learning Is Changing Our Schools
and Colleges. New York: Palgrave Macmillan.
Matchett, K., Dahlberg, M., and Rudin, T. (2016). Quality in the Undergraduate Experience:
What Is It? How Is It Measured? Who Decides? Summary of a Workshop. Washing-
ton, DC: The National Academies Press. Available: http://www.nap.edu/catalog/23514/
quality-in-the-undergraduate-experience-what-is-it-how-is [July 2016].
Matsudaira, J. (2015). Defining and Measuring Quality in Higher Education. Paper com-
missioned for the Board on Higher Education and the Workforce Meeting on Quality
in Higher Education, December 14-15. Available: http://sites.nationalacademies.org/cs/
groups/pgasite/documents/webpage/pga_170937.pdf [July 2017].
Mankiw, N.G. (2003). Principles of Microeconomics (third ed.). Boston, MA: South-Western
College.
National Academies of Sciences, Engineering, and Medicine. (2016a). Barriers and Oppor-
tunities for 2-Year and 4-Year STEM Degrees: Systemic Change to Support Students’
Diverse Pathways. Washington, DC: The National Academies Press. Available: http://
www.nap.edu/catalog/21739/barriers-and-opportunities-for-2-year-and-4-year-stem-
degrees [March 2016].
National Academies of Sciences, Engineering, and Medicine (2016b). Science Literacy: Con-
cepts, Contexts, and Consequences. Washington, DC: The National Academies Press.
Available: https://www.nap.edu/catalog/23595/science-literacy-concepts-contexts-and-
consequences [August 2017].
National Academy of Sciences, National Academy of Engineering, and Institute of Medicine.
(2007). Rising Above the Gathering Storm: Energizing and Employing America for a
Brighter Future. Washington, DC: The National Academies Press. Available: http://www.
nap.edu/catalog/11463/rising-above-the-gathering-storm-energizing-and-employing-
america-for [March 2016].
National Academy of Sciences, National Academy of Engineering, and Institute of Medi-
cine. (2010). Rising Above the Gathering Storm, Revisited: Rapidly Approaching Cat-
egory 5. Washington, DC: The National Academies Press. Available: https://www.nap.
edu/catalog/11463/rising-above-the-gathering-storm-energizing-and-employing-america-
for [August 2017].
National Research Council. (2011). Expanding Underrepresented Minority Participation:
America’s Science and Technology Talent at the Crossroads. Washington, DC: The
National Academies Press. Available: https://www.nap.edu/catalog/12984/expanding-
underrepresented-minority-participation-americas-science-and-technology-talent-at [Au-
gust 2017].
National Research Council. (2012). Discipline-Based Education Research: Understanding
and Improving Learning in Undergraduate Science and Engineering. Washington, DC:
The National Academies Press. Available: http://www.nap.edu/catalog/13362/discipline-
based-education-research-understanding-and-improving-learning-in-undergraduate
[March 2016].
National Research Council. (2014). Capturing Change in Science, Technology, and Innovation:
Improving Indicators to Inform Policy. Washington, DC: The National Academies Press.
Available: https://www.nap.edu/catalog/18606/capturing-change-in-science-technology-
and-innovation-improving-indicators-to [August 2017].
National Science and Technology Council. (2013). Federal STEM Education 5-Year Strate-
gic Plan. Available: https://www.whitehouse.gov/sites/default/files/microsites/ostp/stem_
stratplan_2013.pdf [March 2016].
National Science Foundation. (2014). Science and Engineering Indicators 2014. Arlington, VA:
Author. Available: https://www.nsf.gov/statistics/seind14 [February 2018].
National Science Foundation. (2015). Revisiting the STEM Workforce: A Companion to Sci-
ence and Engineering Indicators 2014. Arlington, VA: Author. Available: http://www.nsf.
gov/pubs/2015/nsb201510/nsb201510.pdf [March 2016].
INTRODUCTION 35
National Science Foundation. (2016). Science and Engineering Indicators 2016. Arlington, VA:
Author. Available: https://www.nsf.gov/statistics/2016/nsb20161/# [July 2017].
National Science Foundation, National Center for Science and Engineering Statistics. (2017).
Women, Minorities, and Persons with Disabilities in Science and Engineering: 2017.
Special Report NSF 17-310. Arlington, VA: Author. Available: https://www.nsf.gov/
statistics/2017/nsf17310 [August 2017].
Oakes, J. (1986). Educational Indicators: A Guide for Policymakers. Santa Monica, CA:
Center for Policy Research in Education.
Odden, A. (1990). Educational indicators in the United States: The need for analysis. Educa-
tional Researcher, 19(5), 24–29.
Parsad, B., and Lewis, L. (2008). Distance Education at Degree-Granting Postsecondary
Institutions: 2006–07 (NCES 2009–044). National Center for Education Statistics, Insti-
tute of Education Sciences, U.S. Department of Education. Washington, DC. Available:
https://nces.ed.gov/pubs2009/2009044.pdf [August 2017].
Planty, M., and Carlson, D. (2010). Understanding Education Indicators: A Practical Primer
for Research and Policy. New York: Teachers College Press.
President’s Council of Advisors on Science and Technology. (2012). Engage to Excel: Produc-
ing One Million Additional College Graduates with Degrees in Science, Technology,
Engineering, and Mathematics. Available: https://www.whitehouse.gov/sites/default/files/
microsites/ostp/pcast-engage-to-excel-final_feb.pdf [March 2016].
Radford, A.W. (2011). Stats in Brief: Learning at a Distance: Undergraduate Enrollment in
Distance Education Courses and Degree Programs. (NCES 2012-154). Washington,
DC: National Center for Education Statistics, U.S. Department of Education. Available:
https://nces.ed.gov/pubs2012/2012154.pdf [August 2017].
Romer, P.M. (1990). Endogenous technological change. Journal of Political Economy, 98,
S5, S71.
Rothwell, J. (2013). The Hidden STEM Economy. Metropolitan Policy Program at
Brookings Institution. Available: http://www.brookings.edu/~/media/research/files/
reports/2013/06/10-stem-economy-rothwell/thehiddenstemeconomy610.pdf [July 2017].
Shavelson, R.J., McDonnell, L.M., and Oakes, J. (1989). Indicators for Monitoring Mathematics
and Science Education. Santa Monica, CA: RAND. Available: http://www.rand.org/pubs/
reports/R3742.html [July 2017].
Snyder, T.D., de Brey, C., and Dillow, S.A. (2016). Digest of Education Statistics 2014.
(NCES 2016-006). Washington, DC: National Center for Education Statistics, Institute
of Education Sciences, U.S. Department of Education. Available: https://nces.ed.gov/
pubs2016/2016006.pdf [August 2017].
Solow, R.M. (1957). Technical change and the aggregate production function. The Review of
Economics and Statistics, 39, 312–320.
Stevens, A.H., Kurlaender, M., and Grosz, M. (2015). Career Technical Education and Labor
Market Outcomes: Evidence from California Community Colleges. (NBER Working
Paper No. 21137). Cambridge, MA: National Bureau of Economic Research. Available:
http://www.nber.org/papers/w21137 [August 2017].
Weisberg, D., Sexton, S., Mulhern, J., and Keeling, D. (2009). The Widget Effect: Our N
ational
Failure to Acknowledge and Act on Differences in Teacher Effectiveness. B rooklyn, NY:
The New Teacher Project.
Wilson, S.M., and Anagnostopolous, D. (in press). The seen and the foreseen: Will unintended
consequences thwart efforts to (re)build trust in teacher preparation? Journal of Teacher
Education.
Xie, Y., and Killewald, A.A. (2012). Is American Science in Decline? Cambridge, MA: Harvard
University Press.
C
hapter 1 introduced the study charge and briefly described trends
in the larger social, economic, educational, and scientific and tech-
nological context that may influence the quality of undergraduate
STEM education. In this chapter the committee focuses more narrowly on
dimensions of undergraduate STEM education that are closely related to
student learning and success, presenting a simplified conceptual framework
to guide its development of indicators.
As background for discussing the framework, the committee notes its
conceptual process of arriving at the indicators proposed in this report.
First, the committee adopted a systems perspective on higher education.
Then, it identified three overarching goals for improving undergraduate
STEM education, asking: What are the key targets that represent the best
leverage toward the committee’s vision for undergraduate STEM educa-
tion? After identifying these goals, the committee then operationalized
each one by identifying specific objectives, or elements of the goal, that
need to be addressed in order to meet the goal in its entirety. Identifying
these discrete objectives, described below, allowed the committee to move
forward with developing specific indicators, designed to measure progress
toward meeting the specified objectives, and ultimately, to monitor the
status and quality of undergraduate STEM education. The committee’s
conceptual framework represents the process of students moving through
higher education as institutions seek to produce graduates capable of
meeting the grand challenges of society (as mentioned in Chapter 1): see
Figure 2-1. This overall framework will enable readers to envision the
conceptual basis for the proposed indicator system, indicating how each
37
Educational Environment
Outcomes
Inputs Educational Processes Graduates with
Students Experience Evidence-Based STEM Knowledge
Incoming Students
STEM Education and Skills
goal (and supporting objectives) maps onto the higher education system in
all its complexity: see Figure 2-2.
Educational Environment
1.2. Supports that help STEM instructors use
evidence-based practices
1.3. Institutional culture that values undergraduate
STEM
2.3. Representational diversity among STEM
instructors
Inputs 2.4. Inclusive institutions and STEM departments Outcomes
(Incoming Students) (Graduates with
2.1. Equity of access to Educational Processes STEM Knowledge
high-quality 1.1. Use of evidence- and Skills)
undergraduate STEM based educational 2.2. Representational
programs and practices equity among STEM
experiences 1.4. Continuous credential earners
improvement 3.3. STEM credential
3.1. Foundational attainment
preparation for STEM
for all students
3.2. Successful
navigation
In developing goals and the objectives that follow from them, the com-
mittee considered not only its basic framework (refer to Figure 2-1) but also
other models of change in higher education (e.g., Elrod and Kezar, 2015,
2016; Henderson, Beach, and Finkelstein, 2011). These various models of
undergraduate education as a complex, interacting system were helpful as
the committee considered the most important levers for improvement and
identified objectives to be monitored through an indicator system.
In addition, the committee derived its goals in part from a similar set
of statements in the recent report, Monitoring Progress Toward Successful
K–12 STEM Education (National Research Council, 2013) which in turn
followed a related report on K–12 STEM education (National Research
Council, 2011). Although there are clear parallels between the goals dis-
cussed in that pair of reports and the committee’s three goals, the commit-
tee’s goals reflect the different challenges and contexts of the K–12 and the
higher education sectors. In response to policy makers’ questions and in-
creasing accountability pressures, the higher education sector is particularly
concerned about students’ outcomes, especially the employment outcomes
that are reflected in Goal 3. However, ensuring adequate numbers of STEM
professionals (Goal 3) will not be possible without first attending to the
STEM educational processes and environment reflected in Goals 1 and 2.
These three goals are interconnected and mutually supportive, targeting
improvement in various elements of the undergraduate education system
and the interactions of these elements that together will enhance students’
success in STEM education. Advancing the goals will require strategic use
of multiple change levers within and across the multiple levels of the higher
education system, using both top-down and bottom-up approaches (Austin,
2011). The goals are applicable to all varieties of undergraduate STEM
educational experiences and are designed to enhance those experiences to
the greatest extent possible. The systems perspective reflected in these goals
is also essential in developing indicators to monitor progress, because an
educational indicator system not only measures an educational system’s
inputs, processes, and outputs, but also suggests how they work together to
produce an overall effect on students (Odden, 1990, pp. 24-25; Shavelson,
McDonnell, and Oakes, 1991).
A growing body of research has identified the STEM teaching and
learning experiences and equity and inclusion strategies that support all
students’ mastery of STEM concepts and skills and persistence to gradua-
tion. Widely deploying these evidence-based processes is essential to ensure
adequate numbers of STEM professionals. As noted in Chapter 1, the most
rapidly growing groups within the general population are often underrep-
resented in STEM education and employment fields. These groups provide
an untapped resource of talent, and Goal 2 focuses on changing the edu-
cational processes and environment to increase their engagement and suc-
cess in undergraduate STEM education (Summers and Hrabowski, 2006;
National Academy of Sciences, National Academy of Engineering, and
Institute of Medicine 2011; National Academies of Sciences, Engineering,
and Medicine, 2016a). Thus, advancing the three complementary goals will
sustain a robust STEM workforce that contributes to national economic
growth and international competitiveness (President’s Council of Advisors
on Science and Technology, 2012; Xie and Killewald, 2012). The rest of this
section discusses the committee’s three goals in more detail.
retention of students in key gateway courses in all STEM fields. The com-
mittee notes that elements of NSTC objectives (2) and (3) were specific to
the federal government’s role, calling for increased federal support of cer-
tain aspects of undergraduate STEM and do not represent broad national
objectives for the U.S. higher education system as a whole.
The Objectives
The committee selected 11 objectives for improving undergraduate
STEM, grouped under the committee’s three overarching goals.
These objectives and their relationship to the three goals are shown in
Figure 2-2. The objectives are designed to improve the quality in each com-
ponent of the basic conceptual framework: inputs, processes, environment,
and outcomes. However, the objectives primarily target improvement of the
educational processes, environments, and outcomes. Although the inputs,
the incoming students, influence the quality of undergraduate STEM educa-
tion, some of the characteristics of the students reflect K–12 preparation,
which lies outside the scope of the study charge.
The detailed framework shown in Figure 2-2 illustrates students’ en-
trance to 2-year or 4-year colleges, their STEM-related learning experiences
inside and outside the classroom, the environments that surround students
and instructors, and student outcomes, including credentials and knowledge
of STEM concepts and skills.
PROPOSED INDICATORS
The objectives identified in the detailed framework drove the com-
mittee’s development of indicators: Table 2-1 presents the committee’s
proposed indicators in concert with the committee’s framework and objec-
tives. The next three chapters of the report describe those objectives and
indicators.
CONCLUSION
In this chapter, the committee has proposed a conceptual framework
for the indicator system. Beginning with a model of higher education as
GOAL 2: Strive for Equity, Diversity, and Inclusion of STEM Students and Instructors by
Providing Equitable Opportunities for Access and Success
Input 2.1 Equity of access to 2.1.1 Institutional structures,
high-quality undergraduate policies, and practices that
STEM educational strengthen STEM readiness for
programs and experiences entering and enrolled college
students
continued
REFERENCES
Association of American Colleges & Universities. (2015). Committing to Equity and Inclusive
Excellence: A Campus Guide for Self-Study and Planning. Washington, DC: Author.
Astin, A.W. (1993). What Matters in College? Four Critical Years Revisited. San Francisco,
CA: Jossey-Bass.
National Academies of Sciences, Engineering, and Medicine. (2016a). Barriers and Opportuni-
ties for 2-Year and 4-Year STEM Degrees: Systemic Change to Support Students’ Diverse
Pathways. Washington, DC: The National Academies Press. Available: http://www.nap.
edu/catalog/21739/barriers-and-opportunities-for-2-year-and-4-year-stem-degrees [June
2016].
National Academies of Sciences, Engineering, and Medicine. (2016b). Science Literacy:
Concepts, Contexts, and Consequences. Washington, DC: The National Academies Press.
National Academy of Sciences, National Academy of Engineering, and Institute of Medicine.
(2011). Expanding Underrepresented Minority Participation: America’s Science and
Technology Talent at the Crossroads. Washington, DC: The National Academies Press.
Available: http://www.nap.edu/catalog/12984/expanding-underrepresented-minority-
participation-americas-science-and-technology-talent-at [June 2016].
National Research Council. (2011). Successful K-12 STEM Education: Identifying Effective
Approaches in Science, Technology, Engineering, and Mathematics (STEM). Washing-
ton, DC: The National Academies Press. Available: http://www.nap.edu/catalog/13158/
successful-k-12-stem-education-identifying-effective-approaches-in-science [June 2016].
National Research Council. (2012). Discipline-Based Education Research: Understanding
and Improving Learning in Undergraduate Science and Engineering. Washington, DC:
The National Academies Press. Available: https://www.nap.edu/catalog/13362/discipline-
based-education-research-understanding-and-improving-learning-in-undergraduate [July
2017].
National Research Council. (2013). Monitoring Progress Toward Successful K-12 STEM Edu-
cation: A Nation Advancing? Washington, DC: The National Academies Press. Available:
https://www.nap.edu/search/?term=Monitoring+Progress+Toward+Successful+K-12+
STEM+Education%3A+A+Nation+Advancing%3F.+&x=16&y=6 [July 2017].
National Research Council. (2015). Enhancing the Effectiveness of Team Science. Washington,
DC: The National Academies Press. Available: http://www.nap.edu/catalog/19007/
enhancing-the-effectiveness-of-team-science [June 2016].
National Science and Technology Council. (2013). Federal Science, Technology, Engineering,
and Mathematics (STEM) 5-Year Strategic Plan. Washington, DC: Author. Available:
https://www.whitehouse.gov/sites/default/files/microsites/ostp/stem_stratplan_2013.pdf
[June 2016].
Odden, A. (1990). Educational indicators in the United States: The need for analysis. Educa-
tional Researcher, 19(5), 24–29.
OECD. (2016). OECD Economic Surveys: United States. Available: http://www.oecd.org/eco/
surveys/United-States-2016-overview.pdf [June 2016].
President’s Council of Advisors on Science and Technology. (2012). Engage to Excel: Produc-
ing One Million Additional College Graduates with Degrees in Science, Technology,
Engineering and Mathematics. Washington, DC: Author. Available: https://www.white-
house.gov/sites/default/files/microsites/ostp/pcast-engage-to-excel-final_feb.pdf [March
2016].
Rothwell, J. (2013). The Hidden STEM Economy. Metropolitan Policy Program at Brookings
Institution. Available: http://www.brookings.edu/~/media/research/files/reports/2013/06/10-
stem-economy-rothwell/thehiddenstemeconomy610.pdf [April 2015].
Seymour, E., and Hewitt, N. (1997). Talking about Leaving: Why Undergraduates Leave the
Sciences. Boulder, CO: Westview Press.
Shavelson, R.J., McDonnell, L., and Oakes, J. (1991). What are educational indicators and
indicator systems? Practical Assessment, Research and Evaluation, 2(11). Available:
http://pareonline.net/getvn.asp?v=2andn=11[July 2017].
Summers, M.F., and Hrabowski III, F.A. (2006). Preparing minority scientists and engineers.
Science, 311(5769), 1870–1871.
Tinto, V. (1993). Leaving College: Rethinking the Causes and Cures of Student Attrition.
(second ed.). Chicago, IL: University of Chicago Press.
Weaver, G.C., Burgess, W.D., Childress, A.L., and Slakey, L. (2015). Transforming Institu-
tions: Undergraduate STEM Education for the 21st Century. West Lafayette, IN: Purdue
University Press.
Withem, K., Malcom-Piqueux, L., Dowd, A.C., and Bensimon, E.M. (2015). America’s Unmet
Promise: The Imperative for Equity in Higher Education. Washington, DC: American
Association of Colleges & Universities.
Xie, Y., Fang, M., and Shauman, K. (2015). STEM education. Annual Review of Sociology,
41, 331–357.
Xie, Y., and Killewald, A.A. (2012). Is American Science in Decline? Cambridge, MA: Harvard
University Press.
A
s noted in Chapter 1, the committee does not propose indicators to
directly measure student learning. Although some disciplines have
begun to identify the core concepts and skills that all undergradu-
ates should master (e.g., Arum, Roksa, and Cook, 2016; Brewer and Smith,
2011) and develop assessments of them, there is currently no agreement
on a uniform set of STEM-wide concepts and skills, nor on standardized
assessments of such concepts and skills. Rather, the committee expects that
engaging students in evidence-based STEM educational practices (Goal 1)
and striving for equity, diversity, and inclusion (Goal 2) will increase all
students’ mastery of STEM concepts and skills. Advancing these goals is
expected to improve persistence among students already interested in STEM
and attract other students to STEM majors, thus increasing the number
of students earning STEM credentials and ensuring adequate numbers
of STEM professionals (Goal 3). These expectations echo the President’s
Council of Advisors on Science and Technology (2012); it recommended
widespread adoption of evidence-based teaching and learning approaches
to increase the number of STEM graduates and ensure an adequate supply
of STEM professionals.
The major sections of this chapter address the committee’s four objec-
tives for Goal 1.
55
1.2. Existence and use of supports that 1.2.1 Extent of instructors’ involvement in
help STEM instructors use evidence-based professional development
educational practices.
1.2.2 Availability of support or incentives for
evidence-based course development or course
redesign
1.3 Institutional culture that values 1.3.1 Use of valid measures of teaching
undergraduate STEM instruction effectiveness
discusses the meaning of the indicator and identifies the additional research
needed to fully develop the indicators: see Table 3-1.
In the Classroom
Active Learning as a General Class of Evidence-Based Practices There is
no generally agreed-upon definition of “active learning” in the research
literature, but there are characteristics that such approaches have in com-
mon. In this report, the committee uses the term “active learning” to refer
to that class of pedagogical practices that cognitively engage students in
building understanding at the highest levels of Bloom’s taxonomy (Bloom,
Krathwohl, and Masia, 1964; Anderson, Krathwohl, and Bloom, 2001). Ac-
tive learning instructional practices have been shown to improve students’
academic achievement both generally, across all fields of study (Mayhew et
al., 2016), and in STEM specifically (National Research Council, 2012a).
These practices include collaborative classroom activities, fast feedback
using classroom response systems (e.g., clickers), problem-based learning,
and peer instruction (Bonwell and Eison, 1991; Prince, 2004): see Box 3-1.
The core idea behind all active learning approaches is that learning requires
mental activity, which is more likely to occur when students are engaged
in activities or discussions focused on the content than when students are
BOX 3-1
Peer Instruction: An Example of Active Learning
BOX 3-2
Formative Assessment in a 2-Year College Setting
BOX 3-3
High-Impact Practices
dents who intend to pursue STEM majors, but also for all students. Many
students transfer into a STEM major after initially focusing on another field
of study (Chen, 2013; National Academies of Sciences, Engineering, and
Medicine, 2016). Such transfers suggest that many students with interest
and ability in STEM would benefit from more guidance and information
about STEM programs and careers.
study found that students who received peer mentoring reported increased
sense of belonging and science identity, as well as improved self-efficacy,
all factors that are important for increasing persistence of underrepresented
minorities in STEM (Trujillo et al., 2015). Still, a recent review noted that
research on undergraduate mentoring programs needs more rigorous re-
search designs (Gershenfeld, 2014).
Proposed Indicators
Given what is known about the value of evidence-based STEM educa-
tional practices and the relative lack of their widespread adoption, the com-
mittee proposes two indicators to monitor progress toward the objective of
using evidence-based practices in and outside of classrooms.
Proposed Indicators
2 See http://www.public.asu.edu/~anton1/AssessArticles/Assessments/Biology%20Assessments/
3 The week-long summer institutes focusing on life sciences education engaged participants in
active learning and formative assessment, to help them both understand and experience these
evidence-based educational practices. See http://www.hhmi.org/news/hhmi-helps-summer-
institute-expand-regional-sites [September 2017].
value teaching have found that instructors’ course development and rede-
sign can sometimes be accelerated when they receive appropriate support,
such as instructional resources, establishment of faculty learning communi-
ties (see, e.g., Tewksbury and MacDonald, 2005) and teaching and learning
centers, and help from trained instructional assistants (see. e.g., Wieman,
Perkins, and Gilbert, 2010). Faculty learning communities have also been
developed at 2-year colleges (Sipple and Lightner, 2013).
Support for the time instructors need to develop or redesign a course
usually comes from the department or institution in the form of course buy-
outs during the academic year (Dolan et al., 2016). Financial support can
come as additional compensation during the academic year as “overload”
or payment during unfunded summer months. Instructional resources can
include content repositories, course templates, assessment/learning objec-
tive alignment tools, and successful course design models. Support can also
come from different types of people, including content developers (col-
laborating faculty, co-instructors, postdoctoral fellows, graduate students,
undergraduate students), experts in pedagogy, assessment and instructional
technology, and other instructors in peer learning communities. All of these
various forms of support are helpful, if not essential, but they require spe-
cific department, college, and institutional cultures that routinely demon-
strate, in both words and actions, that evidence-based course development
and redesign are valued.
Developing a course is not a one-time activity; it is an ongoing explo-
ration and evolution of how to engage and help all students have the op-
portunity to learn (Weaver et al., 2016). Targeted experimentation, whether
developed locally or as a replication of published work, signals an approach
that values ongoing, or continuous, educational improvement (see further
discussion below). Full engagement with evidence-based course develop-
ment or redesign forces examination of learning objectives, instructional
activities and approaches, assessment of student learning outcomes, con-
nections with preceding and post courses, and interdisciplinary connections.
The work fosters instructional experimentation that activates engagement
in the scholarship of teaching and learning in a manner closely linked to the
process STEM faculty members use in their own research.
Proposed Indicators
Ory and Ryan, 2001), and there are many who argue that typical student
evaluations are not a valid measure of teaching effectiveness (e.g., Emery,
Kramer, and Tian, 2003; Zabaleta, 2007). Furthermore, there is nearly
universal agreement that high-quality assessment of teaching effectiveness
requires multiple measures (e.g., Anderson et al., 2011; Chism and Stanley,
1999). An interview-based study with 72 physics instructors (Henderson
et al., 2014, p. 16) concluded: “. . . both instructors and institutions use a
limited repertoire of the possible assessment methods. Both groups would
benefit from including additional methods.”
skills (Elrod and Kezar, 2015, 2016b). This process of ongoing evaluation
and improvement is referred to as continuous improvement: see Box 3-4.
Institutional-level improvement efforts are essential when attempting
nationwide improvement in undergraduate STEM education. Although
most institutions are engaged in multiple quality improvement efforts in
different departments, schools, and classrooms, they are often disconnected,
rather than linked for systemic continuous improvement. Therefore, the
committee sought to identify examples and evidence of 2-year and 4-year
institutions that are engaged in coordinated continuous quality improve-
ment efforts. Because students directly experience courses and programs
of study, the committee focused on aspects of courses and programs that
signal the continuous improvement process of clearly articulated goals with
BOX 3-4
Continuous Improvement
BOX 3-5
Establishing Learning Goals and Assessing Student Progress
strong alignment of the course goals, formative assessments, and evaluative as-
sessments in [a] course. Because assessments inform students of what they need to
know and do, new learning opportunities need to be aligned well with assessments
to be successful.
Simon and Taylor (2009) found that “explicit learning goals provide a valuable
aid to guide students in their learning” and suggested best practices for the use
of learning goals.
Engineers have long used an objectives- or outcomes-focused design pro-
cess to solve a variety of human problems. The translation of these basic design
principles to solve learning and teaching problems led to early work such as basic
principles of curriculum and instruction (Tyler, 1949), Bloom and colleagues’s work
on evaluating student learning (Bloom, Hastings, and Madaus, 1971), “backward
design” (Wiggins and McTighe, 2005), and the integrated course design approach
described by Fink (2003). These approaches, typically used to redesign single
courses, can also be used to connect courses across a curriculum to constitute a
program of study and connect programs across an entire institution.
These structured approaches to course, curriculum, and program design
lend themselves to iterative testing and improvement on the basis of evidence
as suggested by Fulcher and colleagues (2017). These authors suggest that in-
structors identify specific student learning outcomes and gather data on students’
knowledge, skills, or abilities related to the targeted outcomes before and after
any changes in instruction, using assessments that yield reliable and valid scores.
student learning goals and assessment results. For example, the Senior Col-
lege and University Commission of the Western Association of Schools and
Colleges asks institutions to clearly state student learning outcomes and
standards of performance at the course, program, and institution level and
to engage faculty in developing and widely sharing these student learning
outcomes.4
4 See https://www.wscuc.org/resources/handbook-accreditation-2013/part-ii-core-commitments-
and-standards-accreditation/wasc-standards-accreditation-2013/standard-2-achieving-educational-
objectives-through-core-functions [November 2017].
REFERENCES
Anderson, L.W., Krathwohl, D.R., and Bloom, B.S. (2001). A Taxonomy for Learning, Teach-
ing, and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives. New
York: Longman.
Anderson, W.A., Banerjee, U., Drennan, C.L., Elgin, S.C.R., Epstein, I.R., Handelsman, J.,
Hatfull, G.F., Losick, R., O’Dowd, D.K., Olivera, B.M., Strobel, S.A., Walker, G.C., and
Warner, I.M. (2011). Changing the culture of science education at research universities.
Science, 331(6014), 152–153.
Arum, R., Roksa, J., and Cook, A. (2016). Improving Quality in American Higher Education:
Learning Outcomes and Assessments for the 21st Century. Hoboken, NJ: John Wiley
& Sons.
Association of American Colleges & Universities. (2007). College Learning for the New
Global Century. Washington, DC: Author. Available: https://www.aacu.org/sites/default/
files/files/LEAP/GlobalCentury_final.pdf [June 2016].
Association of American Universities. (2017). Progress Toward Achieving Systemic Change:
A Five-Year Status Report on the AAU Undergraduate STEM Education Initiative.
Washington, DC. Available: https://www.aau.edu/sites/default/files/AAU-Files/STEM-
Education-Initiative/STEM-Status-Report.pdf [October 2017].
Austin, A. (2011). Promoting Evidence-Based Change in Undergraduate Science Educa-
tion. Paper commissioned by the Board on Science Education. Available: http://sites.
nationalacademies.org/cs/groups/dbassesite/documents/webpage/dbasse_072578.pdf
[June 2016].
Bahr, P.R. (2008). Cooling out in the community college: What is the effect of academic ad-
vising on students’ chances of success? Research in Higher Education, 49(8), 704–732.
Bailey, T., Jaggars, S.S., and Jenkins, D. (2015). What We Know About Guided Pathways.
New York: Columbia University, Teachers College, Community College Research Center.
Baker, V.L., and Griffin, K.A. (2010). Beyond mentoring and advising: Toward understand-
ing the role of faculty “developers” in student success. About Campus: Enhancing the
Student Learning Experience, 14(6), 2–8.
Beach, A.L., Sorcinelli, M.D., Austin, A.E., and Rivard, J.K. (2016). Faculty Development in
the Age of Evidence: Current Practices, Future Imperatives. Sterling, VA: Stylus.
Berk, R.A. (2005). Survey of 12 strategies for measuring teaching effectiveness. International
Journal on Teaching and Learning in Higher Education, 17(1), 48–62.
Black, P., and Wiliam, D. (1998). Inside the black box: Raising standards through classroom
assessment. Phi Delta Kappan, 80(2), 139–148. Available: www.pdkint1.org/kappan/
kb1a9810.htm [July, 2017].
Blaich, C.F., and Wise, K.S. (2010). Moving from assessment to institutional improvement.
New Directions for Institutional Research, 2010(S2), 67–78.
Bloom, B. S., Krathwohl, D.R., and Masia, B.B. (1964). Taxonomy of Educational Objectives.
1. Cognitive Domain. New York: Longman.
Bloom, B.S., Hastings, J.H., and Madaus, G.F (1971). Handbook on Formative and Summa-
tive Evaluation of Student Learning. New York: McGraw-Hill.
Bonwell, C., and Eison, J. (1991). Active Learning: Creating Excitement in the Classroom
(ASHE-ERIC Higher Education Report No. 1). Washington, DC: George Washington
University. Available: http://www.ed.gov/databases/ERIC_Digests/ed340272.html [July
2017].
Borrego, M., Cutler, S., Prince, M., Henderson, C., and Froyd, J.E. (2013). Fidelity of im-
plementation of Research-Based Instructional Strategies (RBIS) in engineering science
courses. Journal of Engineering Education, 102(3), 394–425. doi:10.1002/jee.20020.
Bradforth, S.E., Miller, E.R., Dichtel, W.R., Leibovich, A.K., Feig, A.L., Martin, J.D., Bjorkman,
K.S., Schultz, Z.D., and Smith, T.L. (2015). Improve undergraduate science education.
Nature, 523, 282–284. Available: https://www.nature.com/polopoly_fs/1.17954!/menu/
main/topColumns/topLeftColumn/pdf/523282a.pdf [May 2017].
Brancaccio-Taras, L., Pape-Lindstrom, P., Peteroy-Kelly, M., Aguirre, K., Awong-Taylor, J.,
Balser, R., Cahill, M.J., Frey, R.G., Jack, R., Kelrick, M., Marley, K., Miller, K.G.,
Osgood, M., Romano, S., Uzman, J.A., and Zhao, J. (2016). The PULSE Vision &
Change Rubrics, version 1.0: A valid and equitable tool to measure transformation of life
sciences departments at all institution types. CBE-Life Sciences Education, 15(4), ar60.
Available: http://www.lifescied.org/content/15/4/ar60.full [March 2017].
Brewer, C., and Smith, D. (Eds.). (2011). Vision and Change in Undergraduate Biology Educa-
tion. Washington, DC: American Association for the Advancement of Science.
Brower, A.M., and Inkelas, K.K. (2010). Living-learning programs: One high-impact educa-
tional practice we know a lot about. Liberal Education, 96(2), 36–43.
Brownell, J.E., and Swaner, L.E. (2010). Five High-Impact Practices: Research on Learning
Outcomes, Completion, and Quality. Washington, DC: Association of American Colleges
& Universities.
Casagrand, J., and Semsar, K. (2017). Redesigning a course to help students achieve higher-
order cognitive thinking skills: From goals and mechanics to student outcomes. Advances
in Physiology Education, 41(2), 194–202. doi:10.1152/advan.00102.2016.
Chen, X. (2013). STEM Attrition: College Students’ Paths Into and Out of STEM Fields.
Washington, DC: National Center for Education Statistics, Institute of Education Sci-
ences, U.S. Department of Education.
Chism, N.V.N., and Stanley, C.A. (1999). Peer Review of Teaching: A Sourcebook. Bolton,
MA: Anker.
Council of Scientific Society Presidents. (2013). The Role of Scientific Societies in STEM
Faculty Workshops. Available: http://www.aapt.org/Conferences/newfaculty/upload/
STEM_REPORT-2.pdf [July 2017].
Crouch, C.H., and Mazur, E. (2001). Peer instruction: Ten years of experience and results.
American Journal of Physics, 69(9), 970–977.
Crisp, G., and Cruz, I. (2009). Mentoring college students: A critical review of the literature
between 1990 and 2007. Research in Higher Education, 50(6), 525–545.
Denison, D.R. (1996). What is the difference between organizational culture and organi-
zational climate? A native’s point of view on a decade of paradigm wars. Academy of
Management Review, 21(3), 619–654.
Dolan, E.L., Lepage, G.P., Peacock, S.M., Simmons, E.H., Sweeder, R., and Wieman, C.
(2016). Improving Undergraduate STEM at Research Universities: A Collection of Case
Studies. Tucson, AZ: Research Corporation for Science Advancement. Available: https://
www.aau.edu/sites/default/files/STEM%20Scholarship/RCSA2016.pdf [June 2017].
Drake, J.K. (2011). The role of academic advising in student retention and persistence. About
Campus, 16(3), 8–12.
Eagan, M.K. (2013). Understanding Undergraduate Interventions in STEM: Insights from a
National Study. Presented to the Committee on Barriers and Opportunities in Complet-
ing 2- and 4-Year STEM Degrees. Available: http://sites.nationalacademies.org/cs/groups/
dbassesite/documents/webpage/dbasse_085900.pdf [July 2017].
Eagan, K. (2016). Becoming More Student-Centered? An Examination of Faculty Teaching
Practices Across STEM and Non-STEM Disciplines Between 2004 and 2014. A report
prepared for the Alfred P. Sloan Foundation. Available: https://sloan.org/storage/app/
media/files/STEM_Higher_Ed/STEM_Faculty_Teaching_Practices.pdf [May 2017].
Ebert-May, D., Derting, T.L., Hodder, J., Momsen, J.L., Long, T.M., and Jardeleza, S.E.
(2011). What we say is not what we do: Effective evaluation of faculty professional
development programs. Bioscience, 61(7), 550–558.
Elrod, S., and Kezar, A. (2015). Increasing Student Success in STEM: A Guide to Systemic
Institutional Change. Washington, DC: Association of American Colleges & Universities.
Available: https://www.aacu.org/peerreview/2015/spring/elrod-kezar [May 2017].
Elrod, S., and Kezar, A. (2016a). Increasing Student Success in STEM: A Guide to Systemic
Institutional Change. Washington, DC: Association of American Colleges & Universities.
Elrod, S., and Kezar, A. (2016b). Increasing student success in STEM: An overview of a new
guide to systemic institutional change. In G.C. Weaver, W.D. Burgess, A.L. Childress,
and L. Slakey (Eds.), Transforming Institutions: 21st Century STEM Education. West
Lafayette, IN: Purdue University Press.
Emery, C.R., Kramer, T.R., and Tian, R.G. (2003). Return to academic standards: A critique
of student evaluations of teaching effectiveness. Quality Assurance in Education, 11(1),
37–46.
Estrada, M. (2014). Ingredients for Improving the Culture of STEM Degree Attainment with
Cocurricular Supports for Underrepresented Minority Students. Paper prepared for the
Committee on Barriers and Opportunities in Completing 2- and 4-Year STEM Degrees.
Available: http://sites.nationalacademies.org/cs/groups/dbassesite/documents/webpage/
dbasse_088832.pdf [July 2017].
Fairweather, J. (2005). Beyond the rhetoric: Trends in the relative value of teaching and re-
search in faculty salaries. Journal of Higher Education, 76(4), 401–422.
Henderson, C., and Dancy, M. (2007). Barriers to the use of research-based instructional strat-
egies: The influence of both individual and situational characteristics. Physical Review
Special Topics: Physics Education Research, 3(2), 020102.
Henderson, C., and Dancy, M.H. (2009). Impact of physics education research on the teach-
ing of introductory quantitative physics in the United States. Physical Review Special
Topics—Physics Education Research, 5(2), 020107. Available: https://journals.aps.org/
prper/abstract/10.1103/PhysRevSTPER.5.020107 [February 2018].
Henderson, C., Beach, A., and Finkelstein, N. (2011). Facilitating change in undergraduate
STEM instructional practices: An analytic review of the literature. Journal of Research
in Science Teaching, 48(8), 952–984. doi.org/10.1002/tea.20439.
Henderson, C., Dancy, M., and Niewiadomska-Bugaj, M. (2012). Use of research-based
instructional strategies in introductory physics: Where do faculty leave the innovation-
decision process? Physical Review Special Topics—Physics Education Research, 8(2),
020104.
Henderson, C., Turpen, C., Dancy, M., and Chapman, T. (2014). Assessment of teaching
effectiveness: Lack of alignment between instructors, institutions, and research recom-
mendations. Physical Review Special Topics—Physics Education Research, 10, 010106.
Holland, J.M., Major, D.A., and Orvis, K.A. (2012). Understanding how peer mentoring and
capitalization link STEM students to their majors. The Career Development Quarterly,
60(4), 343–354.
Jha, S., Noori, H., and Michela, J. (1996). The dynamics of continuous improvement. Aligning
organizational attributes and activities for quality and productivity. International Journal
of Quality Science, 1(1), 19–47.
Kober, N. (2015). Reaching Students: What Research Says About Effective Instruction in Un-
dergraduate Science and Engineering. Board on Science Education, Division of Behavioral
and Social Sciences and Education. Washington, DC: The National Academies Press.
Kozlowski, S.W.J., and Ilgen, D.R. (2006). Enhancing the effectiveness of work groups and
teams. Psychological Science in the Public Interest, 7(3), 77–124.
Kuh, G.D. (2008). High-Impact Educational Practices: What They Are, Who Has Access to
Them, and Why They Matter. Washington, DC: Association of American Colleges &
Universities.
Kuh, G.D., and O’Donnell, K. (2013). Ensuring Quality and Taking High-Impact Practices to
Scale. Washington, DC: Association of American Colleges & Universities.
Laursen, S., Hunter, A.B., Seymour, E., Thiry, H., and Melton, G. (2010). Undergraduate
Research in the Sciences: Engaging Students in Real Science. Hoboken, NJ: John Wiley
& Sons.
Lazry, N., Mazur, E., and Watkins, J. (2008). Peer instruction: From Harvard to the two-year
college. American Journal of Physics, 76(11), 1066–1069.
Lichtenstein, G., Chen, H.L., Smith, K.A., and Maldonado, T.A. (2014). Retention and persis-
tence of women and minorities along the engineering pathway in the United States. In A.
Johri and B.M. Olds (Eds.), Cambridge Handbook of Engineering Education Research.
New York: Cambridge University Press.
Light, R.J. (2004). Making the Most of College. Boston, MA: Harvard University Press.
Manduca, C.A., Iverson, E.R., Luxenberg, M., Macdonald, R.H., McConnell, D.A., Mogk,
D.W., and Tewksbury, B.J. (2017). Improving undergraduate STEM education: The ef-
ficacy of discipline-based professional development. Science Advances, 3(2), 1–15.
Mayhew, M.J., Rockenbach, A.N., Bowman, N.A., Seifert, T.A.D., Wolniak, G.C., Pascarella,
E.T., and Terenzini, P.T. (2016). How College Affects Students: 21st Century Evidence
that Higher Education Works (vol. 3). San Francisco, CA: Jossey-Bass.
Mazur, E. (1997). Peer Instruction: A User’s Manual. Upper Saddle River, NJ: Prentice Hall.
McConnell, D.A., Steer, D.N., Owens, K.D., Knott, J.R., et al. (2006). Using ConcepTests to
assess and improve student conceptual understanding in introductory geoscience courses.
Journal of Geoscience Education, 54(1), 61–68.
Metzner, B.S. (1989). Perceived quality of academic advising: The effect on freshman attrition.
American Educational Research Journal, 26(3), 422–442.
Miller, E., and Trapani, J. (2016). AAU Undergraduate STEM Initiative: Measuring Progress.
Presentation to the Committee on Developing Indicators for Undergraduate STEM Edu-
cation, Washington, DC, April 1. Available: http://sites.nationalacademies.org/cs/groups/
dbassesite/documents/webpage/dbasse_183497.pdf [December 2017].
National Academies of Sciences, Engineering, and Medicine. (2016). Barriers and Opportuni-
ties for 2-Year and 4-Year STEM Degrees: Systemic Change to Support Students’ Diverse
Pathways. Washington, DC: The National Academies Press. Available: http://www.nap.
edu/catalog/21739/barriers-and-opportunities-for-2-year-and-4-year-stem-degrees [June
2016].
National Academies of Sciences, Engineering, and Medicine. (2017). Undergraduate Research
Experiences for STEM Students: Successes, Challenges, and Opportunities. Washington,
DC: The National Academies Press.
National Research Council. (2001). Testing Teacher Candidates: The Role of Licensure Tests
in Improving Teacher Quality. Washington, DC: National Academy Press.
National Research Council. (2012a). Discipline-Based Education Research: Understanding
and Improving Learning in Undergraduate Science and Engineering. Washington, DC:
The National Academies Press.
National Research Council. (2012b). Education for Life and Work: Developing Transferable
Knowledge and Skills in the 21st Century. Washington, DC: The National Academies
Press.
National Science Foundation. (2016). FY 2017 Budget Request. Arlington, VA: Author. Avail-
able: http://www.nsf.gov/about/budget/fy2017 [June 2016].
National Science and Technology Council. (2013). Federal STEM Education 5-Year Strategic
Plan. Available: https://obamawhitehouse.archives.gov/sites/default/files/microsites/ostp/
stem_stratplan_2013.pdf [March 2017].
Ory, J.C., and Ryan, K. (2001). How do student ratings measure up to a new validity frame-
work? New Directions for Institutional Research, 2001(109), 27–44.
Packard. B.W. (2016). Successful STEM Mentoring Initiative for Underrepresented Students:
A Research-based Guide for Faculty and Administrators. Sterling, VA: Stylus.
Park, S., Hironaka, S., Carver, P., and Nordstrum, L. (2013). Continuous Improvement
in Education. Stanford, CA: Carnegie Foundation for the Advancement of Teaching.
Available: https://www.carnegiefoundation.org/wp-content/uploads/2014/09/carnegie-
foundation_continuous-improvement_2013.05.pdf [June 2017].
Parker, L.C., Adedokun, O., and Weaver, G.C. (2016). Culture policy and resources: Barriers
reported by faculty implementing course reform. In G.C. Weaver, W.D. Burgess, A.L.
Childress, and L. Slakey (eds.). Transforming Institutions: Undergraduate STEM Educa-
tion for the 21st Century. West Lafayette, IN: Purdue University Press.
Pascarella, E.T., and Terenzini, P.T. (2005). How College Affects Students (vol. 2). K.A.
Feldman (ed.). San Francisco, CA: Jossey-Bass.
Pascarella, E.T., Martin, G.L., Hanson, J.M., Trolian, T.L., Gillig, B., and Blaich, C. (2014).
Effects of diversity experiences on critical thinking skills over 4 years of college. Journal
of College Student Development, 55(1), 86–92. Available: http://aquila.usm.edu/cgi/
viewcontent.cgi?article=9211&context=fac_pubs [May 2017].
Peterson, M.W., and Spencer, M.G. (1990). Understanding academic culture and climate.
In W.G. Tierney (Ed.), Assessing Academic Cultures and Climates: New Directions for
Institutional Research (pp. 3-18). San Francisco, CA: Jossey-Bass.
Pfund, C., Miller, S., Brenner, K., Bruns, P., Chang, A., Ebert-May, D., Fagen, A.P., Gentile,
J., Gossens, S., Khan, I.M, Labov, J.B., Pribbenow, C.M., Susman, M., Tong, L., Wright,
R., Yuan, R.T., Wood, W.B., and Handelsman, J. (2009). Professional development:
Summer institute to improve university science teaching. Science, 324(5926):470–471.
Available:http://science.sciencemag.org/content/324/5926/470.long [May 2017].
Pizzolato, J.E. (2008). Advisor, teacher, partner: Using the learning partnerships model to
reshape academic advising. About Campus, 13(1), 18–25.
President’s Council of Advisors on Science and Technology. (2012). Engage to Excel: Produc-
ing One Million Additional College Graduates with Degrees in Science, Technology,
Engineering, and Mathematics. Available: https://obamawhitehouse.archives.gov/sites/
default/files/microsites/ostp/pcast-engage-to-excel-final_2-25-12.pdf [July 2017].
Prince, M. (2004). Does active learning work? A review of the research. Journal of Engineering
Education, 93(3), 223–231.
PULSE Fellows. (2016). The PULSE Vision and Change Rubrics Version 2.0. Available:
http://api.ning.com/files/Kfu*MfW7V8MYZfU7LNGdOnG4MnryzUgUpC2IxdtUmucn
B4QNCdLaOwWGoMoULSeKw8hF9jiFdh75tlzuv1nqtfCuM11hNPp3/PULSERubrics
Packetv2_0_FINALVERSION.pdf [May 2017].
Roediger, H.L., Agarwal, P.K., McDaniel, M.A., and McDermott, K.B. (2011). Test-enhanced
learning in the classroom: Long-term improvements from quizzing. Journal of Experi-
mental Psychology: Applied, 17(4), 382–395.
Savage, A.F. (2014). Science literacy: A key to unlocking a fully-engaged citizenry. Diver-
sity and Democracy, 17(3). Available: https://www.aacu.org/diversitydemocracy/2014/
summer/savage [March 2017].
Savage, A.F., and Jude, B.A. (2014). Starting small: Using microbiology to foster scientific
literacy. Trends in Microbiology. http://dx.doi.org/10.1016/j.tim.2014.04.005.
Saxe, K., and Braddy, L. (2016). A Common Vision for Undergraduate Mathematical Sciences
Programs in 2025. Washington, DC: Mathematical Association of America.
Schein, E.H. (2004). Organizational Culture and Leadership (3rd ed.). San Francisco, CA:
Jossey-Bass.
Schein, E.H. (2010). Organizational Culture and Leadership (4th ed.). Hoboken, NJ: John
Wiley & Sons, Inc.
Schneider, B. (1975). Organizational climates: An essay. Personnel Psychology, 28(4), 447–479.
Schneider, B., and Reichers, A.E. (1983). On the etiology of climates. Personnel Psychology,
36(1), 19–39.
Schneider, B., Ehrhart, M.G., and Macey, W.H. (2013). Organizational climate and culture.
Annual Review of Psychology, 64, 361–388.
Seidman, A. (1991). The evaluation of a pre/post admissions/counseling process at a suburban
community college: Impact on student satisfaction with the faculty and the institution,
retention, and academic performance. College and University, 66(4), 223–232.
Simon, B., and Taylor, J. (2009). What is the value of course-specific learning goals? Journal
of College Science Teaching, 39(2), 52–57.
Sipple, S., and Lightner, R. (2013). Developing Faculty Learning Communities at Two-Year
Colleges: Collaborative Models to Improve Teaching and Learning. Sterling, VA: Stylus.
Smith, D. (2013). Describing and Measuring Undergraduate Teaching Practices. Wash-
ington, DC: American Association for the Advancement of Science. Available: http://
ccliconference.org/files/2013/11/Measuring-STEM-Teaching-Practices.pdf [May 2017].
Strayhorn, T.L. (2011). Bridging the pipeline: Increasing underrepresented students’ prepara-
tion for college through a summer bridge program. American Behavioral Scientist, 55(2),
142–159. doi: 10.11177/000276421038187.
Tewksbury, B.J., and MacDonald, R.H. (2005). Designing Effective and Innovative Courses.
Available: http://wp.stolaf.edu/cila/files/2014/08/Assignment_Part_1.2.pdf [July 2017].
Titus, M.A. (2004). An examination of the influence of institutional context on student per-
sistence at 4-year colleges and universities: A multilevel approach. Research in Higher
Education, 45(7), 673–699.
Trujillo, G., Aguinaldo, P., Anderson, C., Busamante, J., Gelsinger, D., Pastor, M., Wright,
J., Marquez-Magana, L., and Riggs, B. (2015). Near-peer STEM mentoring offers unex-
pected benefits for mentors from traditionally underrepresented backgrounds. Perspec-
tives on Undergraduate Research and Mentoring, 4.1. Available: http://blogs.elon.edu/
purm/2015/11/11/near-peer-stem-mentoring-offers-unexpected-benefits-for-mentors [July
2016].
Tyler, R.W. (1949). Basic Principles of Curriculum and Instruction. Chicago, IL: University
of Chicago Press.
Weaver, G.C., Burgess, W.D., Childress, A.L., and Slakey, L. (2016). Transforming Institu-
tions: Undergraduate STEM Education for the 21st Century. West Lafayette, IN: Purdue
University Press.
Wieman, C., Perkins, K., and Gilbert, S. (2010). Transforming science education at large
research universities: A case study in progress. Change: The Magazine of Higher Learn-
ing, 42(2), 7–14.
Wiggins, G., and McTighe, J. (2005). Understanding by Design (second ed.). Alexandria, VA:
Association for Supervision and Curriculum Development.
Zabaleta, F. (2007). The use and misuse of student evaluations of teaching. Teaching in Higher
Education, 12(1), 55–76.
E
quity, diversity, and inclusion are distinct concepts, and all three
are critically important to ensuring that the undergraduate STEM
educational system meets the nation’s needs and serves all people
(Witham et al., 2015). For quite some time, there has been ongoing discus-
sion about the compatibility of equity and excellence in STEM education
(e.g., Association of American Colleges & Universities, 2015; Gates, 1995;
Howard Hughes Medical Institute, 2016; Malcom et al., 1984). There is
growing recognition that in order to achieve excellence and effectiveness,
the STEM educational system needs to serve all students well (e.g., National
Academies of Sciences, Engineering, and Medicine, 2016). Therefore, the
committee’s second goal is for STEM undergraduate education to be equi-
table, diverse, and inclusive.
To be considered equitable, institutions and STEM departments would
provide enrolled students with adequate support to enter, persist, and suc-
cessfully complete STEM general education coursework or STEM degrees,
by engaging all students in evidence-based STEM educational practices
and programs.1 To be considered diverse, the national pool of students
participating and succeeding in undergraduate STEM education would be
representative of the demographics of the U.S. college student population.
STEM instructors, including faculty and graduate student educators, would
also reflect the national pool of individuals eligible to teach in undergradu-
ate STEM education. Finally, to be inclusive, undergraduate STEM learning
environments would need to effectively engage and educate diverse learners.
87
The major sections of this chapter address the committee’s four objec-
tives for Goal 2:
2.2 Representational diversity among 2.2.1 Diversity of STEM degree and certificate
STEM credential earners earners in comparison with diversity of degree
and certificate earners in all fields
Proposed Indicators
Developing and maintaining high levels of interest in STEM for all en-
tering college students are necessary conditions for advancing equity and di-
versity in undergraduate STEM education. While first-year college students
likely develop such interest over the course of their precollege educational
experiences, colleges and universities also play a role in promoting and
maintaining their interest in pursuing STEM degrees. For example, institu-
tions may be able to bolster interest in STEM among first-year students by
providing early exposure to the range of STEM program offerings, offer-
ing opportunities for career exploration, and providing both entering and
first-year students with individualized advising and degree planning (see
discussion of advising and mentoring in Chapter 3).
Achieving greater equity and diversity in undergraduate STEM edu-
cation also requires that students who express an interest in STEM early
in their college experiences actually enter into and persist in STEM de-
gree programs. Indicator 2.1.2 (above) also includes measures of student
persistence—the extent to which students who enter into a STEM degree
program maintain enrollment in that program. Again, disaggregating by
key student characteristics and STEM discipline would indicate whether
certain student populations are more likely to leave STEM fields than
others.
ing their interest in STEM fields, strengthening their STEM identities, and
cultivating graduate school aspirations (Eagan et al., 2013). There is also
some evidence that these educational practices and programs are par-
ticularly effective in increasing learning and retention among historically
underrepresented students in STEM, making access to and participation
in such practices vital to advancing equity in STEM outcomes and degree
attainment.
This indicator would provide information about the patterns of ac-
cess to and participation in evidence-based STEM educational practices
and co-curricular programs. It would be disaggregated across student de-
mographic groups (race and ethnicity, gender, socioeconomic status and
ability status), institutional type (e.g., research university, liberal arts col-
lege, 2-year college), and STEM discipline. Students’ engagement in such
practices and programs has been shown to promote mastery of STEM
concepts and skills and retention in STEM majors (e.g., National Research
Council, 2012). In addition, there is growing evidence that participation in
certain evidence-based programs outside the classroom (e.g., undergradu-
ate research, mentoring, bridge programs) can boost STEM retention and
career and graduate school aspirations (Eagan et al., 2013; Estrada, 2014;
Gilmer, 2007; Lenaburg et al., 2012; Packard, 2016). For example, a recent
review of research related to undergraduate research experiences (National
Academies of Sciences, Engineering, and Medicine, 2017) concluded that
participation in this type of evidence-based educational practice is beneficial
for all students and, for students from historically underrepresented groups,
improves their persistence in STEM and helps to validate their disciplinary
identity.
Only limited research is available on the extent to which different stu-
dent groups participate in these valuable evidence-based educational prac-
tices. For example, data from the National Survey of Student Engagement
(NSSE) indicate that certain student groups that are historically under
represented in STEM fields, including Blacks, Hispanics, Native Ameri-
cans, low-income students, and first-generation students, are less likely
than other students to participate in undergraduate research and five other
high-impact practices, though these data do not provide insight into the
patterns of participation specifically for STEM majors2 (Finley and M cNair,
2013; N ational Survey of Student Engagement, 2016). However, the recent
National Academies (2017) study of undergraduate research experiences
found that data on who participates in these experiences overall or at
specific types of institutions have not been collected systematically, and
recommended that institutions collect those data.
Proposed Indicators
not exactly mirror the patterns from non-STEM fields, which provide the
comparison group in this indicator. Nevertheless, the presence of large
equity gaps in STEM degree attainment at the national level would indi-
cate that the undergraduate STEM education system is not functioning as
intended and is failing to meet the needs of all students.
The committee notes that comparisons with other base populations
(e.g., all college students, national adult population) may also be informa-
tive. For example, a recent report on the status of women, minorities, and
persons with disabilities in science and engineering (National Science Foun-
dation, 2017) presents data on the proportion of STEM degrees earned
by historically underrepresented populations along with the demographic
breakdown of the U.S. noninstitutionalized adult population. Using this
broader base population as the comparison group shows larger equity
gaps because of the compounding of inequalities in the rates of high school
completion, college entrance, and entry to STEM degree programs expe-
rienced by historically underrepresented populations. In order to ensure
that any inequities identified from the committee’s proposed indicator are
attributable only to the undergraduate STEM educational system, our pro-
posed comparison group is all undergraduate degree and certificate earners.
Proposed Indicators
Although instructor diversity is a concern across all fields, the represen-
tation of historically underrepresented groups among STEM instructors is
even lower than overall levels. Thus, the committee proposes the following
two indicators that can be used to monitor progress toward representa-
tional diversity among STEM instructors. The first indicator focuses on
faculty, whether tenured, untenured, adjunct, full time or part time. The
second focuses on graduate student instructors. We present them together,
and the discussion that follows covers both.
tion. Currently, the diversity of STEM faculty does not reflect the diversity
of STEM graduate degree holders. We propose that these indicators will
allow for monitoring change in the diversity of STEM educators.
for engaging with students inside and outside the classroom (Gasiewski et
al., 2012).
Instructors’ perceptions of their departmental and institutional climates
may affect their willingness to experiment with enhancing their teaching
strategies, engagement with colleagues and students, and likelihood to want
to continue working at their current institution. Certain groups of instruc-
tors—primarily women and minorities—report higher levels of stress due to
discrimination or more hostile climates on campus (Turner and González,
2011). These factors tend to reduce overall job satisfaction (Sanderson,
Phua, and Herda, 2000) and thus increase the chances of their leaving their
academic appointments (Ponjuan, 2006; Rosser, 2004).
Proposed Indicators
Because the first two indicators of this objective are closely related, they
are presented together, and the discussion that follows covers both.
diverse, inclusive, and equitable. Such practices and policies include those
that aim to engage diverse learners in and outside of STEM classrooms in
order to produce more equitable outcomes. Though related to the indica-
tor above on institutional structures, policies, and practices that strengthen
levels of STEM readiness for entering college students (Indicator 2.1.1),
this indicator is distinct because it focuses on the experiences of students
enrolled in STEM academic programs, not on practices intended to increase
student readiness for such programs.
Students who are historically underrepresented in STEM fields of-
ten face unique challenges (stereotypes, stereotype threat, implicit bias)
that negatively affect their ability to enter, persist, and succeed in STEM
fields (Fries-Britt and Griffin, 2007; Godsil, 2016; Martin, 2009; McGee
and Martin, 2011). Culturally responsive pedagogies and instructional
approaches include those that are interactive, asset-based, and focus on
building student identities. These practices and approaches have been dem-
onstrated to build more equitable learning environments in some educa-
tional contexts (Gay, 2000; Ladson-Billings, 1995; Moses et al., 1989;
Nasir et al., 2014; Paris, 2012), but they have not been widely imple-
mented in STEM at the postsecondary level (Davis, Hauk, and Latiolais,
2009). For example, most mathematics classrooms use standard lecture
formats (Eagan, 2016). Though additional research is needed, a growing
evidence base suggests that developing diverse students’ sense of belong-
ing and improving student-faculty interactions is conducive to broadening
participation in STEM fields. The committee envisions that this indicator
would allow for the monitoring of data about the prevalence of culturally
responsive educational approaches used in STEM departments. Recruiting
and retaining a diverse STEM faculty is also critical to inclusive educa-
tional environments (as discussed above). Thus, the indicator would include
measures about the use of search and hiring practices that are effective in
diversifying STEM faculty (e.g., implicit bias training).
REFERENCES
American Association for the Advancement of Science. (2011). Vision and Change in Un-
dergraduate Biology Education: A Call to Action. Washington, DC: Author. Available:
http://visionandchange.org/finalreport [July 2017].
American Association for the Advancement of Science. (2015). Vision and Change in Under-
graduate Biology Education: Chronicling Change, Inspiring the Future. Washington, DC:
Author. Available: http://visionandchange.org/files/2015/07/VISchange2015_webFin.pdf
[July 2017].
American Association of Community Colleges. (2014). Datapoints: High Paying Occupations.
Washington, DC: Author. Available: http://www.aacc.nche.edu/Publications/datapoints/
Documents/HighOccupations_10.14.pdf [July 2017].
An, B.P. (2013). The impact of dual enrollment on college degree attainment: Do low-SES
students benefit? Educational Evaluation and Policy Analysis, 35(1), 57–75.
Eagan, M.K., Hurtado, S., Figueroa, T., and Hughes, B. (2014). Examining STEM Pathways
among Students Who Begin College at Four-Year Institutions. Paper prepared for the
Committee on Barriers and Opportunities in Completing 2- and 4-Year STEM Degrees.
Available: http://sites.nationalacademies.org/cs/groups/dbassesite/documents/webpage/
dbasse_088834.pdf [July 2017].
Elrod, S., and Kezar, A. (2016). Increasing student success in STEM: An overview for a new
guide to systemic institutional change. In G.C. Weaver, W.E. Burgess, A.L. Childress, and
L. Slakey (Eds.), Transforming Institutions: Undergraduate STEM Education for the 21st
Century. West Lafayette, IN: Purdue University Press.
Elrod, S., and Kezar, A. (2015). Increasing student success in STEM. Peer Review, 17(2). Avail-
able: https://www.aacu.org/peerreview/2015/spring/elrod-kezar [July 2017].
Estrada, M. (2014). Ingredients for Improving the Culture of STEM Degree Attainment with
Co-curricular Supports for Underrepresented Minority Students. Paper prepared for the
Committee on Barriers and Opportunities in Completing 2- and 4-Year STEM Degrees.
Available: http://sites.nationalacademies.org/cs/groups/dbassesite/documents/webpage/
dbasse_088832.pdf [July 2017].
Finley, A., and McNair, T. (2013). Assessing Underserved Students’ Engagement in High-
Impact Practices. Washington, DC: Association of American Colleges & Universities.
Fries-Britt, S., and Griffin, K. (2007). The Black box: How high-achieving Blacks resist stereo-
types about Black Americans. Journal of College Student Development, 48(5), 509–524.
Gates, J. (1995). Equity vs. excellence: A false dichotomy in science and society. The Scientist,
9(14), 12.
Gay, G. (2000). Culturally Responsive Teaching: Theory, Practice and Research. New York:
Teachers College Press.
Gasiewski, J.A., Eagan, M.K., Garcia, G.A., Hurtado, S., and Chang, M.J. (2012). From gate-
keeping to engagement: A multicontextual, mixed method study of student academic en-
gagement in introductory STEM courses. Research in Higher Education, 53(2), 229–261.
Gilmer, T.C. (2007). An understanding of the improved grades, retention and graduation rates
of STEM majors at the Academic Investment in Math and Science (AIMS) Program of
Bowling Green State University (BGSU). Journal of STEM Education, 8(1), 11–21.
Godsil, R.D. (2016). Why race matters in physics class. UCLA Law Review Discourse, 64, 40.
Gurin, P., Dey, E., Hurtado, S., and Gurin, G. (2002). Diversity and higher education: Theory
and impact on educational outcomes. Harvard Educational Review, 72(3), 330–367.
Howard Hughes Medical Institute. (2016). Inclusive Excellence: Engaging All Students in
Science. Washington, DC: Howard Hughes Medical Institute. Available: https://www.
hhmi.org/sites/default/files/Programs/Inclusive/Inclusive-Excellence-2018-Program-
Announcement.pdf [September 2017].
Hurtado, S. (2001). Linking diversity and educational purpose: How diversity affects the class-
room environment and student development. In G. Orfield (Ed.), Diversity Challenged:
Evidence on the Impact of Affirmative Action (pp. 187–203). Cambridge, MA: Harvard
Education Publishing Group.
Hurtado, S., Han, J.C., Sáenz, V.B., Espinosa, L.L., Cabrera, N.L., and Cerna, O.S. (2007).
Predicting transition and adjustment to college: Biomedical and behavioral science aspi-
rants’ and minority students’ first year of college. Research in Higher Education, 48(7),
841–887.
Hurtado, S., Alvarez, C. L., Guillermo-Wann, C., Cuellar, M., and Arellano, L. (2012). A
model for diverse learning environments. In Higher Education: Handbook of Theory
and Research (pp. 41–122). Netherlands: Springer.
Ibarra, R.A. (2001). Beyond Affirmative Action: Reframing the Context of Higher Education.
Madison: University of Wisconsin Press.
National Research Council. (2013). Monitoring Progress Toward Successful K-12 STEM
Education: A Nation Advancing? Washington, DC: The National Academies Press. doi:
10.17226/13509.
National Science Foundation. (2016). Science and Engineering Indicators 2016. Arlington, VA:
Author. Available: https://www.nsf.gov/statistics/2016/nsb20161/#/ [July 2017].
National Science Foundation. (2017). Women, Minorities, and Persons with Disabilities in
Science and Engineering: 2017 (NSF 17-310). Arlington, VA: Author. Available: www.
nsf.gov/statistics/wmpd [July 2017].
National Survey of Student Engagement. (2016). NSSE 2016 High-Impact Practices: U.S.
Summary Percentages by Student Characteristics. Bloomington, IN: National Survey of
Student Engagement. Available: http://nsse.indiana.edu/2016_institutional_report/pdf/
HIPTables/HIP.pdf [July 2017].
Packard. B.W. (2016). Successful STEM Mentoring Initiative for Underrepresented Students:
A Research-Based Guide for Faculty and Administrators. Sterling, VA: Stylus.
Paris, D. (2012). Culturally sustaining pedagogy: A needed change in stance, terminology, and
practice. Educational Researcher, 41(3), 93–97.
Ponjuan, L. (2006). A national study of job satisfaction of faculty of color in doctoral institu-
tions. Journal of the Professoriate, 1(1), 45–70.
President’s Council of Advisors on Science and Technology. (2012). Engage to Excel: Produc-
ing One Million Additional College Graduates with Degrees in Science, Technology,
Engineering, and Mathematics. Washington, DC: Author. Available: https://obamawhite
house.archives.gov/sites/default/files/microsites/ostp/pcast-engage-to-excel-final_2-25-12.
pdf [July 2017].
Rosser, V.J. (2004). Faculty members’ intentions to leave: A national study on their work life
and satisfaction. Research in Higher Education, 45(3), 285–309.
Rutschow, E.Z., and Diamond, J. (2015). Laying the Foundations: Early Findings from the
New Math Ways Project. New York: MDRC.
Salzman, H., and Van Noy, M. (2014). Crossing the Boundaries: STEM Students in Four-Year
and Community Colleges. Paper prepared for the Committee on Barriers and Opportuni-
ties in Completing 2- and 4-Year STEM Degrees. Available: http://sites.nationalacademies.
org/cs/groups/dbassesite/documents/webpage/dbasse_089924.pdf [July 2017].
Sanderson, A., Phua, V.C., and Herda, D. (2000). The American Faculty Poll. Chicago, IL:
National Opinion Research Center.
Smith, D.G. (2015). Diversity’s Promise for Higher Education: Making It Work. Baltimore,
MD: Johns Hopkins University Press.
Speroni, C. (2011). Determinants of Students’ Success: The Role of Advanced Placement and
Dual Enrollment Programs (NCPR Working Paper). New York: National Center for
Postsecondary Research.
Turner, C.S.V., and González, J.C. (2011). Faculty women of color: The critical nexus of race
and gender. Journal of Diversity in Higher Education, 4(4), 199–211.
Umbach, P.D. (2006). The contribution of faculty of color to undergraduate education. Re-
search in Higher Education, 47(3), 317–345.
Van Noy, M., and Zeidenberg, M. (2014). Hidden STEM Knowledge Producers: Community
Colleges’ Multiple Contributions to STEM Education and Workforce Development.
Paper prepared for the Committee on Barriers and Opportunities in Completing 2- and
4-Year STEM Degrees. Available: http://sites.nationalacademies.org/cs/groups/dbassesite/
documents/webpage/dbasse_088831.pdf [July 2017].
Wang, X. (2013). Modeling entrance into STEM fields of study among students beginning
at community colleges and four-year institutions. Research in Higher Education, 54(6),
664–692.
Wang, X. (2015). Pathway to a baccalaureate in STEM fields: Are community colleges a vi-
able route and does early STEM momentum matter? Educational Evaluation and Policy
Analysis, 37(3), 376–393. doi: 10.3102/0162373714552561.
Williams, D.A., Berger, J.B., and McClendon, S.A. (2005). Toward a Model of Inclusive
Excellence and Change in Postsecondary Institutions. Washington, DC: Association of
American Colleges & Universities.
Witham, K., Malcom-Piqueux, L.E., Dowd, A.C., and Bensimon, E.M. (2015). America’s
Unmet Promise: The Imperative for Equity in Higher Education. Washington, DC: As-
sociation of American Colleges & Universities.
T
he committee recognizes that it is not possible to specify a target “ad-
equate number” of STEM professionals, given the varying demands
across the different STEM disciplines and types of occupations now
and in the future. However, the committee is nevertheless committed to
ensuring that the nation has a robust, highly talented STEM workforce.
Advancing Goal 3 will require progress toward Goal 1 (increasing
students’ mastery of STEM concepts and skills through engagement in
evidence-based STEM educational practices and programs) and Goal 2
(striving for equity, diversity, and inclusion). Progress toward Goals 1 and
2 will increase the numbers of students entering and persisting in STEM
fields and ultimately earning STEM credentials. However, achieving Goal 3
would not necessarily require that all of these increased numbers of STEM
graduates enter STEM professions. Rather, graduates would apply their
STEM knowledge, skills, and ways of thinking at work in diverse STEM
and non-STEM occupations and through civic participation, helping to
address the grand challenges facing society (see “Vision” in Chapter 1).
As a first step toward developing indicators of progress toward this
goal, the committee identified three specific objectives for advancing the
goal:
111
3.2 Successful navigation into and through 3.2.1 Retention in STEM programs, course to
STEM programs of study course and year to year
Proposed Indicator
Proposed Indicators
and Alfonso, 2011). For example, Bowen, Chingos, and McPherson (2009)
found that 2-year students who transferred to a public flagship university
were as likely to graduate as those who started there, and those who trans-
ferred to less selective public 4-year institutions had a greater chance of
graduating than native students.
Different types of 4-year institutions vary in their acceptance of transfer
credits, with public institutions accepting the most credits. Simone (2014)
found that transfer students who entered private nonprofit institutions
transferred 21 percent fewer credits than those who entered public institu-
tions, and those who entered private for-profit institutions transferred 52
percent fewer credits. This variation, in turn, influences completion rates:
transfer students’ completion of bachelor’s degrees is highest (65%) at
public institutions, followed by private nonprofit institutions (60%), and
lowest at private for-profit institutions (35%).
Proposed Indicator
REFERENCES
Accreditation Board for Engineering and Technology, Inc. (2016). Criteria for Accrediting En-
gineering Programs. Baltimore, MD: Author. Available: http://www.abet.org/wp-content/
uploads/2015/10/E001-16-17-EAC-Criteria-10-20-15.pdf [July 2016].
Association of Public and Land-grant Universities. (2017). Getting FIU Students Chem-
istry Ready. Available: http://www.aplu.org/projects-and-initiatives/accountability-and-
transparency/using-data-to-increase-student-success/APLU_WhitePaper_Florida_C.pdf
[September 2017].
Astin, A.W., and Astin, H.S. (1992). Undergraduate Science Education: The Impact of Dif-
ferent College Environments on the Educational Pipeline in the Sciences. Final Report.
Washington, DC: National Science Foundation.
Bailey, A.L., and Carroll, P.E. (2015). Assessment of English language learners in era of new
academic content standards. Review of Research in Education, 39(1), 253–294.
Bailey, T. (2009). Rethinking developmental education in community college. Community
College Research Center Brief, 40. Available: http://ccrc.tc.columbia.edu/media/k2/
attachments/rethinking-developmental-education-in-community-college-brief.pdf [July
2017].
Bailey, T., Jeong, D.W., and Cho, S.-W. (2010). Referral, enrollment, and completion in devel-
opmental education sequences in community colleges. Economics of Education Review,
29(2), 255–270.
Bailey, T., Bashford, J., Boatman, A., Squires, J., Weiss, M., Doyle, W., Valentine, J.C., LaSota,
R., Polanin, J.R., Spinney, E., Wilson, W., Yeide, M., and Young, S.H. (2016). Strategies
for Postsecondary Students in Developmental Education: A Practice Guide for College
and University Administrators, Advisors, and Faculty. Washington, DC: Institute of
Education Sciences, What Works Clearinghouse.
Bailey, T.R., Jaggars, S.S., and Jenkins, D. (2015). Redesigning America’s Community Colleges:
A Clearer Path to Student Success. Cambridge, MA: Harvard University Press.
Barr, D.A., Gonzalez, M.E., and Wanat, S.F. (2008). The leaky pipeline: Factors associated
with early decline in interest in premedical studies among underrepresented minority
undergraduate students. Academic Medicine, 83(5), 503–511.
Bettinger, E. (2010). To be or not to be: Major choices in budding scientists. In C.T. Clotfelter
(Ed.), American Universities in a Global Market (pp. 69–98). Chicago, IL: University of
Chicago Press.
Bowen, W.G., Chingos, M.M., and McPherson, M.S. (2009). Crossing the Finish Line: Com-
pleting College at America’s Public Universities. Princeton, NJ: Princeton University
Press.
Bressoud, D., Mesa, V., and Rasmussen, C. (2015). Insights and Recommendations from
the MAA National Study of College Calculus. Washington, DC: Mathematical Associa-
tion of America Press. Available: http://www.maa.org/sites/default/files/pdf/cspcc/Insights
andRecommendations.pdf [April 2016].
Carlan, P.E., and Byxbe, F.R. (2000). Community colleges under the microscope: An analysis
of performance predictors for native and transfer students. Community College Review,
28(2), 27–42.
Chen, X. (2009). Students Who Study Science, Technology, Engineering, and Mathematics
(STEM) in Postsecondary Education. (NCES 2009-161). Washington, DC: U.S. Depart-
ment of Education, National Center for Education Statistics. Available: https://nces.
ed.gov/pubs2009/2009161.pdf [July 2017].
Correll, S.J. (2001). Gender and the career choice process: The role of biased self-assessments.
American Journal of Sociology, 106 (6), 1691–1730.
Crisp, G., Nora, A., and Taggart, A. (2009). Student characteristics, pre-college, college, and
environmental factors as predictors of majoring in and earning a STEM degree: An analy-
sis of students attending a Hispanic serving institution. American Educational Research
Journal, 46(4), 924–942.
Director, S.W., Khosla, P.K., Rohrer, R.A., and Rutenbar, R.A. (1995). Reengineering the cur-
riculum: Design and analysis of a new undergraduate electrical and computer engineer-
ing degree at Carnegie Mellon University. Proceedings of the Institute of Electrical and
Electronics Engineers, 83(9), 1246–1269.
Eagan, K., Herrera, F.A., Garibay, J.C., Hurtado, S., and Chang, M. (2011). Becoming STEM
Protégés: Factors Predicting the Access and Development of Meaningful Faculty–Student
Relationships. Presented at Association for Institutional Research Annual Forum, To-
ronto, Ontario, May 24.
Eagan, K., Hurtado, S, Figueroa, T., and Hughes, B. (2014). Examining STEM Pathways
among Students Who Begin College at Four-Year Institutions. Paper prepared for the
Committee on Barriers and Opportunities in Completing 2- and 4-Year STEM Degrees.
Available: http://sites.nationalacademies.org/cs/groups/dbassesite/documents/webpage/
dbasse_088834.pdf [July 2017].
Ellis, J., Fosdick, B.K., and Rasmussen, C. (2016). Women 1.5 times more likely to leave STEM
pipeline after calculus compared to men: Lack of mathematical confidence a potential cul-
prit. PLOS ONE, 11(7). Available: http://journals.plos.org/plosone/article?id=10.1371/
journal.pone.0157447 [May 2016].
Glass, J.C., and Harrington, A.R. (2002). Academic performance of community college trans-
fer students and “native” students at a large state university. Journal of Research and
Practice, 26, 415–430.
Grant, H., and Dweck, C.S. (2003). Clarifying achievement goals and their impact. Journal
of Personality and Social Psychology, 85(3), 541–553.
Hills, J. (1965) Transfer shock: The academic performance of the transfer student. The Journal
of Experimental Education, 33(3) (Spring, 1965). (ERIC Document Reproduction Service
No. ED 010 740).
Jenkins, D., and Fink, J. (2015). What We Know about Transfer. New York, NY: Columbia
University, Teachers College, Community College Research Center. Available: https://
ccrc.tc.columbia.edu/media/k2/attachments/what-we-know-about-transfer.pdf [October
2017].
Jenkins, D., and Weiss, M.J. (2011). Charting Pathways to Completion for Low-Income
Community College Students. (CCRC Working Paper No. 34). New York: Community
College Research Center.
Jenkins, D., Jaggars, S.S., Roksa, J., Zeidenberg, M., and Cho, S.-W. (2009). Strategies for
Promoting Gatekeeper Course Success Among Students Needing Remediation: Research
Report for the Virginia Community College System. New York: Columbia University,
Teachers College, Community College Research Center.
Logue, A.W., Watanabe-Rose, M., and Douglas, D. (2016) Should students assessed as need-
ing remedial mathematics take college-level quantitative courses instead? A randomized
controlled trial. Educational Evaluation and Policy Analysis, 38(3), 1–21.
Melguizo, T., Kienzl, G.S., and Alfonso, M. (2011). Comparing the educational attainment of
community college transfer students and four-year college rising juniors using propensity
score matching methods. Journal of Higher Education, 82(3), 265–291.
Mervis, J. (2010). Better intro courses seen as key to reducing attrition of STEM majors. Sci-
ence, 330(6002), 306.
Miller, C., and Settle, A. (2011) When practice doesn’t make perfect: Effects of task goals on
learning computing concepts. Association for Computing Machinery Transactions on
Computing Education, 11(4).
Monaghan, D.B. and Attewell, P. (2015). The community college route to the bachelor degree.
Educational Evaluation and Policy Analysis, 37(1), 70–91. Available: http://journals.
sagepub.com/doi/pdf/10.3102/0162373714521865 [September 2017].
National Academies of Sciences, Engineering, and Medicine. (2016). Barriers and Opportuni-
ties for 2-Year and 4-Year STEM Degrees: Systemic Change to Support Diverse Student
Pathways. Washington, DC: The National Academies Press. doi:10.17226/21739.
National Research Council. (1999). Being Fluent with Information Technology. Washington,
DC: National Academy Press. Available: http://www.nap.edu/openbook.php?record_
id=6482 [July 2017].
National Science Foundation. (2015). Revisiting the STEM Workforce: A Companion to Sci-
ence and Engineering Indicators 2014. Arlington, VA: Author. Available: http://www.nsf.
gov/pubs/2015/nsb201510/nsb201510.pdf [March 2016].
November, N., and Day, K. (2012). Using undergraduates’ digital literacy skills to improve
their discipline-specific writing: A dialog. International Journal for the Scholarship of
Teaching and Learning, 6(2), Article 5. Available: http://digitalcommons.georgiasouthern.
edu/ij-sotl/vol6/iss2/5 [July 2017].
OECD. (2013). Skilled for Life? Key Findings from the Survey of Adult Skills. Paris: Author.
Available: http://www.oecd.org/site/piaac/SkillsOutlook_2013_ebook.pdf [May 2016].
Ohland, M.W., Sheppard, S.D., Lichtenstein, G., Eris, O., and Chachra, D. (2008). Persistence,
engagement, and migration in engineering programs. Journal of Engineering Education,
7(3), 259–278.
Ost, B. (2010). The role of peers and grades in determining major persistence in the sciences.
Economics of Education Review, 29(6), 923–934.
Parker, L.C., Adedokun, O., and Weaver, G.C. (2016). Culture, policy and resources: Barriers
reported by faculty implementing course reform. In G.C. Weaver, W.D. Burgess, A.L.
Childress, and L. Slakey, (eds.), Transforming Institutions: Undergraduate STEM Educa-
tion for the 21st Century. West Lafayette, IN: Purdue University Press.
President’s Council of Advisors on Science and Technology. (2012). Engage to Excel: Producing
One Million Additional College Graduates with Degrees in Science, Technology, Engi-
neering and Mathematics. Washington, DC: Author. Available: https://obamawhitehouse.
archives.gov/sites/default/files/microsites/ostp/pcast-engage-to-excel-final_2-25-12.pdf [July
2017].
Pyburn, D.T., Pazicni, S., Benassi, V.A., and Tappin, E.E. (2013). Assessing the relationship
between language comprehension and performance in general chemistry. Chemistry
Education Research and Practice, 14, 524–541.
Radford, A.W., and Horn, L. (2012). An Overview of Classes Taken and Credits Earned
by Beginning Postsecondary Students (Web Tables, NCES 2013–151rev). Washington,
DC: U.S. Department of Education, National Center for Education Statistics. Available:
https://nces.ed.gov/pubs2013/2013151rev.pdf [July 2017].
Rask, K. (2010). Attrition in STEM fields at a liberal arts college: The importance of grades
and pre-collegiate preferences. Economics of Education Review, 29(6), 892−900.
Sardone, N.B. (2011). Developing information technology (IT) fluency in college students: An
investigation of learning environments and learner characteristics. Journal of Information
Technology Education, 10, 101–122.
Saxe, K., and Braddy, L. (2016). A Common Vision for Undergraduate Mathematical Sciences
Programs in 2025. Washington, DC: Mathematical Association of America. Available:
http://www.maa.org/sites/default/files/pdf/CommonVisionFinal.pdf [May 2017].
Scott-Clayton, J. (2011). The Shapeless River: Does a Lack of Structure Inhibit Students’
Progress at Community Colleges? (CCRC Working Paper No. 25). New York: Columbia
University, Teachers College, Community College Research Center.
Seymour, E. (2001). Tracking the processes of change in U.S. undergraduate education in
science, mathematics, engineering, and technology. Science Education, 86(1), 79–105.
Seymour, E., and Hewitt, N. (1997). Talking About Leaving: Why Undergraduates Leave the
Sciences. Boulder, CO: Westview Press.
Simone, S.A. (2014). Transferability of Postsecondary Credit Following Student Transfer or
Coenrollment. (NCES 2014-163). Washington, DC: U.S. Department of Education, In-
stitute of Education Sciences, National Center for Education Statistics. Available: https://
nces.ed.gov/pubs2014/2014163.pdf [September 2017].
Sparks, D., and Malkus, N. (2013). First-year undergraduate remedial coursetaking:
1999−2000, 2003−04, 2007−08. Statistics in Brief. Washington, DC: Institute of Educa-
tion Statistics, U.S. Department of Education, National Center for Education Statistics.
Available: http://nces.ed.gov/pubs2013/2013013.pdf [June 2016].
Stinebrickner, R., and Stinebrickner, T.R. (2013). A Major in Science? Initial Beliefs and Final
Outcomes for College Major and Dropout. (Centre for Human Capital and Productiv-
ity Working Papers, 2013–2014). London, ON: Department of Economics, University
of Western Ontario. Available: http://ir.lib.uwo.ca/cgi/viewcontent.cgi?article=1093and
context=economicscibc [July 2016].
Thompson, P.W., Castillo-Chavez, C., Culbertson, R.J., Flores, A., Greeley, R., Haag, S., Rose,
S.D., and Rutowski, R.L. (2007). Failing the Future: Problems of Persistence and Reten-
tion in Science, Technology, Engineering, and Mathematics (STEM) Majors at Arizona
State University. Phoenix: Arizona State University.
Tobias, S. (1990). They’re Not Dumb, They’re Different: Stalking the Second Tier. Tucson,
AZ: Research Corporation.
U.S. Department of Education. (2014). Percentage of First-Year Undergraduate Students Who
Reported Taking Remedial Education Courses, by Selected Student and Institution Char-
acteristics: 2003–04, 2007–08, and 2011–12 [Data file]. Institute of Education Sciences,
National Center for Education Statistics. Available: https://nces.ed.gov/programs/digest/
d15/tables/dt15_311.40.asp [October 2017].
Valentine, J.C., Konstatopoulos, S., and Goldrick-Rab, S. (2017). What happens to students
placed in developmental education? A meta-analysis of regression discontinuity studies.
Review of Educational Research, 87(4), 806–833. Available: http://journals.sagepub.com/
doi/pdf/10.3102/0034654317709237 [October 2017].
Vaz, R. (2004). The promise of computer literacy. Liberal Education, 90(4). Available: https://
www.aacu.org/publications-research/periodicals/promise-computer-literacy [July 2017].
Xie, Y., and Killewald, A.A. (2012). Is American Science in Decline? Cambridge, MA: Harvard
University Press.
Yamada, H., Bohannen, A., and Grunow, A. (2016). Assessing the Effectiveness of Quant-
way: A Multilevel Model with Propensity Score Matching. Carnegie Math Pathways
Technical Report. Stanford, CA: Carnegie Foundation for the Advancement of Teaching.
Available: https://www.carnegiefoundation.org/wp-content/uploads/2016/10/Quantway_
propensity_score_matching_10-2016.pdf [February 2018].
T
his chapter addresses the committee’s charge to review existing sys-
tems for monitoring undergraduate STEM education. The first sec-
tion provides an overview of currently available data on higher
education in STEM fields. The next two sections review public and propri-
etary data sources, respectively. The fourth section discusses existing moni-
toring systems that contain elements related to the committee’s proposed
indicators. The final section focuses directly on the committee’s indicators,
summarizing for each indicator current data sources, potential new data
sources, and the research and data development that would be required
to tap those potential sources for the purpose of ongoing monitoring of
undergraduate STEM education.
OVERVIEW
Although many different postsecondary education data sources are
available, they are limited in their ability to track students’ progress into
and through STEM programs and monitor the status of the committee’s
goals for undergraduate STEM:
127
The various public and proprietary data sources currently available are
summarized in Table 6-1. These data sources rely primarily on three types
of data (1) student and faculty unit record administrative data, (2) aggre-
gated institution-level data, and (3) surveys of individual students and
instructors (see Box 6-1).
Proprietaryb
National Annual 98% of institutions Limited for student
Student represented, but characteristics
Clearinghouse institutions do not
always provide
students’ demographic
characteristics,
disciplines, and degree
programs
TABLE 6-1 Continued
Coverage and Feasibility of
Source Frequency Representativeness Disaggregation
HERIc Faculty Every 3 years Strong coverage among Strong for faculty at 4-year
Survey 4-year nonprofit institutions
institutions; nationally
representative of full-
time faculty at 4-year
institutions
BOX 6-1
Types of Data on Postsecondary Education
1 IPEDS has recently expanded its data collections to include part-time students and transfer
eral student financial aid programs is required under Title IV of the Higher
Education Act as amended in 1992 (P.L. 102-325) to provide data annually.
Because of this requirement, response rates are very high. For example, in
the spring 2010 data collection, the response rate for each of the survey
components was more than 99 percent (Knapp, Kelly-Reid, and Ginder,
2012). According to the NCES Handbook of Survey Methods (Burns,
Wang, and Henning, 2011), IPEDS includes the universe of postsecondary
institutions participating in federal student financial aid programs.
In 2014, about 7,300 institutions complied with the mandate to re-
spond, and an additional 200 institutions that did not participate in federal
financial aid programs voluntarily provided data (National Center for Edu-
cation Statistics, 2014). Individual institutions, or in some cases, the state
higher education systems responding on behalf of multiple institutions, pro-
vide data describing their institutional characteristics, enrollments, comple-
tions and completers, graduation rates and other outcome measures, faculty
and staff, finances, institutional prices, student financial aid, admissions,
and academic libraries. To do so, institutional research staff or administra-
tors aggregate internal administrative records (i.e., student unit record data)
to create institution-level data files and submit them to IPEDS.
IPEDS data are collected and released three times each year and are
made publicly accessible in two online platforms—the College Naviga-
tor, that can be used by students, families, educational policy makers, and
others2 and the IPEDS Data Center.3 To ensure data quality, the NCES
Statistical Standards Program publishes statistical standards and provides
methodological and statistical support to assist NCES staff and contractors
in meeting the standards, with the goal of providing high-quality, reliable,
and useful statistical information to policy makers and the public (National
Center for Education Statistics, 2012). Several data elements in IPEDS (see
National Center for Education Statistics, 2014) are relevant the committee’s
proposed indicators.
12-Month Enrollment
Data on 12-month enrollment for undergraduate and graduate students
are collected in the fall. The data include unduplicated headcounts and
instructional activity in contact or credit hours. Instructional activity is
used to compute a standardized, 12-month, full-time-equivalent enrollment.
Completions
Completion data covering all degrees (associate’s, bachelor’s, master’s
and doctorate) and sub-baccalaureate awards are collected in the fall. These
data are disaggregated by race and ethnicity, gender, and field of study.
They include all STEM degrees and awards received by students, both those
who began at the reporting institution and those who transferred to that
institution.
Graduation Rates
The graduation data cover the initial cohort of full-time, first-time,
degree- and certificate-seeking undergraduate students at 2-year and 4-year
institutions; the number of those students who complete their degrees or
certificates within 150 percent of the normal time (i.e., 3 years or 6 years);
and the number of those students who transferred to other institutions.
Data are reported by race and ethnicity, gender, and field of study. The data
also include 100 percent graduation rates: 4-year bachelor’s degree rates
have been reported since 1997; 2-year certificate and degree rates have been
reported since 2008–2009.
It is important to note that these data do not include part-time students,
students who transfer to the reporting institution and students who transfer
out and later graduate from another institution. Given the high rates of
student “swirl” in STEM fields, these data do not accurately capture STEM
graduation rates.
(Wine, Janson, and Wheeless, 2011). The full BPS 04/09 dataset provides
rich information on students’ course histories, enrollment and matricula-
tion pathways, college experiences and perceptions, and retention and
graduation outcomes. Data are disaggregated by race and ethnicity, gender,
socioeconomic status, enrollment status, disability status, field of study, and
institution type. However, the sample sizes do not allow disaggregation by
demographic characteristics, field of study, and institution type.
The current cohort, BPS 12/17, began college in 2012, was followed
up in 2014, and was followed up again in 2017; the data are not yet avail-
able. Continuing these regular cycles will be critical for informing some of
the proposed indicators. In addition, more frequent data collection would
allow the indicators to be updated annually, rather than only once every
3 years.
4 As defined in Title IV of the Higher Education Act, such financial aid includes loans under
the Federal Family Education Loan Program or William D. Ford Federal Direct Loan (Direct
Loan) Program, as well as Perkins Loans, Pell Grants, Teacher Education Assistance for Col-
lege and Higher Education Grants, Academic Competitiveness Grants or Science and Math
Access to Retain Talent Grants, and Parent PLUS loans.
been essential, and not all states have provided state funding to maintain
these systems after federal grants expired.
Freshman Survey
The HERI freshman survey gathers data from incoming first-time full-
time college students on their educational attitudes and aspirations. More
than 1,900 four-year institutions have participated in the survey since
1966. In 2015, HERI identified 1,574 institutions that offer baccalaure-
ates degrees that were included in IPEDS and invited them to participate.
This national population of institutions was divided into 26 stratification
groups, based on institutional race, type (e.g., university, 4-year college,
2-year college), control (e.g., public, private nonsectarian, Roman C
atholic,
other religious), and selectivity. Of the 1,574 institutions, 308 institu-
tions responded, a 19 percent response rate. Generally, the response rate
among students in the institutions that participated has been high, averag-
ing 75 percent. Since 2011, data from institutions have been included in
the “national norms sample” only if there was a response rate of at least
a 65 percent among incoming full-time first-year students. Data from in-
stitutions just below this cutoff are included if the survey administration
methods showed no systematic biases in freshman class coverage. In 2015,
data from 199 institutions, representing 141,189 student responses, met
these criteria and were included in the national norms sample (Eagan et
al., 2016).
In 2015, the survey data were weighted by a two-step procedure. The
first weight was designed to adjust for response bias within institutions,
and the second weight was designed to compensate for nonresponding
institutions within each stratification group by gender. The weighted data
are nationally representative of first-time, full-time freshmen in nonprofit
4-year colleges and universities in the United States (Eagan et al., 2016;
National Academies of Sciences, Engineering, and Medicine, 2016). Reflect-
ing the high quality of these data, NSF relies on them for the undergradu-
ate education section of the National Science Board’s biennial Science and
Engineering Indicators report (National Science Foundation, 2016).
Because it does not include 2-year institutions and part-time students are
undersampled, the HERI Freshman Survey does not provide data nationally
representative of the U.S. population of 2-year and 4-year students. In terms
of disaggregation, although the data include the proportion of entering stu-
dents who stated an intention to major in a STEM field and later completed
a degree in that field, they do not measure students’ actual selections of
major field (e.g., switching into STEM majors). The data are disaggregated
by demographic characteristics.
Faculty Survey
With the suspension of the NCES Survey of Postsecondary Faculty in
2004 (see above), researchers and policy makers have increasingly relied
on the HERI Faculty Survey. Although this survey is designed to include
full- and part-time faculty members, most participating institutions choose
to sample only full-time faculty. In addition, although both 2-year and
4-year institutions are invited to participate, 4-year nonprofit institutions
predominate in the survey. The survey includes questions about working
conditions and activities and teaching approaches.
In 2014, HERI identified a national population of 1,505 institutions
that grant baccalaureate degrees that had responded to the IPEDS 2012–
2013 human resources survey and invited them to participate in the HERI
Faculty Survey. The national population was divided into 20 stratifica-
tion groups based on type, control, and selectivity. Of those invited, 148
institutions participated, a 9 percent response rate. HERI also developed
a supplemental sample of 67 institutions to enhance the number of respon-
dents from types of institutions that participated at a lower rate than others
to create a normative national sample of institutions (Eagan et al., 2014b).
To be included in the normative national sample, colleges were required
to have responses from at least 35 percent of full-time undergraduate
MONITORING SYSTEMS
The committee was not able to locate any existing systems that are
designed specifically for monitoring the status and quality of undergradu-
ate STEM education. However, it did identify existing monitoring systems
that include elements relevant to undergraduate STEM, as discussed below.
Enrollment
The levels and flows of enrollment in STEM show how the different
STEM fields are changing over time and so can inform decision makers
charged with directing resources to undergraduate education. For post-
secondary education, the enrollment data include the number enrolled
in STEM relative to other degrees; change over time in the number of
undergraduate degrees conferred; demographic characteristics of students
enrolled in STEM fields, including citizenship status; and the number of stu-
dents enrolled in 2-year institutions by demographic characteristics. These
statistics are tabulated from the IPEDS fall enrollment survey.
(National Science Foundation, 2016) presents the number and growth rates
of associate and baccalaureate degrees awarded in STEM, by demographic
characteristics, drawing on the IPEDS completion survey.
Data Gaps
A recent review of SEI (National Research Council, 2014) noted that,
although it provides a bevy of statistics on 4-year and postgraduate enroll-
ments and degrees, it needs improved information on 2-year students who
later earn higher degrees in STEM. Noting an increase in students who at-
tend 2-year institutions as part of their 4-year STEM education, the review
recommended that NCSES track graduates of 2-year institutions in STEM
fields and publish data on these students’ persistence at different levels of
education (National Research Council, 2014, p. 18).
BOX 6-2
Example Measure of Completion of Foundational Courses
Indicator 1.1.1:
Use of Evidence-Based STEM Educational Practices
in Course Development and Delivery
2.2.1 Diversity of STEM Degree IPEDS Include items on students’ Pell None
and Certificate Earners in (socioeconomic) status and disability
Comparison with Diversity of status in data provided by institutions
Degree and Certificate Earners
in All Fields
2.2.2. Diversity of Transfers NSC Add student attributes (gender, race More comprehensive participation
from 2- to 4-year STEM and ethnicity, Pell status, disability among and coverage of all types of
Programs in Comparison with status) to the data voluntarily postsecondary institutions
Diversity of Students in 2-year submitted by institutions to NSC
STEM Programs
2.2.3. Time to Degree for NSC Same as above Increase coverage of students’
Students in STEM Academic academic programs in the
Programs data voluntarily submitted by
institutions to NSC
Indicators for Monitoring Undergraduate STEM Education
Foundational Courses, including developmental and foundational students sampled to allow more
Developmental Education courses granular disaggregation
Courses, to Ensure STEM
Program Readiness
3.2.1 Retention in STEM BPS 04/09 None Expand number of institutions and
Programs, Course to Course students sampled to allow more
and Year to Year granular disaggregation
HERI Freshman Survey and None Incorporate 2-year institutions in
NSC Freshman Survey; increase coverage
of students’ academic programs in
NSC data provided by institutions
3.2.2 Transfers from 2-year NSC None Increase coverage of students’
More generally, the validity of data from any self-report survey can be
threatened by faking, cheating, and motivation. Respondents may be mo-
tivated by social desirability (the tendency to think of and present oneself
in a favorable light) and therefore not respond accurately. For example,
Manduca and colleagues (2017) recently found that instructors reported
frequently using evidence-based teaching practices on self-report surveys,
but observational methods (discussed below) indicated that these instruc-
tors rarely did so. If self-report surveys are used for high-stakes purposes
(e.g., to inform decisions about promotion and tenure), it can provide ad-
ditional incentives to tailor one’s responses to present oneself in the best
possible light (Sackett, 2012).
Measuring teaching is difficult, and different measurement methods
(e.g., self-report surveys, interviews, observations) have varying strengths,
weaknesses, and costs (William T. Grant Foundation, Spencer Foundation,
and Bill & Melinda Gates Foundation, 2014). Observational methods,
such as the Reformed Teaching Observational Protocol (Piburn and Daiyo,
2000) and the more recent Classroom Observation Protocol for Under-
graduate STEM (Smith et al., 2013) require trained experts who analyze
either videotapes or actual instruction using protocols that describe vari-
ous teaching practices. These methods provide high-quality data, but they
are time-consuming and expensive to implement even for small groups of
instructors. For practical reasons, and to capture information on teaching
practices among larger samples of instructors, development of self-report
surveys is continuing (e.g., Wieman and Gilbert, 2014). For these same
practical reasons, self-report surveys of instructors would be the most likely
source of national data for indicators to monitor the use of evidence-based
practices in and outside the classroom.
Since 2004, when NCES last administered the National Survey of
Postsecondary Faculty (NSOPF), the HERI has conducted the only com-
prehensive, nationally representative survey of faculty, and this survey
covers only faculty at 4-year colleges and universities. The HERI Faculty
Survey includes questions about instructors’ activities related to research,
teaching, and service, as well as their perceptions of students, campus ad-
ministration, and workplace stressors. It also invites respondents to report
on their participation in a range of professional development opportuni-
ties. The resulting (weighted) dataset represents the national population of
full-time faculty with responsibility for teaching undergraduates at 4-year
colleges and universities. However, the HERI Faculty Survey data have four
significant limitations as indicators of use of evidence-based educational
practices. First, the data are self-reports, which as noted above sometimes
do not correspond with more direct measures of instruction, such as ob-
servational protocols. Second, the HERI data are collected primarily from
full-time instructors at 4-year institutions, missing the growing numbers
Indicator 1.1.2:
Use of Evidence-Based Practices Outside the Classroom
Indicator 1.2.1:
Extent of Instructors’ Involvement in Professional Development
Indicator 1.2.2:
Availability of Support or Incentives for Evidence-
Based Course Development or Course Redesign
additional dimensions may be needed. For example, the PULSE rubrics ad-
dress several dimensions of support, including support for teaching/learning
needs in STEM and faculty mentoring for the teaching role (PULSE Fellows,
2016). Such research is the first step toward developing new survey ques-
tions that might be included in existing surveys of institutions (e.g., IPEDS)
or in a revived National Survey of Postsecondary Faculty.
Indicator 1.3.1:
Use of Valid Measures of Teaching Effectiveness
Indicator 1.3.2:
Consideration of Evidence-Based Teaching in Personnel
Decisions by Departments and Institutions
Indicator 2.1.1:
Institutional Structures, Policies, and Practices That Strengthen
STEM Readiness for Entering and Enrolled College Students
Indicator 2.1.2:
Entrance to and Persistence in STEM Academic Programs
Indicator 2.1.3:
Equitable Student Participation in Evidence-Based
STEM Educational Programs and Experiences
Indicator 2.2.1:
Diversity of STEM Degree and Certificate Earners in Comparison
with Diversity of Degree and Certificate Earners in All Fields
Indicator 2.2.2:
Diversity of Transfers from 2-Year to 4-Year STEM Programs in
Comparison with Diversity of Students in 2-Year STEM Programs
Indicator 2.2.3:
Time-to-Degree for Students in STEM Academic Programs
8 Because Pell grants are given to low-income students, data on their recipients can be used
Indicator 2.3.1:
Diversity of STEM Instructors in Comparison with the
Diversity of STEM Graduate Degree Holders
Indicator 2.3.2:
Diversity of STEM Graduate Student Instructors in Comparison
with the Diversity of STEM Graduate Students
Indicator 2.4.1:
Students Pursuing STEM Credentials Feel Included and
Supported in Their Academic Programs and Departments
Indicator 2.4.2:
Instructors Teaching Courses in STEM Disciplines Feel
Included and Supported in Their Departments
9 This survey is administered by the HERI and the Culturally Engaging Campus Environ-
Indicator 2.4.3:
Institutional Practices Are Culturally Responsive,
Inclusive, and Consistent across the Institution
Indicator 3.1.1:
Completion of Foundational Courses, Including Developmental
Education Courses, to Ensure STEM Program Readiness
Indicator 3.2.1:
Retention in STEM Degree or Certificate Programs,
Course to Course and Year to Year
Indicator 3.2.2:
Transfers from 2-Year to 4-Year STEM Programs in
Comparison with Transfers to All 4-Year Programs
Indicator 3.3.1:
Percentage of Students Who Attain STEM Credentials
over Time, Disaggregated by Institution Type, Transfer
Status, and Demographic Characteristics
11 As noted above, almost all U.S. institutions receive Title IV funds.
12 See https://nces.ed.gov/ipeds/cipcode/Default.aspx?y=55 [August 2017].
The committee found that IPEDS and other federal data sources gen-
erally allow data to be disaggregated by students’ race and ethnicity and
gender. However, conceptions of diversity have broadened to include ad-
ditional student groups that bring unique strengths to undergraduate STEM
education and may also encounter unique challenges. To fully support the
indicators, federal data systems will need to include additional student
characteristics.
The committee also reviewed the many new, proprietary data sources
that have been developed over the past two decades in response to growing
accountability pressures in higher education. Although not always nation-
ally representative of 2-year and 4-year public and private institutions,
some of these sources include large samples of institutions and address the
committee’s goals and objectives.
Based on its review of existing public and proprietary data sources,
the committee considered research needs and data availability for each of
the 21 proposed indicators. It found that, for some indicators, further re-
search is needed to develop clear definitions and measurement approaches,
and overall, the availability of data for the indicators is limited. For some
indicators, nationally representative datasets are available, but when these
data are disaggregated, first to focus on STEM students and then to focus
on specific groups of STEM students, the sample sizes become too small
for statistical significance. For other indicators, no data are available from
either public or proprietary sources.
REFERENCES
Armstrong, J., and Zaback, K. (2016). Assessing and Improving State Postsecondary Data
Systems. Washington, DC: Institute for Higher Education Policy. Available: http://www.
ihep.org/sites/default/files/uploads/postsecdata/docs/resources/state_postsecondary_data_
systems-executive_summary.pdf [June 2016].
Berk, R.A. (2005). Survey of 12 strategies for measuring teaching effectiveness. International
Journal on Teaching and Learning in Higher Education, 17(1), 48–62.
Brancaccio-Taras, L., Pape-Lindstrom, P., Peteroy-Kelly, M., Aguirre, K., Awong-Taylor, J.,
Balser, R., Cahill, M.J., Frey, R.G., Jack, R., Kelrick, M., Marley, K., Miller, K.G.,
Osgood, M., Romano, S., Uzman, J.A., and Zhao, J. (2016). The PULSE vision & change
rubrics, version 1.0: A valid and equitable tool to measure transformation of life sciences
departments at all institution types. CBE-Life Sciences Education, 15(4), art 60. Avail-
able: http://www.lifescied.org/content/15/4/ar60.full [March 2017].
Brick, J.M., and Williams, D. (2013). Explaining rising nonresponse rates in cross-sectional
surveys. The ANNALS of the American Academy of Political and Social Science, 645(1),
36–59.
Burns, S., Wang, X., and Henning, A. (Eds.). NCES Handbook of Survey Methods. (NCES
2011-609). Washington, DC: U.S. Department of Education, National Center for Edu-
cation Statistics. Available: https://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2011609
[July 2017].
Campbell, C.M., and Cabrera, A.F. (2014). Making the mark: Are grades and deep learning
related? Research in Higher Education, 55(5), 494–507.
Cataldi, E.F., Fahimi, M., and Bradburn, E.M. (2005). 2004 National Study of Postsecond-
ary Faculty (NSOPF:04) Report on Faculty and Instructional Staff in Fall 2003. (NCES
2005-172). Washington, DC: U.S. Department of Education, National Center for Educa-
tion Statistics. Available: http://nces.ed.gov/pubs2005/2005172.pdf [July 2016].
Center for Postsecondary Research. (2017a). About: FSSE. Bloomington: Indiana University
Center for Postsecondary Research. Available: http://fsse.indiana.edu/html/about.cfm
[June 2017].
Center for Postsecondary Research. (2017b). About: NSSE. Bloomington: Indiana University
Center for Postsecondary Research. Available: http://nsse.indiana.edu/html/about.cfm
[June 2017].
Chen, X. (2016). Remedial Coursetaking at U.S. Public 2- and 4-Year Institutions: Scope,
Experiences, and Outcomes. (NCES 2016-405). Washington, DC: U.S. Department
of Education, National Center for Education Statistics. Available: https://nces.ed.gov/
pubs2016/2016405.pdf [July 2017].
Chen, X., and Soldner, M. (2013). STEM Attrition: College Students’ Paths Into and Out of
STEM Fields. Washington, DC: U.S. Department of Education.
Complete College America. (2014). Four-Year Myth: Make College More Affordable, Restore
the Promise of Graduating on Time. Indianapolis, IN: Author.
Cunningham, A.F., and Milam, J. (2005). Feasibility of a Student Unit Record System Within
the Integrated Postsecondary Education Data System. (NCES 2005-160). Washington,
DC: U.S. Department of Education, National Center for Education Statistics.
Drinkwater, M.J., Matthews, K.E., and Seiler, J. (2017). How is science being taught? Mea-
suring evidence-based teaching practices across undergraduate science departments.
CBE Life Sciences Education, 16(1), ar18, 1-11. Available: http://www.lifescied.org/
content/16/1/ar18.full.pdf [October 2017].
Dynarski, S.M., Hemelt, S.W., and Hyman, J.M. (2013). The Missing Manual: Using National
Student Clearinghouse Data to Track Postsecondary Outcomes. (Working Paper No.
W9552). Cambridge, MA: National Bureau of Economic Research. Available: http://
www.nber.org/papers/w19552 [September, 2017].
Eagan, K., Hurtado, S., Figueroa, T., and Hughes, B. (2014a). Examining STEM Pathways
Among Students Who Begin College at Four-Year Institutions. Paper prepared for the
Committee on Barriers and Opportunities in Completing 2- and 4-Year STEM De-
grees. Washington, DC. Available: http://sites.nationalacademies.org/cs/groups/dbassesite/
documents/webpage/dbasse_088834.pdf [April 2015].
Eagan, M.K., Stolzenberg, E.B., Berdan Lozano, J., Aragon, M.C., Suchard, M.R., and
Hurtado, S. (2014b). Undergraduate Teaching Faculty: The 2013–2014 HERI Faculty
Survey. Los Angeles: Higher Education Research Institute, University of California,
Los Angeles. Available: http://heri.ucla.edu/monographs/HERI-FAC2014-monograph.
pdf [August 2016].
Eagan, M.K., Stolzenberg, E.B., Ramirez, J.J., Aragon, M.C., Suchard, M.R., and Rios-Aguilar,
C. (2016). The American Freshman: Fifty-Year Trends, 1966–2015. Los Angeles: Higher
Education Research Institute, University of California, Los Angeles. Available: http://
www.heri.ucla.edu/monographs/50YearTrendsMonograph2016.pdf [August 2016].
Estrada, M. (2014). Ingredients for Improving the Culture of STEM Degree Attainment with
Co-curricular Supports for Underrepresented Minority Students. Paper prepared for the
Committee on Barriers and Opportunities in Completing 2- and 4-Year STEM Degrees.
Available: http://sites.nationalacademies.org/cs/groups/dbassesite/documents/webpage/
dbasse_088832.pdf [July 2017].
Executive Office of the President. (2015). Using Federal Data to Measure and Improve the
Performance of U.S. Institutions of Higher Education. Washington, DC: Executive
Office of the President. Available: https://collegescorecard.ed.gov/assets/UsingFederalData
ToMeasureAndImprovePerformance.pdf [February 2018].
Fosnacht, K., Sarraf, S., Howe, E., and Peck, L. (2017). How important are high response rates
for college surveys? Review of Higher Education, 40(2), 245–265.
Ginder, S., Kelly-Reid, J.E., and Mann, F.B. (2017). Graduation Rates for Selected Cohorts,
2008-2013; Outcome Measures for Cohort Year 2008; Student Financial Aid, Academic
Year 2015-2016; and Admissions in Postsecondary Institutions, Fall 2016: First Look
(Preliminary Data) (NCES 2017-150). U.S. Department of Education, Washington,
DC: National Center for Education Statistics. Available: https://nces.ed.gov/pubsearch/
pubsinfo.asp?pubid=2017150 [November 2017].
Handelsman, J., and Ferrini-Mundy, J. (2016). STEM Education: Cross-Agency Priority Goal
Quarterly Progress Update, FY2016 Quarter 1. Washington, DC: Office of Science and
Technology Policy.
HCM Strategists. (2013). A Better Higher Education Data and Information for Informing
Policy: The Voluntary Institutional Metrics Project. Washington, DC: HCM Strate-
gists. Available: http://hcmstrategists.com/wp-content/themes/hcmstrategists/docs/gates_
metrics_report_v9.pdf [January 2017].
Jenkins, D., and Fink, J. (2016). Tracking Transfer: New Measures of Institutional and State
Effectiveness in Helping Community College Students Attain Bachelor’s Degrees. New
York: Community College Research Center, Columbia University. Available: http://ccrc.
tc.columbia.edu/media/k2/attachments/tracking-transfer-institutional-state-effectiveness.
pdf [September 2017].
Khan, B. (2016). Overview of Science and Engineering Indicators 2016. Presentation to the
Committee on Developing Indicators for Undergraduate STEM Education, February 22.
Available: http://sites.nationalacademies.org/cs/groups/dbassesite/documents/webpage/
dbasse_171321.pdf [June 2016].
Knapp, L.G., Kelly-Reid, J.E., and Ginder, S.A. (2012). Enrollment in Postsecondary Institu-
tions, Fall 2011; Financial Statistics, Fiscal Year 2011; and Graduation Rates, Selected
Cohorts, 2003–2008: First Look. (Provisional data, NCES 2012-174). Washington, DC:
U.S. Department of Education, National Center for Education Statistics.
Manduca, C.A., Iverson, E.R., Luxenberg, M., Macdonald, R.H., McConnell, D.A., Mogk,
D.W., and Tewksbury, B.J. (2017). Improving undergraduate STEM education: The ef-
ficacy of discipline-based professional development. Science Advances, 3(2), 1–15.
Miller, B. (2016). Building a Student-level Data System. Washington, DC: Institute for Higher
Education Policy. Available: http://www.ihep.org/sites/default/files/uploads/postsecdata/
docs/resources/building_a_student-level_data_system.pdf [June 2017].
National Academies of Sciences, Engineering, and Medicine. (2016). Barriers and Opportuni-
ties for 2-Year and 4-Year STEM Degrees: Systemic Change to Support Students’ Diverse
Pathways. Washington, DC: The National Academies Press.
National Academies of Sciences, Engineering, and Medicine. (2017). Undergraduate Research
Experiences for STEM Students: Successes, Challenges, and Opportunities. Washington,
DC: The National Academies Press.
National Center for Education Statistics. (2012). 2012 Revision of NCES Statistical Standards:
Final. Washington, DC: National Center for Education Statistics. Available: https://nces.
ed.gov/statprog/2012/ [September 2017].
National Center for Education Statistics. (2014). Integrated Postsecondary Education Data
System (IPEDS). Available: http://nces.ed.gov/statprog/handbook/pdf/ipeds.pdf [June
2016].
National Center for Education Statistics. (2015). Table 326.20. Digest of Educational Sta-
tistics. Available: http://nces.ed.gov/programs/digest/d14/tables/dt14_326.20.asp [June
2016].
National Research Council. (2012). Discipline-Based Education Research: Understanding and
Improving Learning in Undergraduate Science and Engineering. Washington, DC: The
National Academies Press.
National Research Council. (2014). Capturing Change in Science, Technology, and Innovation:
Improving Indicators to Inform Policy. Washington, DC: The National Academies Press.
Available: http://www.nap.edu/catalog/18606/capturing-change-in-science-technology-
and-innovation-improving-indicators-to [June 2016].
National Science Foundation. (2016). Science and Engineering Indicators 2016. Arlington,
Virginia: National Science Foundation. Available: https://www.nsf.gov/statistics/2016/
nsb20161/#/ [February 2018].
National Student Clearinghouse. (2016a). Notes from the Field #3: Research Center Notes on
Letter of Sara Goldrick-Rab and Douglas N. Harris, University of Wisconsin-Madison.
Herndon, VA: Author. Available: https://nscresearchcenter.org/workingwithourdata/
notesfromthefield-3 [July 2016].
National Student Clearinghouse. (2016b). Who We Are. Herndon, VA: Author. Available:
http://www.studentclearinghouse.org/about [June 2016].
Piburn, M. and Daiyo, S. (2000). Reformed Teaching Observational Protocol (RTOP) Refer-
ence Manual. Available: http://files.eric.ed.gov/fulltext/ED447205.pdf [September 2017].
Porter, S.R. (2013). Self-reported learning gains: A theory and a test of college student survey
response. Research in Higher Education, 54(2), 201–226.
PULSE Fellows. (2016). The PULSE Vision and Change Rubrics Version 2.0. Available:
http://api.ning.com/files/Kfu*MfW7V8MYZfU7LNGdOnG4MnryzUgUpC2IxdtUmucn
B4QNCdLaOwWGoMoULSeKw8hF9jiFdh75tlzuv1nqtfCuM11hNPp3/PULSERubrics-
Packetv2_0_FINALVERSION.pdf [May 2017].
Sackett, P. (2012). Faking in personality assessments: Where do we stand? In M. Zieger, C.
MacCann, and R.D. Roberts (eds). New Perspectives in Faking in Personality Assessment
(pp. 330-344). New York: Oxford Univeristy Press.
Seymour, E., Wiese, D., Hunter, A., and Daffinrud, S.M. (2000). Creating a Better Mousetrap:
On-line Student Assessment of Their Learning Gains. Paper presentation at the National
Meeting of the American Chemical Society, San Francisco, CA, March 27.
Smith, M.K., Jones, F.H.M., Gilbert, S.L., and Wieman, C.E. (2013). The classroom obser-
vation protocol for undergraduate STEM (COPUS): A new instrument to characterize
university STEM classroom practices. CBE Life Sciences. Available: http://www.lifescied.
org/content/12/4/618.full [June 2016].
Van Noy, M., and Zeidenberg, M. (2014). Hidden STEM Knowledge Producers: Community
Colleges’ Multiple Contributions to STEM Education and Workforce Development.
Paper Prepared for the Committee on Barriers and Opportunities in Completing 2- and
4-Year STEM Degrees. Available: http://sites.nationalacademies.org/cs/groups/dbassesite/
documents/webpage/dbasse_088831.pdf [June 2017].
Walter, E.M., Beach, A.L., Henderson, C., and Williams, C.T. (2016). Describing instruc-
tional practice and climate: Two new instruments. In G.C. Weaver, W.D. Burgess, A.L.
Childress, and L. Slakey (Eds.), Transforming Institutions: Undergraduate STEM Educa-
tion for the 21st Century. West Lafayette, IN: Purdue University Press.
Whitfield, C., and Armstrong, J. (2016). The State of State Postsecondary Data Systems:
Strong Foundations 2016. Boulder, CO: State Higher Education Executive Officers. Avail-
able: http://www.sheeo.org/sites/default/files/publications/SHEEO_StrongFoundations
2016_FINAL.pdf [June 2016].
Wieman, C., and Gilbert, S. (2014). The teaching practices inventory: A new tool for charac-
terizing college and university teaching in mathematics and science. CBE-Life Sciences
Education, 13(3), 552-569.
William T. Grant Foundation, Spencer Foundation, and Bill & Melinda Gates Foundation
(2014). Measuring Instruction in Higher Education: Summary of a Convening. New
York: William T. Grant Foundation. Available: http://wtgrantfoundation.org/library/
uploads/2015/11/Measuring-Instruction-in-Higher-Education.pdf [October 2017].
Wine, J., Janson, N., and Wheeless, S. (2011). 2004/09 Beginning Postsecondary Students Lon-
gitudinal Study (BPS:04/09) Full-scale Methodology Report. (NCES 2012-246). Wash-
ington, DC: National Center for Education Statistics, Institute of Education Sciences,
U.S. Department of Education. Available: https://nces.ed.gov/pubs2012/2012246_1.pdf
[June 2017].
A
s detailed in Chapter 6, nationally representative data are not cur-
rently available from public or proprietary sources for most of the
committee’s proposed indicators. This limits policy makers’ ability
to track progress toward the committee’s goals of (1) increasing students’
mastery of STEM concepts and skills; (2) striving for equity, diversity, and
inclusion; and (3) ensuring adequate numbers of STEM professionals. That
chapter outlines steps toward developing each of the 21 indicators by revis-
ing various public and proprietary data sources to provide the data needed
for each one.
This chapter aims to reduce the complexity of implementing the indica-
tor system by presenting three options for obtaining the data required for
all of the indicators: (a) creating a national student unit record data system;
(b) expanding National Center for Education Statistics (NCES) data col-
lections; and (c) combining existing data from nonfederal sources. It also
discusses new data collection and analysis systems that could potentially be
used in the future to support the proposed indicator system. The chapter
ends with the committee’s conclusion about moving forward, including a
caution about the intended use of the proposed indicator system.
177
1 The National Center for Education Statistics has added new survey components to begin
tion Act (see Chapter 6). At this time, however, there are bipartisan bills
in Congress (H.R. 2434 and S. 1121, the College Transparency Act) that
would amend the Higher Education Act to repeal the current ban on a
national student unit record data system and direct the NCES to create such
a system. If the bills became law, NCES, when creating the new system,
could take advantage of the lessons learned from the many state higher
education systems and multi-institution education reform consortia that
have successfully collected and used unit record student data to monitor
undergraduate education.
Creating a national database of student unit records appears to be both
technically and financially feasible and could reduce institutions’ current
burden of reporting IPEDS data, as shown in two feasibility studies.
In 2005, NCES commissioned the first study, to examine the feasibility
of creating a student unit record data system that could replace the ag-
gregated institution-level data included in IPEDS. The study (Cunningham
and Milam, 2005) presented three findings. First, the authors found that
NCES had at the time most of the computing hardware and software neces-
sary to implement such a system, including equipment for web-based data
collection and servers for storing large amounts of student data. However,
to ensure the security and confidentiality of the data, NCES would have to
create a new, permanent database storage system, protected by physical and
software firewalls; the authors did not estimate how much these modifica-
tions would have cost.
Second, the authors found that implementing the new system at that
time would present colleges and universities with technical challenges,
requiring expenditures for new technology, training in the use of the new
reporting system, and personnel. Cunningham and Milam (2005) gathered
estimates of implementation costs from hundreds of people from a variety
of individual institutions, state higher education agencies, and higher educa-
tion associations. The cost estimates varied widely, depending on whether
an institution was currently participating in a state-level student unit record
data system (see Chapter 6) and its information technology and institu-
tional research capabilities. Another key factor was whether an institution
was already uploading student data to the National Student Loan Data
System (NSLDS; see Chapter 6); at the time, nearly all institutions were
doing so. Given these complex factors influencing costs, Cunningham and
Milam (2005) did not estimate an average per-institution cost, but noted
the possibility of providing federal support to defray these costs.
Third, the authors found that institutional costs would eventually de-
cline, partly because some IPEDS reporting would be eliminated.
Cunningham and Milam (2005) concluded that it was technically fea-
sible for most institutions to report student data to a national student unit
record data system, given time for transition. They did not address, how-
ever, whether this new reporting would be financially feasible for participat-
ing colleges and universities.
More recently, Miller (2016) analyzed various approaches to develop-
ing a national student unit record data system to be overseen by the Depart-
ment of Education. Like Cunningham and Milam (2005), the author noted
that nearly all institutions were already reporting data to NSLDS, which
included much of the data needed to track the progress of students receiving
financial aid over time (e.g., enrollments, transfers, field of study, comple-
tions). Because, on average, 70 percent of all students receive financial aid
(Executive Office of the President of the United States, 2017), NSLDS al-
ready includes much of the data needed for a national system. Miller thus
proposed that a national data system could best be created by building on
the existing capability of NSLDS.
Expanding NSLDS to include all data from all students appears tech-
nically feasible, based on the system’s recent history of adding 17 percent
more student records between February 2010 and July 2013. Miller (2016)
estimated that the programming changes to accommodate this growth
would cost around $1 million. Miller cautioned, however, that NSLDS
already has significant technical limitations and a history of poor process-
ing speeds. Adding millions of additional records on students who do not
receive financial aid could slow the system’s ability to perform its core
function of ensuring students receive financial aid and repay their loans.
To address this problem, Miller (2016) proposed a complete modernization
of NSLDS, which would require additional funding; he did not provide a
cost estimate.
In this proposal, NCES would handle access to the student unit record
data system by policy makers, researchers, and the public (Miller, 2016).
At least once a year, a data extract would be transmitted to NCES, which
would be responsible for generating public reports on higher education and
populating IPEDS with data no longer being reported to it. NCES would
also establish and implement protocols for allowing access to the database
while maintaining the privacy and confidentiality of individual student
records.
Miller (2016) argues that moving to the system he proposes would not
burden most institutions with massive, costly changes in their reporting, for
two reasons. First, NSLDS requires institutions to report data on only those
students who receive federal financial aid. Yet many institutions submit data
on all of their students to the National Student Clearinghouse (NSC), which
in turn reports on financial aid recipients to NSLDS on behalf of these
institutions. For this large group of public and private institutions (more
than 3,600 according to NSC; see Chapter 6), moving to a student unit
record system would simply mean passing along data they are already as-
sembling. Second, the student unit record data system would replace seven
1.2 Existence and use of 1.2.1 Extent of instructors’ Renewed and expanded
supports that help STEM involvement in professional NSOPF
instructors use evidence- development
based learning experiences
1.2.2 Availability of support Renewed and expanded
or incentives for evidence- NSOPF
based course development or
course redesign
continued
2 As noted above, the committee uses the term “instructor” to refer to all individuals who
teach undergraduates, including tenured and tenure-track faculty, part-time and adjunct in-
structors, and graduate student instructors.
TABLE 7-1 Continued
Objective Indicator Proposed Data Source
1.3 An institutional climate 1.3.1 Use of valid measures Renewed and expanded
that values undergraduate of teaching effectiveness NSOPF
STEM instruction
1.3.2 Consideration of Renewed and expanded
evidence-based teaching NSOPF
in personnel decisions by
departments and institutions
2.1 Equity of access to 2.1.1 Institutional structures, Extended and expanded BPS
high-quality undergraduate policies, and practices that
STEM educational strengthen STEM readiness
programs and experiences for entering and enrolled
college students
2.2 Representational equity 2.2.1 Diversity of STEM Unit record data system
among STEM credential degree and certificate earners
earners in comparison with diversity
degree and certificate earners
in all fields
TABLE 7-1 Continued
Objective Indicator Proposed Data Source
2.3.2 Diversity of STEM Revised IPEDS Human
graduate student instructors Resources Survey
in comparison with diversity
of STEM graduate students
2.4 Inclusive environments 2.4.1 Students pursuing Extended and expanded BPS
in institutions and STEM STEM credentials feel
departments included and supported in
their academic programs and
departments
3.2 Successful navigation 3.2.1 Retention in STEM Unit record data system
into and through STEM programs, course to course
programs of study and year to year
3.3 STEM credential 3.3.1 Number of students Unit record data system
attainment who attain STEM credentials
over time, disaggregated
by institution type, transfer
status, and demographic
characteristics
NOTES: BPS, Beginning Postsecondary Students Longitudinal Study; IPEDS, Integrated Post-
secondary Education Data System; NSOPF, National Study of Postsecondary Faculty.
The IPEDS data do not fully support these two indicators because they
do not permit the disaggregation of STEM credentials by disability status
and Pell status (a proxy for socioeconomic status).3 Fortunately, states and
voluntary multi-institution data initiatives have developed and implemented
an expanded range of measures that could fill some of the gaps in IPEDS
(see below).
Under this option, NCES (with support from the National Science
Foundation [NSF]) would expand IPEDS institutional surveys to include
institution-level measures of student progress toward degrees and certifi-
cates. In addition, NCES would extend existing student surveys and revive
a major faculty survey. Table 7-2 shows how each indicator would be sup-
ported under this option.
In comparison with the first option, this option would place a greater
burden on institutions of higher education. In Option 1, institutional re-
search staff would only have to upload student unit record data to the
national student unit record system. In this option, institutional research
staff would be required to collect additional data (beyond their current col-
lections) and use it to calculate additional measures for reporting to IPEDS.
3 Beginning in 2017–2018, NCES will gather information on students’ Pell grant status as
1.2 Existence and use of 1.2.1 Extent of instructors’ Renewed and expanded
supports that help STEM involvement in professional NSOPF
instructors use evidence- development
based learning experiences
1.2.2 Availability of support Renewed and expanded
or incentives for evidence- NSOPF
based course development or
course redesign
1.3 An institutional climate 1.3.1 Use of valid measures Renewed and expanded
that values undergraduate of teaching effectiveness NSOPF
STEM instruction
1.3.2 Consideration of Renewed and expanded
evidence-based teaching NSOPF
in personnel decisions by
departments and institutions
2.1 Equity of access to 2.1.1 Institutional structures, Extended and expanded BPS
high-quality undergraduate policies, and practices that
STEM educational strengthen STEM readiness
programs and experiences for entering and enrolled
college students
TABLE 7-2 Continued
Objective Indicator Proposed Data Source
2.2 Representational equity 2.2.1 Diversity of STEM Revised and expanded IPEDS
among STEM credential degree and certificate earners
earners in comparison with diversity
of degree and certificate
earners in all fields
2.4 Inclusive environments 2.4.1 Students pursuing Extended and expanded BPS
in institutions and STEM STEM credentials feel
departments included and supported in
their academic programs and
departments
TABLE 7-2 Continued
Objective Indicator Proposed Data Source
3.1 Foundational 3.1.1 Completion of Revised and expanded IPEDS
preparation for STEM for foundational courses,
all students including developmental
education courses, to ensure
STEM program readiness
3.2 Successful navigation 3.2.1 Retention in STEM Revised and expanded IPEDS
into and through STEM programs, course to course
programs of study and year to year
3.3 STEM credential 3.3.1 Number of students Revised and expanded IPEDS
attainment who attain STEM credentials
over time (disaggregated
by institution type, transfer
status, and students’
demographic characteristics)
NOTES: BPS, Beginning Postsecondary Students Longitudinal Study; IPEDS, Integrated Post-
secondary Education Data System; NSOPF, National Study of Postsecondary Faculty.
Overall, this option would require NCES to change its data collection in
three ways: expanding IPEDS, expanding the Beginning Postsecondary Stu-
dents Longitudinal Study (BPS), and renewing and expanding the National
Study of Postsecondary Faculty.
Expanding IPEDS
Under this option, IPEDS surveys would be expanded to require institu-
tions to report on several measures that have been developed and tested in
voluntary data collections by states and higher education reform consortia:
see Box 7-1. These new measures, like the committee’s proposed indicators,
are designed to represent important dimensions of undergraduate education
in a readily understandable form.
Specifically, NCES would expand the IPEDS surveys to include the fol-
lowing measures, as defined by Janice and Voight (2016, p. iv):
BOX 7-1
New Measures in Higher Education
4 Title IV includes students with loans under the Federal Family Education Loan Program
or William D. Ford Federal Direct Loan (Direct Loan) Program, as well as students who have
Federal Perkins Loans, students who have received Federal Pell Grants, Teacher Education
Assistance for College and Higher Education Grants, Academic Competitiveness Grants or
Science and Math Access to Retain Talent Grants, and students on whose behalf parents bor-
rowed Parent PLUS loans.
benefit of the first option: if a national student unit record data system is
created, the burden of calculating the proposed indicators related to STEM
students’ progress and completion would fall on NCES rather than on
institutions.
4-year institutions and of STEM students. Third, given the decline in survey
response rates, the survey organizations would be provided with support
for incentives or other mechanisms to boost response rates. Table 7-3 sum-
marizes how the indicators could be supported in this option.
1.2 Existence and use of 1.2.1 Extent of instructors’ Revised and expanded
supports that help STEM involvement in professional proprietary surveys to include
use evidence-based STEM development a nationally representative
learning experiences sample of all types of 2-year
and 4-year institutions
1.3 An institutional climate 1.3.1 Use of valid measures Revised and expanded
that values undergraduate of teaching effectiveness proprietary surveys to include
STEM instruction a nationally representative
sample of all types of 2-year
and 4-year institutions
TABLE 7-3 Continued
Objective Indicators Data Source
2.3 Representational 2.3.1 Diversity of STEM Nationally representative
diversity among STEM instructors in comparison sample of institutions drawn
instructors with diversity of STEM from appropriate voluntary
graduate degree holders reform initiatives
CONCLUSIONS
CONCLUSION 6 Three options would provide the data needed for
the proposed indicator system:
Option 1 would provide the most accurate, complete, and useful data
to implement the proposed indicators of students’ progress through STEM
education. As noted above, legislation has been introduced in Congress
to repeal the current ban on a student unit record data system and direct
NCES to create it. Although creating a national student unit record data
system would require investment of federal resources, the system would
provide valuable information to policy makers about the status and quality
of undergraduate education generally, not only in STEM fields. Institutions
would be required to share their student unit record data with the federal
government, but they would not be required to gather any additional data
or make any additional calculations beyond what they already provide
to IPEDS. This option for implementing the indicator system would also
require regular surveys of students and faculty for data not covered by a
student unit record data system.
Option 2 would take advantage of the well-developed system of insti-
tutional surveys that NCES uses to obtain IPEDS data annually from the
vast majority of 2-year and 4-year institutions. Under this option, NCES
would add to these surveys some of the new measures of student progress
developed by higher education reform consortia, which include part-time
and transfer students. Some of the measures are closely related to the
proposed indicators. Like the first option, this option would also require
investment of federal resources, but it would draw on the strengths of the
well-established system of institutional reporting for IPEDS. In comparison
with Option 1, this option would increase institutions’ burden for IPEDS
reporting, requiring them to calculate additional measures based on their
internal student unit record data. The additional measures would provide
much of the student data needed for the indicator system, but the system
would also require data from regular surveys of students and faculty.
Option 3 could be carried out by the federal government or another
entity (e.g., a higher education association). It would take advantage of
the rapid growth of higher education data collection and analysis by state
higher education systems and education reform consortia across the country
and require little or no federal investment. As noted above, some of these
new measures of student progress are similar to the committee’s indicators.
As in option 1 and 2, additional data from surveys would be needed to
support the indicators.
A Note of Caution
The proposed indicator system would create a picture of the cur-
rent status of undergraduate STEM education and allow policy makers to
monitor change over time, including movement toward the three goals that
underlie the indicator system. Although individual institutions or consortia
of institutions may wish to adopt some or all of these indicators to monitor
their own STEM educational programs, the indicator system is not intended
to support ranking systems or inter-institutional comparisons. Many of the
indicators are influenced by the socioeconomic status, parental education,
and high school preparation of potential STEM students, long before these
REFERENCES
Cunningham, A.F., and Milam, J. (2005). Feasibility of a Student Unit Record System Within
the Integrated Postsecondary Education Data System. (NCES 2005–160). U.S. Depart-
ment of Education, National Center for Education Statistics. Washington, DC: U.S.
Government Printing Office.
Engle, J. (2016). Answering the Call: Institutions and States Lead the Way Toward Better
Measures of Postsecondary Performance. Seattle, WA: Bill & Melinda Gates Founda-
tion. Available: http://postsecondary.gatesfoundation.org/wp-content/uploads/2016/02/
AnsweringtheCall.pdf [June 2017].
Executive Office of the President of the United States. (2017). Using Federal Data to Measure
and Improve the Performance of Institutions of Higher Education. Washington, DC:
Author. Available: https://collegescorecard.ed.gov/assets/UsingFederalDataToMeasure
AndImprovePerformance.pdf [September 2017].
HCM Strategists. (2013). The Voluntary Institutional Metrics Project: A Better Higher Educa-
tion Data and Information Framework for Informing Policy. Washington, DC: Author.
Available: https://www.luminafoundation.org/resources/a-better-higher-education-data-
and-information-framework-for-informing-policy [July 2017].
Janice, A., and Voight, M. (2016). Toward Convergence: A Technical Guide for the Postsec-
ondary Metrics Framework. Washington, DC: The Institute for Higher Education Policy.
Available: http://www.ihep.org/research/publications/toward-convergence-technical-
guide-postsecondary-metrics-framework [July 2017].
Marist College. (2017). Learning Analytics Project Wins Innovation Award. Available: http://
www.marist.edu/publicaffairs/eduventuresaward2015.html [July 2017].
Miller, B. (2016). Building a Student-level Data System. Washington, DC: Institute for Higher
Education Policy. Available: http://www.ihep.org/sites/default/files/uploads/postsecdata/
docs/resources/building_a_student-level_data_system.pdf [June 2017].
National Academies of Sciences, Engineering, and Medicine. (2017). Supporting Students’
College Success: The Role of Assessment of Intrapersonal and Interpersonal Com-
petencies. Washington, DC: The National Academies Press. Available: https://www.
nap.edu/catalog/24697/supporting-students-college-success-the-role-of-assessment-of-
intrapersonal [October 2017].
Sclater, N., and Peasgood, A. (2016). Learning Analytics in Higher Education: A Review
of UK and International Practice. Available: https://www.jisc.ac.uk/reports/learning-
analytics-in-higher-education [July 2017].
University of Maryland, Baltimore County. (2017). Division of Information Technology Ana-
lytics. Available: http://doit.umbc.edu/analytics [July 2017].
Appendix A
I
n order to obtain broad input into its work, the committee publicly
released a draft report for comment in August 2016 after completing
Phase I of the study. This draft report was intended to elicit feedback
from the interested public in order to ensure that the committee was com-
prehensively covering the relevant terrain and also proposing reasonable
goals and objectives that could be monitored over time without imposing
undue data collection burdens. The interim report was available on the
committee’s website, with a 7-week period for comment.
Public comments were sought to obtain perspectives and insights from
researchers and practitioners knowledgeable about undergraduate STEM
reform and education statistics.
The public comment draft included a conceptual framework for the
indicator system, identified goals and objectives for improving undergradu-
ate STEM education at both 2-year and 4-year institutions, and reviewed
existing systems for monitoring undergraduate STEM education: Table A-1
shows the draft goals and objectives on which the committee sought com-
ment. Based on the committee’s consideration of what information from the
public would be most useful for the second phase of the study, the report
included a series of questions for readers to respond to, as follows:
201
OVERARCHING ISSUES
Several themes emerged across all comments received, through the
website, letters, and at the October meeting:
APPENDIX A 203
COMMITTEE RESPONSE
In response to these comments, the Committee made several revisions
to the interim goals and objectives shown in Table A-1:
APPENDIX A 205
• Career development/advising
• Evidence-based instructional practices
Process 1.3 Appropriate general education • Core proficiency in math, language and communication, and
experiences for STEM students’ digital fluency/computational thinking
foundational preparation
3. Evidence- Process 3.1 Use of evidence-based STEM • Active learning instructional strategies
Based (EB) educational practices both in and • Formative assessment
Education out of classrooms • Advising and mentoring
• Co-curricular opportunities/experiences
• Internships
• Engage in relevant interdisciplinary big questions
• Authentic practice
• Backward design of courses and programs
• Aligned assessments
• Data driven course and program improvements
Process 3.2 Equitable access to evidence- • Mentoring and advising
based STEM educational practices • Diversity of instructional staff
both in and out of classrooms • Numbers of students experiencing evidence-based practices
Indicators for Monitoring Undergraduate STEM Education
continued
TABLE A-1 Continued
208
Appendix B
T
his appendix presents measurement approaches and formulas that
could potentially be used to calculate some of the committee’s pro-
posed indicators. Given the complexity of the phenomena the indica-
tors are designed to measure and the limited data available, the committee
does not propose an approach or formula for every indicator. Table B-1
lists only selected indicators.
209
1.3.1 Use of valid measures of teaching Percentage of departments that use validated
effectiveness measures other than typical student
evaluations to measure instructional quality
(such as validated observation protocols,
teaching portfolios, validated self-report
tools)
2.1.1 Institutional structures, policies, and Curricular practices that strengthen levels of
practices that strengthen STEM readiness STEM readiness for entering and enrolled
for entering and enrolled college students students (e.g., accelerated developmental
mathematics course sequences); assessment
and placement practices that strengthen
levels of STEM readiness for entering and
enrolled students (e.g., multiple measures for
mathematics placement); academic program
structures that promote coherence in STEM
course taking and timely degree completion
(e.g., guided pathways); institutional
structures that enhance access to STEM
courses (e.g., dual enrollment)
APPENDIX B 211
TABLE B-1 Continued
Indicator Possible Formula
2.1.2 Entrance to and persistence in Percentage of entering college students
STEM educational programs that state an intention to major in STEM,
disaggregated by race and ethnicity, gender,
socioeconomic status, first-generation status,
and ability status; persistence rates for STEM
aspirants, disaggregated by race and ethnicity,
gender, socioeconomic status, first-generation
status, and ability status
2.2.1 Diversity of STEM degree and Ratio of the share of STEM undergraduate
certificate earners in comparison with the degrees earned to the share of all
diversity of degree and certificate earners undergraduate degrees earned, disaggregated
in all fields by race and ethnicity, gender, socioeconomic
status, first-generation status, and ability
status
2.2.2 Diversity of students who transfer Ratio of the share of 2-year college transfer
from 2-year- to 4-year STEM programs in students entering 4-year STEM degree
comparison with diversity of students in programs to the share of all 2-year college
2-year STEM programs students in STEM programs, disaggregated
by race and ethnicity, gender, socioeconomic
status, first-generation status, and ability
status
2.2.3 Time to degree for students in STEM 3-year graduation rates for students in
academic programs 2-year STEM programs, disaggregated by
race and ethnicity, gender, socioeconomic
status, first-generation status, and ability
status; 4-year and 6-year graduation rates
for students in 4-year STEM programs,
disaggregated by race and ethnicity, gender,
socioeconomic status, first-generation status,
and ability status; average time to degree of
students earning bachelor degrees; average
time to degree of students earning associate
degrees; average academic terms (semesters
or quarters) to degree of students earning
bachelor degrees; average academic terms
(semesters or quarters) to degree of students
earning associate degrees
continued
TABLE B-1 Continued
Indicator Possible Formula
2.3.1 Diversity of STEM instructors Ratio of the share of STEM instructors to the
in comparison with diversity of STEM share of all STEM graduate degree holders,
graduate degree holders disaggregated by race and ethnicity, gender,
socioeconomic status, first-generation status,
and ability status and by STEM discipline and
institutional type
2.3.2 Diversity of STEM graduate student Ratio of the share of STEM teaching
instructors in comparison with diversity of assistants to the share of all STEM graduate
STEM graduate students students, disaggregated by race and ethnicity,
gender, socioeconomic status, first-generation
status, and ability status and by STEM
discipline and institutional type
APPENDIX B 213
TABLE B-1 Continued
Indicator Possible Formula
2.4.2. Instructors who teach courses in Proportion of STEM faculty expressing
STEM disciplines feel supported and satisfaction with the collegiality among
included in their departments faculty in their departments, disaggregated
by race and ethnicity, gender, rank, and
employment status and by STEM discipline
and institutional type; proportion of
STEM faculty experiencing stress due to
discrimination, disaggregated by race and
ethnicity, gender, rank, and employment
status and by STEM discipline and
institutional type
Appendix C
AGENDA
Open Sessions: Workshop on Developing Indicators for Undergraduate
STEM
215
10:30–10:45 Break
12:15–1:15 Lunch
APPENDIX C 217
AGENDA
Open Sessions: Meeting for Public Comment
10:30 Break
2:45 Break
APPENDIX C 219
Appendix D
Biographical Sketches of
Committee Members and Staff
221
Education. She has a B.A. from Ithaca College and a Ph.D. in neuroscience
from the University of Miami.
CHARLES BLAICH is the director of the Center of Inquiry and the Higher
Education Data Sharing Consortium at Wabash College. He collaborated
with researchers at other universities to design and implement the Wabash
National Study of Liberal Arts Education, which has involved 49 colleges
and universities as participants in the longitudinal research project on the
practices and conditions that support student learning. In his academic
research on the auditory communication in zebra finches, undergraduate
students were his collaborators as well as coauthors on all of his papers and
conference presentations. He serves on an advisory panel for a project in
the California State University system to enhance access to STEM education
for underrepresented minorities. He has received teaching awards from the
University of Connecticut, Eastern Illinois University, and Wabash College.
He has a B.S. in psychology, an M.A. in experimental psychology, and a
Ph.D. in developmental psychology, all from the University of Connecticut.
APPENDIX D 223
president for academic affairs and senior advisor to the president. He also
held a number of academic positions at Drexel University, the University
of Michigan, the University of Florida, and Carnegie Mellon University. He
is a fellow of the Institute of Electrical and Electronics Engineers (IEEE)
and of the American Society of Engineering Education (ASEE). He has
received numerous awards for his research and educational contributions,
including the ASEE Benjamin Garver Lamme Award, the IEEE Millennium
Medal, the IEEE Education Medal, and the Aristotle Award from the Semi
conductor Research Corporation. He is a member of the National Academy
of Engineeering. He has a B.S. from the State University of New York at
Stony Brook and an M.S. and a Ph.D. in electrical engineering from the
University of California, Berkeley.
California State University at Chico and a Ph.D. in genetics from the Uni-
versity of California, Davis.
APPENDIX D 225
instructors and staff, improving the educational system, and fostering edu-
cational innovation and discovery all in service of removing disparities in
undergraduate student outcomes while maximizing learning. His work
has included STEM educational and training programs for middle school
through college, as well as undergraduate teaching. He has also been active
in creating and leading applications of technology for instruction, scientific
visualization and simulation, tools for evidence-based instructional actions,
curriculum development and evaluation, and science exhibits for students
from elementary school through graduate school and for the general public.
He is the founder of the Tools for Evidence-based Actions community, a
group of researchers and administrators from more than 100 universities
dedicated to sharing tools and methodologies that encourage evidence-
based instructional actions. He has a B.S. in biophysics and chemistry
from Wayne State University and a Ph.D. in biophysical chemistry from the
University of California, Berkeley.
APPENDIX D 227