Assessment Resources

The Office of Assessment, Trinity College is part of a rich community of practice in the United States and beyond. We are grateful to be able to share the insights, best practices, visionary thinking among our colleagues at Duke and at other institutions of higher education. The Office of Assessment cannot take responsibility for changing content on linked websites, though we do review externally-linked materials annually. Readers who discover errors are asked to send us notification at assessment@duke.edu, and we invite our colleagues at Duke and elsewhere to recommend readings and resources to continue to build the list.

The following resources address many aspects of assessment, program evaluation, and institutional research. These materials are not exhaustive; rather they are a starting point for deeper discovery in the practice of learning outcomes assessment. Faculty members who are new the role of assessment liaison, and those who would like to learn more about the principles and operation of assessment in Trinity College, are encouraged to consult our Assessment Handbook for Program Officers. You are encouraged to email the team if you need assistance finding or using information for your work. The Office of Assessment assigns staff liaisons to each A&S department for continuity of consultations and collaborations. A&S programs' primary and/or secondary liaisons are available here.

If you use specific data to inform student learning outcomes in your program, please consult our Policy on Data Dissemination, including the data request form.

Last reviewed July 2022


Table of Contents

Assessment Fundamentals

  • What are assessment and the assessment cycle?
  • Assessment of Learning in General Education
  • Assessment of Learning in a Program of Study
  • Assessment of Learning in a Course
  • Writing Good, Relevant Learning Outcomes
  • Designing or Choosing Measures or Instruments
  • Analysis of Evidence
  • Communication and Use of Findings

Assessment in Specific Learning Contexts

  • Planning for a Self-Study Prior to Review
  • High-impact Practices or Experiences
  • Examples of Disciplinary Orientations

Faculty and Student Engagement in Assessment

  • Diversity, Equity, and Inclusion
  • Students as Essential Partners
  • Faculty Support and Development
Accreditation
Data Sources

Assessment Technologies and Platforms

  • Lists of Supportive Technologies
  • Technology Usage
  • Select Emerging Innovations

Ethical Conduct of Assessment

  • Privacy and Confidentiality
  • Data Security
  • Fairness and Equity

Theoretical Foundations of and Key Publications in Learning Outcomes Assessment

  • What is Action Research?
  • Common Conceptual Frameworks Used in Learning Outcomes Assessment
  • Select Perspectives on the State of our Practice
  • Additional References
  • Prominent Journals in Learning Outcomes Assessment

Other Connections

  • Duke Resources and Partners
  • External Organizations

 

 

Assessment Fundamentals

What are assessment and the assessment cycle?

A quick Google search will uncover dozens if not hundreds of common-sense definitions of “assessment.”  In Trinity College, we think about assessment as the ongoing process of rigorous self-study that:

  • documents good educational practice,
  • helps faculty and staff create, revise, or enhance learning opportunities for students,
  • informs students’ own understandings of their development,
  • enables rich discussions of our mission and values as a learning community, and
  • provides evidentiary support for external reports including the requirements of accreditation and funding proposals. 

The assessment cycle can be illustrated in a variety of ways (see these Google search results), but these visualizations all are based on the idea that assessment processes start with articulations of objectives, and moves through the collection, interpretation and discussion of evidence, before using findings to make informed plans for future teaching and learning.  The process is iterative and introspective.

Institutional Examples of the Assessment Cycle
  • George Washington University (LINK)
  • James Madison University (LINK)
  • Penn State University (LINK)
Examples of Assessment Timelines
  • Cal Poly (LINK)
  • Yavapai College (LINK)
  • Rochester Institute of Technology (LINK)
  • Accreditation Board for Engineering and Technology (ABET) (LINK)
General Glossaries of Assessment Terms
  • Harcourt Mifflin Harcourt Assessments (LINK)
  • Mohawk Valley Community College (LINK)
  • Carnegie Mellon University (LINK)

Assessment of Learning in the General Education

What do we mean by "General Education"?

A college’s general education is the foundational work all students complete towards the baccalaureate degree in the college. Students typically complete additional courses or experiences in their undergraduate majors, minors, or certificate programs. Trinity College’s general education overlays and intersects with the requirements of academic plans, operationalized through codes for Areas of Knowledge, Modes of Inquiry, and Small Group Learning Experiences (summarized here).

Guides for Assessment in the General Education
  • Assessing General Education Learning Outcomes (AAC&U) (LINK)
  • GWU Guide to General Education Assessment (LINK)
  • Beauchman, M., & Waldenberger, S. (2017, September). Assessing general education: Identifying outcomes, data analysis, and improvements. (Assessment in Practice). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA).  (LINK)
Examples of General Education
  • Clemson University (LINK
  • Georgia Tech (LINK)
  • Rice University (LINK)
  • Stanford University (LINK
  • University of Notre Dame (LINK)
  • University of Virginia (LINK
  • Vanderbilt University (LINK
Where and How Do Course-, Program-, and Institution-level Assessment Activities Intersect?
  • Miller, R., & Leskes, A. (2005).  Levels of assessment.  Association of American Colleges and Universities.  (LINK)
  • Capsim blog post: The Five Levels of Assessment in Higher Education (LINK)
  • Millet, I., & Weinstein, S. (2015). Grading by objectives: A matrix method for course assessment. Quality Approaches in Higher Education6(2), 3-19. (LINK)

Assessment of Learning in a Program of Study

See also “disciplinary orientations” in a later section.

What is Program Assessment?
Developing a Program Assessment Plan
  • University of Cincinnati (LINK)
  • UW-Madison (LINK)
  • University of Oklahoma (LINK)
  • Duke A&S curriculum review process and template (restricted access) (LINK)  
Department Assessment Portfolio (DAP)
  • The Office of Assessment uses the online DAP to structure departments’ and programs’ annual assessment activities and reports. 
  • Launch the DAP (restricted access):  Open mail.duke.edu, navigate to OneDrive, in the Shared folder, open the Department Assessment Portfolio folder.
  • Instructions and support documentation for Duke A&S personnel (LINK)
  • Sample timelines for new and returning faculty users (COMING SOON)

Assessment of Learning in a Course

Basics
  • Video from the National Council on Measurement in Education (LINK)
  • Cornell University (LINK)
  • Tufts University (LINK)
  • Northwestern University (LINK)
  • Duke Learning Innovation  (LINK)
Distinguishing Assessment from Grading
  • Brookhart, S. M., Guskey, T. R., Bowers, A. J., McMillan, J. H., Smith, J. K., Smith, L. F., ... & Welsh, M. E. (2016). A century of grading research: Meaning and value in the most common educational measure. Review of Educational Research86(4), 803-848.  (LINK)
  • Sadler, D. R. (2005). Interpretations of criteria‐based assessment and grading in higher education. Assessment & evaluation in higher education30(2), 175-194.  (LINK)
  • Walvoord, B. E., & Anderson, V. J. (2011). Effective grading: A tool for learning and assessment in college. John Wiley & Sons. (LINK)
     
  • Mt. Holyoke College (LINK)
  • Carnegie Mellon (LINK)
  • Villanova (LINK)
  • University of Dayton  (LINK)
  • Carlton (LINK)
  • Iowa State (LINK)
  • Vanderbilt (LINK)
  • The IDEA Center (LINK)

Writing Good, Relevant Learning Outcomes

Well-conceived and well-worded outcomes are the foundation of an effective assessment plan.  They should represent and operationalize the program’s mission in clear, measurable statements of students’ attainment and learning progress.

How-to Guides for Learning Outcomes
  • Duke University (LINK)
  • University of Wisconsin-Madison (LINK)
  • James Madison University (LINKLINK)
  • IUPUI (LINK)
  • Cal Poly (LINK)
  • Academic Learning Compacts (Florida State System) (LINK)
  • Checklist for good learning objectives (JMU)  https://www.jmu.edu/assessment/_files/checklist.doc
     
  • Adelman, C. (2015, February). To imagine a verb: The language and syntax of learning outcomes statements (Occasional Paper No. 24). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA).  (LINK)
Developing a Coherent Curriculum Map

A curriculum map or matrix is an illustration of student learning outcomes across a learning experience (whether a topic, a course, or a program of study). The curriculum map can be used to:

  • Understand the learning journey across experiences,
  • Facilitate discussion among stakeholders,
  • Select and schedule appropriate assessment tasks.
     
  • Summary (LINK)
  • National Institute for Learning Outcomes Assessment: (LINKLINK)
  • University of Illinois (LINK)
  • University of Cincinnati (LINK)
  • University of Massachusetts (LINK)

Designing or Choosing Measures or Instruments

Key Attributes

It also is helpful to understand some of the terms assessment experts use to characterize and evaluate the suitability of an assessment measure.

Variations

There are many types of learning measures available to instructors and academic programs. The following sites illustrate and describe the many options for measurement and collection of evidence.

Item Analysis and Discrimination
  • Penn State University (LINK)
  • University of Illinois (LINK)
  • Buffalo State College (LINK
     
  • Clauser, J. C., & Hambleton, R. K. (2017). Item analysis for classroom assessments in higher education. In Handbook on measurement, assessment, and evaluation in higher education (pp. 355-369). Routledge. (LINK)
  • D'Sa, J. L., & Visbal-Dionaldo, M. L. (2017). Analysis of multiple choice questions: Item difficulty, discrimination index and distractor efficiency. International Journal of Nursing Education9(3).  DOI: 10.5958/0974-9357.2017.00079.4 (LINK)
  • DeMars, C. (2010). Item Response Theory. New York, NY: Oxford University Press.  (LINK)
  • Haladyna, T.M. & Downing, S.M. & Rodriguez, M.C. (2002). A review of multiple-choice item-writing guidelines for classroom assessment. Applied Measurement 
  • in Education, 15(3), 309-334.  (LINK)
  • Quaigrain, K., & Arhin, A. K. (2017). Using reliability and item analysis to evaluate a teacher-developed test in educational measurement and evaluation. Cogent Education4(1), 1301013.  DOI: 10.1080/2331186X.2017.1301013

Standardized Tests in Use in Trinity College

Type: In-class Tests or Quizzes
  • Asking good test questions (Cornell LINK; University of Illinois LINK)
  • Selecting and designing instruments (James Madison University) (LINK)
  • Where to look for pre-existing instruments (James Madison University) (LINK)
  • Multiple-choice exams (LINK)
  • An annotated bibliography of test development (LINK)
  • Benjamin, R., Miller, M. A., Rhodes, T. L., Banta, T. W., Pike, G. R., & Davies, G. (2012, September). The seven red herrings about standardized assessments in higher education. (Occasional Paper No. 15). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA). (LINK)
Type: Essays and Manuscripts
Type: Projects
  • Toolkits for project-based learning (LINKLINK)
  • Carlton College Service Learning projects (LINK)
  • Evaluating capstone projects, (University of Hawaii LINK, Univ. of New South Wales LINK)
  • University of Warwick (LINK)
  • University of Nebraska (LINK)
Type: Presentations and Performances
  • University of Warwick (LINKLINK)
  • Texas A&M University  (LINK)
  • Carlton College (LINKLINK)
  • University of New South Wales (LINK)
Type: Using Course Evaluations to Understand Student Learning

There are many peer-reviewed journal pieces and other publications supporting and critiquing course evaluation processes. The following selections focus on the use of evaluation results to inform teaching practice and program-level assessment.

  • University of Wisconsin-Madison (LINK)
  • Iowa State University (LINK)
  • Vanderbilt University (LINK)
  • IUPUI (LINK)
  • Duquesne University (LINK)
  • Cornell University (LINK)
Type: Surveys
  • Understanding advantages and disadvantages of surveys (LINKLINKLINK)
  • Swarthmore College (LINK
  • Cornell Univeristy (LINK)
  • Carlton College and knowledge surveys (LINK
  • NASPA Foundation (LINK)   
  • Duke University survey policies (LINKLINK)
  • Duke Initiative on Survey Methodology (LINK)
  • Qualtrics at Duke (LINK)
Guidance for Assessment in Large Courses
  • Carleton College (LINK)
  • University of Calgary (LINK)
  • Yale University (LINK)
  • University of Illinois (LINK)
  • University of New South Wales (LINK)
  • Duke Learning Innovation regularly works with Duke faculty to design and plan for learning assessments in large courses.
Alternative Ways to Understand the Student Learning Experience
  • Although exams and essays traditionally have been commonly-used assessment techniques, there are a variety of other ways to collect useful feedback about teaching and evidence of student learning. 
  • Summary from Indiana University (LINK)
  • Summary from University of Missouri system (LINK)
Inviting a colleague to observe your class:
  • Duke Learning Innovation (LINK)
  • The Duke Graduate School (LINK)
  • Cornell University (LINK
  • University of West Oahu (LINK)
  • Bowdoin College (LINK)
Peer assessments and group work:
  • Cornell University (LINK)
  • Carnegie Mellon University (LINK)
  • Carleton College (LINK)
Student self-assessments:
  • McMillan, J. H., & Hearn, J. (2008). Student self-assessment: The key to stronger student motivation and higher achievement. Educational horizons87(1), 40-49. (LINK)
  • Rice University (LINK)
  • Cornell University (LINK)
  • University of New South Wales (LINK)
  • Faculty Focus (LINK)
Feedback from internships or other co-curricular experiences:
  • Beard, D. F. (2007). Assessment of internship experiences and accounting core competencies. Accounting Education: An international journal16(2), 207-220.  (LINK)
  • Bender, D. (2020). Internship assessment in professional program accreditation: a 10-year study. Education+ Training.  (LINK)
  • University of Warwick (LINK)
  • Faculty Focus (LINK)
Concept maps: 
  • Carnegie Mellon University (LINK)
  • University of Warwick (LINK)
  • Carleton College (LINK)
  • Georgia Tech (LINK)
Class participation:
  • Faculty Focus (LINK)
  • University of Warwick (LINK)
  • University of New South Wales (LINK)
  • Boston University (LINK)
  • Inside Higher Ed (LINK)
  • Carnegie Mellon University (LINK)
Reflections and portfolios:
  • Stanford University (LINK)
  • Carleton College (LINK)
  • Bowling Green State University (LINK)
  • UC Berkeley (LINK)
  • Association of American Colleges and Universities (LINKLINK)
  • Inside Higher Ed (LINK)
  • Educause (LINK)
  • Wiley Publisher (LINK)
  • Association for Authentic, Experiential and Evidence-based Learning (LINK)
  • Cambridge, D. (2010). Eportfolios for lifelong learning and assessment. John Wiley & Sons.
  • Cambridge, D., Cambridge, B. L., & Yancey, K. B. (Eds.). (2009). Electronic portfolios 2.0: Emergent research on implementation and impact. Stylus Publishing, LLC..
  • Chen, H.L., & Black, T.C. (2010). Using e-portfolios to support an undergraduate learning career: An experiment with academic advisingEducause Quarterly Magazine, 33(4). 
  • Eynon, B., & Gambino, L. M. (2017). High-impact ePortfolio practice: A catalyst for student, faculty, and institutional learning. Stylus Publishing, LLC.
  • Penny-Light, T., Chen, H.L., Ittelson, J. (2012). Documenting Learning with ePortfolios: A Guide for College Instructors. San Francisco, CA: Jossey-Bass.
  • Yancey, K. B. (2009). Electronic portfolios a decade into the twenty-first century: What we know, what we need to know. Peer Review11(1), 28. (LINK)
Simulations, case studies, and case-based learning:
  • National Institute on Learning Outcomes Assessment (LINK)
  • Yale University (LINK)
  • University of Warwick (LINK)
  • University of New South Wales (LINKLINK)
  • London School of Economics (LINK)
Student exit interviews
  • IUPUI  (LINK)
  • Lehigh University (LINK)
  • Ursinus College (LINK)
  • University of Wisconsin (LINKLINK)
  • Lee Jr, R. D. (1991). The use of exit interviews in master's programs of public affairs and administration. The American Review of Public Administration21(3), 183-195.  (LINK)
  • Stewart, B. R., Martin Jr, C. L., & Steedle, L. F. (2009). Accounting program assessment: Exit interviews of graduating seniors. American Journal of Business Education (AJBE)2(7), 61-72.  (LINK)
Student discussion or focus groups:
  • Drexel University (LINK)
  • University of Kentucky (LINK)
  • University of San Francisco (LINK)
  • Franklin University (LINK)
  • Washington State University (LINK)
  • Gowdy, E. A. (1996). Effective Student Focus Groups: the bright and early approach. Assessment & Evaluation in Higher Education21(2), 185-189.  (LINK)
  • Liamputtong, P. (2011). Focus group methodology: Principle and practice. Sage Publications.
  • McLafferty, I. (2004). Focus group interviews as a data collecting strategy. Journal of advanced nursing48(2), 187-194.  (LINK)
  • Varga-Atkins, T., McIsaac, J., & Willis, I. (2017). Focus Group meets Nominal Group technique: an effective combination for student evaluation?  Innovations in Education and Teaching International54(4), 289-300.  (LINK)

Analysis of Evidence

Introductions to and Summaries of Assessment Analysis
  • Illinois State University (LINK)
  • Washington State University (LINK)
  • University of Virginia (LINK)
  • James Madison University (LINK)
     
  • Data visualization (LINKLINK)
  • See also Duke Center for Data and Visualization Sciences (LINK)
     
  • The National Council on Measurement in Education (NCME) also developed videos on the “anatomy of measurement”  (LINK).
  • Murphy, S. A. (2015). How data visualization supports academic library assessment: Three examples from The Ohio State University Libraries using Tableau. College & Research Libraries News76(9), 482-486.  (LINK)
  • Zilvinskis, J., & Michalski, G. V. (2016). Mining text data: Making sense of what students tell us. Professional File. Article 139, Fall 2016. Association for Institutional Research.  (LINK)
Rubrics
Why and when to use a rubric?
  • Cornell University (LINK)
  • Arizona State University (LINK)
  • UC Berkeley (LINK)
  • University of Oklahoma (LINK)
     
  • Andrade, H. (2005). Teaching with Rubrics: The Good, the Bad, and the Ugly. College Teaching 53(1):27-30.  (LINK)
  • Reddy, Y. M., & Andrade, H. (2010). A review of rubric use in higher education. Assessment & Evaluation in Higher Education, 35(4), 435-448.  (LINK)
  • Stevens, D. D. & Levi, A. (2005). Introduction to Rubrics : An Assessment Tool to Save Grading Time, Convey Effective Feedback, and Promote Student Learning. Sterling, VA: Stylus Publishing.
  • Stanny, C.J. & Nilson, L.B. (2014). Specifications Grading: Restoring Rigor, Motivating Students, and Saving Faculty Time. Sterling, VA: Stylus Publishing.
Types of rubrics explained
  • DePaul University (LINK)
  • Southwestern University (LINK)
  • University of Wisconsin-Madison (LINK)
Examples of rubrics

The Duke University Writing Studio and Duke Learning Innovation can assist Duke faculty, staff, and students with the development of effective rubrics. 

The AAC&U VALUE project (LINK) makes available to the general public 16 detailed rubrics (LINK) to guide teaching and evaluation. Users are encouraged to extend, adapt, or blend the rubrics as relevant to their assessment goals. The following examples follow the VALUE project schema.

Intellectual and practical skills

  1. Inquiry and analysis  (LINK)
  2. Critical thinking  (LINKLINKLINK)
  3. Creative thinking  (LINKLINKLINK)
  4. Written communication  (LINKLINKLINK)
  5. Oral communication  (LINKLINKLINK)
  6. Reading  (LINKLINK)
  7. Quantitative literacy  (LINKLINKLINK)
  8. Information literacy  (LINKLINKLINK)
  9. Teamwork  (LINKLINKLINK)
  10. Problem solving  (LINKLINKLINK)

Personal and social responsibility

  1. Civic engagement—local and global  (LINKLINKLINKLINK)
  2. Intercultural knowledge and competence  (LINKLINK)
  3. Ethical reasoning  (LINKLINKLINK)
  4. Foundations and skills for lifelong learning  (LINK)
  5. Global learning  (LINKLINK)

Integrative and applied learning

  1. Integrative learning  (LINK)

Process-oriented guided-inquiry learning (POGIL) (LINK)

Examples of Disciplinary Orientations  

Communication and Use of Findings

Understanding terms:  “Closing the Loop” (LINKLINK)

Many assessment practitioners use the term to describe the resolution of an assessment process, where the study of evidence leads to well-reasoned changes in curriculum or educational practice. This stage usually involves the sharing of written reports and/or presentations with specific recommendations for action. The term is a bit of a misnomer, however, because the loop never really closes. As we make evidence-guided adjustments to our work, we restart the process with new or revised learning outcomes and updated targets for student learning. 

Research Utilization
  • Research Utilization: An annotated bibliography (LINK)
  • Cummings, G. G., Estabrooks, C. A., Midodzi, W. K., Wallin, L., & Hayduk, L. (2007). Influence of organizational characteristics and context on research utilization. Nursing research56(4), S24-S39.  (LINK)
  • Estabrooks, C. A., Floyd, J. A., Scott‐Findlay, S., O'Leary, K. A., & Gushta, M. (2003). Individual determinants of research utilization: a systematic review. Journal of advanced nursing43(5), 506-520.  (LINK)
  • Weiss, C. H. (1979). The many meanings of research utilization. Public administration review39(5), 426-431. (LINK)
  • Weiss, C. H. (1993). Where politics and evaluation research meet. Evaluation practice14(1), 93-106.  (LINK)
Assessment Reporting
  • Examples of effective use of assessment results (LINK)
  • Diery, A., Vogel, F., Knogler, M., & Seidel, T. (2020, June). Evidence-Based Practice in Higher Education: Teacher Educators' Attitudes, Challenges, and Uses. In Frontiers in Education (Vol. 5, p. 62). Frontiers.  (LINK)
  • Fulcher, K. H., Smith, K. L., Sanchez, E. R., & Sanders, C. B. (2017). Needle in a Haystack: Finding Learning Improvement in Assessment Reports. Professional File. Article 141, Summer 2017. Association for Institutional Research. (LINK)
  • Huberman, M. (1994). Research utilization: The state of the art. Knowledge and policy7(4), 13-33.  (LINK)
  • Jankowski, N. (2021, January). Evidence-based storytelling in assessment. (Occasional Paper No. 50). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment.  (LINK

 

 

Assessment in Specific Learning Contexts

Planning for a Self-study Prior to External Review

Trinity College of Arts & Sciences (LINK

High-impact Practices or Experiences

  • Civic engagement (LINKLINK)
  • Student leadership (LINKLINK)
  • Living-learning communities (LINKLINKLINK)
  • Undergraduate research (LINKLINKLINK)
  • Study abroad or away (LINKLINK)
     
  • Finley, A. (2019, November). A comprehensive approach to assessment of high-impact practices (Occasional Paper No. 41). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA).  (LINK)

Examples of Disciplinary Orientations

  • STEM (LINKLINK)
  • Studio and performing arts (LINK)
  • Humanities (LINKLINK)
  • Social sciences and interpretive social sciences (LINKLINK)
  • Mentorship and advising (LINKLINK)
  • Student affairs and student development (LINKLINK)
  • Libraries (LINKLINK)
     
  • Talman, K., Vierula, J., Kanerva, A.-M., Virkki, O., Koivisto, J.-M., & Haavisto, E. (2021). Instruments for assessing reasoning skills in higher education: a scoping review. Assessment & Evaluation in Higher Education46(3), 376–392.  (LINK)
  • Gilchrist, D., & Oakleaf, M. (2012, April). An essential partner: The librarian’s role in student learning assessment. (Occasional Paper No. 14). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA). (LINK)

 

 

Faculty and Student Engagement in Assessment

Diversity, Equity, and Inclusion

Anti-racist Assessment and Educational Planning
  • Anti-racism at Duke (LINK)
  • Update on AIR Activities to Advance Diversity, Equity, and Inclusion (LINK)
  • Be Part of the Solution: Antiracism in Institutional Research (LINK)
  • Centering Racial Equity Throughout Data Integration (LINK)
  • 5 Steps to Take as an Antiracist Data Scientist (LINK)  
  • Truth, Racial Healing & Transformation (TRHT) AAC&U Campus Centers (LINK)
     
  • MacKinnon, D., & Manathunga, C. (2003). Going Global with Assessment: What to do when the dominant culture's literacy drives assessment. Higher Education Research & Development22(2), 131-144.  (LINK)
  • McNair, T. B. (2020). We Hold These Truths: Dismantling Racial Hierarchies, Building Equitable Communities. Association of American Colleges and Universities. 1818 R Street NW, Washington, DC 20009.  (LINK)
  • Montenegro, E., & Jankowski, N. A. (2020, January). A new decade for assessment: Embedding equity into assessment praxis (Occasional Paper No. 42). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA). (LINK)
Supporting Inclusivity of Gender Identity and Expression
  • Aguillon, S. M., Siegmund, G. F., Petipas, R. H., Drake, A. G., Cotner, S., & Ballen, C. J. (2020). Gender differences in student participation in an active-learning classroom. CBE—Life Sciences Education19(2), ar12.  (LINK)
  • Elwood, J. (2006). Gender issues in testing and assessment. The Sage handbook of gender and education, 262-278.
  • Garvey, J. C., Hart, J., Metcalfe, A. S., & Fellabaum-Toston, J. (2019). Methodological troubles with gender and sex in higher education survey research. The Review of Higher Education43(1), 1-24.  (LINK)
  • Johnson, E. A., Subasic, A., Beemyn, G., Martin, C., Rankin, S., & Tubbs, N. J. (2011). Promising practices for inclusion of gender identity/gender expression in higher education. The Pennsylvania State University LGBTA.
  • MacNell, L., Driscoll, A., & Hunt, A. N. (2015). What’s in a name: Exposing gender bias in student ratings of teaching. Innovative Higher Education40(4), 291-303.  (LINK)
  • Seifert, T. A., Wells, R. S., Saunders, D. B., & Gopaul, B. (2013). Unrealized educational expectations a growing or diminishing gender gap? It Depends on Your Definition. Professional File. Article 134, Fall 2013. Association for Institutional Research.  (LINK)
  • Vantieghem, W., Vermeersch, H., & Van Houtte, M. (2014). Transcending the gender dichotomy in educational gender gap research: The association between gender identity and academic self-efficacy. Contemporary Educational Psychology39(4), 369-378.  (LINK)
Cultural Responsiveness
  • Montenegro, E., & Jankowski, N. A. (2017, January). Equity and assessment: Moving towards culturally responsive assessment. (Occasional Paper No. 29). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA). (LINK)
  • Manathunga, C. (2009). Research as an intercultural ‘contact zone’. Discourse: Studies in the Cultural politics of Education30(2), 165-177.  (LINK)
Accessibility
  • Duke’s Student Disability Access Office (LINK)
  • Duke Testing Center (LINK)
  • Inclusive assessment at Tufts University (LINK)
  • Inclusive teaching and assessment at Cornell University (LINK)

Students as Essential Partners

  • Students’ perceptions about assessment in higher education (LINK)
  • Involving students in the assessment process (LINKLINK)
  • Finney, S. J., Sundre, D. L., Swain, M.S., & Williams, L. M. (2016). The validity of value-added estimates from low-stakes testing contexts: The impact of change in test-taking motivation and test consequences. Educational Assessment.  (LINK)
  • Turos, J. M. (2020, March). Actively engaging undergraduate students in the assessment process. Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA). (LINK)
  • Truncale, N. P., Chalk, E. D., Pellegrino, C., & Kemmerling, J. (2018, March). Implementing a student assessment scholars program: Students engaging in continuous improvement. Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA). (LINK)
  • Wise, S. L. & DeMars, C. E. (2005). Low examinee effort in low-stakes assessment: Problems and potential solutions. Educational Assessment,10 (1),1-17.  (LINK)
  • Planning learning outcomes through student employment opportunities (LINKLINK)

Faculty Support and Development

  • What is the Scholarship of Teaching & Learning (SoTL)?  (LINKLINK)
  • Duke Learning Networks (LINK)
  • Duke Learning Innovation event calendar (LINK)
  • Duke Office of Faculty Advancement (LINK)
  • Duke Faculty Affairs (LINK)
  • Duke Graduate School Preparing Future Faculty program (LINK)
  • Duke Graduate School Certificate in College Teaching (LINK)
  • Duke Graduate School Emerging Leaders Institute (LINK)
     
  • Banta, T. W. (Ed.). (2002). Building a scholarship of assessment. John Wiley & Sons.
  • Cain, T. R. (2014, November). Assessment and academic freedom: In concert, not conflict (Occasional Paper No. 22). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA).  (LINK
  • Hutchings, P. (2010, April). Opening doors to faculty involvement in assessment. (Occasional Paper No. 4). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA).  (LINK)
  • Gold, L., Rhoades, G., Smith, M., & Kuh, G. (2011, May). What faculty unions say about student learning outcomes assessment. (Occasional Paper No. 9). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA).  (LINK)
  • Polychronopoulos, G. B., & Leaderman, E. C. (2019, July). Strengths-based assessment practice: Constructing our professional identities through reflection. Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA).  (LINK
  • Stanny, C. J. (2019, July). Promoting an improvement culture. Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA).  (LINK)

 

 

Accreditation

The Southern Association of Colleges and Schools Commission on Colleges (SACS) is the regional accreditor for institutions of higher education in the Southeastern United States. It is recognized by the Council on Higher Education Accreditation. Accreditation signifies that Duke maintains clearly specified educational objectives that are consistent with its mission and appropriate to the degrees it offers, has the physical, financial, and human resources to fulfill that mission, and documents systematic methods for studying and evaluating its objectives.

  • Index of SACSCOC resources (LINK)
  • Resource Manual for the 2018 Principles of Accreditation (LINK)
  • Quality Enhancement Plan (LINK)
  • Council for Higher Education Accreditation (LINKLINK)
  • Congressional Research Service:  An overview of accreditation of higher education in the U.S. (LINK)
  • AAUP:  Accreditation and the federal future of Higher Education (LINK)
  • SACSCOC position statement on diversity, equity, and inclusion (LINK)

 

 

Data Sources

Office of Assessment and the office of the Director of Academic Services and Systems recognize that much of the information we collect and use is of a fundamentally private nature. We make both implicit and explicit pledges of confidentiality to students and faculty. The Office of Assessment supports collaboration between and among the various university offices charged with the assessment of student learning outcomes or program evaluation. Assessment personnel are empowered to share data between offices, subject to standards of review. The full Policy on Data Dissemination is available online. (LINK)  

Individuals whose primary affiliation or appointment is in Arts & Sciences should consult Dr. Jennifer Hill, Dr. Alessandra Dinin, or Mr. Stephen Katsaounis to discuss their informational needs and submit a Trinity College data request form.  Although the Office of the University Registrar (OUR) has its own request form (here), the OUR refers those inquiries to the Office of Assessment for the following reasons:

  • It helps maintain consistency of operationalization of variables and sample/cohort selection relevant to studies of undergraduate learning.
  • It reduces the risk of non-matching data reports due to different querying methodologies.
  • Assessment personnel can advise researchers how their projects contribute to or complement broader assessment activities underway within the department or program.

Leaders of A&S departments and undergraduate certificate programs have access to suites of data reports in Tableau online. Access Tableau here (restricted access), or visit a descriptive inventory here on the Office of Assessment website.  As part of the Consortium on the Funding of Higher Education (COFHE), Duke’s Office of Institutional Research administers surveys at specific stages of students’ undergraduate experiences. Duke does not participate in the NSSEThis PDF briefly summarizes the COFHE survey administration. Note, some embedded links are non-operational.  Additional data resources are available on the OIR website (LINK).  

Researchers interested in institutional data are encouraged to consult IPEDS

 

 

Assessment Technologies and Platforms

Lists of Supportive Technologies

Duke Learning Innovation regularly supports faculty as they make decisions about suitable assessment tools and platforms in their courses and programs. Search software.duke.edu (restricted access) for licenses and downloads.

The National Council on Measurement in Education (NCME) and National Institute on Learning Outcomes Assessment host lists of technologies that support assessment and evaluation. (LINKLINK)  Other examples include: 

  • U.S. Department of Education (LINK)
  • Cornell University (LINK)
  • Univ. of New South Wales (LINK)

Technology Usage

  • The Intentional Use of Technology model (LINKLINK)
  • Bruff, D. (2019). Intentional tech: Principles to guide the use of educational technology in college teaching. West Virginia University Press.
  • Deeley, S. J. (2018). Using technology to facilitate effective assessment for learning and feedback in higher education. Assessment & Evaluation in Higher Education43(3), 439-448.  (LINK)
  • Harrison, J. M., & Braxton, S. N. (2018, September). Technology solutions to support assessment. (Occasional Paper No. 35). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA).  (LINK)
  • Spector, J. M., & Kim, C. (2014). Technologies for intentional learning: Beyond a cognitive perspective. Australian Journal of Education58(1), 9-22.  (LINK)
  • Spurlin, J. E. (2006). Technology and learning: Defining what you want to assess.  (LINK)

Select Emerging Innovations

Gaming
  • Duke Gaming Lab (LINK)
  • McClarty, K. L., Orr, A., Frey, P. M., Dolan, R. P., Vassileva, V., & McVay, A. (2012). A literature review of gaming in education. Gaming in education, 1-35.  (LINK)
  • Ifenthaler, D., Eseryel, D., & Ge, X. (2012). Assessment for game-based learning. In Assessment in game-based learning (pp. 1-8). Springer, New York, NY.  (LINK)
  • Tobias S., Fletcher J.D., Wind A.P. (2014) Game-Based Learning. In: Spector J., Merrill M., Elen J., Bishop M. (eds) Handbook of Research on Educational Communications and Technology. Springer, New York, NY.  (LINK)
Badging
  • About badging (LINKLINK)
  • Parker, H.E. (2015, April). Digital badges as effective assessment tools. Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA).  (LINK)
  • Stefaniak, J., & Carey, K. (2019). Instilling purpose and value in the implementation of digital badges in higher education. International Journal of Educational Technology in Higher Education16(1), 1-21.  (LINK)
The Comprehensive Learner Record
  • About the comprehensive learner record (LINKLINK)
  • Baker, G. R., & Jankowski, N. A. (2020, June). Documenting learning: The comprehensive learner record. (Occasional Paper No. 46). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment.  (LINK)
  • Hope, J. (2021). Consider added value of a comprehensive learner record. The Successful Registrar21(2), 8-8.  (LINK)
Pandemic-era Modalities

Hong, R. C., & Moloney, K. (2020, October). There is no return to normal: Harnessing chaos to create our new assessment future. (Occasional Paper No. 49). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment.  (LINK)

Muller, K., Gradel, K., Deane, S., Forte, M., McCabe, R., Pickett, A. M., Piorkowski, R., Scalzo, K., & Sullivan, R. (2019, October). Assessing student learning in the online modality (Occasional Paper No. 40). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA).  (LINK)

 

 

Ethical Conduct of Assessment

Privacy and Confidentiality

Data Security

Fairness and Equity

  • Code of Fair Testing Practices in Education (LINK)
  • 1999 Standards for Educational and Psychological Testing (LINK)
  • Council for the Advancement of Standards in Higher Education (CAS) Statement of Shared Ethical Principles (LINK)

 

 

Theoretical Foundations of and Key Publications in Learning Outcomes Assessment

What is Action Research?

Action research is a disciplined process of inquiry conducted by and for those taking the action. The primary reason for engaging in action research is to assist the “actor” in improving and/or refining his or her actions. (Excerpted from ASCD) See also:

  • Association for Supervision and Curriculum Development (ASCD) (LINK)
  • Brown University (LINK)
  • Avison, D. E., Lau, F., Myers, M. D., & Nielsen, P. A. (1999). Action research. Communications of the ACM42(1), 94-97.  (LINK)
  • Brydon‐Miller, M., & Maguire, P. (2009). Participatory action research: Contributions to the development of practitioner inquiry in education. Educational Action Research17(1), 79-93.  (LINK)
  • Cohen, L., Manion, L., & Morrison, K. (2017). Action research. In Research methods in education (pp. 440-456). Routledge.  (LINK)
  • Goldkuhl, G. (2008). Practical inquiry as action research and beyond.  (LINK)
  • Koshy, V. (2009). Action research for improving educational practice: A step-by-step guide. Sage. Chicago.  (LINK)
  • Stringer, E. T. (2008). Action research in education. Upper Saddle River, NJ: Pearson Prentice Hall.
  • Stringer, E. T., & Aragón, A. O. (2020). Action research. Sage publications.

Common Conceptual Frameworks Used in Learning Outcomes Assessment

  • Astin, A. W. (1991). Assessment for excellence. American Council on Education/Macmillan Series on Higher Education.
  • Atkinson, R. C., & Shiffrin, R. M. (1968). Human memory: A proposed system and its control processes. In Psychology of learning and motivation (Vol. 2, pp. 89-195). Academic Press.  (LINK
  • Bandura, A. (1986). Social foundations of thought and action: A social cognitive theory. Englewood Cliffs, NJ, US: Prentice-Hall, Inc.
  • Dweck, C. S. (2008). Mindset: The new psychology of success. Random House Digital, Inc.
  • Erikson, E. H. (1994). Identity: Youth and crisis (No. 7). WW Norton & Company.
  • Flavell, J. H. (1976). Metacognitive aspects of problem solving. The nature of intelligence, 12, 231-235.
  • Gardner, H., & Hatch, T. (1989). Educational implications of the theory of multiple intelligences. Educational researcher, 18(8), 4-10.  (LINK
  • Krathwohl, D. R. (2002). A revision of Bloom's taxonomy: An overview. Theory into practice, 41(4), 212-218.  (LINKLINK
  • Lave, J., & Wenger, E. (1990). Situated learning: Legitimate peripheral participation. Cambridge, UK: Cambridge University Press.
  • Mezirow, J. (1997). Transformative learning: Theory to practice. New directions for adult and continuing education, 1997(74), 5-12.  (LINK)
  • Mossberger, K., Tolbert, C. J., & McNeal, R. S. (2007). Digital citizenship: The Internet, society, and participation. MIT Press.
  • Piaget, J. (1952). The origins of intelligence in children (Vol. 8, No. 5, pp. 18-1952). New York: International Universities Press.  (LINK)
  • Soloway, E., Guzdial, M., & Hay, K. E. (1994). Learner-centered design: The challenge for HCI in the 21st century. interactions, 1(2), 36-48.  (LINK)  
  • Twigg, C.A. (2000). Institutional readiness criteria. EDUCAUSE Review, 35(2), 42-48.  (LINKLINK
  • Vygotsky, L. S. (1980). Mind in society: The development of higher psychological processes. Harvard university press.
  • Weiss, C. H. (1997). Theory-based evaluation: Past, present, and future. New directions for evaluation76, 41-55.  (LINK)   
  • Wiggins, G. P., Wiggins, G., & McTighe, J. (2005). Understanding by design.  ASCD.  (LINK

Select Perspectives on the State of our Practice

  • Building Capacity, Fostering Institutionalization: A Study of Assessment at Independent Colleges in the United States.  (LINK
  • Hundley, S. P., & Kahn, S. (2019, November). Meta-themes and meta-trends in assessment: Enduring issues, emerging ideas. Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA). (LINK)
  • Hutchings, P., Ewell, P., & Banta, T. (2012, May). AAHE principles of good practice: Aging nicely. Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA).  (LINK)
  • Miller, M. A. (2012, January). From denial to acceptance: The stages of assessment. (Occasional Paper No. 13). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA).  (LINK
  • Singer-Freeman, K., & Robinson, C. (2020, November). Grand challenges in assessment: Collective issues in need of solutions (Occasional Paper No. 47). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment.  (LINK
  • The 10 meta-trends in higher education assessment (LINK)

Additional References

Astin, A. W. (1991). Assessment for excellence. American Council on Education/Macmillan Series on Higher Education.

Banta, T. W., Jones, E. A., & Black, K. E. (2009). Designing effective assessment: principles and profiles of good practice. San Francisco, CA: Jossey-Bass.

Banta, T. W., & Palomba, C. A. (2014). Assessment essentials: Planning, implementing, and improving assessment in higher education. John Wiley & Sons.

Blaich, C. F., & Wise, K. S. (2011, January). From gathering to using assessment results: Lessons from the Wabash National Study. (Occasional Paper No. 8). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA). (LINK)

Braskamp, L. & Engberg, M. (2014, February). Guidelines to consider in being strategic about assessment. Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA).  (LINK)

Creswell, J. W. (1998). Qualitative inquiry and research design: choosing among five traditions. Thousand Oaks, CA: Sage.

Cumming, T., & Ewell, P. (2017). Introduction: History and conceptual basis of assessment in higher education.  (LINK)

Ewell, P. T. (2009, November). Assessment, accountability, and improvement: Revisiting the tension. (Occasional Paper No. 1). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA).  (LINK)

Finney, S. J., Wells, J. B., & Henning, G. W. (2021, March). The need for program theory and implementation fidelity in assessment practice and standards (Occasional Paper No. 52). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA).  (LINK)

Fulcher, K., H., Smith, K. L., Sanchez, E. R. H., Ames, A. J., & Meixner, C. (2017). Return of the pig: Standards for learning improvement. Research & Practice in Assessment, 11, 10-40.  (LINK)

Funnell, S. C., & Rogers, P. J. (2011). Purposeful program theory: Effective use of theories of change and logic models (Vol. 31). John Wiley & Sons.

Hersh, R. H., & Keeling, R. P. (2013, February). Changing institutional culture to promote assessment of higher learning. (Occasional Paper No. 17). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA). (LINK)

Kuh, G. D., Kinzie, J., Schuh, J. H., & Whitt, E. J. (2011). Student success in college: Creating conditions that matter. John Wiley & Sons.

Kuh, G. D., Ikenberry, S. O., Jankowski, N. A., Cain, T. R., Ewell, P. T., Hutchings, P., & Kinzie, J. (2015). Using evidence of student learning to improve higher education. San Francisco, CA: Jossey-Bass.

Maki, P. L. (2012). Assessing for learning: Building a sustainable commitment across the institution. Stylus Publishing, LLC.

Patton, M.Q. (2002). Qualitative research and evaluation methods. Thousand Oaks, CA: Sage.

Suskie, L. (2018). Assessing student learning: A common sense guide. San Francisco, CA: Jossey-Bass.

Upcraft, M.L. & Schuh, J.H. (1996). Assessment in student affairs: A guide for practitioners. San Francisco: Jossey-Bass.

Walvoord, B. E. (2010). Assessment clear and simple: A practical guide for institutions, departments, and general education. John Wiley & Sons.

Weiss, C. H. (1979). The many meanings of research utilization. Public administration review39(5), 426-431.  (LINK)

Prominent Journals in Learning Outcomes Assessment

  • Assessment & Evaluation in Higher Education  (LINK
  • American Journal of Evaluation  (LINK)
  • Assessment in Education: Principles, Policy & Practice  (LINK)
  • Assessment Update  (LINK)
  • Educational Action Research (LINK)
  • Educational Assessment, Evaluation and Accountability (LINK
  • Evaluation and Program Planning  (LINK)
  • Evaluation Review (LINK
  • International Journal of Innovative Teaching and Learning in Higher Education (LINK
  • Journal of Assessment and Institutional Effectiveness (LINK
  • Journal of Educational Measurement (LINK
  • Journal of Higher Education (LINK
  • New Directions for Evaluation (LINK
  • New Directions for Institutional Research (LINK
  • New Directions for Teaching and Learning (LINK
  • Research & Practice in Assessment (LINK
  • Research in Higher Education (LINK

 

 

Other Connections

Duke Resources and Partners

  • Campus IRB (LINK
  • Center for Data and Visualization Sciences, Duke Libraries (LINK)   
  • Curriculum review process and template (restricted access) (LINK)   
  • Duke Learning Innovation (LINK)   
  • Office of Institutional Research (LINK
  • Student Affairs Assessment (LINK

External Organizations

  • American Educational Research Association (LINK
  • American Psychological Association (LINK)  
  • Association for Authentic, Experiential, & Evidence-Based Learning (LINK)  
  • Association for Institutional Research (LINK)  
  • Association for the Assessment of Learning in Higher Education (LINK
  • Association for the Study of Higher Education (LINK
  • Association of American Colleges and Universities (LINK
  • Council for the Advancement of Standards in Higher Education (CAS) (LINK
  • Department of Education, Office of Postsecondary Education (LINK
  • Education Resources Information Center (ERIC) (LINK
  • Educause (LINK)
  • Higher Education Research Institute (LINK
  • Institute for Higher Education Policy (LINK
  • Integrated Post-secondary Data System (IPEDS) (LINK
  • James Madison University Center on Assessment & Research Studies (LINK)  
  • Liberal Education and America's Promise (LEAP) (LINK)  
  • Lumina Foundation  (LINK
  • National Center for Education Statistics  (LINK
  • National Council on Measurement in Education (LINK)
  • National Institute of Learning Outcomes Assessment  (LINK)  
  • North Carolina Association for Institutional Research  (LINK)  
  • North Carolina Independent Colleges and Universities (LINK)
  • Project Kaleidoscope  (LINK)  
  • Spencer Foundation (LINK)  
  • Student Experience in the Research University (Center for Studies in Higher Education) (LINK
  • Teagle Foundation (LINK)
  • Truth, Racial Healing & Transformation (TRHT) Campus Centers (LINK)
  • Valid Assessment of Learning in Undergraduate Education (VALUE) (LINK)