Sustainability Literacy Assessment

Credit Language

AC 6: Immersive Experience – version 2.2

Frequently Asked Questions

How has this credit changed between STARS 2.1 and 2.2?

Minor changes to recognize assessments for which a “substantive portion” focuses on student knowledge. A comprehensive list of changes can be found in the 2.2 Summary of changes.

What kind of questions address sustainability literacy?

Sustainability literacy questions typically include right/wrong questions. For example, “Which of the following correctly identifies the Brundtland definition of sustainable development?” would have a single correct answer.  

Are there pre-made sustainability literacy assessments that can be used for this credit?

Yes, see the guide on Sustainability Literacy/Culture Assessment Tools for a list of approved assessment tools.

Can we earn points if we include sustainability literacy questions on another assessment?

Yes, the assessment can combine sustainability culture/behavior/engagement, or can be included with other assessments needed for STARS (e.g., transportation survey). Some institutions include sustainability literacy questions in broader surveys not specifically related to sustainability, such as a first year student survey. In order to count, a substantive portion of the assessment (e.g., at least 10 questions or a third of the questions) must evaluate sustainability knowledge (e.g., climate change, social and economic justice, biodiversity loss, global poverty, resource depletion, and so on).

Can a post-assessment that has not yet occurred count as the follow-up assessment?

Yes. A structured pre- and post-assessment for which the pre-assessment has been conducted and the post-assessment has been scheduled may count.

What constitutes a representative sample?

A representative sample is a subset of a statistical population that accurately reflects the members of the entire population. A representative sample should be an unbiased indication of what the entire population is like. For example, in a student population of 1000 students in which 25 percent of the students are enrolled in a business school, 50 percent are enrolled in humanities programs, and 25 percent are enrolled in science programs, a representative sample might include 200 students: 50 business students, 100 humanities students, and 50 science students. Likewise, a representative sample of purchases should accurately reflect the institution’s total purchases, accounting for seasonal and other variations in product availability and purchasing.

What’s the difference between a literacy assessment and a cultural assessment?

An assessment that addresses student knowledge of sustainability topics and challenges (e.g., climate change, social and economic justice, biodiversity loss, global poverty, resource depletion, and so on) is required. Assessments of cultural aspects (e.g., perceptions, beliefs, dispositions, behaviors, and awareness of campus sustainability initiatives) are recognized in the Assessing Sustainability Culture credit. Literacy assessment questions typically have right/wrong answers whereas culture assessment questions generally do not have a right/wrong answer.

Resources, Templates & Tools

Example Responses

  • College of Charleston – Good reporting example of an institution that has adopted Suli-test.
  • Emory University – A good example of an assessment that captures both sustainability literacy, and sustainability culture (see EN 6). The assessment was recently expanded to include all employees as well as students.
  • Florida Gulf Coast University – Solid assessment methodology. The assessment consists of both direct and indirect measures of student achievement. 
  • Pennsylvania State University – Comprehensive methodology, and a supplemental upload of literacy frequencies by grade level is included. Assessment addresses literacy AND behavior/culture/engagement.
  • University of California, Irvine – Good example of a pre- and post-test survey that was administered to a subset of students enrolled in a sustainability course.
  • University of New Brunswick, Fredericton – Good detail on how the assessment was developed. The uploaded spreadsheet not only highlights the questions asked, but includes the survey results. 
  • University of New Hampshire – Good detail on assessment methodology, post-testing, and approach for achieving a representative sample. A detailed assessment report is uploaded.
  • University of St. Thomas – Descriptive responses include detailed information on how a representative sample was achieved. The survey was sent to faculty and staff as well as students, and included an invitation from the president.
  • University of Vermont – Good example of a survey that asks for student perspectives of their own sustainability literacy.

Common Issues Identified During Review

  • Assessment must cover sustainability literacy rather than sustainability-related values, behaviors or beliefs. An institution may use a single instrument that addresses sustainability literacy, culture, and/or engagement to meet the criteria for this credit if at least 10 questions or a third of the assessment focuses on student knowledge of sustainability topics and challenges.
  • If “The entire student body or, at minimum, to the institution’s predominant student body” is selected, descriptive information must explain how a representative sample was achieved. If there is indication that a non-representative sample was assessed (e.g., only one class participated), response should be “A subset of students….”
  • If “Pre- and post-assessment to the same cohort of students or to representative samples…” is selected, there must be some mention of a follow-up assessment (a scheduled post assessment that has not yet occurred may count.). If the support isn’t there, response should be “Standalone evaluation without a follow-up assessment….”

Did you find this article helpful?