Javascript is required
1.
Clark WC, Dickson NM. Sustainability science: The emerging research program. Proceedings of the National Academy of Sciences of the United States of America. 2003;100(14):8059–8061. [Crossref]
2.
Kates RW, Clark WC, Corell R, Hall JM, Jaeger CC, Lowe I, et al. Sustainability Science. Science. 2001;292(5517):641–642. [Crossref]
3.
Anderies J, Janssen M, Ostrom E. A framework to analyze the robustness of social-ecological systems from an institutional perspective. Ecology and Society. 2004;9(1). [Crossref]
4.
Ostrom E, Janssen MA, Anderies JM. Going beyond panaceas. Proceedings of the National Academy of Sciences of the United States of America. 2007;104(39):15176–15178. [Crossref]
5.
Komiyama H, Takeuchi K. Sustainability science: building a new discipline. Sustainability Science. 2006;1(1):1–6. 006-0007-4. [Crossref]
6.
Kates RW, Parris TM. Long-term trends and a sustainability transition. Proceedings of the National Academy of Sciences of the United States of America. 2003;100(14):8062–8067. [Crossref]
7.
Parris TM, Kates RW. Characterizing a sustainability transition: goals, targets, trends, and driving forces. Proceedings of the National Academy of Sciences of the United States of America. 2003;100:8068–8073. [Crossref]
8.
Martens P. Sustainability: Science or Fiction? IEEE Engineering Management Review. 2007;35(3):70. [Crossref]
9.
Griggs D, Stafford-Smith M, Gaffney O, Rockstr ¨om J, Ohman MC, Shyamsundar P, et al. Sustainable development goals for people and planet. Nature. 2013;495(7441):305–307. [Crossref]
10.
Clark WC. Sustainability science: a room of its own. vol. 104; 2007. [Crossref]
11.
Miller TR, Wiek A, Sarewitz D, Robinson J, Olsson L, Kriebel D, et al. The future of sustainability science: A solutions-oriented research agenda. Sustainability Science. 2014;9(2):239–246. [Crossref]
12.
Spangenberg JH. Sustainability science: a review, an analysis and some empirical lessons. Environmental Conservation. 2011;38(03):275–287. [Crossref]
13.
Kajikawa Y. Research core and framework of sustainability science. Sustainability Science. 2008;(3):215–239. 0053-1. [Crossref]
14.
Kauffman J, Arico S. New directions in sustainability science: promoting integration and cooperation. Sustainability Science. 2014;9(4):413–418. [Crossref]
15.
Brandt P, Ernst A, Gralla F, Luederitz C, Lang DJ, Newig J, et al. A review of transdisciplinary research in sustainability science. Ecological Economics. 2013;92:1–15. [Crossref]
16.
Wiek A, Farioli F, Fukushi K, Yarime M. Sustainability science: Bridging the gap between science and society. Sustainability Science. 2012;7(SUPPL. 1):1–4. [Crossref]
17.
Trencher G, Yarime M, McCormick KB, Doll CNH, Kraines SB. Beyond the third mission: Exploring the emerging university function of cocreation for sustainability. Science and Public Policy. 2013;41(2):151– 179. [Crossref]
18.
Bettencourt LMA, Kaur J. Evolution and structure of sustainability science. PNAS. 2011;108(49):19540–19545. [Crossref]
19.
Kates RW. What kind of a science is sustainability science? Proceedings of the National Academy of Sciences. 2011;108(49):19449– 19450. [Crossref]
20.
Kajikawa Y, Ohno J, Takeda Y, Matsushima K, Komiyama H. Creating an academic landscape of sustainability science: an analysis of the citation network. Sustainability Science. 2007;2(2):221–231. [Crossref]
21.
SSPP is a peer-reviewed, open-access journal focusing on sustainability science research. The journal also provides an international network of sustainability research and education.
22.
Sustainability: Science, Practice, & Policy. Academic Programs in Sustainability; 2016. Available from: http://sspp.proquest.com/ sspp institutions/display/universityprograms#.
23.
Stock P, Burton RJF. Defining terms for integrated (multi-inter-transdisciplinary) sustainability research. Sustainability. 2011;3(8):1090– 1113. [Crossref]
24.
Schweizer-ries P, Perkins DD. Sustainability Science: Transdisciplinarity , Transepistemology , and Action Research Introduction to the Special Issue. Umweltpsychologie. 2012;16(1):6–10.
25.
Frisk E, Larson KL. Education for sustainability: Competencies & practices for transformative action. Journal of Sustainability Education. 2011;2.
26.
Onuki M, Mino T. Sustainability education and a new master’s degree, the master of sustainability science: the Graduate Program in Sustainability Science (GPSS) at the University of Tokyo. Sustainability Science. 2009;4(1):55–59. [Crossref]
27.
Tamura M, Uegaki T. Development of an educational model for sustainability science: Challenges in the Mind-Skills-Knowledge education at Ibaraki University. Sustainability Science. 2012;7(2):253–265. [Crossref]
28.
Sterling S, Orr D. Sustainable education, re-visioing learning and change. Devon, UK: Green Books; 2001.
29.
Wiek A, Withycombe L, Redman C. Key competencies in sustainability: a reference framework for academic program development. Sustainability Science. 2011;6(2):203–218. 0132-6. [Crossref]
30.
Remington-Doucette S, Connell K, Armstrong C, Musgrove S. Assessing sustainability education in a transdisciplinary undergraduate course focused on real world problem solving: A case for disciplinary grounding. International Journal of Sustainability in Higher Education. 2013;14(4):404–433. [Crossref]
31.
A list of past field courses in GPSS-GLI is available at http://www.sustainability.k.u-tokyo.ac.jp/exercises.
32.
San Carlos RO, Tyunina O, Yoshida Y, Mori A, Sioen GB, Yang J. In: Esteban M, Akiyama T, Chiahsin C, Ikeda I, editors. Assessment of Fieldwork Methodologies for Educational Purposes in Sustainability Science: Exercise on Resilience, Tohoku Unit 2015. Springer International Publishing; 2016. pp. 67–91. 32930-7 4. [Crossref]
33.
Tumilba V, Kudo S, Yarime M. Review and assessment of academic activities , student competencies, research themes and practice of sustainability principles in the Graduate Program in Sustainability Science. In: The 19th International Sustainable Development Research Conference. Stellenbosch: ISDRC 19; 2013.
34.
Bonwell CC, Eison JA. Active Learning: Creating Excitement in the Classroom. Washington; 1991.
35.
Chi MTH. Active-Constructive-Interactive: A Conceptual Framework for Differentiating Learning Activities. Topics in Cognitive Science. 2009;1(1):73–105. [Crossref]
36.
Graduate Program in Sustainability Science, Global Leadership Initiative, The University of Tokyo, Tokyo, Japan.
37.
Barth M, Godemann J, Rieckmann M, Stoltenberg U. Developing key competencies for sustainable development in higher education. International Journal of Sustainability in Higher Education. 2007;8(4):416–430. [Crossref]
38.
Waas T, Hug´e J, Ceulemans K, Lambrechts W, Vandenabeele J, Lozano R, et al. Sustainable Higher Education. Understanding and Moving Forward. Brussels; 2012.
39.
Lozano R. Incorporation and institutionalization of SD into universities: breaking through barriers to change. Journal of Cleaner Production. 2006;14(9-11):787–796. [Crossref]
40.
Rieckmann M. Future-oriented higher education: Which key competencies should be fostered through university teaching and learning? Futures. 2012;44(2):127–135. [Crossref]
41.
Shephard K. Higher education for sustainability: seeking affective learning outcomes. International Journal of Sustainability in Higher Education. 2008;9(1):87–98. [Crossref]
42.
Akiyama T, Li J. Environmental leadership education for tackling water environmental issues in arid regions. In: Environmental Leadership Capacity Building in Higher Education. Springer Japan; 2013. pp. 81–92.
43.
San Carlos RO, Teah HY, Akiyama T, Li J. In: Esteban M, Akiyama T, Chiahsin C, Ikeda I, editors. Designing Field Exercises with the Integral Approach for Sustainability Science: A Case Study of the Heihe River Basin, China. 1st ed.; 2016. pp. 23–39. 3-319-32930-7 2. [Crossref]
Search
Open Access
Research article

Fostering the Next Generation of Sustainability Professionals— Assessing Field Courses in a Sustainability Science Graduate Program

ricardo san carlos*,
yuki yoshida,
shogo kudo
Graduate Program in Sustainability Science—Global Leadership Initiative, Graduate School of Frontier Sciences, The University of Tokyo, Tokyo, Japan
Challenges in Sustainability
|
Volume 5, Issue 1, 2017
|
Pages 52-61
Received: 02-27-2016,
Revised: 07-02-2016,
Accepted: 08-01-2016,
Available online: 03-26-2017
View Full Article|Download PDF

Abstract:

A growing number of educational programs in sustainability science has paralleled the rise of the field itself. The educational approach of these programs follows the problem-driven, interdisciplinary, and transdisciplinary nature of the field itself. However, its effectiveness has yet to be systematically evaluated. Similarly, while ad-hoc evaluation schemes have attempted to monitor the quality of the educational programs, there is no standard method that accounts for the particularities of sustainability science programs. This study thus addresses the need for an assessment of the problem-driven approach of educational programs in sustainability science. We have conducted student self-assessments of field courses in the Graduate Program in Sustainability Science (GPSS-GLI) at The University of Tokyo, which positions its field courses at the center of its curriculum. The self-assessments were based on five key competencies identified as particularly important for sustainability professionals. Workshops and questionnaires engaged students in a reflection of the six field courses and of their own personal development through the activities offered. Our questionnaire results indicate that the majority of participants were satisfied with how the courses furthered their personal development. While some participants expressed frustration at being unable to sufficiently address the respective field’s sustainability challenges due to time constraints, students generally recognized the five key competencies as important for addressing sustainability issues after participating in the courses. Moreover, participants attributed much of their learning to their active engagement in planned field research activities, rather than to passive learning. Variations in results across different course units provide material for further analysis and development of the curriculum. This study is an initial attempt at assessment, with room for ongoing improvement and further research to address additional requirements for fostering the next generation of sustainability professionals.

Keywords: Curriculum development, Fieldwork evaluation, Higher education, Competencies, Sustainability professional, Sustainability science

1. Introduction

Sustainability science has been promoted actively both in research and education as a vibrant response to emerging sustainability challenges such as climate change, environmental degradation, food security, energy provision, and inequality. The main disciplinary foci of sustainability science are to understand the complex interactions between natural and social systems [1], [2],[3], [4], [5], and to create knowledge for sustainable development [6], [7], [8], [9]. Since challenges in sustainability generally require action to alter the status quo, the discipline’s approach is problem-based and solutionoriented [10], [11], [12], [13]. Moreover, interconnected problems [14] require researchers to go beyond their discipline of training. To reflect findings from research into actual practice, it is also necessary to cross the potential divide between academics and practitioners. Accordingly, sustainability science combines an interdisciplinary approach that employs academic knowledge from natural and social sciences to humanities, with a transdisciplinary approach that promotes co-design and co-creation of knowledge by diverse social stakeholders to address real-world sustainability challenges [15], [16], [17], [18].

While the research dimension of sustainability science has formed its own space and landscape within academia [10], [13], [19], [20], sustainability-related educational programs have also been promoted. According to a list from the journal Sustainability: Science, Practice, & Policy (SSPP)[21] there are more than 230 sustainability programs at the university level as of January 2016 [22]. Sustainability science education plays a key role in producing human resources with the literacy, knowledge, and skills required to actualize the recommendations of sustainability science research. Program curriculum and implementation must therefore reflect the interdisciplinary and transdisciplinary aspects of sustainability science. Students should be encouraged and trained to develop an interdisciplinary and transdisciplinary mindset, as the problems they address define the types of knowledge and methods necessary to propose possible solutions. Given the field’s problem-driven and solutionoriented approach, it is also critical to have linkages not only between research and stakeholders such as industry, government, and NGOs, but also between research and education for the continuous development of sustainability professionals. More collaborative and critical research approaches are necessary to guide social transformation towards sustainability [23].

Theoretical and applied literatures address the design of educational programs. The interdisciplinary approach of sustainability science brings together academic disciplines with diverging worldviews, and this in turn creates epistemological discussions. Such inter-paradigmatic collaboration and negotiation is considered a key characteristic of the field. In line with the epistemological discussion, the idea of ‘transepistemology’ [24] was introduced [24] to better describe the dynamic integration of different methods in sustainability research. According to Scheweizer-Ries and Perkins [24], trans-epistemology is the “cooperation between different personal knowledge systems” and society as a whole is the “ ‘producer’ of shared and socially constructed understanding of the world” [24]. The idea of sustainability is fundamentally normative and carries specific cultural values. It may also differ from person to person, so that sustainability science researchers must be able to imagine the diversity of views on a given topic and comprehend interlinkages between the viewpoints of different stakeholders. Accordingly, an educational program in sustainability must have the flexibility to accommodate awareness and tolerance of multiple epistemological views [25].

Regarding the design and operation of an educational program in sustainability science, Onuki and Mino introduced three key components: (i) knowledge and concept-oriented courses, (ii) experiential learning and skills-oriented courses, and (iii) transdisciplinary thesis research [26]. Mino and his colleagues later added the transboundary framework to emphasize the full range of scales, from the individual to the global, in order to examine subjects and problems from multiple angles [27].Tamura and Uegaki operate a sustainability science program in Ibaraki University, and raise another core concept for designing a sustainability science program, the “Mind-Skills-Knowledge” model of sustainability education [28]. In an analogy of sustainability science students with athletes who need to maintain their body, technique and spirit, the framework stresses a balance of different types of knowledge. Others have suggested declarative, procedural, effectiveness, and social knowledges, as well as their effective interaction [25].

In terms of the evaluation of sustainability science education programs, one challenge is to develop a method for investigating whether students are effectively acquiring the competencies required in order for them to actualize their knowledge as concrete actions for sustainability [28]. The work of Wiek and his colleagues provides a comprehensive discussion of five key competencies within a problem-solving framework [29]. While the proposed key competencies have been applied to assess students’ learning outcomes in a transdisciplinary course [30], a general need for research on pedagogy and evaluation in sustainability science programs remains.

To address this gap in the evaluation of sustainability science programs, this study aims to examine the problemdriven approach of sustainability science through student self-assessments of six field courses at the Graduate Program in Sustainability Science (GPSS-GLI), The University of Tokyo. Field courses in GPSS-GLI are designed for students to engage in collaborative research and to address real-world sustainability challenges in various topical cases. So far, field courses have covered countries in Africa (Kenya, Nigeria, and South Africa), Asia (China, Japan, Thailand), Europe (Denmark and Sweden) and Latin America (Costa Rica), and topics such as renewable energy, biodiversity conservation, and urban informal settlement [31].

These courses also aim to equip students with practical skills such as workshop facilitation, coordination with local resource persons, and field methodologies that can be applied to their thesis research. The general structure of each field course is designed by one or two faculty members who specialize on the given topic. One unit normally accompanies a cohort of four to ten students, and one doctoral student takes a leading role in the planning. Six field courses implemented during the academic year 2014–2015 are evaluated in this study (see Table 1 for an overview of the units). Four of these are Global Field Exercises and two Resilience Exercises, but these have equal weight and significance in the curriculum and are handled as identical in this analysis.

Table 1. Description of field course units and assessment participation rates

Type of course

Resilience Exercise

Global Field Exercise

Unit name

Minamata

Tohoku

Oasis

Costa Rica

Bangkok

Nairobi

Main location

Minamata, Japan

Otsuchi, Japan

Zhangyo City, China

Guanacaste, Costa Rica

Bangkok, Thailand

Nairobi, Kenya

Duration of field exercise

6 days

7 days

14 days

7 days

13 days

14 days

Workshop participants / GPSS-GLI students in unit

8/10 (80% workshop participation)

8/8 (100% workshop participation)

5/5 (100% workshop participation

5/5 (100% workshop participation)

3/3 (100% workshop participation)

7/8 (88% workshop participation)

Focus / Objective

Educational / understanding issues regarding the Minamata Disease

Educational / current situation of Tsunami affected area and applying the concept of resilience

Research / sustainable water management in semiarid region in China

Research / additionality of payments for ecosystem services for agroforestry

Educational / urban health issues (focus differed by group)

Educational / sustainability challenges and research methods in urban Africa

Primary field activities

Lectures, field, visits, group work

Lectures, field, visits, interviews

Field visits, interviews, survey

Interviews, field visits

Lectures, field visits, group work

Lectures, field research, group work

Major characteristics

Organized by faculty. Output of educational material

Organized by faculty

Student-led, participants from multiple institutions

Student-led, first time unit

Organized by faculty.

Participants from multiple institutions

Organized with participation of local students

2. Methods

The assessment began with the development of a conceptual framework and methodology, implemented systematically in a subsequent phase. The first phase took place in the Tohoku Resilience Exercise, one of six workshops assessed in this study. While the exercise itself had an educational focus of having students grapple with the complexity of issues surrounding the regional reconstruction after the Tohoku Earthquake and Tsunami of March 2011, students simultaneously engaged in a reflective analysis that laid the foundation of this assessment effort [32].

This developmental phase began with a review and analysis of the conceptual framework of key competencies for sustainability science professionals [29] that had been used in a previous assessment of the said program [33]. The chosen framework was deemed appropriate for this assessment as a focus on real-world problems is characteristic of GPSS-GLI, and the five key competencies were identified for their relevance to sustainability science research and problem solving [29, 32]. Collectively, students reviewed this pre-existing framework and adapted the original definitions for use within an educational context [32]. The group then analyzed the linkages between each of the competencies and the activities and issues within the Resilience Exercise.

In order to hold pointed discussions about how different components of the field course-related activities contributed to participants’ personal development, there was a need to distinguish between active and passive learning, as well as of recognizing a competence as important in professions of sustainability science. As discussed by San Carlos and colleagues [33], a review of academic literature revealed a lack of consensus and clarity on the definitions of active and passive learning [34, 35]. For practical purposes, our understanding is that active learning involves active engagement of the students with the planned field exercise course activities. In other words, active learning is learning by doing, such as designing and conducting original field surveys and through firsthand interaction with stakeholders in the research topic. In contrast, passive learning is the unidirectional transmission of information through lectures and other methodologies that do not require active student engagement [35].

At the end of the week-long Exercise course with daily, reflective discussions, each student’s personal experiences were quantified for analysis using a questionnaire with the concepts discussed. This questionnaire was used throughout the subsequent assessment. The questionnaire assumed that the respondent would have received some explanation of the competencies prior to assessment, but listed definitions as shown in Table 2. Students were asked to rate the unit’s effectiveness in facilitating personal development of the respective competence beyond their baseline level. The assessment of each competence was threefold: for passive learning, for active learning, and for “recogni[tion]/agree[ment] about the importance of the competence for research on sustainability issues” (hereafter: “Recognition”). Responses were indicated according to a 5 point-Likert scale (1: very ineffective (no influence); 2: ineffective, 3: satisfactory, 4: effective, 5: very effective). Open space was provided at the end of the questionnaire with prompts encouraging comments on respondents’ personal experiences or feedback on the assessment itself.

All subsequent assessments were conducted after the completion of the field courses according to the following procedure. The authors contacted student participants of the respective course unit using e-mail and/or social media to schedule a course workshop. This correspondence involved all GPSS-GLI students who had participated in the course, with one exception where the student had already graduated and left the country.

Workshops were facilitated by at least one of the authors. A brief introduction of the assessment project was followed by inquiry about the unit’s educational and/or research objectives. Using a whiteboard or projected computer screen, students were then asked to list out the units’ activities. Next, the competence framework was introduced using the definitions on Table 2, and students were asked to identify linkages between the competencies and the listed activities. At the end, the questionnaire was handed out either electronically or on paper for students to complete individually. The total duration of the workshops averaged about 90 minutes, and all workshops were conducted between September and November of 2015.

Table 2. Original and applied definitions of Key Competencies in Sustainability (adapted from San Carlos et al [32])

Competence

Original Definition 29

Our Operationalization

Systems-thinking competence

Ability to collectively analyze complex systems across different domains (society, environment, economy, etc.) and across different scales (local to global)

Competency to organize and understand the complex constellation of sustainability issues

Anticipatory competence

An ability to collectively analyze, evaluate, and craft rich pictures of the future related to sustainability issues and sustainability problem-solving frameworks

Competency to visualize future scenarios, including non-intervention and alternative sustainability visions

Normative competence

An ability to collectively map, specify, apply, reconcile, and negotiate sustainability values, principles, goals, and targets

Competency to understand the range of different values that could lead to different sustainability visions

Strategic competence

Ability to collectively design and implement interventions, transitions, and transformative governance strategies toward sustainability

Competency to design and implement strategies to achieve a particular sustainability vision

Interpersonal competence

An ability to motivate, enable, and facilitate collaborative and participatory sustainability research and problem solving

Competency to communicate, coordinate, negotiate or lead

Subsequent consultations with faculty and affiliated staff members supplement the above process as a means to consider the appropriateness of the completed assessment. To date, this process has consisted of an e-mail with a semistructured questionnaire to faculty and staff members associated with each field course. The e-mail included a summary of the student assessments for the respective unit, as well as cross-unit average scores. Another document outlined the intent of the assessment and prompted for responses as follows: 1) explanations and interpretations of the results; 2) reflections on the exercise design; 3) comments and feedback on the assessment itself. As some unit-specific comments would be traceable to individual faculty members, the document asked faculty members to indicate their willingness to have their comments attributable to the unit in question.

3. Results

3.1 Student Workshop and Faculty Participation Rate

Field course units are referred to by the location: Minamata (Japan), Tohoku (Japan), Oasis (China), Costa Rica, Bangkok (Thailand), and Nairobi (Kenya). As shown in Table 1, the assessment had a high rate of participation, with full participation for four of six units. E-mail inquiries with faculty and staff members were followed up with reminders and reached a response rate of 86% (n = 7). As only one staff member was contacted, faculty and staff will hereafter be referred to as “respondents” to ensure confidentiality.

3.2 Student Workshop and Questionnaire Results

Figure 3 shows the questionnaire results. Columns (A) to (F) show results in each course unit by competence. Rows (1) to (5) show the results for each competence by course unit. Mean scores and standard deviations (SD) for each competence are shown by type of learning. The last column and row represent the aggregate means by competence and unit, respectively.

3.3 Results by Competence and Type of Learning

Figure 1 shows the mean scores for the five competencies by type of learning. Results indicate overall student satisfaction with the field courses, as all five competencies obtained scores higher than 3.0 (“satisfactory”) for all types of learning. The highest scoring competence was Interpersonal Competence (M = 4.17). The lowest scoring competence was Strategic Competence (M = 3.38).

High scores on Recognition indicate that students generally agreed with the literature on the relevance of the key competencies for sustainability science research [29]. Recognition scored higher than the other types of learning on four of five competencies (Anticipatory Competence (M = 3.89); Normative Competence (M = 3.89); Strategic Competence (M = 3.79); Interpersonal Competence (M = 4.27)). Regarding Systems Thinking Competence, Recognition only scored marginally below the mean score of 3.88 (M = 3.86).

Figure 1. Aggregate scores by competence and type of learning

Active Learning was evaluated more highly than Passive Learning for all competencies. This is intuitive, as the field courses are based on the concept of providing opportunities for active engagement in the field [36]. The difference was greatest for Interpersonal Competence, where the aggregate mean for Active Learning (M = 4.33) was 0.43 points greater than for Passive Learning (M = 3.90). In contrast, the gap between Passive (M = 3.17) and Active (M = 3.18) Learning was only 0.01 points for Strategic Competence.

Other notable results are the high scores on Interpersonal Competence (M = 4.17) and low scores on Strategic Competence (M = 3.38). Effectiveness with the development of Interpersonal Competence may be explained by GPSS-GLI students’ diversity in cultural, academic, and professional backgrounds as well as demographics [36]. Lower evaluations on Strategic Competence may be due to the expectation and desire of students to have a tangible impact on the study area, despite time and resource constraints that limit such impact in reality. Student and faculty respondents alike commented that courses focused on understanding past and current situations rather than on speculating the future. This is understandable given the one to two week duration of the courses and consistent with the interpretation regarding the lack of capacity of the courses to have tangible impact.

3.4 Results by Field Course and Competence

Figure 2 shows competence and mean scores for the six field courses assessed in this study. Mean scores by course unit were also higher than satisfactory (3.0). The highest scoring course unit was the Bangkok Unit (M = 3.85). The Tohoku Unit (M = 3.46) received the lowest scorings and had high inter-student variation in each competence, a result likely attributable to the extended and critical discussions unique to this unit [32].

Results showed varying tendencies across units on student assessments’ scores and standard deviations (Figure 3). Minamata Unit obtained high scores and low standard deviations for all competencies (see column (A)). Tohoku Unit yielded the lowest mean score (M = 3.46), with similar results excepting Strategic Competence (M = 2.75), which scored below “satisfactory”. However, standard deviations within each competence were high for all competencies and almost all types of learning (see column (B)). One respondent took particular note of the contrast between Minamata and Tohoku Units, as “both are designed as ‘experienceoriented’ [and with] similar concepts”.

The Oasis and Costa Rica Units were similar in their focus on research. However, evaluations by Oasis Unit students had large variation (e.g. standard deviations above 1.0 for Strategic Competence (Passive (SD = 1.30); Active (SD = 1.22); Recognition (SD = 1.58))). Students in both units were consistent in their high evaluations of the course’s impact on their Interpersonal Competence (Oasis (M = 4.87); Costa Rica (M = 4.20); see Figure 3, columns (C) and (D)). In particular, Oasis student evaluation for Interpersonal Competence was highest of all units (aggregate M = 4.87). These outcomes may be attributed to the emphasis on student leadership noted by the faculty respondents affiliated with the two units. One stated that this emphasis might have been interpreted as a lack of strategic vision in the design of the unit, explaining the lower evaluation on Strategic Competence.

General scoring patterns on the Bangkok and Africa Units are comparable. However, responses on the former unit had greater internal consistency (see Table 1, column (E), (F) and Figure 2). The Africa Unit yielded relatively high variation amongst students for Passive and Active Learning. An affiliated respondent observed that these relatively high variations might indicate that “the exercise led to variable experiences for different students”. Another respondent affiliated with the Bangkok Unit expressed surprise at the lower scores on Recognition. Regarding Systems Thinking Competence, this respondent suggested that more attention should be given to a “holistic view about the complex systems (economic, social etc.) relating to the environmental and health issues” addressed in the unit.

4. Discussion

This self-assessment of field courses in GPSS-GLI addresses both the needs of the said program, as well as academic needs to assess the development of competencies necessary for sustainability professionals [29, 37]. Building upon the foundation of a previous assessment of GPSSGLI curricular activities conducted six years ago [33], the present study provides a more detailed and in-depth assessment of field courses, a core activity in the program.

4.1 Contributions to GPSS-GLI

The self-assessment method was generated in the previous exploration of GPSS-GLI student perspectives on curricular activities and the development of their competencies [33]. Student participation in the assessment and development of GPSS-GLI is consistent with the program’s emphasis on developing student leadership skills [36] and educational practices in which students can participate [38].

Responses on the validity of the assessment are mixed, yet overall positive. Most faculty members considered the competence framework to be an appropriate assessment framework for GPSS-GLI. Five out of the six faculty members consulted considered the results insightful to varying degrees. Comments included, “the results seem accurate,” and “results are convincing”.

Nonetheless, some were skeptical of the framework and/or fundamental approach of this assessment. One respondent considered the competence framework unfit for this assessment, another expressed that their understanding of the framework was insufficient to use it, and a third considered it necessary to differentiate between the two types of field courses (Global Field Exercise and Resilience Exercise) offered in the program. A further response questioned, “the overall assessment has to be looked at with a question mark”.

4.2 Methodological Limitations

Indeed, limitations of this ongoing assessment must be carefully considered. First, students may have difficulties relating their field experiences to the development of their personal competencies. Moreover, the time between the field course and assessment varied from unit to unit. Notwithstanding the high rates of participation in the workshops (Table 1), the validity of our results must be interpreted according to the low numbers of participants per unit. Second, results depend on students’ comprehension of the competencies, and the relatively short workshops may have been insufficient to ensure adequate comprehension. One faculty member raised this issue and recommended incorporating an explanation of the competencies in the guidance before each field course.

Third, scores only reflect additional improvement of individual competencies that students considered attributable to the field courses. Accordingly, results are subject to variations in baseline levels. Individual experiences before, during and after the units play a great role in student assessment, and a respondent questioned “if [students] could really assess what outcome/impact they experienced for each key competence and by how much”.

Figure 2. Aggregate scores by course and competence
Figure 3. Questionnaire results

Fourth, field courses were designed with unique objectives, none of which explicitly involved the said competencies. Nonetheless, one respondent interpreted that “assessment results show that this design was vindicated”, somewhat validating the methodology even if it differed from the original intentions.

A high or low score is not necessarily good or bad, but merely a reflection of the unit design. Results thus ought to be viewed in light of the respective unit. Alternatively, future assessments should consider incorporating or reflecting the unit design in its assessment framework so as to more appropriately assess field courses designed with varying objectives in mind. For example, intended objectives of the respective course unit may be integrated into the assessment framework to provide insights more relevant to the unit in question. However, condensing the main features of each unit design into the assessment framework would be extremely challenging. Instead, the authors believe that a post-assessment discussion with students and faculty could shed light on the results obtained and allow for an open discussion that would involve the units’ design.

Additional qualitative data on students’ experience could offer a deeper insight into the results and how the competencies were developed in each unit. One respondent suggested “one would have to have qualitative expressions about their experiences” in order to better analyze the assessment outcome.

4.3 Fundamental Considerations

Lozano has suggested that most of the tools available for assessing sustainability do not seem adequate for immediate application to the university setting. In general, responses to this situation fall under either 1) modification of the existing tools, or 2) creation of specific tools for universities [39]. The current assessment falls under the latter approach and attempted to cater to the characteristics of the field courses in GPSS-GLI. Any application of this competence-based assessment to other programs or universities should be conducted with care and upon fundamental reconsideration of the assessment approach. Within the program, faculty members must consider the appropriateness of the framework used in this assessment, as well as whether or not and how to incorporate its outcomes in the design of future course units.

While Wiek et al.’s framework was selected for its focus on sustainability science, Wiek and his colleagues specify that pedagogy was beyond the scope of their study [29]. Thus, the application of his framework to education is so far unique to this assessment project [32, 33], and the results must be interpreted with caution. For example, universal competencies other than those “key” to sustainability science have not been considered, and the list of the five key competencies so far identified has yet to be finalized [29]. The existence of other studies on sustainability in higher education suggest that attention should be also paid to competencies related to domains such as the affective learning outcomes of educational initiatives [40, 41]. Further, alternative approaches to assessment could be taken into consideration, such as the Integral Framework, employed by a GPSS-GLI faculty member in the design of one course unit [42, 43] but were beyond the scope of the present study.

More fundamentally, the objective and validity of an assessment need to be carefully examined. Most faculty members consulted in the assessment consider the development of an evaluation scheme for sustainability science education to be a necessary step in improving the program design. However, respondents shared the concern that an emphasis on assessment development may lead to program designs that excessively cater to evaluation. This concern is particularly relevant to the field courses, where, through direct exposure to the problems and through real-life interactions with residents of the field site, students’ learning outcomes extend beyond what was originally intended or anticipated. Field courses must thus maintain a certain degree of flexibility to encompass diverse forms of learning.

5. Conclusions

This study contributes to the development of a method to assess students’ learning outcomes of field courses in a sustainability science program. Through the case of six field courses in GPSS-GLI at The University of Tokyo, we address not only the development of the said program, but also academic needs to assess the key competencies for sustainability professionals. The results of the self-assessment suggest that the majority of field course participants felt satisfied with the knowledge and skills they acquired, and gained the ambition to further explore the respective topic areas. Students also recognized the importance of key competencies for sustainability professionals after participating in the field courses. Although the authors do not suggest that all courses be aligned with the key competencies, the study suggests that such alignment could raise students’ awareness of the competencies they are acquiring from the program they belong to.

We expect to tackle limitations of the self-assessment method of this study in future developments. In particular, the assessment framework may be altered to reflect the variety of field course designs. In terms of implementation, the framework’s concepts may be better standardized by building a common understanding of the methods and terminologies used across students and faculty. There should also be consensus within the graduate program on the role of the assessment and the appropriate level of effort dedicated to this task.

We believe that the results of this study showed enough evidence to support the usefulness and appropriateness of the deployed self-assessment. We consider the present study a successful and relevant step forward in the assessment of field exercise courses in the Graduate Program in Sustainability Science – Global Leadership Initiative of The University of Tokyo, and a contribution to the general development of mechanisms to assess key competencies for sustainability science research.

Data Availability

The data used to support the findings of this study are available from the corresponding author upon request.

Acknowledgments

The authors would like to thank the Graduate Program in Sustainability Science – Global Leadership Initiative (GPSSGLI) for providing the opportunity to conduct the assessment reported in this document. In particular, the authors thank the students and faculty that contributed comments and reflections on the results. Finally, acknowledgements go to all the students that actively collaborated in the workshops that made this assessment possible.

Conflicts of Interest

The authors declare that they have no conflicts of interest.

References
1.
Clark WC, Dickson NM. Sustainability science: The emerging research program. Proceedings of the National Academy of Sciences of the United States of America. 2003;100(14):8059–8061. [Crossref]
2.
Kates RW, Clark WC, Corell R, Hall JM, Jaeger CC, Lowe I, et al. Sustainability Science. Science. 2001;292(5517):641–642. [Crossref]
3.
Anderies J, Janssen M, Ostrom E. A framework to analyze the robustness of social-ecological systems from an institutional perspective. Ecology and Society. 2004;9(1). [Crossref]
4.
Ostrom E, Janssen MA, Anderies JM. Going beyond panaceas. Proceedings of the National Academy of Sciences of the United States of America. 2007;104(39):15176–15178. [Crossref]
5.
Komiyama H, Takeuchi K. Sustainability science: building a new discipline. Sustainability Science. 2006;1(1):1–6. 006-0007-4. [Crossref]
6.
Kates RW, Parris TM. Long-term trends and a sustainability transition. Proceedings of the National Academy of Sciences of the United States of America. 2003;100(14):8062–8067. [Crossref]
7.
Parris TM, Kates RW. Characterizing a sustainability transition: goals, targets, trends, and driving forces. Proceedings of the National Academy of Sciences of the United States of America. 2003;100:8068–8073. [Crossref]
8.
Martens P. Sustainability: Science or Fiction? IEEE Engineering Management Review. 2007;35(3):70. [Crossref]
9.
Griggs D, Stafford-Smith M, Gaffney O, Rockstr ¨om J, Ohman MC, Shyamsundar P, et al. Sustainable development goals for people and planet. Nature. 2013;495(7441):305–307. [Crossref]
10.
Clark WC. Sustainability science: a room of its own. vol. 104; 2007. [Crossref]
11.
Miller TR, Wiek A, Sarewitz D, Robinson J, Olsson L, Kriebel D, et al. The future of sustainability science: A solutions-oriented research agenda. Sustainability Science. 2014;9(2):239–246. [Crossref]
12.
Spangenberg JH. Sustainability science: a review, an analysis and some empirical lessons. Environmental Conservation. 2011;38(03):275–287. [Crossref]
13.
Kajikawa Y. Research core and framework of sustainability science. Sustainability Science. 2008;(3):215–239. 0053-1. [Crossref]
14.
Kauffman J, Arico S. New directions in sustainability science: promoting integration and cooperation. Sustainability Science. 2014;9(4):413–418. [Crossref]
15.
Brandt P, Ernst A, Gralla F, Luederitz C, Lang DJ, Newig J, et al. A review of transdisciplinary research in sustainability science. Ecological Economics. 2013;92:1–15. [Crossref]
16.
Wiek A, Farioli F, Fukushi K, Yarime M. Sustainability science: Bridging the gap between science and society. Sustainability Science. 2012;7(SUPPL. 1):1–4. [Crossref]
17.
Trencher G, Yarime M, McCormick KB, Doll CNH, Kraines SB. Beyond the third mission: Exploring the emerging university function of cocreation for sustainability. Science and Public Policy. 2013;41(2):151– 179. [Crossref]
18.
Bettencourt LMA, Kaur J. Evolution and structure of sustainability science. PNAS. 2011;108(49):19540–19545. [Crossref]
19.
Kates RW. What kind of a science is sustainability science? Proceedings of the National Academy of Sciences. 2011;108(49):19449– 19450. [Crossref]
20.
Kajikawa Y, Ohno J, Takeda Y, Matsushima K, Komiyama H. Creating an academic landscape of sustainability science: an analysis of the citation network. Sustainability Science. 2007;2(2):221–231. [Crossref]
21.
SSPP is a peer-reviewed, open-access journal focusing on sustainability science research. The journal also provides an international network of sustainability research and education.
22.
Sustainability: Science, Practice, & Policy. Academic Programs in Sustainability; 2016. Available from: http://sspp.proquest.com/ sspp institutions/display/universityprograms#.
23.
Stock P, Burton RJF. Defining terms for integrated (multi-inter-transdisciplinary) sustainability research. Sustainability. 2011;3(8):1090– 1113. [Crossref]
24.
Schweizer-ries P, Perkins DD. Sustainability Science: Transdisciplinarity , Transepistemology , and Action Research Introduction to the Special Issue. Umweltpsychologie. 2012;16(1):6–10.
25.
Frisk E, Larson KL. Education for sustainability: Competencies & practices for transformative action. Journal of Sustainability Education. 2011;2.
26.
Onuki M, Mino T. Sustainability education and a new master’s degree, the master of sustainability science: the Graduate Program in Sustainability Science (GPSS) at the University of Tokyo. Sustainability Science. 2009;4(1):55–59. [Crossref]
27.
Tamura M, Uegaki T. Development of an educational model for sustainability science: Challenges in the Mind-Skills-Knowledge education at Ibaraki University. Sustainability Science. 2012;7(2):253–265. [Crossref]
28.
Sterling S, Orr D. Sustainable education, re-visioing learning and change. Devon, UK: Green Books; 2001.
29.
Wiek A, Withycombe L, Redman C. Key competencies in sustainability: a reference framework for academic program development. Sustainability Science. 2011;6(2):203–218. 0132-6. [Crossref]
30.
Remington-Doucette S, Connell K, Armstrong C, Musgrove S. Assessing sustainability education in a transdisciplinary undergraduate course focused on real world problem solving: A case for disciplinary grounding. International Journal of Sustainability in Higher Education. 2013;14(4):404–433. [Crossref]
31.
A list of past field courses in GPSS-GLI is available at http://www.sustainability.k.u-tokyo.ac.jp/exercises.
32.
San Carlos RO, Tyunina O, Yoshida Y, Mori A, Sioen GB, Yang J. In: Esteban M, Akiyama T, Chiahsin C, Ikeda I, editors. Assessment of Fieldwork Methodologies for Educational Purposes in Sustainability Science: Exercise on Resilience, Tohoku Unit 2015. Springer International Publishing; 2016. pp. 67–91. 32930-7 4. [Crossref]
33.
Tumilba V, Kudo S, Yarime M. Review and assessment of academic activities , student competencies, research themes and practice of sustainability principles in the Graduate Program in Sustainability Science. In: The 19th International Sustainable Development Research Conference. Stellenbosch: ISDRC 19; 2013.
34.
Bonwell CC, Eison JA. Active Learning: Creating Excitement in the Classroom. Washington; 1991.
35.
Chi MTH. Active-Constructive-Interactive: A Conceptual Framework for Differentiating Learning Activities. Topics in Cognitive Science. 2009;1(1):73–105. [Crossref]
36.
Graduate Program in Sustainability Science, Global Leadership Initiative, The University of Tokyo, Tokyo, Japan.
37.
Barth M, Godemann J, Rieckmann M, Stoltenberg U. Developing key competencies for sustainable development in higher education. International Journal of Sustainability in Higher Education. 2007;8(4):416–430. [Crossref]
38.
Waas T, Hug´e J, Ceulemans K, Lambrechts W, Vandenabeele J, Lozano R, et al. Sustainable Higher Education. Understanding and Moving Forward. Brussels; 2012.
39.
Lozano R. Incorporation and institutionalization of SD into universities: breaking through barriers to change. Journal of Cleaner Production. 2006;14(9-11):787–796. [Crossref]
40.
Rieckmann M. Future-oriented higher education: Which key competencies should be fostered through university teaching and learning? Futures. 2012;44(2):127–135. [Crossref]
41.
Shephard K. Higher education for sustainability: seeking affective learning outcomes. International Journal of Sustainability in Higher Education. 2008;9(1):87–98. [Crossref]
42.
Akiyama T, Li J. Environmental leadership education for tackling water environmental issues in arid regions. In: Environmental Leadership Capacity Building in Higher Education. Springer Japan; 2013. pp. 81–92.
43.
San Carlos RO, Teah HY, Akiyama T, Li J. In: Esteban M, Akiyama T, Chiahsin C, Ikeda I, editors. Designing Field Exercises with the Integral Approach for Sustainability Science: A Case Study of the Heihe River Basin, China. 1st ed.; 2016. pp. 23–39. 3-319-32930-7 2. [Crossref]

Cite this:
APA Style
IEEE Style
BibTex Style
MLA Style
Chicago Style
GB-T-7714-2015
Carlos, R. S., Yoshida, Y., & Kudo, S. (2017). Fostering the Next Generation of Sustainability Professionals— Assessing Field Courses in a Sustainability Science Graduate Program. Chall. Sustain., 5(1), 52-61. https://doi.org/10.12924/cis2017.05010052
R. S. Carlos, Y. Yoshida, and S. Kudo, "Fostering the Next Generation of Sustainability Professionals— Assessing Field Courses in a Sustainability Science Graduate Program," Chall. Sustain., vol. 5, no. 1, pp. 52-61, 2017. https://doi.org/10.12924/cis2017.05010052
@research-article{Carlos2017FosteringTN,
title={Fostering the Next Generation of Sustainability Professionals— Assessing Field Courses in a Sustainability Science Graduate Program},
author={Ricardo San Carlos and Yuki Yoshida and Shogo Kudo},
journal={Challenges in Sustainability},
year={2017},
page={52-61},
doi={https://doi.org/10.12924/cis2017.05010052}
}
Ricardo San Carlos, et al. "Fostering the Next Generation of Sustainability Professionals— Assessing Field Courses in a Sustainability Science Graduate Program." Challenges in Sustainability, v 5, pp 52-61. doi: https://doi.org/10.12924/cis2017.05010052
Ricardo San Carlos, Yuki Yoshida and Shogo Kudo. "Fostering the Next Generation of Sustainability Professionals— Assessing Field Courses in a Sustainability Science Graduate Program." Challenges in Sustainability, 5, (2017): 52-61. doi: https://doi.org/10.12924/cis2017.05010052
Carlos R. S., Yoshida Y., Kudo S.. Fostering the Next Generation of Sustainability Professionals— Assessing Field Courses in a Sustainability Science Graduate Program[J]. Challenges in Sustainability, 2017, 5(1): 52-61. https://doi.org/10.12924/cis2017.05010052
cc
undefined