Matches in SemOpenAlex for { <https://semopenalex.org/work/W2022638081> ?p ?o ?g. }
Showing items 1 to 57 of
57
with 100 items per page.
- W2022638081 endingPage "292" @default.
- W2022638081 startingPage "281" @default.
- W2022638081 abstract "In response to recent calls for reform in higher-education biology (AAAS 2011, National Research Council 2012), many college instructors are seeking ways to create interactive, student-centered classrooms, in which students explicitly share responsibility for effective learning with instructors. There is a substantial and expanding set of tools available to facilitate student-centered course design, including case, problem, and clicker question libraries; simulation modeling; authentic laboratory experiences and demonstrations; Socratic tutorials; peer instruction, and others (National Academies of Science 2011)(Supplementary Resources). However, many instructors who try new methods often abandon those practices (Henderson et al. 2012). Likely causes are the lack of guidance in the literature for instructors to navigate the formidable process of shifting from lecture-based to interactive classrooms, and experiencing “pushback” from colleagues or students. In this article, we offer three recommendations to increase your chances of success in student-centered courses. For this discussion, we define “success” as (1) students maintaining or increasing achievement on authentic assessments of concepts and skills, and (2) intellectually and emotionally rich experiences for students and instructors alike. Individually, each of us struggled to create successful student-centered courses —despite access to excellent resources like those mentioned above. In response, we formed a community of practice (Wenger 1999) focused on effective student-centered course design for our courses (Table 1). These recommendations emerged from regular (approximately biweekly), sustained discussions among the authors about acknowledging and understanding our failures, and about creating and sharing best practices in college teaching. For each recommendation, we elaborate on what we did and what our students did, and we provide an example of its implementation (Table 2). The present paper provides our best advice for success, grounded in relevant research in teaching in higher education when possible. As instructors, we often begin course planning by making a list of topics to cover. As we've all discovered, the problem with this approach is that the list of topics grows, the time we invest in any given topic shrinks, our students seem to comprehend ever less, and they experience greater frustration in learning what to them are unrelated facts of questionable relevance. From the instructor's perspective this approach to teaching may be less than satisfying. An alternative approach is to start with the big questions in your discipline, instead of starting with “what topics to cover.” Identify what you'd be proud to have taught your students to be able to think about (concepts or big ideas) and do (skills or competencies) as a result of your course. We dubbed this the “five-year plan:” in five years, when our students have forgotten the finer points of Lotka-Volterra predator–prey interactions or Fisher's fundamental theorem, can they look at an economic indicator graph in the newspaper and interpret it accurately? When they attend a community-planning meeting, can they evaluate arguments based on evidence? When they attend graduate school, do they take the initiative to look up the predator–prey equations or Fisher's fundamental theorem and persist in their efforts to understand and use them? Big ideas matter in biology because we focus on systems with complex interactions that often vary across spatial and temporal scales. Novices like our students tend to focus on rudimentary lists of properties and characteristics, instead of the more expert-like models of relationships among essential concepts that characterize a sophisticated understanding of biological systems (Hogan and Thomas 2001, Knapp and D'Avanzo 2010, AAAS 2011). Initially, when all three of us committed to student-centered courses, we started with what we thought were well-defined content goals. Each of us individually chose case studies, problems, scenarios, clickers, and demonstrations to get students discussing how to approach and solve problems and dilemmas. We assembled our respective courses based on the content we wanted our students to learn and master. However, each of us in turn discovered that we had not yet thought about meaningful ways for our students to build competencies (AAAS 2011), such as reading and interpreting graphs, making evidence-based arguments, and learning to solve complex problems collaboratively. But competencies are the tools that allow our students to grapple with the content we are asking them to master! Based on this realization, we began to collaborate toward a common goal of fostering scientifically-literate individuals who had mastered both the “big ideas” and competencies of science. As a result of our collaboration, we adopted five common learning goals for all our courses (Table 2). Over several iterations of our respective courses and many discussions among ourselves, each of us consolidated our existing course goals, discarding those that didn't correspond with a big idea in our fields, so that our students could focus on fewer concepts and their relationships—a more expert-like approach (Dunbar 2000, Knapp and D'Avanzo 2010). We referenced existing work to guide our choices for fewer, more fundamental concepts and skills (competencies) in biology generally (e.g., AAAS 2011) and principles of ecology particularly (e.g., Knapp and D'Avanzo 2010). We created, then iteratively revised, our course goals and unit goals so that our students regularly tapped into higher-order thinking skills, like being able to apply concepts, analyze data and evidence, or synthesize multiple ideas (Crowe et al. 2008). This iterative process of consolidating our course objectives within our community of practice helped each of us to pull out the narrative threads and themes that were meaningful within our disciplines and to us personally, which, in turn, helped our students construct meaningful narratives about our subjects. After two iterations of each of our courses and multiple discussions, we settled on a set of five course goals common to all our courses and based heavily on competencies (our Table 2; Table 2.1 in AAAS 2011). Concurrent with our ongoing process of course goal refinement, we planned course activities and assessments (e.g., cases, problems, readings, homework, quizzes, exams, etc.) based on whether and how the component supported students mastering both concepts and competencies (Wiggins and McTighe 2005). We worked to build the sequencing for concepts and skills so that students progressed from simpler to more complex concepts and competencies, with multiple opportunities for students to practice new and more complex skills. Once we developed a full suite of concept and skill objectives, we then clearly articulated these objectives, justified them to our students, and repeatedly referenced them throughout our courses. We welcomed when our students asked, “Why are we doing this?” as opportunities to reinforce the content and skills required to be a successful scientist. These questions also allowed us to articulate our own expert narratives about “big ideas,” and, more importantly, invited students to begin constructing and communicating their own meaning and narratives about the big ideas. As one example of implementing a big idea, all of us recognized that one important competency for our students to gain was graphical literacy—that is, the abilities both to read a figure and to create an appropriate figure. Since most students have no training in making or interpreting scientific figures, all of us focused early in each semester of our courses on the fundamentals of describing, then interpreting figures (ESA 2005). We made frequent use of a variety of figures, from primary sources to those in the popular press. After students gained skill and confidence with describing axes, understanding scales, interpreting error bars, and summarizing overall trends, we chose figures that added layers of complexity appropriate to the course level. In an introductory course, this meant clustered histograms, or graphs with two Y-axes; in upper-division courses, students grappled with multiple regressions, point clouds, and stacked histograms. We modeled the habit of sketching predictive graphs and asked our students to practice this as well. We chose cases and problems that emphasized students creating graphs and figures to communicate about their results. Our assessments—exams, presentations, and posters—required students to create figures to support their claims or their research. By the end of our respective courses, students were competent at sketching predictive graphs, selecting figures appropriate to display data, and interpreting figures. Productive interactions help our students learn more than they do in traditional lectures alone (Hake 1998, PCAST 2013). In student-centered classrooms, the responsibility for constructing knowledge and learning is shared between the instructor and all students, necessarily requiring students to interact with one another. But what constitutes productive interactions? In our process toward interactive classrooms, each of us initially assumed that almost any student interactions helped them learn. This was a mistake, and as a result, each of us initially spent more time solving interaction “problems” such as resistance to working in groups, disengagement from learning, or conflicts between students about expectations for doing the necessary work. Moreover, because the problems we study in ecology and evolution are complex, we underestimated how much time is required to develop the effective collaboration required to solve such problems (Knight and Wood 2005). Many problems our students will encounter in their futures are complex problems requiring cooperation among people of varying knowledge and skills. Just as students need repetition and practice with content learning, they need practice in the skills of doing science—including becoming good collaborators. Realizing the importance of effective collaboration in our own professional development and teaching, we approached this problem using two sets of strategies: (1) explicitly teaching collaboration skills, and (2) structuring class activities that promoted collaboration. To increase effective collaboration in our classrooms, we familiarized ourselves with research-based features of high-functioning collaborative groups: positive interdependence, when students share common goals and understand that their achievement is only possible by working together; quality in-person interactions; individual and collective accountability; and goal-checking and debriefing (Johnson et al. 1998, Tanner et al. 2003). So that students understood that we valued collaboration, we built a collaboration skill goal into every course's learning objectives, and we set aside time to teach our students how to collaborate. With our assistance, students designated the roles and responsibilities of group members. In introductory courses, where students have less experience working in collaborative groups, we defined group member roles. Examples of specific roles included: a facilitator to keep the group on track; a timekeeper to monitor group progress; a recorder, to take thorough notes of the discussions and problem-solving steps; a synthesizer, to integrate multiple threads of thoughts and expressions; and a reporter, to report out to the larger class. In smaller, upper-division courses, we invited students to designate roles and choose who would fill them. Group member roles can be adapted to an instructor's preferences, student level, course content and enrollment, and scope of the problem at hand. For larger projects, such as the poster project described below, students drew up team contracts that described individual and group expectations, penalties for failing to meet deadlines, and benchmarks for completion of the main project and its components. The team contract empowered students to manage—and minimize—conflict and to accomplish the team's goals. We also recognized that the way we had previously constructed some of our group exercises did not promote effective collaboration. For example, many of the problems we had our students solve in groups could have easily been solved by any single individual, negating the need for positive interdependence. Over several iterations of our courses, we reconfigured in-class problems, cases, and assignments to align more closely with practices of effective collaboration. We redesigned problems to enhance collaboration by offering more complex or challenging problems (e.g., instead of mere calculations or solution-finding); by constraining the time given to solve them; by drawing upon multiple, complementary skills such as researching, graphing, and writing; and by requiring authentic work products such as posters and oral reports. Students had designated time during class for cooperative learning—that is, challenging, multilayered problems to solve and work products to produce. Instead of taking time away from learning concepts, we leveraged cooperative learning to promote greater concept mastery at higher cognitive levels than was possible in our previous teacher-centered courses. After implementing these strategies, our students showed better concept mastery from well-structured cooperative work than from individual efforts. For example, students in an ecology course showed higher scores on exams consisting of questions that asked students to apply, evaluate, and synthesize what they had learned (higher-order congition; Bloom 1956), than students in a previous teacher-centered course on exams testing only factual recall. This is consistent with other findings about the synergy of thoughtfully-structured cooperative work (Hoskinson 2010, Ramaekers et al. 2011, Welsh 2012). Students also reported that collaboration helped them learn “much” or “very much” in anonymous surveys, both while our courses were in progress and after their conclusions (Barger, Hoskinson, Martin, unpublished data). As a specific example of promoting productive student interaction (Table 2), each of us required a substantial research project as a key learning activity in our courses. These projects were empirical (e.g., hypothesis-driven literature reviews and syntheses) or data-driven (students generating and analyzing data). Once we had some experience and success fostering collaboration at smaller scales (e.g., in-class problems and cases), and using the research-based principles of creating effective collaborations, we created a research project culminating in an original scientific poster presented at a poster session. Students worked in teams of 3–6 over the course of several weeks to fulfill relatively simple, very broad conceptual prompts: “Solve a biological problem using a mathematical model that you develop,” “Choose an international ecosystem management issue and analyze the major factors (e.g., ecological, social, economic, institutional) that have led either to success or failure of the management actions;” “Carry out an original research project on any topic of your choice that uses the principles of evolution to formulate and test alternative hypotheses.” We provided the sequencing for this project (identify a question, pose a hypothesis, review the literature, etc.) and required teams to turn in small products (hypothesis, annotated bibliography, model prototype, etc. as appropriate to the project's activities) at regular intervals for group accountability. We built in brief check-ins during our class periods to allow students with busy schedules to report on their progress to their teams and to plan next steps. During the poster session, students demonstrated individual accountability by giving a brief poster talk that demonstrated mastery of the project and course concepts. Over the two semesters of this larger-scale common research project, most students demonstrated thorough mastery of course content and, in some cases, quite sophisticated, expert-like concept mastery. We acknowledge that the scope of such a project may be beyond what many instructors are comfortable with, and it did involve a significant commitment of course resources. We gained both confidence and experience in structuring a cooperative-learning project of this scope by starting with individual learning modules such as class sessions and case studies. But this investment of time enhanced student mastery of big ideas and competencies, and contributed to students' development as apprentice scientists and effective collaborators. Our purpose for including this example is to show that building student collaboration into our courses meant no sacrifice in their content mastery. Indeed, some student teams showed higher achievement on team quizzes and the final project than their scores on individual assessments would have predicted (Hoskinson, unpublished data). Instructors favoring smaller-scale collaborative activities may find that building effective interactions into daily course activities often requires little reconfiguration of existing cases and problems, builds students' collaboration and problem-solving skills, and ultimately promotes their mastery of the big ideas and competencies. Sometimes, our college courses emphasize factual knowledge at the expense of students learning how to notice and regulate their thinking, a necessary skill for biology undergraduates if they are to “…learn how to integrate concepts across levels of organization and complexity and to synthesize and analyze information that connects conceptual domains…” (AAAS 2011: ix)—or, in other words, to think like scientists about big ideas (Recommendation 1). Our students have enormous reserves of knowledge and strategies. All too often, though, they do not use them. Metacognition, or “thinking about thinking,” helps students learn by developing habits of planning, monitoring, and evaluating their own learning (Tanner 2012). We use metacognition all the time in our professional practices— whether we are aware of our own metacognitive processes or not (Dunbar 2000, Tanner 2012). Students who develop the habit of reflecting about their thinking solve harder problems, and faster, than students who do not (Delclos and Harrington 1991). Reflective students learn by observing others (Huelser and Metcalfe 2011) and think more about their capabilities than merely their scores (Schraw 1998). They describe their solutions in terms of the skills and strategies they use (Zion et al. 2005). In other words, they acquire the scientific competencies and big ideas we identified in Recommendation 1. But if we don't think about how we teach, how can we expect students to think about how they learn? We have already related how important it was for us, as educators working in separate courses with similar goals for class innovation, to meet regularly to discuss the big ideas in our fields and to plan how we would implement our concept and competency goals. Questions we posed for ourselves, like those under Recommendations 1 and 2, helped us develop our own metacognition about teaching—although we were not aware of what to call this process a priori! In addition to planning, our discussions also focused on monitoring our own thinking and motivations in choosing course activities, and evaluating what worked or didn't work. In our classrooms, we modeled the strategies important to our scientific practice. Most of us can identify moments in our own undergraduate or graduate education when our instructors were unable to describe how they solved a problem, got from one step to the next, or connected ideas and concepts. We learned to describe our processes of solving problems (e.g., understand the problem, understand the “target output,” etc.), drawing graphs (sketch two perpendicular axes, put the dependent variable on the Y-axis and independent variable on the X-axis, etc.) or understanding descriptions of experiments (identify the question, find the null and alternative hypotheses, etc.). We also learned to talk about when and why we chose particular strategies for problem-solving, graphing, reading scientific papers, evaluating alternative hypotheses, and other common scientific practices. One important difference between experts and novices is that experts understand when and why to use different strategies or representations (e.g., what kind of data are represented by a bar graph vs. a continuous regression curve). We practiced organizing what we know, presenting it in ways that were meaningful to our students, and communicating clearly and plainly with them. We asked for frequent feedback—a type of formative assessment—to check this and to inform our teaching. All three of us also made frequent use of metacognition ourselves, during and after class sessions, especially monitoring (e.g., “How well is this going? Where are students getting it, and where are they stuck? Should I intervene, or let them wrestle?”) and evaluating (“How well did that work? What would I do differently next time?”). We built in multiple opportunities for our students to plan their learning, monitor their progress, and evaluate both themselves and their products. Two of us regularly used pre-class formative assessments (quizzes and surveys) to inform both individual students and instructors about potential areas of difficulty and to drive our teaching plans. Another of us used optional post-exam analysis, in which we gave students an opportunity to earn back a small number of points lost on an exam by finding a source for, and correcting, their incorrect responses. On a subsequent exam, these students performed better than students who did not analyze their results (Hoskinson, unpublished data). We all used student reflections to some degree, ranging from structured, low-investment “minute papers” (Cross and Angelo 1993) to open-ended “insights” in which students wrote about a course-related insight of their choosing. Two of us used weekly student reflections, and all of us used midterm and end-of-term student reflections (described below). At the beginning of individual class sessions, we all made regular, frequent use of prompts that probed students' planning (e.g., “Before you start working on this, talk with your small group about some possible strategies you can think of to solve this problem.”); their monitoring (e.g., “What did you figure out? What do you still need to understand?”); and their evaluating (e.g., “Which do you think is a better explanation and what evidence are you using to reach that conclusion?”). Students also practiced metacognition regularly throughout a class period's learning activities when we asked them to make comparisons (e.g., of possible pedigrees, alternative hypotheses, graphs); to rate or rank answers; and to critique their own ideas or their classmates' ideas—an activity that also reinforces effective collaboration. Although each of us implements this practice differently, both personally and in our classrooms, we believe this recommendation to be the most important and most fundamental to successful student-centered course design. We were most effective when we emphasized metacognitive practice for student learning and our own teaching. Conversely, we experienced some of our strongest self-doubts and student pushback when we neglected its importance in informing our teaching and our students' learning. Over time, our own metacognition allowed us not only to converge upon the big ideas and skills in our fields, but the narrative threads meaningful in our disciplines and to us personally—that is, the stories we tell to support engagement, curiosity, and perseverance in understanding big ideas about complex biological systems. To illustrate putting metacognition in practice, in all of our courses, we asked our students to complete regular written reflections on their learning. Depending on course enrollment and instructor preferences, these could be structured or unstructured. In structured reflections, students respond to one or more prompts, such as “How are you making progress in this class? How will you clarify any concepts that are still unclear to you?” In unstructured reflections, students could write about any course-related insight they had noticed that week. For a larger-enrollment course, two of the authors made regular use of Survey Monkey to collect and analyze student responses to open-ended questions. We did not grade or score these reflections, but we did respond, individually or collectively, to their content or themes. To demonstrate our valuation of them, we allocated a small proportion of the course grade (5%) to their presence or absence. Student reflections have several benefits, both for students and for instructors. First, they are regular and ongoing invitations to students to be responsible for, and reflect deliberately upon, their learning. Second, metacognition, like many other skills, becomes easier when it is practiced regularly. Over a semester, our students increased the frequency and quality of their metacognitions in student reflections (Barger and Hoskinson, unpublished data). Third, when completed outside of class, they require no class meeting time other than what is required to introduce their format and their purpose to students. Even when a course's enrollment made it impossible to respond to individual reflections, we could still respond either individually to a small subset, or collectively about broader patterns we noticed. Fourth, we used them as formative assessments, frequently adjusting our teaching when a few or more students mentioned struggling with a concept. Finally, we found that students appreciated the opportunity to provide regular feedback, both about their progress in the course, and our effectiveness as their instructors. From a perspective of understanding our roles as instructors and mentors, we often found these reflections inspiring, keenly insightful, funny, authentic, and humbling. Although curricular resources for student-centered courses abound, each of us found the process of transforming our own courses to be unexpectedly formidable. Each of us was confident, enthusiastic, and committed to the value and worth of student-centered teaching. But along the way, each of us—and many of our colleagues—experienced doubt, discouragement, pushback from students, and scrutiny from colleagues and administrators. Even though we teach different courses to diverse student populations, we discovered common features of our successful, high-functioning student-centered courses: they focus on big ideas and competencies; students learn by collaborating with one another; and students and instructors practice regular reflection about learning and teaching. Our three recommendations are therefore especially directed toward our colleagues just beginning or about to begin their own process of designing student-centered courses, but could be used by anyone seeking greater insight about their teaching. By focusing on the processes of instructional transformation rather than content choices, we hope to broaden the conversation about student-centered teaching to colleagues among many disciplines. We don't claim that our recommendations are comprehensive; rather, we offer them in the spirit of inquiry that drives our own scientific teaching, to initiate conversations that allow us to acknowledge what we are all attempting to do, and to do it a little better the next time. We thank our colleagues in the Department of Ecology and Evolutionary Biology for many productive conversations about teaching and learning. We are very grateful to J. H. Carpenter, A. Cotch, M. DeAntoni, C. D'Avanzo, B. Grasser, B. Krueger, and H. Sledge for many helpful comments and suggestions on early drafts of the manuscript, and to L. Middleton, J. Roach, R. Safran, and T. Tucker for comments and suggestions on the final draft. AMH was supported by the Science Education Initiative (SEI) in the Department of Ecology and Evolutionary Biology at University of Colorado Boulder. National Center for Case Study Teaching in Science, http://sciencecases.lib.buffalo.edu/cs/. Contains >470 peer-reviewed cases over many science subjects including ecology and evolution, from more-structured “clicker” cases, to discussions, dilemmas, analyses, and public-hearing-style formats. Most cases are appropriate for lower-division undergraduates, and cases can be reviewed without password access. For full access to teaching notes and answer keys, create a free account. Teaching Issues and Experiments in Ecology (TIEE), http://tiee.esa.org. A peer-reviewed collection of cases, data sets, figures, teaching tips, and other materials derived mostly from recent research articles appropriate for teaching college ecology. The Teaching section may be particularly helpful to instructors. The site is no longer actively curated. Climate Literacy and Energy Awareness Network (CLEAN), http://cleanet.org/clean/educational_resources/index.html. A reviewed and curated collection of cases centered on climate change, with topics including systems perspectives for ecosystem ecology and biogeochemistry. BioQUEST Library Online, http://bioquest.org/BQLibrary//. Peer-reviewed collection of problems, simulations, software, datasets, and tools for learning quantitative aspects of biology. SEI at the University of Colorado Boulder, http://www.colorado.edu/sei/. Course materials and instructor resources, including the entire case-based ecology course (one of the three courses described in the article), clicker questions and first-day suggestions, as well as videos and tutorials for implementing peer instruction. Think Like a Biologist, http://biodqc.org. Uses vetted Diagnostic Question Clusters (DQCs) to develop students' ability think like biologists using big ideas and sound reasoning. SERC, http://serc.carleton.edu/sp/process_of_science/browse_examples.html. Useful for teaching science competencies, especially related to the process of science. Association for Biology Laboratory Education (ABLE) http://www.ableweb.org/. Resources for building inquiry-driven college biology laboratory experiences including video tutorial software and simulation software and links. HHMI BioInteractive, http://www.hhmi.org/biointeractive. Of more interest to instructors who teach evolution and human evolution, this site is indexed by course, resource type, and subject material. Includes many visualization tools. BeSocratic, http://besocratic.colorado.edu. A series of graphical activities that can be used as formative assessments in helping students master science competencies. Students can draw and sketch (phylogenies, curves, etc.) and receive feedback that helps them modify their ideas. An account is required but free. SALG (Student Assessment of Learning Gains), http://www.salgsite.org. A free, validated tool for college instructors to gather student-reported perceptions of learning, for informing future teaching and curriculum choices. Instructors can use pre-validated questions or generate their own. SurveyMonkey, http://www.surveymonkey.com. Free survey software for online surveys." @default.
- W2022638081 created "2016-06-24" @default.
- W2022638081 creator A5026531225 @default.
- W2022638081 creator A5087547134 @default.
- W2022638081 creator A5089084488 @default.
- W2022638081 date "2014-07-01" @default.
- W2022638081 modified "2023-10-17" @default.
- W2022638081 title "Keys to a Successful Student-Centered Classroom: Three Recommendations" @default.
- W2022638081 cites W1487795549 @default.
- W2022638081 cites W1964861149 @default.
- W2022638081 cites W2000620354 @default.
- W2022638081 cites W2007694848 @default.
- W2022638081 cites W2008310925 @default.
- W2022638081 cites W2008785686 @default.
- W2022638081 cites W2029076930 @default.
- W2022638081 cites W2030576411 @default.
- W2022638081 cites W2111232558 @default.
- W2022638081 cites W2111761700 @default.
- W2022638081 cites W2127030547 @default.
- W2022638081 cites W2128734588 @default.
- W2022638081 cites W2138949691 @default.
- W2022638081 cites W4254991137 @default.
- W2022638081 doi "https://doi.org/10.1890/0012-9623-95.3.281" @default.
- W2022638081 hasPublicationYear "2014" @default.
- W2022638081 type Work @default.
- W2022638081 sameAs 2022638081 @default.
- W2022638081 citedByCount "2" @default.
- W2022638081 countsByYear W20226380812016 @default.
- W2022638081 countsByYear W20226380812022 @default.
- W2022638081 crossrefType "journal-article" @default.
- W2022638081 hasAuthorship W2022638081A5026531225 @default.
- W2022638081 hasAuthorship W2022638081A5087547134 @default.
- W2022638081 hasAuthorship W2022638081A5089084488 @default.
- W2022638081 hasBestOaLocation W20226380811 @default.
- W2022638081 hasConcept C41008148 @default.
- W2022638081 hasConceptScore W2022638081C41008148 @default.
- W2022638081 hasIssue "3" @default.
- W2022638081 hasLocation W20226380811 @default.
- W2022638081 hasOpenAccess W2022638081 @default.
- W2022638081 hasPrimaryLocation W20226380811 @default.
- W2022638081 hasRelatedWork W2096946506 @default.
- W2022638081 hasRelatedWork W2130043461 @default.
- W2022638081 hasRelatedWork W2350741829 @default.
- W2022638081 hasRelatedWork W2358668433 @default.
- W2022638081 hasRelatedWork W2376932109 @default.
- W2022638081 hasRelatedWork W2382290278 @default.
- W2022638081 hasRelatedWork W2390279801 @default.
- W2022638081 hasRelatedWork W2748952813 @default.
- W2022638081 hasRelatedWork W2899084033 @default.
- W2022638081 hasRelatedWork W3004735627 @default.
- W2022638081 hasVolume "95" @default.
- W2022638081 isParatext "false" @default.
- W2022638081 isRetracted "false" @default.
- W2022638081 magId "2022638081" @default.
- W2022638081 workType "article" @default.