02678nas a2200253 4500008004100000022001400041245015200055210006900207260001500276300000700291490000600298520169300304653002201997653002802019653002802047653001102075653002702086653003502113653001702148100002402165700003102189700002502220856017902245 2020 eng d a2365-746400aAdapting implementation science for higher education research: the systematic study of implementing evidence-based practices in college classrooms.0 aAdapting implementation science for higher education research th c2020 11 05 a540 v53 a
Finding better ways to implement effective teaching and learning strategies in higher education is urgently needed to help address student outcomes such as retention rates, graduation rates, and learning. Psychologists contribute to the science and art of teaching and learning in higher education under many flags, including cognitive psychology, science of learning, educational psychology, scholarship of teaching and learning in psychology, discipline-based educational research in psychology, design-based implementation research, and learning sciences. Productive, rigorous collaboration among researchers and instructors helps. However, translational research and practice-based research alone have not closed the translation gap between the research laboratory and the college classroom. Fortunately, scientists and university faculty can draw on the insights of decades of research on the analogous science-to-practice gap in medicine and public health. Health researchers now add to their toolbox of translational and practice-based research the systematic study of the process of implementation in real work settings directly. In this article, we define implementation science for cognitive psychologists as well as educational psychologists, learning scientists, and others with an interest in use-inspired basic cognitive research, propose a novel model incorporating implementation science for translating cognitive science to classroom practice in higher education, and provide concrete recommendations for how use-inspired basic cognitive science researchers can better understand those factors that affect the uptake of their work with implementation science.
10aCognitive Science10aEducation, Professional10aEvidence-Based Practice10aHumans10aImplementation Science10aTranslational Medical Research10aUniversities1 aSoicher, Raechel, N1 aBecker-Blease, Kathryn, A.1 aBostwick, Keiko, C P uhttps://liberalarts.oregonstate.edu/biblio/adapting-implementation-science-higher-education-research-systematic-study-implementing-evidence-based-practices-college-classrooms01572nas a2200217 4500008004100000020001400041245006400055210006400119260001500183300001400198490000700212520093700219653002101156653001601177653001901193653002301212653002601235100002501261700003101286856003701317 2020 eng d a0888-408000aAssessing structure building in college classrooms at scale0 aAssessing structure building in college classrooms at scale c2020/05/01 a747 - 7530 v343 aSummary Structure building refers to the way in which people construct meaning from incoming information by creating a foundation of mental nodes, mapping incoming information to the foundational structure, and shifting to a new structure when necessary. Structure building ability has been shown to moderate learning both in laboratory-based and classroom-based research (e.g., use of outlines for effective note-taking and course final grades, respectively). However, measurement of structure building can be resource intensive. The purpose of the present study was to evaluate a shortened, scalable measure of structure building (developed by a textbook publisher) in a real-world context. The results are consistent with the hypothesis that this tool, embedded in the online ancillary materials accompanying a textbook, can be used to measure a variable that is relevant to students' learning in introductory psychology courses.10ahigher education10ameasurement10areader ability10astructure building10atranslational science1 aSoicher, Raechel, N.1 aBecker-Blease, Kathryn, A. uhttps://doi.org/10.1002/acp.364302456nas a2200169 4500008004100000022001400041245007300055210006900128260001300197300001200210490000600222520186200228100002902090700002402119700003102143856011202174 2020 eng d a2332-213600aFour Empirically Based Reasons Not to Administer Time-Limited Tests.0 aFour Empirically Based Reasons Not to Administer TimeLimited Tes c2020 Jun a175-1900 v63 aFor more than a century, measurement experts have distinguished between time-limited tests and untimed power tests, which are administered without time limits or with time limits so generous that all students are assured of completing all items. On untimed power tests, students can differ in their propensity to correctly respond to every item, and items should differ in how many correct responses they elicit. However, differences among students' speed of responding do not confound untimed power tests; therefore, untimed power tests ensure more accurate assessment. In this article, we present four empirically based reasons to administer untimed power tests rather than time-limited tests in educational settings. (1) Time-limited tests are less valid; students' test-taking pace is not a valid reflection of their knowledge and mastery. (2) Time-limited tests are less reliable; estimates of time-limited tests' reliability are artificially inflated due to artifactual consistency in students' rate of work rather than authentic consistency in students' level of knowledge. (3) Time-limited tests are less inclusive; time-limited tests exclude students with documented disabilities who, because they are legally allowed additional test-taking time, are often literally excluded from test-taking classrooms. (4) Time-limited tests are less equitable; in addition to excluding students with documented disabilities, time-limited tests can also impede students who are learning English, students from underrepresented backgrounds, students who are older than average, and students with disabilities who encounter barriers (e.g., stigma and financial expense) in obtaining disability documentation and legally mandated accommodations. We conclude by offering recommendations for avoiding time-limited testing in higher educational assessment.
1 aGernsbacher, Morton, Ann1 aSoicher, Raechel, N1 aBecker-Blease, Kathryn, A. uhttps://liberalarts.oregonstate.edu/biblio/four-empirically-based-reasons-not-administer-time-limited-tests02059nas a2200217 4500008004100000020001400041245008200055210006900137260001500206300001400221490000700235520142500242653001901667653001301686653001501699653001701714653001501731100002501746700003101771856003901802 2020 eng d a0266-490900aTesting the segmentation effect of multimedia learning in a biological system0 aTesting the segmentation effect of multimedia learning in a biol c2020/12/01 a825 - 8370 v363 aAbstract Multimedia instruction, the combination of pictures and words to produce meaningful learning, involves attention, selection, organization, and integration of new information with previously learned information. Because there is a large, theory-based literature supporting the effectiveness of multimedia instruction, we proposed that multimedia instruction could be leveraged to address issues in health communication. The cognitive theory of multimedia learning outlines techniques to improve meaningful learning when the processing load of essential information exceeds the cognitive capacity of the learner (Mayer, 2014). Specifically, segmentation, or presentation of the material in a learner paced fashion, results in deeper learning of the material than continuous presentation (Mayer & Chandler, 2001). We proposed a conceptual replication of the segmentation effect with multimedia materials relevant in a health communication context. We hypothesized that transfer of information from a multimedia presentation about kidney function would be improved in a segmented, versus continuous, condition. Additionally, we hypothesized that participants' perceived cognitive load during the learning task would be lower in the segmented, versus continuous, presentation condition. We were unable to replicate either of these advantages for the use of segmentation with health-related materials.
10acognitive load10alearning10amultimedia10asegmentation10aself-paced1 aSoicher, Raechel, N.1 aBecker-Blease, Kathryn, A. uhttps://doi.org/10.1111/jcal.1248502140nas a2200229 4500008004100000020004300041245010400084210006900188260000900257300005400266520125400320653002201574653001801596653002601614653001301640653001401653653003001667653001501697100002501712700003101737856014201768 2020 eng d a2332-211X(Electronic),2332-2101(Print)00aUtility value interventions: Why and how instructors should use them in college psychology courses.0 aUtility value interventions Why and how instructors should use t c2020 aNo Pagination Specified - No Pagination Specified3 aAccording to expectancy-value models of achievement motivation, a core component of increasing student motivation is utility value. Utility value refers to the importance that a task has in one’s future goals. Utility value interventions provide an opportunity for students to make explicit connections between course content and their own lives. A large body of literature suggests that utility value interventions are effective for a wide range of students (e.g., both adolescent and adult learners) in a variety of courses (e.g., introductory psychology, introductory biology, and physics). This review provides (1) an overview of an expectancy value model of achievement motivation, (2) a comprehensive review of the experimental studies of utility value interventions in psychology, (3) concrete pedagogical recommendations based on the evidence from over 30 studies of the utility value intervention, and (4) suggestions for future research directions. After reading this review, college-level psychology instructors should be able to decide whether the utility value intervention is appropriate for their own course and, if so, implement the intervention effectively. (PsycInfo Database Record (c) 2020 APA, all rights reserved)
10a*College Students10a*Intervention10a*Psychology Education10a*Reading10a*Teachers10aExperimenter Expectations10aMotivation1 aSoicher, Raechel, N.1 aBecker-Blease, Kathryn, A. uhttps://liberalarts.oregonstate.edu/biblio/utility-value-interventions-why-and-how-instructors-should-use-them-college-psychology-courses