Learning Analytics in Higher Education -  - E-Book

Learning Analytics in Higher Education E-Book

0,0
22,99 €

-100%
Sammeln Sie Punkte in unserem Gutscheinprogramm und kaufen Sie E-Books und Hörbücher mit bis zu 100% Rabatt.
Mehr erfahren.
Beschreibung

Gain an overview of learning analytics technologies in higher education, including broad considerations and the barriers to introducing them. This volume features the work of practitioners who led some of the most notable implementations, like: * the Open Learning Initiative now at Stanford University, * faculty-led projects at the University of Michigan, including ECoach and SLAM, * the University of Maryland, Baltimore County s Check My Activity and * Indiana University s FLAGS early warning system and e-course advising initiatives. Readers will glean from these experiences, as well as from a national project in Australia on innovative approaches for enhancing student experience, an informed description of the role of feedback within these technologies, and a thorough discussion of ethical and social justiceissues related to the use of learning analytics, and why higher education institutions should approach such initiatives cautiously, intentionally, and collaboratively. This is the 179th volume of the Jossey-Bass quarterly report series New Directions for Higher Education. Addressed to presidents, vice presidents, deans, and other higher education decision makers on all kinds of campuses, it provides timely information and authoritative advice about major issues and administrative problems confronting every institution.

Sie lesen das E-Book in den Legimi-Apps auf:

Android
iOS
von Legimi
zertifizierten E-Readern

Seitenzahl: 209

Veröffentlichungsjahr: 2017

Bewertungen
0,0
0
0
0
0
0
Mehr Informationen
Mehr Informationen
Legimi prüft nicht, ob Rezensionen von Nutzern stammen, die den betreffenden Titel tatsächlich gekauft oder gelesen/gehört haben. Wir entfernen aber gefälschte Rezensionen.



New Directions for Higher Education

Betsy O. Barefoot Jillian L. Kinzie CO-EDITORS

Learning Analytics in Higher Education

John Zilvinskis Victor Borden EDITORS

Number 179 • Fall 2017

Jossey-Bass

Learning Analytics in Higher Education

John Zilvinskis and Victor Borden

New Directions for Higher Education, no. 179

Co‐editors: Betsy O. Barefoot and Jillian L. Kinzie

NEW DIRECTIONS FOR HIGHER EDUCATION, (Print ISSN: 0271‐0560; Online ISSN: 1536‐0741), is published quarterly by Wiley Subscription Services, Inc., a Wiley Company, 111 River St., Hoboken, NJ 07030‐5774 USA.

Postmaster: Send all address changes to NEW DIRECTIONS FOR HIGHER EDUCATION, John Wiley & Sons Inc., C/O The Sheridan Press, PO Box 465, Hanover, PA 17331 USA.

Copyright and Copying (in any format)

Copyright © 2017 Wiley Periodicals, Inc., a Wiley Company. All rights reserved. No part of this publication may be reproduced, stored or transmitted in any form or by any means without the prior permission in writing from the copyright holder. Authorization to copy items for internal and personal use is granted by the copyright holder for libraries and other users registered with their local Reproduction Rights Organisation (RRO), e.g. Copyright Clearance Center (CCC), 222 Rosewood Drive, Danvers, MA 01923, USA (www.copyright.com), provided the appropriate fee is paid directly to the RRO. This consent does not extend to other kinds of copying such as copying for general distribution, for advertising or promotional purposes, for republication, for creating new collective works or for resale. Permissions for such reuse can be obtained using the RightsLink “Request Permissions” link on Wiley Online Library. Special requests should be addressed to: [email protected]

Information for subscribers

New Directions for Higher Education is published in 4 issues per year. Institutional subscription prices for 2017 are:

Print & Online: US$454 (US), US$507 (Canada & Mexico), US$554 (Rest of World), €363 (Europe), £285 (UK). Prices are exclusive of tax. Asia‐Pacific GST, Canadian GST/HST and European VAT will be applied at the appropriate rates. For more information on current tax rates, please go to www.wileyonlinelibrary.com/tax-vat. The price includes online access to the current and all online back‐files to January 1st 2013, where available. For other pricing options, including access information and terms and conditions, please visit www.wileyonlinelibrary.com/access.

Delivery Terms and Legal Title

Where the subscription price includes print issues and delivery is to the recipient's address, delivery terms are Delivered at Place (DAP); the recipient is responsible for paying any import duty or taxes. Title to all issues transfers FOB our shipping point, freight prepaid. We will endeavour to fulfil claims for missing or damaged copies within six months of publication, within our reasonable discretion and subject to availability.

Back issues: Single issues from current and recent volumes are available at the current single issue price from cs‐[email protected].

Disclaimer

The Publisher and Editors cannot be held responsible for errors or any consequences arising from the use of information contained in this journal; the views and opinions expressed do not necessarily reflect those of the Publisher and Editors, neither does the publication of advertisements constitute any endorsement by the Publisher and Editors of the products advertised.

Publisher: New Directions for Student Leadership is published by Wiley Periodicals, Inc., 350 Main St., Malden, MA 02148‐5020.

Journal Customer Services: For ordering information, claims and any enquiry concerning your journal subscription please go to www.wileycustomerhelp.com/ask or contact your nearest office.

Americas: Email: cs‐[email protected]; Tel: +1 781 388 8598 or +1 800 835 6770 (toll free in the USA & Canada).

Europe, Middle East and Africa: Email: cs‐[email protected]; Tel: +44 (0) 1865 778315.

Asia Pacific: Email: cs‐[email protected]; Tel: +65 6511 8000.

Japan: For Japanese speaking support, Email: [email protected].

Visit our Online Customer Help available in 7 languages at www.wileycustomerhelp.com/ask

Production Editor: Abha Mehta (email: [email protected]).

Wiley's Corporate Citizenship initiative seeks to address the environmental, social, economic, and ethical challenges faced in our business and which are important to our diverse stakeholder groups. Since launching the initiative, we have focused on sharing our content with those in need, enhancing community philanthropy, reducing our carbon impact, creating global guidelines and best practices for paper use, establishing a vendor code of ethics, and engaging our colleagues and other stakeholders in our efforts. Follow our progress at www.wiley.com/go/citizenship

View this journal online at wileyonlinelibrary.com/journal/he

Wiley is a founding member of the UN‐backed HINARI, AGORA, and OARE initiatives. They are now collectively known as Research4Life, making online scientific content available free or at nominal cost to researchers in developing countries. Please visit Wiley's Content Access ‐ Corporate Citizenship site: http://www.wiley.com/WileyCDA/Section/id-390082.html

Address for Editorial Correspondence: Co‐editors, Betsy Barefoot and Jillian L. Kinzie, New Directions for Higher Education, Email: [email protected]

Abstracting and Indexing Services

The Journal is indexed by Academic Search Alumni Edition (EBSCO Publishing); Higher Education Abstracts (Claremont Graduate University); MLA International Bibliography (MLA).

Cover design: Wiley

Cover Images: © Lava 4 images | Shutterstock

For submission instructions, subscription and all other information visit:

wileyonlinelibrary.com/journal/he

CONTENTS

Editors' Notes

1: An Overview of Learning Analytics

Learning Analytics Defined

Relationship to Existing Roles and Functions

Developing Learning Analytics Projects Cautiously, Intentionally, and Collaboratively

Conclusion

References

2: Incorporating Learning Analytics in the Classroom

Introduction

The Open Learning Initiative Approach to Learning Analytics

Creating Opportunities to Generate Meaningful Data

Creating Open and Accessible Predictive and Explanatory Models

Supporting Evidence-Informed Decision Making

Implementing a Continuous Improvement Cycle

Summary

References

3: Learning Analytics Across a Statewide System

Learning Analytic Implementation at Indiana University

Fostering Learning, Achievement, and Graduation Success: Implementation and Assessment of an Early Alert System

Vended Learning Analytics Systems: The Education Advisory Board and the Student Success Collaborative

What We Have Learned So Far

References

4: Learner Analytics and Student Success Interventions

The Right Information to the Right Students

The Roles of Feedback in Advising and Academic Achievement

Social Cognitive Theory

The Right Time in the Right Way

Too Much of a Good Thing?

Closing the Loop: Considerations for Implementation

References

5: Cultivating Institutional Capacities for Learning Analytics

The Origin of the Symposium on Learning Analytics at Michigan and the Learning Analytics Community

Establishing a Learning Analytics Task Force

University of Michigan–Funded Learning Analytics Grants

The Learning Analytics Fellows Program

New Metrics, Cross-Institutional Research, and Analytics-Driven Cultural Change

Finding New Homes for Continuing Learning Analytics Task Force Efforts

Conclusion

References

6: Using Analytics to Nudge Student Responsibility for Learning

Assumptions

A Case Study for Nudging Students at University of Maryland, Baltimore County

Other Notable Student-Facing Analytics Interventions

Discussion: Leveraging Information Technology–Facilitated Peer Pressure

Conclusion

References

7: Ethics and Justice in Learning Analytics

Ethical Questions in Learning Analytics

From Ethics to Justice

Justice in Analytics

Conclusion

References

8: Learning Analytics as a Counterpart to Surveys of Student Experience

The Student Survey Tradition

Shifting Attention From Groups to Individuals

Student Analytics and the Student Experience

Qualities and Analytics for the Successful Student Experience

Conclusions and Implications

References

9: Concluding Thoughts

Organizational Capacity Development

Engaged Collaboration

Student Agency and Responsibility for Learning

Optimizing Feedback

A Culture of Inquiry

Social Justice

Reference

ADVERT

Order Form

Index

End User License Agreement

List of Tables

Chapter 8

Table 8.1

Table 8.2

List of Illustration

Chapter 2

Figure 2.1 Example learning objective and the number of students who are predicted to have mastered this objective

Figure 2.2 Example skill and the distribution of learners who have mastered and not mastered the skill by number of activity attempts (unique activities related to the skill)

Figure 2.3 Open Analytics Service (OARS) Components

Figure 2.4 Example assessment activity that connects to the skill (Compute Test Statistic) and learning objective (Carry out hypothesis testing for the population proportion and mean (when appropriate), and draw conclusions in context)

Chapter 8

Figure 8.1 Sample Student Success Report

Guide

Cover

Table of Contents

1

Pages

5

6

7

9

10

11

12

13

14

15

16

17

19

20

21

22

23

24

25

26

27

28

29

30

31

33

34

35

36

37

38

39

40

41

42

43

44

45

46

47

48

49

50

51

52

53

54

55

56

57

58

59

60

61

62

63

65

66

67

68

69

70

71

72

73

74

75

77

78

79

80

81

82

83

84

85

86

87

89

90

91

92

93

97

98

100

101

102

103

104

105

106

107

108

109

110

111

112

113

114

115

Editors’ Notes

For the last few years, we have been thinking and talking about the learning analytics phenomena with colleagues and each other, and watching passionate interest well up from disparate sources. We both attended several Learning Analytics & Knowledge (LAK) conferences hosted by the Society for Learning Analytics Research (SoLAR). When the conference was hosted in Indianapolis in 2014 a majority of the sessions featured computer scientists displaying successful predictive analytic programs; whereas at the 2016 conference in Edinburgh, Scotland, there was a push to emphasize the ways these projects measure the experience of learning. We have also noticed the last two or three annual forums of the Association for Institutional Research, a number of presentations regarding learning analytics, often with packed crowds of institutional research staff who have been tasked with contributing to the development of learning analytics on their campuses. We also noted that the Association for the Study of Higher Education Annual Conference has had a few sessions dedicated to learning analytics that have been attended by a small, yet dedicated, group.

In each of these experiences, we've noticed different takes on learning analytics from the scientist, higher education institution administrative staff, and scholar perspectives. Motivating our work is a question: “Why haven't learning analytics become prevalent within higher education?” Considering the ways technology has been incorporated within the academy over the past two decades (e-mail, websites, learning management systems), it would seem that learning analytics would be a natural evolutionary step—however, this hasn't happened at the pace or to the extent that we might have expected. In our research on the topic, we've noticed that the complexity of higher education as an enterprise, matched with required resources and understandings needed to successfully implement learning analytics, present immense challenges to incorporating these technologies effectively and productively.

Therefore, we conceptualized an issue of New Directions for Higher Education that would inform campus leaders, faculty, and staff about the scope and type of activities and initiatives required to bring learning analytics to their campus. The goal of this volume is to introduce the reader to a basic understanding of learning analytics and the types of projects and initiatives that several leading practitioners have adopted and adapted, providing substantive examples of implementation, and expert learnings on some of the more nuanced issues related to this topic.

In the first chapter, we offer basic definitions of learning analytics, as well as an overview of who collaborates within learning analytics, exposing several broad issues for the reader to consider while reading the rest of the book. In the second chapter, Candace Thille reports findings from her and her colleagues’ work with the Open Learning Initiative at Stanford University to inform the reader of the nuances of measuring student learning in digital environments, while describing the technology needed to achieve this feat and the assessment infrastructure required to improve teaching and learning in the digital environment.

Chapters 3 and 5 provide examples of developing learning analytics applications on the campuses of large universities. In Chapter 3, our colleagues from Indiana University (IU), Cathy Buyarski, Jim Murray, and Becky Torstrick, describe the implementation of learning analytics across the diverse campuses of IU, incorporating both an internally developed early warning system and externally developed (commercial) e-advising program. Steve Lonn, Timothy McKay, and Stephanie Teasley describe in Chapter 5 initiatives to create a culture of learning analytics at the University of Michigan through the development of symposia, grants, and faculty task forces.

In between these chapters, Matt Pistilli focuses on the key roles of feedback and intervention in virtually all types of learning analytics projects and initiatives. In Chapters 6 through 8, the authors explore several compelling issues related to learning analytics. John Fritz describes how and why including students, and promoting student responsibility for learning, is critical within learning analytics initiatives, based on his experiences at the University of Maryland, Baltimore County, as well as related efforts elsewhere. Jeffrey Johnson provides a rich analysis of issues related to ethics and justice that arise when working with student data generally as well as specifically with the types of applications now prevalent in the learning analytics realm. Chapter 8 presents findings from a national Office for Learning and Teaching–funded project in Australia that focuses on developing new thinking about how to characterize modern student experience that preserves the use of strong conceptual underpinnings that have historically guided research on student experience, and using analytics as a way to break the old molds and form new ones. In the final chapter, as the editors of the volume, we discuss some of the themes found within the issues, such as collaboration, interrogation, justice, and independence.

We believe that this volume can substantially inform readers who have been considering the implementation of learning analytics on their campus. However, we recognize that it is by no means comprehensive or complete, as new research and understandings are constantly emerging around this topic (just follow the comprehensive coverage by EDUCAUSE, SoLAR, and ACM); meanwhile, scholars within computer science, learning technologies, learning sciences, and education have contributed comprehensive understandings of learning analytics as a scholarly field (both fundamental definitions and cutting edge research are presented in the Journal of Learning Analytics). This issue of New Directions for Higher Education serves as an entrée into the world of learning analytics for those who are, or who will soon be, exposed to this topic and expected to make decisions in the coming weeks, months, and years about how to engage constructively in using technology to improve student learning and persistence to completion.

John ZilvinskisVictor BordenEditors

Dr. John Zilvinskis

is an Assistant Professor of Student Affairs Administration at Binghamton University - State University of New York (SUNY).

Dr. Victor Borden

is a Professor of Higher Education within the Department of Educational Leadership and Policy Studies at Indiana University Bloomington.

1

The purpose of this chapter is to provide administrators and faculty with an understanding of learning analytics and its relationship to existing roles and functions so better institutional decisions can be made about investments and activities related to these technologies.

An Overview of Learning Analytics

John Zilvinskis, James Willis, III, Victor M. H. Borden

Higher education administrators and leaders have a sense that there is some magic going on in other industries and sectors, where people appear to be taking better advantage of the seemingly endless and growing amount of data available to improve sales and target customer support. There is a sense that higher education is far behind in this endeavor and so they are willing to invest significantly in new, expensive technologies. Campus leaders hear about some cases where institutions are using these methods and technologies to gain competitive advantage; they are impressed and do not want to be left behind. No one has figured out the magic formula, and even very well-known examples have not been sustained. For example, in 2012 The Chronicle of Higher Education (Parry, 2011) featured Rio Salado College for its ability to use online student behavior to predict student class performance; however, 1-year retention rates between fall 2014 and fall 2015 are only 38% for full-time students and 26% for part-time students (National Center for Education Statistics, 2017). These rates are among the lowest 2% and lowest 9%, respectively, among public, 2-year colleges, not surprising for an open access institution with large online programs, but certainly not industry leading.

The truth is, as documented in this volume, implementing and sustaining learning analytics initiatives are just not that simple, nor do they necessarily result in dramatic improvements. It's not that these new technologies and methods are unhelpful, but rather it's that they don't address the more complex aspects of higher education, including the incredible diversity and complexity of learning outcomes across the curriculum and the complex organizational arrangements. Even for institutions committed to “data-driven decision making,” stakeholders often make the mistake of thinking the data will tell them what to do, as opposed to realizing that data themselves do not drive decision making; it is the interpretation of the data that creates change. At the center of these issues are three important questions: (1) What are learning analytics? (2) Who should be involved in these projects? (3) What are some important, broad principles institutions should consider when developing learning analytics? The purpose of this chapter and indeed the remainder of this volume is to provide some answers to these important questions.

Learning Analytics Defined

Though the use of data in higher education for operational and decision support is certainly not new, the computational processes involved in modelling it for prediction, intervention, and tracking have expanded exponentially in recent years. Consequently, there is a lack of maturity and consistency in the naming terms, conventions, and standards related to learning analytics. It is not uncommon to attend a higher education conference panel or sit in on a campus meeting where attendees use “analysis” and “analytics” interchangeably. Certainly, formal definitions and distinctions have been asserted (for example, between educational data mining and learning analytics), but there are notable differences in the formal literature and the usage of terms in practice is much looser and inconsistent. In their article, “Academic Analytics: A New Tool for a New Era,” Campbell, DeBlois, and Oblinger (2007) wrote, “Analytics marries large datasets, statistical techniques, and predictive modeling” (p. 42). Learning analytics uses both traditional data (from student records, surveys, and so on) along with new types of data emanating from transactional systems like learning management systems, online course platforms, social networks, and so on. However, there are finer nuances in defining various types of analytics as one begins to consider the domain and types of learning analytics projects. For the purposes of this chapter, we define learning analytics as the process of using live data collected to predict student success, promote intervention or support based on those predictions, and monitor the influence of that action.

Distinguishing Learning Analytics Projects by Level of Analysis

There are two separate but interrelated domains for this work: the work that engages faculty in improving student learning within individual classes and programmatic curriculum (learning analytics), and the more holistic student support applications that do not focus directly on learning but more on student progress, persistence, and completion (academic analytics, according to Long and Siemens, 2011). In their conceptual framework of analytics in higher education, van Barneveld, Arnold, and Campbell (2012) add that predictive analytics can be used in both of these domains to draw upon historical data to predict future outcomes that can guide intervention. Later in this volume, Pistilli and Wagner describe learner analytics as using historical data from student records to predict the outcomes of current students, with the aim of intervening for students who are predicted to have a low likelihood of success. Already in this paragraph, we've drawn distinctions between learning analytics, academic analytics, predictive analytics, and learner analytics, relating to the level of analysis (student vs. class vs. curriculum), chronological characteristics of predictors (historical vs. contemporaneous), and type of outcomes (learning/behavior/development vs. retention/graduation); therefore, it is easy to see why newcomers find themselves uncertain when trying to specify the type of analytic project they wish to implement.

Distinguishing Learning Analytics Projects by Intended User

Learning analytics projects are often distinguished by the intended user or recipient of information. Previous work in analytics has led to the creation of tools to assist faculty with examining data from individual classes (Campbell et al., 2007). Chapters 2 and 5 in this volume have this focus. Other learning analytics projects primarily inform the work of academic advisors and other support staff with student guidance and coaching (Aguilar, Lonn, & Teasley, 2014; Barber & Sharkey, 2012). Chapters 3 and 4 fall into this category. Still other projects provide data directly to learners (Baker, 2007). Chapter 6 considers this target audience, at least in part. Analytics are also used to provide senior managers with management information related to teaching, learning, and student success (Buerck, 2014), which is also noted in Chapters 3 and 8. Regardless of the intended user, the chapters of this volume demonstrate that numerous campus partners must collaborate to implement a successful learning analytics project.

Relationship to Existing Roles and Functions

Implementation of learning analytics projects requires not only an understanding of the domain or the type of analytic project, but also an understanding of the amount of work and types of expertise needed. It is not feasible for even the most dedicated educators to create their own analytic systems. In addition to involving faculty as domain experts, information providers, and end-users, analytic project development often relies on collaboration among staff from several support units, such as centers for teaching and learning, student support services, institutional research, and information technology. Indeed, the development of learning analytics projects can change how these units routinely operate by providing opportunities for collaboration with new partners and new end-users. This may also lead to changes in the types of skillsets required for staff in these units.

Centers for Teaching and Learning

Staff within a center for teaching and learning (CTL) or similarly named unit typically provide faculty development programs related to curriculum and course design and assessment. These units promote communities of practice around specific pedagogies (for example, service learning) and provide support for using new educational tools and technologies. CTL staff provide required expertise in instructional and pedagogical design for developing new capacity for learning analytics among faculty (Borden, Guan, & Zilvinskis, 2014).

Student Support Services

Educators serving in academic advising, student affairs, and supplemental instruction use traditional data sources (for example, advising records, student needs assessments, registrar information) to determine the most effective interventions to recommend to individual students, such as counseling, tutoring, and peer mentoring. These staff members typically adopt a holistic view of student life and development across the curriculum, cocurriculum, and extracurriculum. Student advising and support providers often have the most extensive experience in dealing with students directly as they formulate academic and life goals (Drake, 2011). When working in learning analytics, the expertise of these educators provides perspective regarding the complex relationship between in-class and out-of-class demands on student life, while also supplying an understanding of how learning analytics intervention can be coordinated throughout existing support systems and how existing systems need to be reshaped to accommodate learning analytics.

Institutional Research

Institutional research (IR) practitioners use theory-related concepts and knowledge of research design to frame research and guide interpretation, while providing aggregated information and analysis to senior managers. These professionals also help with the development of new reporting tools, some of which are dynamic reports or dashboards. When working in learning analytics, IR staff can apply their expertise in advancing interpretive models and conceptual frameworks for making sense of atheoretically based predictive analytics. Working on learning analytics projects requires IR staff to engage with colleagues who tend to use information in operational and individualized contexts rather than the more strategic and aggregate uses to which they are accustomed. In doing so, IR professionals can inform their colleagues about how measurement error qualifies the use of predictive analytics, making them much less precise than immediately apparent. Institutional researchers and other assessment professionals can also contribute by ensuring learning analytics projects include mechanisms for tracking the actions taken by staff and students in response to analytics information, to assess whether those actions are productive or counterproductive.

Information Technology