The focus of learning analytics is to provide actionable data that can improve teaching and learning.
Learning analytics “is the undertaking of activities that generate actionable data from the learning environment intended to improve student outcomes by informing structure, content, delivery or support of the learning environment” — as contextually defined for UW-Madison by the Learning Analytics Roadmap Committee.
UW-Madison has piloted various learning analytics tools and approaches over the past 5 years. To learn more about these projects, as well as current learning analytics projects, please refer to DoIT Academic Technology’s Evaluation Design and Analysis.
Learning Analytics Functional Taxonomy
This learning analytics functional taxonomy* describes six common approaches that help provide context for how learning analytics may be used in educational settings.
- Access learning behavior – For example, an instructor would be able to see when, how long, and how often a student accesses different activity types in Canvas.
- Evaluate social learning – For example, an instructor might apply social network analysis to their online discussions and identify students that bridge groups as knowledge shepherds or students that may not connect with others as much as expected.
- Improve learning materials and tools – For example, learning analytics might show that a large percentage of students in a course struggle with a newly introduced topic based on quiz answers.
- Individualized learning – For example, if an instructor tests students on three topics and a student shows mastery of two topics, but not the third, a program may be able to deliver additional material to the student regarding the topic that has not been mastered rather than delivering further material/practice questions on concepts the student already has a grasp on.
- Predict student performance – For example, if a student’s behavior and performance in a course suggest that a student is struggling, an instructor has an opportunity to intervene. Predictive analytics can also help instructors identify students that are doing OK, but may be need some additional motivation to do better in the course (for example, a C student that could be a B student).
- Visualize learning activities – For example, a learning analytics tool may help a student see how much time she is spending on certain activity types compared to her peers, and how that might relate to performance measures.
*This taxonomy (Nguyen, Gardner, and Sheridan, 2017) was initially explored and adapted by UW-Madison faculty/instructors who participated in a Blended Learning Fellowship on Evidence-Based Teaching during spring 2018 semester.
Governance & Oversight of Learning Analytics
Students are real and diverse individuals, and not just their data or information. These principles — beneficence, transparency, privacy and confidentiality, and minimization of adverse impacts — aim to uphold the dignity of students while ensuring learning analytics are used to improve educational outcomes.
- Learning analytics should be used for the benefit of students, and central to its use should be student success, improved learning experiences for the student, and improved outcomes for students.
- It is up to instructors and programs to determine the best way to deliver and implement courses and other learning experiences.
- Because of the potential human cost and harm of misapplication of learning analytics predictions, any implementation of learning analytics tools should first be subject to a rigorous evaluation of quality.
- All learning analytics activities should adhere to established campus policy for access and use of data (View: Guiding Principles for Using Learning Analytics Data).
As learning analytics practices expand on campus, it will be important to continue to increase awareness around analytics and student privacy, and remain in compliance with campus policy, guidelines and recommendations.
Learning Analytics Roadmap Committee (LARC)
The Learning Analytics Roadmap Committee (LARC), convened by the vice provost for teaching and learning, is responsible for general alignment, oversight, and monitoring of and communication about learning analytics at UW-Madison, particularly activities that support the operationalization of learning analytics as a sustainable educational practice. LARC serves as a central resource and point of contact for learning analytics, which is at the intersection of five major data domains on campus (academic structure, student domain, assessment, advising, and teaching and learning).
Other Campus Advisory Groups
In addition to LARC, other working and advisory groups have been formed, and will continue to be created on an ad hoc basis, to ensure learning analytics are being used responsibly and ethically at UW-Madison. For example, the Data Stewardship Council (DSC) created the Learning Analytics Data Use Subcommittee (LADUS) to create principles for appropriate use of data for learning analytics purposes. It is important to continue to have wide-spread campus participation and different perspectives involved as learning analytics practices continue to evolve on campus.