The concept and use of learning analytics have been around, in evolving forms, for decades. Three DMU scholars, however, have applied learning analytics in a way that has a real impact, in real time, on both students and faculty.
In a paper published in the Journal of Applied Research in Higher Education, Devrim Ozdemir, Ph.D.,Heather Opseth, M.P.A.S., PA-C, and Holland Taylor, M.S.P.A.S.’07, PA-C, describe the curriculum mapping process that began in DMU’s Master of Physician Assistant (PA) Studies Program five years ago, including the use of learning analytics to improve advising, student reflection, remediation and curriculum evaluation on an ongoing basis. This approach may benefit learners of all ages, disciplines, levels and content mastery, not just PA students.
“Students can directly see their progress on course objectives in a way that’s different from grades or how they did on an exam or an assignment,” says Dr. Ozdemir, instructional design coordinator for the College of Health Sciences.
“Using learning analytics is valuable for students to identify their strengths and areas for improvement, and for faculty to better evaluate curriculum,” adds Ms. Opseth, assistant professor in the PA program.
While higher education institutions have collected data on student performance for years, that information has been used primarily for high-level curriculum planning and research on student learning. In their paper, Dr. Ozdemir, Ms. Opseth and Ms. Taylor show how when those data are put directly in the hands of students and faculty, they can track their progress, change their behavior as needed and gauge their actual learning.
The authors accomplished this using technology, Desire2Learn Brightspace and Desire2Learn Insights Portal. “I believe we are one of the few institutions to use these D2L technologies to this capacity,” Dr. Ozdemir says.
In addition to grades, the tools monitor how often students log in for their courses, time spent on teaching and learning activities, and participation in online course discussions. Data are highly detailed on student performance, including on categorized types of exam questions and specific grading criteria linked to course learning objectives.
“The data are live and ongoing. Students can see how they’re doing as they’re moving through the curriculum, and we can remediate on specific areas with students who are struggling. Helping students identify those areas and get help sooner is critical with a curriculum that is so packed,” Ms. Taylor says of DMU’s 25-month PA program. She is the program’s director and department chair.
Aggregated data from each course is also used to evaluate its curriculum as a whole and, if needed, to improve it. That’s invaluable both for internal review of courses and ensuring they meet accreditation standards.
The learning analytics approach, now also used by DMU’s master’s degree programs in Health Care Administration and Public Health, began with the faculty first undertaking a “backward curriculum design process.” That entails constructing each course based on measurable and specific course learning objectives aligned with program-level competencies, rather than built on favored textbooks and traditional course activities.
“It took a lot of work to structure and maintain and a few years to tweak, but now we have the structure,” Ms. Taylor says. “Now we’re starting to reap the benefits.”
Because using learning analytics requires additional time and technology training for faculty and early introduction to students, in their paper the authors advocate for bigger-scale studies of faculty/student attitudes and behaviors. “Curriculum evaluation data will become very critical during continuous quality improvement efforts,” they include.
“We don’t want this to be the only publication on this area, which is untapped,” says Dr. Ozdemir. “We intend to continue our scholarship of teaching and learning in learning analytics, and sharing our findings has been a great experience so far.”