At a major research university in the Midwest, we recently concluded an Experience Analysis and Design (EAD) on student and faculty LMS use. During the qualitative research phase, we were told by faculty, staff and students alike the technology was rife with problems, that it should be discarded and that at the very least, it’s UI needed a major update. Fingers were being pointed every which way.
As many know, increasing LMS adoption brings numerous benefits to an institution, the least of which is a more consistent student experience. Yet changing an LMS can be expensive and time consuming. Our client wanted objective and thorough research about LMS use. Additionally, the client wanted data that would lead to a list of solutions prioritized by methodological scoring and cost implementation. Furthermore, they wanted an inclusive process that engaged departments from across the institution.
And that’s exactly how the EAD methodology works. It’s objective and anonymous avoiding blame and internal politics.
Perhaps one of the biggest surprises we uncovered was that many of those who complained about the LMS had actually either never used it, or used only its very basic elements. Many simply were repeating what others had told them or were unaware of the many features the LMS offered. We also uncovered a rich story involving support structure and staffing, pedagogical approaches, faculty and student usage behaviors and college-level adoption strategies.
As I write this, a prioritized set of solutions, with buy-in from across the campus, is being phased in.
And nobody’s pointing their fingers at anyone else. At least for now