Skip to main content

Academic vs. Administrative Computing: Bridging the Gap - "The Impact of Data on Student Success" (Article #2)

This article is the second in the series, "Academic versus Administrative Computing: Bridging the Gap." Read the first article here: "Separate Evolution of Two Systems."

If academic (LMS) and administrative (SIS) systems can coexist, each serving their own unique purpose, what's wrong with having different systems at the end of the day? There is certainly a good reason to keep a separation of concerns - why, for example, should learning systems have to worry about keeping track of a student's financial record or incoming test scores? Conversely, why should a back-end student information system (SIS) invest in keeping tabs on the student's learning experience? Over the last several years, it has become increasingly clear that there are several problems that this creates:Academic vs. Administrative Computing: Bridging the Gap -

  1. Data is duplicated and seamless integration becomes a must have
  2. It is difficult to put together a holistic picture of the student
  3. Arguably, education is not able to take full advantage of technology due to gaps in data and the timing of systems receiving real time data updates

When thinking about how to solve these problems, institutions need to take into consideration how they define student success. For example, if student success is defined by the measurement of goals including retention and engagement, institutions can determine the type of data they need from academic and administrative systems to measure and support these goals

New Theories and Models for Student Success

New Theories and Models for Student SuccessStudents have a wide variety of choices when it comes to their education path, starting with their underlying educational goals. Choices include how they will pay for college level courses, how many classes they may take concurrently with regards to family or work obligations, whether they choose to live on campus or not, etc. What's more, the circumstances that may be in place when a student first starts their higher education path may not remain the same. In fact, significant changes in those circumstances are usually what leads to students choosing to pause, delay, or step off their path entirely.

All of these diverse choices and circumstances lead to natural variations in student behavior. It is in the institution's best interest to understand major factors that influence the student's ability to continue on their stated course of study.

There is a saying, "the best laid plans" - even the institution that set up a program or course may not fully realize the best conditions that lead to success in their program. The practice of observing behavior over time and detecting patterns that lead to both successful and unsuccessful outcomes is critical. This information is best used to help advisors and counselors identify students who appear to be at-risk for non-successful outcomes, and then recommend specific interventions. This is all based on data gathered from multiple sources to help correlate the wide variety of factors involved.

For many institutions, goals around student success (including retention and engagement) are embedded into their mission. In the quest to understand the best ways to help propel students through their programs, institutions have realized that data generated by systems that students interact with might lead to better understanding of what circumstances lead to successful outcomes. Most of the time, the systems that house the data about student interactions are not originally designed to be used for this purpose — it might be a meal card swipe system, residence hall card key access, library usage, health care visits, etc. And in many cases, it's not clear how information on student behavior can be used to affect change in the student's path.  

Student success is a term which is used frequently, but how do institutions measure it consistently? NSSR 2016 keynote speaker Jillian Kinzie facilitated a discussion that highlighted the fact that schools may define student success in various terms (typically based on the programming that is aligned to the institution's mission); however, it is a challenge when it comes to measuring attainment against those definitions.

Dr. Kinzie goes on to propose a definition of student success: Increased numbers of diverse student groups participating in high-quality educational experiences, earning high quality credentials[1].

Another great example of outlining the aspects of student success comes from New Jersey City University (NJCU)[2]:New Theories and Models for Student Success

Success for NJCU undergraduates includes:

  • Developed expertise in a discipline
  • General education competencies
  • Experiential learning achievements
  • Global citizenship skills and orientation
  • Timely graduation with minimal debt
  • Career placement or graduate study

Julia F. Freeland, a research fellow for the Clayton Christensen Institute for Disruptive Innovation, a San Mateo, California-based think tank that supports new digital learning models, says, "Right now, a funny mix of digital tools and humans take the data that is coming out of disparate online learning programs, analyze it, and give it back to teachers in an actionable format."[3]

In thinking about the ways to improve student success, an institution has several factors within its control. The first is the ability to influence the instructional content for students. The second is the guidance they can give to students about where they need to focus or what factors can contribute to their success. According to a 2014 ECAR Research Bulletin, administrative and academic leaders are increasingly factoring the LMS's "integral educational role and real-time student data into emerging educational models, student success initiatives, and institutional objectives."[4]

The University System of Maryland, in notes from an Education Policy and Student Life committee meeting from September 2015[5], summarized a vision for what could be done with student data that is current and consolidated to target interventions at just the right time:

"While it is true that colleges and universities have been compiling numbers and generating reports for decades, the new kinds and volume of information means that rather than merely reporting, we can utilize the data in planning and decision making. Leading university systems are using these predictive analytics to understand what drives both their institutional performance as well as their students' success, as is increasingly expected by the public and funders. So, rather than looking in the rear-view mirror in order to describe what happened in the past to the students who left our institutions between the 1st and 2nd years, what if, instead, we could pull together all of the data we have on our students over time and use that information to help us track and even begin to predict which students are likely to leave... before they go? What might having that information make possible in terms of targeted interventions aimed specifically at addressing individual student needs at precisely the right times?" 

Gateway Course Redesign

Gateway Course RedesignIndividual instructors are using data to observe effectiveness of course content, assessments, sequencing, and other factors in student outcomes. One biology professor, Kelly Hogan, University of North Carolina at Chapel Hill, observed that 70% of African-American students and 40% of Latino students were failing in her large intro Biology class. She realized that in order for all her students to be successful, she needed to change the way she taught her course.

She made a series of revisions to increase student engagement:

  • Introduce new material through assignments
  • Start each class with a question
  • Think-Pair-Share (TPS)
  • E-polling to test for understanding
  • Quizzes, challenge games

These changes made a difference for her; the statistics for "who is successful?" in her course have evened out[1].

Student Success and Engagement

A key finding from the National Survey for Student Engagement (NSSE) states:

"Students who engage more frequently in educationally purposeful activities both in and outside the classroom get better grades, are more satisfied, and are more likely to persist and graduate."

NSSE studied the relationship between engagement scores and institutional retention/ graduation rates. They determined that the dimensions most correlated with retention and graduation are collaborative learning and number of hours spent per week studying[1].Student Success and Engagement

Knowing this information, and having access to accurate student data, can help educators focus the emphasis of their student success efforts to support students in ways that make the most difference. Consistent student data from the LMS and SIS is crucial in providing educators with a holistic view of each student, their current education path, and what changes institutions need to make to help them be successful. The next article in this series will recommend possible solutions to the problems created by the separate evolution of academic and administrative systems. These solutions can have a positive impact on student success by bridging data gaps through seamless integration, giving educators access to consistent and accurate student data.

Useful Reading:


[1] Dr. Jillian Kinzie, Associate Director of the Center for Postsecondary Research and the National Survey of Student Engagemetn (NSSE) Insitute at the Indiana University School of Education. "Student Success in Colleges & Universities: Advancing Know-What and Know-How"

[2] New Jersey City University, 2016. Student Success Overview

[3] Education Week: Tearing Down the Walls Between Software Silos, Herold, 2014

[4] ECAR Research Bulletin: Learning Management System Evolution, Lang & Pirani, 2014

[5] University System of Maryland, 2015. Efficiency and Effectiveness (E&E 2.0), Analytics, and Student Success

Linda Feng

Linda Feng

Software Architect
Linda Feng is a Principal Software Architect at Unicon, Inc., a leading provider of education technology consulting and digital services. Linda has deep experience in student information systems (SIS) integration, open standards, and big data/ learning analytics, most recently as Senior Product Manager for Canvas SIS Integrations and Canvas Data at Instructure. Prior to Instructure, Linda held the position of software architect for Oracle's Student Products Division. In the last several years, she served as co-chair of the IMS Global Learning Initiative Learning Information Services & Privacy Working Groups, helping to bring a new Enterprise interoperability standard to market.