Trends Unpacked: Technical Challenges and Learning Analytics (Part 5)

Published on: November 14, 2017
Lindsay Pineda, Senior Project Manager
Amanda Mason, Senior Business Manager

Lindsay Pineda, Senior Implementation Consultant, Unicon, Inc.
Amanda Mason, Senior Business Analyst, Unicon, Inc. 

What are the technical challenges your institution faces when implementing learning analytics? Do you have concerns regarding competing technical priorities? Do you have the resources needed for a successful implementation and project/product launch?  These are questions that face all institutions when embarking on a learning analytics journey.  

Back in May 2017, we started an article series called "Trends Unpacked", where we documented the realities of learning analytics implementations as observed during multiple consulting engagements.  We introduced several trending technical challenges, and we'll expand on those challenges, as well as offer recommendations, in this final installment of the Trends Unpacked series.  

When consulting with institutions on getting started with learning analytics, there are several topics that participants frequently raise with regard to technical implementation considerations.  Three of these topics are:

  • Ease of integration with existing infrastructure/systems
  • Maintenance 
  • Resource support 

Ease of Integration with Existing Infrastructure/Systems

Several of the challenges and trends we observed were related to the ease of integration with existing infrastructure/systems:

  • Many institutions struggled with having several different systems and data sets within each system. Institutions reported that the ease of integration across systems was a challenge. We found an average of 8-10 different systems in use on a regular basis. This was also found within similar and differing departments. 
  • Quite a few institutions had also previously purchased commercial systems that resulted in integration challenges with learning analytics technology due to proprietary restrictions.
  • Institutions expressed concern regarding the lack of expertise of both the technical teams and their ability to comply with the requirements of a learning analytics implementation (e.g. xAPI and Universal Data Definitions requirements).

The following examples illustrate the types of integration/system challenges most often expressed at the institutions:

  • What do we do with the data? – This was a common theme and expressed by many at the institutions we visited; at one particular institution we were told, "How is this different than what we already do? We already have loads of data, we just don't do anything with it." This is a consistent statement among institutions. Most do have loads of data they have been collecting for decades, but that is not the difficulty. The difficulty lies in where the data is located. We found that data is located in several different systems, departments, and often stored within the minds of tenured individuals who are sought out to advise in specific situations. Some viewed this robust wealth of data as a positive thing because there was so much information being captured about students. However, the fact remains that a lot of data does not equal a lot of knowledge about how to use it. As one institution pointed out, "There is the perception that having loads of data as a good thing, but we're not sure if it's at all useful or valuable."
  • We need a systematic way to collect data – At other institutions, we experienced concern from a technical group of individuals who voiced frustration regarding the sheer volume of data collected. They expressed that it is difficult to determine how to collect the right data in the right format for the right practices, from all of the varying systems they currently have in place. This same group advised us, "We need to be very clear about the data we collect currently and how this is different than what we are going to collect in the future. There needs to be a structure in place to systematically collect data moving forward." 

Institutions shared some ideas regarding potential solutions and recommendations that they feel would be beneficial:

  • So many systems – An institution advised us that, "our individual systems don't 'talk' to each other currently. A lot of our information has to be tracked manually (e.g. Excel) and is often duplicated in many areas." This was not uncommon among institutions. For example, an institution might be housing documentation and processes in three-ring binders instead of having it available electronically. A majority of institutions also felt there were so many systems being used for a task and "navigating which systems have what information can be very time consuming." 
  • Systems integration concerns – Several institutions we visited had purchased commercial systems, for example learning management systems (LMS) and/or student information systems (SIS). Due to the proprietary nature of some of them, integration with other proprietary (or non-proprietary) systems presented a challenge. An institution advised us, "Some of the major challenges we have are figuring out what data can be extracted from our commercial LMS and SIS, how we are going to work with our provider to do so, and how much it's going to cost us to get that data." 

Institutions shared some ideas regarding potential solutions and recommendations that they feel would be beneficial:

  • Define system purpose – "I suppose we need to just sit down and figure out which systems are for what purpose? Which need to be involved in the learning analytics initiative? Who is in charge of them?" This was the conclusion an institution came to after discussing their readiness for a learning analytics initiative with us. There is no easy way to integrate several systems with multiple types of data. Most institutions advised that they would likely start with automating some of their manual data entry processes in an effort to ensure all data is electronically stored, so it can eventually be integrated with a centralized data warehouse. Another suggestion is to determine what data is actually relevant to your institution and link that to which system it's in. This can be done in collaboration with stakeholders within necessary departments. For example, mapping data that is relevant to a student's academic journey using process mapping techniques can be a very valuable way to illustrate the data that is important to consider. 
  • Document everything – "We have got to figure out where to put all of this new information so others can easily access it." This was a common statement made by many institutions while we were onsite. We believe documenting the integration processes for future reference, and to help new technical team staff members, is extremely important. Institutions supported this and shared with us that change is constant and it can be frustrating for many staff members and students. One institution shared with us, "things get changed around here all the time, but no one knows about it until they are told they aren't doing it."  Some of this confusion can be eliminated if aspects such as a system's purpose; what data it holds; what data is needed for a learning analytics initiative; and who is in charge of those systems and that data were all documented. Also, ensuring the documentation is housed in a centralized location for easy access is necessary for success.


  • Maintenance of learning analytics systems and technology was a major concern for all technical teams throughout the institutions. The institutions did not have a high variety of skill sets among their staff in order to manage the learning analytics implementation in-house, nor did they have the same level of expertise to maintain the technology. For example, Data Scientists, Data Engineers, Business Analysts, Quality Assurance, and other technical roles were not found at any of the institutions.
  • Many institutions would be in need of constant support during and after the implementation of the technology. 

The following examples illustrate the types of maintenance challenges most often expressed at the institutions:

  • Skill set gaps – "Where are we going to find a data scientist?" is a legitimate question that was asked by an institution when discussing the skill set gaps that were present among their staff. Supporting a new initiative such as learning analytics can be overwhelming for many institutions who do not have current staff members with the skill sets needed to maintain the new technology. An institution told us, "We have very limited resources who have varying levels of skill. It's going to be important that we identify what skills they have, what skills they don't and how to bridge that gap." 

Institutions shared with us some ideas regarding potential solutions and recommendations that they feel would be beneficial:

  • Training opportunities – One way to bridge the skill set gap is to identify which skills are lacking among the technical staff that will be responsible for implementing a learning analytics initiative. It is important to note that some roles, skills, and responsibilities will become more familiar as the initiative progresses. It may not always be immediately clear which skill sets are needed. Once initial skill gaps have been identified, providing training for those skills that are not present is necessary. Some of this training can be done internally, while some may need to be sought out off campus; for example, bringing in an experienced facilitator to help guide training efforts and create materials for future use. It is vital that institutions document any and all training efforts for future reference and new staff. All institutions offered unanimous agreement regarding this approach. 
  • Hire additional technology proficient individuals – Each institution varied on the need for differing skill sets, but each one was lacking some key roles to successfully implement a learning analytics initiative. The topic of hiring additional staff produced a lively discussion within the institutions we visited. Several felt it would be very beneficial and cost effective to do so. One institution suggested an idea worth considering: "Initially, we could hire on some of the roles needed as contractors. They could train our technical staff and then we can do it on our own from there." To accurately identify which type of solution will work for your institution, it is important to consider the benefits of hiring on additional staff versus extensively training existing staff in new skill areas. 

Resource Support 

  • Some institutions had smaller technical teams with concerns related to resource allocations, workload additions, and time management. We did see significant buy-in for the implementation of learning analytics within the technical teams.
  • Most institutions also expressed the desire for explicit prioritization of learning analytics initiatives from leadership. This was in attempt to help the technical teams guide and direct current resources and work efforts. 

The following examples illustrate the types of resource challenges most often expressed at the institutions with whom we spoke:

  • Competing and unclear priorities – An individual at an institution told us, "We have so many different and competing priorities all the time. I never know which are the ones I need to focus on." Several institutions expressed some frustration around this topic. "It can be frustrating to put in the time and effort required on something only to have it change in priority a few weeks later." This is not uncommon and we found the same to be true across many other institutions we visited.
  • Communication challenges – "Transparency of information is something we struggle with quite a bit." Many of the institutions we visited appeared to struggle with understanding which departments needed what kind of data and why they needed it. There were several instances when different departments didn't know what the others did, let alone what data might be of value to them. An individual shared with us, "I have worked here for years and now I truly understand what these other departments do and how the data I collect actually affects them."

Institutions shared their ideas with us regarding potential solutions and recommendations that they feel would be beneficial:

  • Define priorities (top down) – One institution informed us that, "Whether they know it or not, the leadership here sets the tone for everything we do. They show us the importance of something by how much they communicate, show up at meetings and allocate money to support it." It seems increasingly important for leadership to set clear expectations about priorities from the start. This helps eliminate prioritization questions later. Other articles in the "Trends Unpacked" series have discussed the importance of clearly defining strategic priorities from the top down. It would benefit leadership to outline resource, budget, and project priorities/allocations. This information needs to be shared with everyone involved with the initiative. It is also beneficial to include leadership members across departments when defining priorities for the institution and when developing a communication strategy.
  • Collaborate – Learning analytics initiatives do not only affect technical teams. They impact students, staff, faculty, leadership, advisors, and many others. We observed that collaboration among these departments was the key to a successful implementation. As one individual reported, "If we communicate regularly and are not afraid that someone else is trying to do something malicious to us, then we can really work with each other to get things done much faster." This seemed to be a common conclusion reached by departmental individuals after spending a few days together openly and transparently discussing how others affect them. We agree with this conclusion; collaboration and communication are central to any successful initiative. An example of how collaboration can be successful is to include relevant stakeholders from all departments in meetings, communications, strategy development and decision-making. When everyone feels they have a voice and their needs are being considered, they are more likely to take ownership of the initiative.

Both technical and organizational challenges are common when implementing a learning analytics initiative. There is a significant amount of effort on behalf of everyone involved. Learning analytics will touch the majority of individuals, departments, processes, procedures, practices, and strategic plans within an institution. This classifies everyone involved as a stakeholder. Our message to institutions, from our experience, is to address these challenges one at a time based on the priority your department/institution assigns to them. Do what is needed to get the project started and continue to solve the systematic challenges that may arise along the way. Attempting to solve every challenge upfront is unrealistic and can be frustrating for an institution. Remember, there is no "fix-all" solution that will resolve the entirety of an institution's challenges as they relate to learning analytics. Learning analytics is a marathon, not a sprint; understanding this upfront is key.

This article brings our "Trends Unpacked" series to a close. It has been a pleasure to document our observations and we hope they have provided value to your learning analytics initiatives. 

Here's to continuous growth and improvement!

Useful Reading:

Lindsay Pineda photo

Lindsay Pineda

Senior Project Manager

Lindsay Pineda is a Senior Project Manager and a Senior Implementation Consultant for Unicon with a rich background in learning/predictive analytics. In her previous position, she focused on helping to develop, implement, and execute a proprietary predictive modeling technology that has proven to be successful in predicting student course persistence on a week-to-week basis. Lindsay has immersed herself in learning/predictive analytics research, practical applications, and implementation. Since coming to Unicon, she has been working with institutions to provide Learning Analytics solutions, both technical and nontechnical, and has had a focus on visiting institutions onsite to provide Readiness Assessments. She helps institutions work through issues of change management, resourcing, challenges, and concerns relating to the adoption of Learning Analytics.

Amanda Mason photo

Amanda Mason

Senior Business Manager

Amanda has over 14 years of business and technical analysis experience, including management experience, with an emphasis on being a liaison between the business and the technical team to ensure constant and clear communication, as well as ensuring all business and technical requirements are established and met. Over her career, Amanda has worked on a plethora of different project types and sizes worldwide. In recent years, Amanda's work has been focused on international projects that involve in depth business, technical, and system requirements. Amanda has defined business and technical requirements within Learning Analytics.