Professional Documents
Culture Documents
Evaluation Report
RATIONALES
SEAtS Software promotes itself as a student success platform. It appears to combine both a Student
Information System (SIS) and Learning Management System (LMS) into one. Slade & Prinsloo
(2013) identify the importance of student as agents who can “voluntarily collaborate in providing data
and access to data to allow learning analytics to serve their learning and development” (p. 1529).
Consequently, I actively searched for a Learning Analytics (LA) tool that would provide this
affordance. Another aspect that lead me to select SEAtS software to evaluate was that the company
highlighted how multiple stakeholders (student, faculty, and administration) could use and benefit from
this software and also boasted of several reputable customers across Canada, America, and the United
Kingdom, including Athabasca University. There was also a lack of online reviews from current users
about this software. Consequently, for these reasons, I decided to provide an evaluation of the learning
analytics of SEAtS Software.
My learning analytics evaluation framework is modelled off both the frameworks from Cooper (2012)
and Scheffel et al. (2014). This framework was developed for K-12 educators and administrators to
support their search and evaluation for a learning analytics tool. The Cooper framework chose to step
away from “soft factors”, elements with subjectivity like cultural context which featured previously in
the original framework by Greller & Drachsler. Their rationalization was to focus on hard factors first
and to debate on the emerging soft factors later. Scheffel et al. (2014), categorized outlier statements
into a cluster called Acceptance and Uptake because of their inability to fit clearly with any other
cluster. This led to an ad hoc criterion called organizational aspects which would feature some of these
soft skills. My belief was that the soft skills should not be delayed or ignored but instead must be
intentionally woven within the framework because although they do not necessarily contribute to the
selection and analysis of a learning analytics tool, they do need to occur in order for the successful
initiation of the search. Consequently, this resulted in the creation of a check list to be used before
commencing in the evaluation of a learning analytics tool using the reworked framework.
3
Before commencing in the evaluation of learning analytics tools, a high school educational institution
should first identify what LA software’s are available and their cost to see if the school budget could
afford to implement them should they be selected. Next the infrastructure of the school would need to
be analyzed, specifically the school network. Most schools operate on Local Area Networks (LAN)
and connectivity should first be inspected. Next, there would need to be a check of what devices were
available that the selected learning analytics software or browser-based program would be downloaded
or accessed on. Access and compatibility for all intended users would need to be assessed.
Before the searching and implementation of a LA software, acceptance from a variety of stakeholders
should be sought out. Scanning the educational culture to ensure that a growth mindset is present
would afford mass organizational change. Moreover, stakeholders would need to be convinced of the
benefits of LA in order to be motivated to use it once implemented.
Lastly, a check that LA could be implemented into the high school would be necessary. Stakeholders
would need to be provided both time and training on how to use the new LA tool. To achieve this
transition, both a team or individuals to provide IT support for the network, software, and devices
issues as well as a team or individuals to provide LA support on how to use the new software would be
needed. Once the checklist is complete, the school would be ready to commence their search of a LA
tool. Once the search for an LA tools has begun, a framework is then needed to help evaluate and
differentiate the available LA tools. Below is the proposed framework.
4
This framework is a condensed version of the Cooper framework which provides the context for
evaluation. The key difference is that this framework includes the Scheffel et al. (2014) quality
indicators of both importance and feasibility. It is proposed that after Analysis and Data Origin of an
LA tool has been evaluated, that it also receives a rating of the three identified quality indicators on a
scale of one to three, one being weak and three being strong.
• Transparency- refers to both what information is provided to all stakeholders and how
accessible that information is (i.e. presentation).
5
• Data Ownership- refers to who has access and control to how data is created,
recorded, collected and monitored. It would also look at any potential data outsourcing
or transfers to third parties.
• Privacy and Ethics- refers to what kinds of data is collected and where it is stored, for
example, Canadian versus American data centers. Also, if it follows the Freedom of
Information and Protection of Privacy Act (FIPPA). For example, what kind of de-
identification tools are available or do stakeholders have the ability to provide consent
or opt out.
After Orientation & Objectives and the Technical Approach of an LA tool have been evaluated the
following would be rated on the same three-point scale:
Actionable Insights- The computer methods used to determine at risk students and both the
computer and human generated recommendations that result from these detections.
Student Agency- refers to the motivation of students to use the LA tool for their own regulation
and accountability.
(Students are loosely classified as Analysis Clients as they can be a part of the alert
Analysis system and consequently could adjust their own learning without further external
intervention.)
Transparency 1 2 3
Ratings: Data Ownership 1 2 3
Privacy & Ethics 1 2 3
Orientation
7
Orientation & via e-mail and SMS to both academic and administrative staff on
Objectives populations of students who may be ‘at risk’ such as those
displaying disengagement with the course. EWS alerts can also
be expanded to include the ‘at risk’ student as well.
Traditional
Descriptive model that uses rules-based analytics to generate
data sets. Data is harvested from existing SISs, VLEs, and
LMSs. Tolerance levels can be set by individual institutions for
different educational contexts and breaches of tolerances can
create and assign calls to actions to both staff and student.
Actionable Insights 1 2 3
Ratings Student Agency 1 2 3
Transformative 1 2 3
suggests which data they will need to collect and store in their data
repositories.
This company makes bold claims but provided little specifics on how
their software worked. It repeatedly highlighted the different types of
educational data it utilized and how the LA could support increased
student performance but failed to mention the following:
RECOMMENDATION
I would not recommend this tool for myself or colleagues because there are too many
unanswered questions. The key hesitance is the lack of grounding in learning theory as well as
knowing what methods for predictive analytics are being used (Simple? Linear? Logistic? etc.).
The SEAtS Software platform is promising and has an appealing UI/UX design. It also could
transform current teaching and administrative practices should the claims be sound. I would
recommend that further investigation be completed and perhaps a trial use of their software first
before purchasing.
11
REFERENCES
Cooper, A. (2012). A framework of characteristics for analytics. CETIS Analytics Series, 1(7).
Ifenthaler, D. (2016). Recorded for the MOOC Analytics for the Classroom Teacher offered by
SEAtS Software | Student Success Platform. (n.d.). Retrieved February 22, 2019, from
https://www.seatssoftware.com/
Scheffel, M., Drachsler, H., Stoyanov, S. & Specht, M. (2014). Quality Indicators for Learning
Slade, S., & Prinsloo, P. (2013). Learning Analytics: Ethical Issues and Dilemmas. American