An innovative method to evaluate usability and learner experience

Alice Gasparini

Abstract

The contribution presented here offers a reflection about digital learning environment from the point of view of usability and learner’s experience in a context of L2 acquisition. Besides, it presents an innovative application of non-pedagogical tracking software used to evaluate students’ behaviour within linguistic learning environments. The study is part of for a PhD project carried out at the University G.D’Annunzio of Chieti and Pescara and University for foreigners of Siena. 

Keywords: Learning Environment, Usability, Learner’ Experience, Learning Analytics, Italian as L2

Introduction

The 2020 pandemic situation brought distance education to everybody’s attention and multiple questions were raised. It became clear the need for teachers and students to adapt their usual way of teaching and learning to a distance context and get familiar with new tools. The contribution based on a PhD project carried out at the University G.D’Annunzio of Chieti and Pescara and University for foreigners of Siena focuses on digital learning environments and tries to answer this main research question: How can good usability enhance and promote a better learner experience?

A well-built learning environment, like a comfortable classroom, helps to create the best condition to learn because it contributes to reducing frustration, fears, and anxieties caused by a lack of familiarity with the technological system. Frustration and anxiety, as Krashen maintained, can raise the affective filter and be an obstacle to real acquisition (1983). How can a teacher or designer monitor their digital classroom and get the best of it? The study reports an application of non-pedagogical tracking software used to evaluate usability and learner experience of linguistic digital spaces. First, we will outline the basic concepts on which the study lies: usability, learners’ Experience and Learning Analytics.

Usability

The idea of usability has a long story, its birth dating back to Henry Ford who at the beginning of the last century started to reflect on the relationship between machines and workers of his companies. His objective was to increase industrial production and he understood that the machines needed to be “human-tailored” to make their usage easier and produce more. This is considered to be the beginning of Ergonomics, the discipline that coordinates and arranges workplaces and workers. The design of professional spaces and objects became user-centric. This approach later applied to all kinds of products made for people. The main idea was designing and creating machines or objects for potential users.

This postulate was then associated and applied to PC and internet websites in the Nineties. Jakob Nielsen[1], who is considered to be “Usability’s guru”, describes it as “a system is easy to learn, efficient to use, pleasant” (Nielsen, 1994:26). Usability evaluates if the system is easy to learn to use the first time and whether its interface can be easily remembered after a time of no-using; its effectiveness in carrying out a user’s task; if it is pleasant to use it and if it satisfies the user.

The concept of usability here described is used as a benchmark when talking about usability. Nevertheless, its elaboration refers to standard websites with a commercial and informative objective; it doesn’t include any e-learning systems. Further studies (Quinn 1996; Preece 2001; Reeves 2002; Zaharias 2004, 2009) about the topic pointed out the necessity to consider the pedagogical dimension not encompassed in Nielsen’s definition. The pedagogical purpose of the environments needs to be taken into consideration because it influences the development and structure of the contents. In order to design an effective learning space both aspects need to be considered: technical and pedagogical. Therefore, pedagogical usability’s concept was formulated and outlines whether the system contributes to the learning process (Nokelainen, 2006:180). The technical aspect is generally the ground for pedagogical effectiveness, and both determine the learner experience.

User Experience and Learners Experience

User Experience is a more recent notion. The term was invented by Donald Norman[2], an American engineer and psychologist who worked for Apple. He describes it as: “everything that touches the experience with the product” (Norman, 2011). User Experience (henceforth UX) has a broader perspective than usability since it includes both emotional and technical features. Usability has now been considered as an attribute of UX.

The concept of learner experience is shaped on UX’s. As stated by Norman, it includes everything connected to the learning experience, it starts before the first access to the platform and it lasts after the end of the course. User and learner experience integrate subjective and objective features; their evaluation requires a combination of quantitative and qualitative data. The data collection methods usually include monitoring software to track users’ behaviour within the systems, observations of the performance of the users, interviews with them, and questionnaires. Each aims to focus on a specific aspect and build the most complete picture.

Web Analytics and Learning Analytics

The aforementioned monitoring software collects analytics data, best known as Web Analytics. They are described as: “the measurement, collection, analysis, and reporting of internet data for understanding and optimizing Web usage” (Järvinen, & Karjaluoto, 2015:118). When Web Analytics refers to actions taken within learning environments, they are known as Learning Analytics (henceforth, LA). According to Long and Siemens (2011), LA is defined as “the measurement, collection, analysis, and reporting of data about learners in their context, for purposes of understanding and optimizing learning and the environments in which it occurs” (p. 34). The data allow the teachers to monitor the learners within the environment, support the learning process and help the students to regulate their learning process. The analytics offered by the software allows observing how learners move in the system, which paths and contents they prefer and whether they encounter any problems and difficulties. The information can be used to design a real data-driven improvement of digital environments. LA are not used for mere statistics or description but to generate changes in the pedagogical design of contents and learning spaces (Ferguson, 2014; De Waal, 2017; Kukulska-Hulme, 2020).

The study

 This work intends to follow this recent direction of the LA sector, analysing the usability of learning environments in linguistic context to improve learners’ experience. In order to achieve the stated objective, the study included in the PhD project mentioned in the introduction of this article proposes the application of a tracking software aimed to evaluate usability and UX.

The research lying on the premises just described involves the comparison of two different learning systems from a usability point of view: a CMS (Content Management System), WordPress, and an LMS (Learning Management System), Moodle. They represent the two most popular typologies of learning environments and they embody two different conceptions of learning.

CMSs were inspired by connectivism and Web 2.0: learning is a social act and the connections among individuals create knowledge and learning (Siemens, 2010; Downes, 2005). Therefore, they were developed to give everyone the chance to create digital resources. They were not originally designed to be learning environments, but their flexibility allows the designers to build a personalized environment. CMSs are generally expressions of informal learning.

LMSs are learning platforms and they contain pedagogical event. They are generally describes as “closed” systems because learners need authentication. They follow constructivist postulates and are usually associated to formal learning, as a matter of fact, they are widely used by universities, schools, and companies for distance education.

Within the two digital spaces, the same course of Italian as L2 was allocated. It was created for self-study and the level of knowledge of Italian was A2-B1. The linguistic resources of Italian as L2 were organized in 5 main units with a specific focus on the necessary areas for the development of communicative competence: communication, lexis, grammar, and culture. The resources were developed for university students and for adults wishing to come to Italy to work.

The innovative feature of the study is the application of non-pedagogical software to a learning system. Matomo, that is the name of the program, is used to track students’ behaviour and its focus is the evaluation of usability and UX. It has strong attention on users’ privacy, all data are anonymous, and they are collected and treated in compliance with GDPR. It allows to observe learner behaviour within the learning environments from an external point of view, giving a clear “picture” of the interaction between students and resources. The analytics are often arranged in infographics which convey a visual and immediate comprehension.

Learning environments usually include their monitoring systems to track the students. Moodle, for example, has a very complex set of functions that convey a very precise tracking, nevertheless, it doesn’t offer features specially developed to evaluate usability or learners’ experience (Fenu et al., 2017). The main point in employing such software in a learning environment is acknowledging what students do within the systems, be aware of the preferred paths and the most visited contents.

As mentioned before, analytics data collection is not enough to evaluate usability and learners’ experience. The software offers a photograph of users’ flows and interactions, but it doesn’t explain the reasons behind their choices and preferences. Therefore, the study included observation of the performance, interviews, and profiling and evaluation questionnaires. Observations and interviews were carried out live with foreign students who studied Italian as L2 in the courses offered by the university G.D’Annunzio. In addition to these, the two systems were tested online by students who learn Italian language and culture abroad, especially in United States, Germany, France, and Poland.

 The data gathered by the different tools were analysed following a thematic qualitative approach, which was aimed to find patterns and recurrent behaviours of the users (Braun, Clarke, 2006). First, the data coming from each system were examined separately and afterward compared. Thanks to the data analysis, it was possible to describe the profile of the students who are interested in studying Italian as L2 and determine the engagement of the two systems outlined by the interaction between the students and the overall organization of the environments and with the individual pages and contents. The data also helped to identify how the learners explored the two learning environments and which criteria they followed to satisfy their linguistic and communicative needs. From a pedagogical point of view, the statistics showed which units and linguistic contents and activities were most visualized and visited, and that gave a clear indication of what the learners found more engaging, interesting, and close to their linguistic needs.

 The overall results gave relevant information about usability and learner experience and they can be used to adapt the learning environments to real students’ needs. Good usability prevents them from wasting valuable cognitive energies which need to be employed to learn the language rather than the functioning of the system. Besides, the data offered useful suggestions about students’ approach to a linguistic platform for individual and self-regulated study.

Thanks to these, it was possible also to identify subjective and personal perceptions about general pedagogical usability and linguistic pedagogical usability of the systems.

Conclusion

Usability and user experience are fundamental concepts to consider in order to build learner-centric digital environments. Technology is only a tool and teachers and instructional designers always need to focus on students if they wish to keep them motivated and interested. In distance education, educators have often fewer chances to observe learners’ behaviour and feedback than in the classroom. The study here presented offers a possible solution to get more data about the overall experience of students in digital learning environments and make data-driven decisions.

References:

Braun, V., Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3 (2). 77-101.    

De Waal, P. (2017). Learning Analytics: i sistemi dinamici di supporto alla decisione per il miglioramento continuo dei processi di insegnamento e apprendimento. Formazione & Insegnamento XV, 2, 2017. Lecce, Pensa Multimedia Editore.

Downes, S. (2010). Connectivism and Connective Knowledge. National Research Council Canada. Retrieved February 14th, 2021, from https://www.downes.ca/files/books/Connective_Knowledge-19May2012.pdf

Fenu, G., Marras, M., Meles, M. (2017). A Learning Analytics Tool for Usability Assessment in Moodle Environments. Journal of e-Learning and Knowledge Society, v.13, n.3, 23-34.

Ferguson, R. (2012). Learning analytics: drivers, developments and challenges. International Journal of Technology Enhanced Learning, 4(5/6), 304–317.

Ferguson, R., Hoel, T., Scheffel, M., Drachsler, H. (2016). Guest editorial: Ethics and privacy in learning analytics. Journal of Learning Analytics, 3 (1), 5–15.

Järvinen, J., & Karjaluoto, H. (2015). The use of Web analytics for digital marketing performance measurement. Industrial Marketing Management, 50, 117–127. Retrieved on February 14th, 2021, from https://doi.org/10.1016/J.INDMARMAN.2015.04.009

Krashen, S. D. (1983). Principles and Practice in Second Language Acquisition. Oxford, Pergamon.

Kukulska-Hulme A., Shield L. (2004). Usability and Pedagogical Design: are Language Learning Websites Special?. ED-MEDIA 2004, Lugano, Switzerland. 4235-4242.

Lim, C.J., & Lee, S. (2007), Pedagogical Usability Checklist for ESL/EFL E-learning Websites, Journal of Convergence Information Technology, 2(3), 67- 76.

Long, P., Siemens G. (2011). Penetrating the Fog: Analytics in Learning and Education. Educause Review, 46 (5), 31-40.

Mishra, P., Koehler, M. (2006). Technological pedagogical content knowledge: A framework for teacher knowledge, Teachers College Record, 108, 1017–1054.

Muir, A., Shield, L., Kukulska-Hulme, A. (2003). Report on the Usability Roadshow to Region 06, Internal OU Report.

Nielsen, J. (1994). Usability Engineering. New York, Morgan Kaufmann.

Norman D. (2001). The definition of User Experience. Retrieved on 11th March 2021 from: https://www.nngroup.com/articles/definition-user-experience/.

Nokelainen, P. (2006). An empirical assessment of pedagogical usability criteria for digital learning material with elementary school students. Educational Technology & Society, 9 (2), 178-197.

Pardo A., Ellis R., Calvo R. (2015). Combining observational and experiential data to inform the redesign of learning activities. Proceedings of the fifth international conference on learning analytics and knowledge—LAK ’15, 305–309.

Quinn, C.N. (1996). Pragmatic evaluation: lessons from usability. Proceedings of 13th Annual Conference of the Australasian Society for Computers in Learning in Tertiary Education.

Preece, J. (2001). Online communities: Designing usability and supporting sociability. New York, Wiley.

Reeves, T., Benson, L., Elliott, D., Grant, M., Holschuh, D., Kim, B., Kim, H., Lauber, E. and Loh, S. (2002). Usability and instructional design heuristics for e-learning evaluation. Proceedings of World Conference on Educational Multimedia, Hypermedia and Telecommunications AACE. Charlottesville, pp. 1615-21.

Siemens, G. (2010). Connectivism: Learning as network-creation. ASTD Learning News, 10(1). Chicago.

Zaharias, P. (2004). Usability and e-Learning: The road towards integration. ACM eLearn Magazine, Vol.2004 (6).

Zaharias, P., Poylymenakou, A. (2009). Developing a Usability Evaluation Method for e-Learning. Applications: Beyond Functional Usability’, International Journal of Human-Computer Interaction, 25:1, 75-98 (18).

About the author

Alice Gasparini is a PhD Candidate at the University G.D’Annunzio, Pescara, Italy, and a tutor of Italian as L2. She taught Italian as L2 in Italy, at the University of Foreigners of Siena and at the University G.D’Annunzio, in Spain and Russia. Her research interests are: Second Language Acquisition and Technology, motivation and effectiveness of Technology-Enhanced Language Learning, self-regulated study in digital environments.


[1] He is a Danish engineer, famous for his studies about usability. He co-founded the Nielsen Norman Group with Donald A. Norman, an American computer user interface and user experience consulting firm. He has invented several usability methods, including heuristic evaluation. For more information: Nielsen, J., https://www.nngroup.com/people/jakob-nielsen/

[2] American psychologist and engineer. He is an American researcher, professor, and author. Much of Norman’s work involves the advocacy of user-centered design and user experience. He is the co-founder of Nielsen Norman Group. For more information: https://www.nngroup.com/people/don-norman/.

Click here to download the pdf version of the write-up by Alice Gasparini.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s