Video: How does going digital help with the ability to monitor learning?
THE ABILITY TO MONITOR LEARNING: How does going digital help with the ability to monitor learning?
- Digital methods of assessing learning have the power to increase student motivation and commitment: activities are often engaging and interactive, as well as offering something new and entertaining , .
- Learners can track their learning performance, for example, on a dashboard or app, and can often receive immediate and even personalized feedback on their progress, thus empowering them to control their own learning , .
- Digital methods allow teachers and course designers to observe and measure students’ learning behavior, so that they can provide interventions for especially challenging activities and tailor learning pathways to suit individual needs, offering the potential for deeper and more effective learning , , .
- There is evidence that the use of learning analytics leads to better learning outcomes .
- Feedback can be provided in writing, verbally or by using pictures and video, using the variety of multimodality made readily possible by digital means.
- Technologies are emerging to enable automated feedback on aspects of learner linguistic output which traditionally required a great deal of teachers’ time , , , .
- Social, online networks can provide an infrastructure in which peer reviews can be a compelling and integral part of collaboration .
- Using digital opportunities, students discover means of monitoring their learning that they can use outside formal classes, standing them in good stead to continue learning in real-world situations throughout their lives.
- Learning analytical tools allow us to collect data about students in order to explore learning patterns and gauge the effectiveness of interactions. This raises ethical concerns, such as privacy issues and the risk of bias in data. , 
- Teachers and curriculum designers may lack the competence to interpret the data collected about student activity, risking false conclusions and unhelpful interventions .
- Measurements of learner behavior typically provide information about students without any deeper understanding of their individual backgrounds and circumstances. For this reason, learning metrics can encourage teachers to plan interventions without regard for an individual’s current situation or preferred approach to learning. 
- On the other hand, when monitoring of a student’s learning behavior does permit personalized learning pathways, there is a risk that students are guided to engage with a limited range of resources and activities apparently suited to their needs, at the expense of wide exposure to diverse resources and activities that might be available to a whole group .
- Approaches which track students’ digital activity tend towards behaviorist interpretations, excluding deeper understanding of what is going on in the students’ minds .
- When students’ activity and progress is being tracked, there is a potential risk that they are distracted from genuine learning, focusing instead on dashboard metrics, such as score achieved or the number of activities completed.
Rarely are language teachers able to master learning analytics by themselves: a variety of experts from various disciplines are required, and this is expensive for institutes of higher education .
 M. Kuuskorpi, T. Kuuskorpi, K. Sipilä, J. Heikkinen, and R. Tamminen, ”Oppimismotivaation muutokset opetustila- ja oppimateriaaliuudistusten yhteydessä,” in Digitaalinen oppiminen ja oppimisympäristöt, M. Kuuskorpi, Ed. Opetushallituksen tuella Kaarinan kaupunki ja kirjoittajat. Tampere: Juvenes Print – Suomen Yliopistopaino Oy, 2015, pp. 107-108.
 C.A. Chapelle and S. Sauro, The Handbook of Technology and Second Language Teaching and Learning. Hoboken, N.J.: Wiley, 2017.
 N. Sclater, A. Peasgood, J. Mullan, and JISC (2016, April). Quick Guide – Learning analytics in higher education [Online]. Available: https://www.jisc.ac.uk/reports/learning-analytics-in-higher-education
 D. Clow, “An overview of learning analytics,” Teaching in Higher Education, vol. 18, no. 6, pp 683-695, Aug. 2013. doi.org/10.1080/13562517.2013.827653
 M.D. Pistilli, “Learner Analytics and Student Success Interventions,” New Directions for Higher Education, vol. 179, Sept. 2017. doi: 10.1002/he.20242
 P. Kessler, and G. Hubbard, “Language Teacher Education and Technology,” in The Handbook of Technology and Second Language Teaching and Learning, C. A. Chapelle and S. Sauro, Eds. Hoboken, N.J.: Wiley, 2017, pp 278-292.
 E. Cotos. (2011, Jan.) Potential of Automated Writing Evaluation Feedback. CALICO Journal, [Online]. vol. 28 (no. 2), pp. 420-459. Available: https://search.proquest.com/docview/869637846?accountid=13031
 V. Hegelheimer, A. Dursun, and Z. Li, “Automated Writing Evaluation in Language Teaching: Theory, Development, and Application,” CALICO Journal, vol. 33, no. 1, pp. i-v. Jan. 2016. doi:http://dx.doi.org/10.1558/cj.v33i1.29251
 A. Lee, and J.R. Glass, “Context-dependent pronunciation error pattern discovery with limited annotations,” in INTERSPEECH-2014, 2014, pp. 2877-2881.
 S.M. Witt, “Automatic error detection in pronunciation training: Where we are and where we need to go,” Proc. IS ADEPT, 2012.
 N. Arnold, and T. Paulus, “Using a social networking site for experiential learning: Appropriating, lurking, modeling and community building,” Internet and Higher Education, vol. 13, no. 4, pp. 188–196, Dec. 2010. doi:10.1016/j.iheduc.2010.04.002.
 J.A. Johnson, ” Ethics and Justice in Learning Analytics,” in Learning Analytics in Higher Education: New Directions for Higher Education, no. 179, J. Zilvinskis and V.M.H. Borden, Eds. Malden, MA: Wiley, 2017, pp. 77–86.
 A. Wilson, C. Watson, T.L. Thompson, V. Drew, and S. Doyle, “Learning analytics: challenges and limitations,” Teaching in Higher Education, vol. 22, no. 8, pp. 991-1007, May 2017. doi: 10.1080/13562517.2017.1332026
 J. Zilvinskis, and V.M.H. Borden, Eds., Learning Analytics in Higher Education: New Directions for Higher Education, no. 179. Malden, MA: Wiley, 2017.