The ATLAS tool is designed to support student-centred teaching by allowing learners to (anonymously) communicate trouble understanding or following a lecture via mouseclick, touch or keypress. The information of all students is then aggregated and provided to the teacher using an ambient design – meaning: by brightening up or dimming down the screen or physical environment. This allows the teacher to get a global idea about the level of understanding within the room without the need to assemble singular chunks of information.

From Problem to Solution

When the COVID-19 pandemic hit the world, many students and lecturers alike struggled with the new situation. Digital teaching and learning formats arose compensating for some of the problems, but challenges remained. Somehow, even formats not used to too much student-teacher (and student-student) interaction, like synchronous online lectures, proved much less effective.

As psychologist working in (digital) learning and instruction, we quickly understood why we struggled to teach as effectively as before: we were not aware of our students' needs and struggled to adapt communication. Lectures are essentially a form of communication and – like all communication – it goes two ways. Even monologues are addressed to an audience and even books are written with an audience in mind. Thus, when lectures went online, the usual feedback channels we used to tailor teaching to our students’ needs were cut off making it harder to understand if and what our students understood. We lecturers are used to taking in the room at one glance, getting a “feel” for the level of understanding of our students. Sure, not each and every one, but the class as a whole. Not fail proof, not perfect, but mostly sufficient (and actually not a “feeling” at all, but a cognitive process integrating bits of information from the environment with knowledge we have about teaching situations to gain something we call “group awareness” – a salient perception of relevant aspects of the social situation).

Thus, when this feedback-system broke down, we compensated. We asked time and time again “is that clear?”, “did you understand this?”, “are you keeping up?” or even “is this going too fast? am I repeating myself for no reason?”. This was exhausting and basically fruitless, as in larger lectures, no student ever turned on their mic and said “Well, no, thanks for asking dear teacher, but I really can’t keep up. Could you slow down a notch? Lovely, thanks!”. Taking into a void is frustrating, and exhausting. Is anyone listening? Have I lost them? Am I talking too fast? Or are they rolling their eyes at me, because I explain everything to them like they were stupid? Are they even on their computer right now?

At this point, the Hochschulforum Digitalisierung promoted the online hackathon #Semesterhack and we proposed a “challenge” to work on a solution to this problem. By now a group of three Learning Scientists with a mixed psychology and bits of computer science background, we stumbled upon another group of researchers, lecturers and students with a similar challenge (or rather they stumbled upon us): with “Bored or Excited?” they also targeted the feedback channels during online lectures. So together in this very heterogenous, interdisciplinary team we discussed, thought, discussed, designed, discussed, and coded and in the end, we came up with an idea using gestures and head movements to assess relevant student states and feeding it back to the teacher using a teacher dashboard. While recognising students’ facial expressions or even only head movements is hard to translate into practice for various reasons (*cough* privacy *cough*), this fun project (Nick-o-Meter and Shaker) received the award for most creative solution proposed at #Semesterhack. ConceptPaper.pdf Interview ARD-alpha

So, we did have a good time and worked on solution ideas on the teacher and learner sides of things, but we did not have a working solution. When #Semesterhack2.0 came about, we decided to give it another go. With a highly motivated group of university students, we set out another challenge, this time with more focus on coming up with a concrete solution idea. And we did! Complemented by a lecturer from Bratislava, our student-team and we (alias: Iapetus design crew) came up with a do-able solution. The concept uses intentional keypresses or touch events of individual students to gain data about their current comprehension-related problems, aggregates the information, and feeds it back to the teacher using an intuitive (and persuasive) design by lighting up or dimming the physical environment of the teacher. This concept thus addresses both the requirements on the student side (accessible, avoiding distraction from the lecture as much as possible while still giving learners ultimate control over the information they provide) and the teacher side (receiving information in a way to support processing without distracting from teaching). ATLAS – Ambient Teacher-Student Awareness Solution was born! Less high-tech, but realisable. Submission to #SemesterHack 2.0 - 2020

The German Academic Exchange Service (DAAD) thought so, too, so they declared our idea an “outstanding project idea” and we received funding from the German Ministry of Education and Research (BMBF) for a short 4-month project to implement and test our solution. With two highly motivated junior research associates on the payroll (one from psychology, one from computer science) complemented by not less motivated research assistants, we made a plan and we made it work! While some goals aimed a little too high (lecturers just do not have time to take part in psychological experiments on a large scale), some were overachieved and we now have a functioning basic web-based solution and a python-based integration with SmartLights.

So, if you want to try out the tool in your classes, adjust the code, implement new features, add another SmartDevice, give us feedback, evaluate the tool, or just want a chat: contact us!

Lenka Schnaubert & Jule Krüger


Krüger, J. M., Vogel, F., & Schnaubert, L. (2020). Synchronous online lectures in emergency remote teaching: the role of immersion, social scripts and group awareness. In H. J. So, M. M. Rodrigo, J. Mason, & A. Mitrovic (Eds.), Proceedings of the 28th International Conference on Computers in Education, Volume 2 (pp. 29-37). Asia-Pacific Society for Computers in Education. https://apsce.net/upfile/icce2020/ICCE2020-Proceedings-Vol2-FinalUpdated.pdf