LEAD-ME Winter Training School Lisbon 2023 Programme
Accessible Embodied Interaction
16-17 March 2023
All sessions will take place in Building C6, Third floor, Room 6.3.27.
Map and information on how to get to the campus
Day 1
Time (WET, Portugal Time) | |
---|---|
09:30 | Opening |
09:45 | Open Minds: Designing for radical inclusion (Gilberto Bernardes) [Watch the video of Designing for radical inclusion] |
11:00 | Coffee break |
11:30 | Mobile eye tracking method and practice (Krzysztof Krejtz) [Watch the video of Mobile eye tracking] |
12:45 | Lunch break |
14:00 | Virtual Environments: Developing accessible presence within an XR environment (Chris Hughes [Watch the video of Virtual environments] |
15:15 | Embodied Musical Interaction (Luís Aly) [Watch the video of Embodied musical interaction] |
16:30 | Coffee break |
17:00 | Hands-on session |
18:30 | Day 1 Closing |
Day 2
Time (WET, Portugal Time) | |
---|---|
09:00 | Opening |
09:15 | Hands-on session |
10:30 | Coffee break |
10:45 | Hands-on session |
12:30 | Lunch break |
13:30 | Hands-on session |
16:15 | Coffee break |
16:30 | Student presentations |
17:30 | Closing |
Speaker Bios
Chris Hughes (Salford University, UK)
Dr Chris Hughes is the interim Director of Computer Science and Engineering at Salford University, UK. His research is focused heavily on developing computer science solutions to promote inclusivity and diversity throughout the broadcast industry. This aims to ensure that broadcast experiences are inclusive across different languages, addressing the needs of those with hearing and low vision problems, learning difficulties and the aged. He was a partner in the H2020 Immersive Accessibility (ImAc) Project.
Previously he worked for the UX group within BBC R&D where he was responsible for developing the concept of responsive subtitles and demonstrated several methods for automatically recovering and phonetically realigning subtitles. He has a particular interest in accessible services and is currently focused on developing new methods for providing accessibility services within an immersive context, such as Virtual Reality and 360 degree video. https://orcid.org/0000-0002-4468-6660
Gilberto Bernardes (University of Porto, Portugal)
Gilberto Bernardes has a multifaceted activity as a musician, professor and researcher in sound and music computing. He holds a PhD in digital media from the University of Porto and a Master of Music, cum laude, from the Amsterdamse Hogeschool voor de Kunsten. His research agenda focuses on sampling-based synthesis techniques, pitch spaces, and the intersection between music technology and well-being, whose findings have been reported in over 70 scientific publications. His artistic activity counts with regular concerts in venues with recognized merit, such as Asia Culture Center (Korea); New York University (USA); Concertgebouw (Holland); and Casa da Música (Portugal). Bernardes is currently an Assistant Professor at the University of Porto and a senior researcher at INESC TEC.
Krzysztof Krejtz (SWPS University of Social Sciences and Humanities, Poland)
Krzysztof Krejtz is a psychologist at SWPS University of Social Sciences and Humanities in Warsaw, Poland, where he is leading the Eye Tracking Research Center. In 2017 he was a guest professor at Ulm University, in Ulm, Germany. He gave several invited talks at e.g., Max-Planck Institute (Germany), Bergen University (Norway), and Lincoln University Nebraska (USA). He has extensive experience in social and cognitive psychology research methods and statistics. In his research, he focuses on the use of eye tracking method and developing a second-order eye data-based metrics that may capture the dynamics of attention and information processing processes (transitions matrices entropy, ambient-focal coefficient K), dynamics of attention process in the context of Human-Computer Interaction, multimedia learning, media user experience, and accessibility. He is a member of the ACM Symposium on Eye Tracking Research and Application (ACM ETRA) Steering Committee. http://orcid.org/0000-0002-9558-3039
Luís Aly (University of Porto, Portugal)
Luís Aly (Porto, 1975-) is a sound designer for theater and dance. Since 1998 he specialised in sound design by creating and implementing all sound elements for over 20 productions. As a sound designer he puts a strong emphasis on the relationship between performing arts and digital audio technologies exploring the diverse ways computer code influences (and is influenced) by the act of performing. As a researcher he holds a Master´s in Multimedia (Interactive Music and Sound Design) (2016 - FEUP, Porto), and he is currently a PhD student in the doctoral program in Digital Media at University of Porto where he has been awarded with a full doctoral scholarship. Since 2017 he has been an invited assistant professor in audio technologies and sound design in the Technical Short Cycle Degrees in Game Design and Digital Animation and Motion Design and Visual Effects at the School of Media, Arts and Design (ESMAD - IPP).
Abstracts
Embodied Musical Interaction
Embodied interaction means sensing the world through our bodily presence and represents a promising avenue of research in human-computer interaction with profound implications in media arts. Biosignals are a robust and accurate means of designing embodied interactions with the digital realm. This session will provide essential knowledge on developing interactive systems relying on biosignals, including development boards, sensors and signal processing techniques, following a focus on appropriating biosignals as control metaphors for designing interactive musical systems. We will deliver a brief history of the subject and discuss various creations' interaction methods and the creative affordances of biosignal-driven interactive musical systems. Finally, we will introduce techniques in MaxMSP for rapid integration of sensors, machine learning and interactive audio to inspire the participants to create their embodied musical interactions.
Mobile eye tracking method and practice
Eye tracking provides unobtrusive and objective measures of a person’s visual attention; where, when, how long, and in which sequence visual information is processed. The main goal of the workshop is to introduce audience to eye tracking method in the context of Human-Conmputer Interaction. The course will present the basics of eye tracking method with the special focus on mobile eye tracking. Mobile eye tracking allows to collect, analysise and utilize user's eye movements data from her/his pervasive environment. The course will present methodological basics, a review of the current mobile eye tracking studies of pervasive computing, and practical notes for future use in Human-Computer Interaction research.
Open Minds: Designing for radical inclusion
An open-minded approach is required to embrace and promote a radical inclusion attitude, which according to Marty Monson, CEO of Barbershop Harmony Society, "means seeing the abundance in what we can do, instead of the limitations of what we can't or won't do." In this talk, I will present the first steps towards our methodology for (co-)designing radical inclusion, namely the open minds deck, a brainstorming tool developed to trigger awareness of problems and research questions for radical inclusion (co-)design by reflecting on person, obstacle, environment, and context categories.
Virtual Environments: Developing accessible presence within an XR environment
In previous work we have demonstrated the practicalities of rapid prototyping XR tools for exploring how accessible services can be provided within XR environments. In this presentation we will discuss how this work can be extended to enable a better feeling of presence by extending the environments to be both collaborative, but also using accessibility to break down traditional barriers to communication and interaction. We will explore the background to this work, demonstrate a possible testing framework and discuss how the tools can be evaluated, using techniques such as eye tracking.