REGISTRATIONS OPEN, follow this link!
EICS 2022 is the fourteen international conference devoted to engineering usable and effective interactive computing systems.
Work presented at EICS covers the full range of aspects that come into play when "engineering" interactive systems, such as innovations in the design, development, deployment, verification and validation of interactive systems. Topics of interest include the design and development of systems incorporating new interaction techniques and multimodal interaction, multi-user, multi-device/screen, multi-environment interaction, mobile and pervasive systems, large-scale and big data applications, deployment of interactive systems, as well as novel development methods and processes for improving the development of interactive systems.
EICS focuses on models, languages, notations, methods, techniques and tools that support designing and developing interactive systems. The Conference brings together people who study or practice the engineering of interactive systems, drawing from HCI, Software Engineering, Requirements Engineering, Conceptual Modelling, CSCW, Ubiquitous / Pervasive Systems
The conference proceedings are published by the ACM and appear in the ACM Digital Library. The full paper are published in the journal PACM EICS series. Further information at https://eics.acm.org/pacm/.
Engineering Interactive Geospatial Visualizations for Cluster-Driven Ultra-high-resolution Wall Displays
Emmanuel Pietriga, INRIA, France
Ultra-high-resolution wall-sized displays feature a very high pixel density over a large physical surface, typically several square meters. They provide effective support for collaborative work sessions that involve the visualization of large, heterogeneous datasets. But the development of interactive visualizations for ultra-high-resolution wall displays raises significant challenges. These range from the design of input techniques adapted to such surfaces, to the design of visualizations that effectively leverage their extreme display capacity. Challenges lie not only in the design but in the technical realization of these visualizations as well, as they run on computer clusters and thus require dedicated software frameworks for the distribution and synchronization of data and graphics.
In this talk, I will essentially focus on challenges that relate to the engineering of interactive visualizations for cluster-driven wall displays, discussing different approaches that we explored over the last fourteen years to create geovisualizations and the associated multi-scale interaction techniques.
Emmanuel Pietriga is a Senior Research Scientist at Inria, ILDA team, in France, where he leads the ILDA research team. He is also Course Instructor at École Polytechnique where he teaches Data Visualization. His research work focuses on the design and engineering of techniques for the visual representation and interactive manipulation of complex, heterogeneous data structures. His specific topics include input and visualization techniques for novel forms of interactive display surfaces (wall displays, AR, large pen+touch surfaces), multi-scale user interfaces, visualization techniques for multivariate networks and knowledge graphs, and HCI aspects of Web browsing. His work has been recognized with 5 paper awards including best paper and honorable mentions from ACM CHI, as well as awards from ACM VRST and IFIP Interact. His doctoral advisees have been hired at Microsoft, the University of Edinburgh, Inria, IGN, and elsewhere. He received his PhD at Institut National Polytechnique de Grenoble (INPG) in 2002. Upon graduation, he received INPG’s Doctoral Dissertation award. His previous research positions include Xerox Research Centre Europe, the Decentralized Information Group at MIT, where he worked for the World Wide Web Consortium, and Inria's research center in Santiago de Chile, where he worked in close collaboration with the ALMA observatory on the design and implementation of user interfaces for telescope operations monitoring and control. He has since then worked in collaboration with other prominent astronomical observatories, including the Cherenkov Telescope Array and the Vera C. Rubin observatory.
Building Virtual and Augmented Reality passenger experiences
Stephen Brewster, University of Glasgow, UK
In Europe, people travel an average of 12,000km per year on private and public transport, in cars, buses, planes and trains. These journeys are often repetitive and wasted time. This total will rise with the arrival of fully autonomous cars, which free drivers to become passengers. I will present our work into improving passenger journeys using Virtual and Augmented Reality (together XR) to support productivity, entertainment and collaboration on the move. Three significant challenges must be overcome to allow us to use travel time: •Interaction – Confined spaces limit our interactivity and the social nature of travel settings can inhibit the types of interactions we may perform; •Sensing – Vehicle movements cause many challenges for sensing when simple IMUs are used, making it difficult to separate vehicle and user actions; •Motion sickness - Many people get sick when they read or play games in vehicles. Once experienced, it can take hours for symptoms to resolve XR headsets could allow passengers to use their travel time in new, productive ways, but only if these fundamental challenges can be overcome. Passengers would be able to use large virtual displays for productivity; escape the physical confines of the vehicle and become immersed in virtual experiences; and communicate with distant others through new embodied forms of communication. I will discuss our solutions to these challenges, focusing on engineering challenges that must be overcome. We are developing new interaction techniques for XR that can work in confined, seated spaces and be socially acceptable to use. We have developed a hardware and software sensing platform that allows us to separate user and vehicle motions for co-located or remote shared experiences. We are working on overcoming motion sickness using neurostimulation and visual displays to support these novel immersive experiences.
Stephen Brewster is a Professor of Human-Computer Interaction in the School of Computing Science at the University of Glasgow. He got his PhD in auditory interface design at the University of York. At Glasgow, he leads the Multimodal Interaction Group, which is very active and has a strong international reputation in HCI. His research focuses on multimodal HCI, or using multiple sensory modalities and control mechanisms (particularly audio, haptics and gesture) to create a rich, natural interaction between human and computer. His work has a strong experimental focus, applying perceptual research to practical situations. A long-term focus has been on mobile interaction and how we can design better user interfaces for users who are on the move. Other areas of interest include VR/AR, wearable devices and in-car interaction. He pioneered the study of non-speech audio and haptic interaction for mobile devices with work starting in the 1990's. He was a General Chair of CHI 2019 in Glasgow, CHI papers chair in 2013 and 2014, and has previously chaired MobileHCI, EuroHaptics and TEI. He is a member of the ACM SIGCHI Academy, an ACM Distinguished Speaker and a Fellow of the Royal Society of Edinburgh. He is also a member of the ACM CHI Steering Committee, setting the direction for the CHI conference series.
The following workshops are co-located with EICS 2022 and are planned to occur on June 21, 2022. Click on the title for further details:
The panel "Engineering Awareness in Interfaces: Focus on Automation and Visualization" will discuss the following spotlight themes:
Submissions and topics
Submissions can be done through http://new.precisionconference.com
More information about the new PACM-HCI (EICS series) review and publication process can be found at http://eics.acm.org/pacm.
EICS 2020 focuses on models, languages, notations, methods, techniques and tools that support designing and developing interactive systems. The Conference brings together people who study or practice the engineering of interactive systems, drawing from HCI, Software Engineering, Requirements Engineering, Conceptual Modelling, CSCW, Ubiquitous / Pervasive Systems. Submissions are invited that advance the state of the art of the engineering of interactive systems.
Topics include but are not limited to:
- Modelling and analysis of interaction and interactive systems
- Processes for engineering interactive systems (e.g., design, implementation, prototyping, evaluation, verification and validation, testing)
- Integrating interaction design into the software development process
- Requirements engineering for interactive systems
- Specification of interactive systems (methods, principles and tools)
- Software architectures for interactive systems
- Frameworks, toolkits, and APIs for interactive systems (e.g., API usability, interaction-driven API design)
- Domain-specific languages for interactive systems
- Formal methods within interactive systems engineering
- Modelling and analysis of users’ activities
- Engineering innovative interactive applications (e.g., adaptive, tangible, touch and multitouch input, voice, gesture, EEG, multimodal input, mobile and wearable systems)
- Engineering hardware/software integration in interactive systems (e.g., fabrication and maker processes, physical computing, etc.)
- Engineering user experience (e.g., fun, affective)
- Engineering complex interactive systems (e.g., large datasets, large communities, enterprise systems, collaborative systems)
- Engineering interactive systems for various user categories (e.g., children, elderly, people with disabilities)
- Certification issues of interactive systems
- New datasets and evaluation data relevant for engineering interactive systems
The reviewing process for full papers follows the Proceedings of the ACM (PACM) model. The submission and review process will take place three times annually, and accepted papers will be published in issues of the PACM on Human-Computer Interaction journal. More can be found at http://eics.acm.org/pacm.
The full papers format is the single column template as described here (Section 2). The final outcome format (acm small) is provided here (see Overleaf template).
Submissions can be done through http://new.precisionconference.com
EICS PACM 2022 Round 1
22/07/2021 - Submission deadline
06/09/2021 - Notifications of reviews
01/10/2021 – Camera ready
EICS PACM 2022 Round 2
22/10/2021 - Submission deadline
29/11/2021 - Notifications of reviews
14/01/2022 – Camera ready
EICS PACM 2022 Round 3
18/02/2022 - Submission deadline
28/03/2022 - Notifications of reviews
02/05/2022 – Camera ready
Full papers chairs
Kris Luyten, firstname.lastname@example.org
Philippe Palanque, email@example.com