\newcommand{\lorem}{\textcolor{gray75}{Lorem ipsum dolor sit amet, consectetur adipiscing elit.\xspace}}
-
+\usepackage{nameref}
\newcommand{\reffig}[1]{Figure~\ref{#1}}
\newcommand{\reftab}[1]{Table~\ref{#1}}
\newcommand{\refchap}[1]{Chapter~\ref{#1}}
{\LARGE\bfseries On the critical role of the sensorimotor loop on the design of interaction techniques and interactive devices}
\vspace*{\stretch{2}}
- XXX 2022
+ June 2022
\vspace*{\stretch{2}}
\end{centering}
Michel Beaudouin-Lafon & Professeur & Université Paris-Saclay\\
Dominique Bechmann & Professeure & Université de Strasbourg\\
Andy Cockburn & Professeur & University of Canterbury\\
- Christian Duriez & Directeur de Recherches & Inria\\
+ Christian Duriez & Directeur de Recherche & Inria\\
\\
\multicolumn{3}{l}{\itshape Garant}\\
- Stéphane Huot & Directeur de Recherches & Inria\\
+ Stéphane Huot & Directeur de Recherche & Inria\\
\end{tabularx}
% \end{center}
%\vspace*{\stretch{1}}
\vspace{35mm}
- \begin{textblock*}{\columnwidth}(20mm,250mm)%
+ \begin{textblock*}{\columnwidth}(20mm,252mm)%
\begin{center}
\raggedright
\includegraphics[width=3.5cm]{logo_univ_lille}
\begin{textblock*}{\columnwidth}(20mm,242mm)%
\begin{center}
\centering
- \includegraphics[width=3.5cm]{logo_cristal}
+ \includegraphics[width=3.5cm]{logo_inria}
\end{center}
- \end{textblock*}
+\end{textblock*}
- \begin{textblock*}{\columnwidth}(20mm,230mm)%
+\begin{textblock*}{\columnwidth}(20mm,236mm)%
\begin{center}
\raggedleft
- \includegraphics[width=3.5cm]{logo_inria}\\
+ \includegraphics[width=3.5cm]{logo_cristal}
\end{center}
\end{textblock*}
\begin{textblock*}{\paperwidth}(0mm,275mm)%
\begin{center}
\centering
- \centering
- Habilitation à diriger des recherches préparée au sein de l'équipe projet LOKI commune à\\l'Université de Lille, CRIStAL --- UMR CNRS 9189 et Inria Lille - Nord Europe\\
+ \small
+ Habilitation à diriger des recherches préparée au sein de l'équipe projet LOKI\\
+ commune à l'Université de Lille et Inria\\
+ au sein du centre Inria de l’Université de Lille et du laboratoire CRIStAL — UMR CNRS 9189
\end{center}
\end{textblock*}
- \begin{textblock*}{\columnwidth}(20mm,260mm)%
- \begin{center}
- \raggedleft
- \color{gray75}
- Revision: \texttt{\StrLeft{\commit}{7}} on branch \texttt{\branch}\\
- Compiled on \today\ at \currenttime\\
- \end{center}
- \end{textblock*}
+ % \begin{textblock*}{\columnwidth}(20mm,260mm)%
+ % \begin{center}
+ % \raggedleft
+ % \color{gray75}
+ % Revision: \texttt{\StrLeft{\commit}{7}} on branch \texttt{\branch}\\
+ % Compiled on \today\ at \currenttime\\
+ % \end{center}
+ % \end{textblock*}
\end{titlepage}
The word \emph{haptics} comes from the ancient Greek word {\mygreek ἁπτικός} or \emph{haptikós} which means to touch, contact, manipulate\footurl{https://en.wiktionary.org/wiki/haptic}.
Therefore I not only refer to haptics as a way to stimulate the human sense of touch, but also the human ability to perform motor actions.
I discuss how my focus was initially on these two notions separately, and how I finally combined these two approaches and now focus on the sensorimotor loop.
-Therefore, haptics is in the end a pretext to discuss the sensorimotor loop, which is the true focus of my work today.
+Therefore, haptics is in the end an occasion to discuss the sensorimotor loop, which is a higher-level notion and the true focus of my work today.
I describe my research since I got my Maître de Conférences position at the University of Lille in September 2011.
I started my research at \href{https://www.cristal.univ-lille.fr/}{CRIStAL} and \href{https://www.inria.fr}{Inria} in the \href{https://www.cristal.univ-lille.fr/mint/}{Mint} project team.
%This document is intended to be useful for anybody who wants to design, implement, or evaluate interactive systems, in particular with haptics.
My core expertise is in Computer Science, but research in HCI is typically interdisciplinary.
HCI experts will probably not be surprised to read elements of experimental psychology, electronics, robotics, or design for example.
-Other readers must understand that the force of our research domain, at least the way I practice my research, is not to dig into known problems as deeply as possible, should the focus be very narrow.
+Other readers must understand that the force of our research domain, is to allow studying research problems in a holistic and systemic way.
+This is my favorite approach, and the way we practice our research in the Loki team (and previously Mjolnir).
+%at least the way I practice my research, is not to dig into known problems as deeply as possible, should the focus be very narrow.
We do not focus on a particular technology or methodology for example.
On the opposite, we are open to any problem, that we must characterize before searching for appropriate solutions.
Hence, we rather need a large overview of methodologies, technologies, and knowledge.
% Ce mémoire décrit les travaux de recherche que j’ai effectués au sein de l’équipe Pro- grammation et Génie Logiciel du Laboratoire de Recherche en Informatique (LRI) depuis mon recrutement comme Maître de Conférences à l’Université Paris-Sud, en septembre 2001, et en tant que membre du projet In Situ de l’INRIA depuis sa création en jan- vier 2002. Ces travaux s’inscrivant dans la continuité de mes travaux antérieurs, il me semble toutefois utile de revenir brièvement sur quelques éléments mentionnés dans ma thèse [Roussel, 2000a]1.
-\paragraph{\refchap{chap:output}}
+\paragraph{\refchap{chap:output} -- \nameref{chap:output}}
The \refchap{chap:output} extends my Ph.D. work during which I used haptics to help visually impaired children at school.
I describe a haptics rendering pipeline that details the different steps on both the system and human side, and both the hardware and software parts.
The second one is about the intertwined relationship between engineering and the evaluation of haptic devices.
The third challenge addresses the restoration of missing haptic properties of physical controls in multi-touch and gestural interaction.
The fourth challenge is about the haptic properties of tangible controls and their effect on interaction.
-Then I discuss a number of contributions to these challenges.
+Then I discuss a number of contributions to these challenges, about Tactons, tactile textures, printed vibrotactile widgets, and actuated computer peripherals.
-\paragraph{\refchap{chap:input}}
+\paragraph{\refchap{chap:input} -- \nameref{chap:input}}
The \refchap{chap:input} presents a mirror vision of the work presented in \refchap{chap:output}.
The input pipeline I discuss is almost identical to the haptic rendering pipeline, except that the user is initiating the action.
Then I present challenges for the design of input systems.
The first challenge is about the sensing and interpretation of human abilities.
The second one addresses the design of input vocabularies.
-The third challenge is about the relevance of unnatural inputs and the futility to replicate the physical world into the digital world.
-To address these challenges I present several contributions.
+The third challenge is about the relevance of unnatural inputs and the limitation of only replicating the physical world into the digital world, which misses many opportunities to augment what human can do with digital tools.
+%futility to replicate the physical world into the digital world.
+To address these challenges I present several contributions about latency measurement, flexible pens, finger identification, and interaction techniques in Virtual Reality.
-\paragraph{\refchap{chap:loop}}
+\paragraph{\refchap{chap:loop} -- \nameref{chap:loop}}
While \refchap{chap:input} and \refchap{chap:output} described the vision of HCI I had at the beginning of my research career.
This vision evolved as the sensorimotor loop kept having greater importance in my work.
%integrate the first two chapters into a whole that is worth more than its parts.
%explain I should not have separated these three chapters.
This chapter starts with studies that failed at improving interaction substantially because they followed the vision of the two first chapters.
-Then I discuss the connections between computing and the sensorimotor loop through models of human behavior, system architectures, and interaction paradigms.
+Then I adddress these issues with contributions that leverage the sensorimotor loop for gestural interaction, and the sense of embodiment in immersive virtual reality.
+I conclude with a discussion about the connections between computing and the sensorimotor loop through models of human behavior, system architectures, and interaction paradigms.
I propose an interpretation of these models that describe both the software and hardware levels.
-I conclude with the description of contributions that leverage the sensorimotor loop for gestural interaction, and the sense of embodiment in immersive virtual reality.
-
-% \begin{verbatim}
-% ==
-% !=
-% <=
-% >=
-% \end{verbatim}
+%suite à cette reflexion globale sur ton travail/domaine, tu dresses un bilan et propose des perspectives et grandes lignes des questions que ça implique et que tu souhaites mener ensuite, en donnant 1 ou 2 exemples (pour éviter de tomber dans la phrase trop générale “et on dresse quelques perspectives pour la suite”).
\begin{Abstract}
We discuss haptics as the sense of touch, and the implication for the design and implementation of haptic systems.
To do so, we present the haptic pipeline that illustrates the hardware and software parts of both interactive systems and users.
-First it shows the diversity of disciplines involved in the design and implementation of haptic systems
+First, it shows the diversity of disciplines involved in the design and implementation of haptic systems.
Second, it reveals pitfalls that potentially alter the message transmitted to users through touch at every stage of the pipeline.
-We present the main general research questions that guided my research: the output vocabulary, the engineering and evaluation of haptic devices, the haptic properties of physical objects, and the use of tangible ofbjects for haptic interaction.
+We present the main general research questions that guided my research: the output vocabulary, the engineering and evaluation of haptic devices, the haptic properties of physical objects, and the use of tangible objects for haptic interaction.
We illustrate these research questions with several research projects: vibrotactile Tactons for activity monitoring, tactile textures with programmable friction, printed vibrotactile widgets, and actuated computer peripherals.
\end{Abstract}
Haptics is generally seen as an output modality, and most haptic systems are designed for providing \emph{haptic feedback}.
Despite the usual user-centered approach in HCI, this is the typical convention in computer science and robotics, with a system-centered point of view.
This is not the case for all scientific disciplines.
-For example, I had misunderstandings with Ludovic Potier when he joined our team as a Postdoc researcher.
-He has a background in cognitive sciences, and what he refers to as input and output is the opposite of the convention I use.
+For example, I had misunderstandings with a postdoc researcher with a background in cognitive sciences when he joined the Mint group a few years ago.
+What he referred to as input and output was the opposite of the convention I use.
+Initially, it created confusion between us, but soon after it helped us to understand each other's points of view.
+It also helped me to characterize the symmetries and asymmetries between humans and systems that I will discuss in \refchap{chap:loop}.
This chapter will cover haptics as a way to stimulate the sense of touch.
%The sense of touch is the primary sense of newborns.
The objective of this pipeline is to transmit information to users through their sense of touch.
This is indeed a simplified pipeline, which nevertheless shows that several scientific disciplines are interested in this topic. %, and each step is essentially studied by several distinct research communities.
While each step of the pipeline is essentially studied by one or two scientific fields, the role of Human-Computer Interaction is to connect them in a meaningful and useful way for people.
+It is an illustration of our holistic approach, that enables us to study the interaction phenomenon that not only takes into account the system, but also the users, their tasks, and the environment.
\input{figures/hapticpath.tex}
The haptic properties of physical interfaces are however typically missing on multi-touch interfaces.
Every widget feels like a flat surface.
+However, humans rely on haptic feedback all the time for everyday interactions with physical objects\cite{maclean09}.
Efforts were made to restore this missing haptic feedback.
For example, vibrotactile actuators can reproduce the clicking sensation of buttons~\cite{nashel03,lylykangas11}.
Several variable friction technologies can reproduce texture sensations~\cite{amberg11,bau10,levesque11}.
Thanks to the tactile sensitivity of our fingers we can perceive the shape, size, and material of the key to some extent.
We can identify ripe fruits and vegetables based on their hardness.
We can use a TV remote in the dark because we locate the keys with proprioception.
-We can tell if an opaque bottle is empty, full ,or in between because we feel its weight, and we can feel the liquid splashing inside.
+We can tell if an opaque bottle is empty, full, or in between because we feel its weight, and we can feel the liquid splashing inside.
To leverage these haptic properties of everyday objects, there is indeed a compelling intersection with \defword{tangible interaction}.
Ullmer and Ishii described Tangible User Interfaces (TUI) this way: “TUIs will augment the real physical world by coupling digital information to everyday physical objects and environments.”~\cite{ishii97}.
Using haptics as an output vocabulary to transmit information to users directly stems from my Master and Ph.D. work.
I had the privilege and pleasure to collaborate with Stephen Brewster, who is also the inventor of the concept of Tactons~\cite{brewster04}.
I worked on several Tacton sets at the time: active~\cite{pietrzak05} and passive~\cite{pietrzak05a} force feedback, as well as pin-array Tactons~\cite{pietrzak06,pietrzak09}.
-Here, I will discuss another project on Tactons called Activibe~\cite{cauchard16}.
+Here, I will discuss another project on Tactons called Activibe, which was a collaboration with Jessica Cauchard, James Landay, and Janette Cheng from Stanford University~\cite{cauchard16}.
The idea was to design haptic feedback for activity monitoring.
Fitness trackers became mainstream in the last decade.
This next contribution is also about the extension of the haptic output vocabulary.
However, contrary to the previous contribution, in this project we studied the output vocabulary for a new kind of tactile technology: programmable friction.
+This work was mainly done by Ludovic Potier who was a postdoc with a cognitive sciences background that I co-supervised with Nicolas Roussel and Géry Casiez when we were part of the Mint team.
+The design and implementation of the tactile device we used are the result of years of research by the experts in electrical engineering in the Mint team: Frédéric Giraud, Michel Amberg, Betty Semail, and their students.
+
Two different technologies exist for changing the perceived friction of a surface.
The first one is electrovibration.
It uses a high voltage signal on an electrode to stick the user's finger on the surface~\cite{strong70}.
First of all, it is important to remind that the squeeze film effect reduces friction.
Therefore, when the command is high, the surface is more slippery than when the command is low.
-Therefore on the figures, the left and bottom sides correspond to high friction, and the right and top sides correspond to low friction.
-Therefore we explain the fact that the higher difference between the reference levels and the JND are on the edge values (\ang{0}, \ang{144}, and \ang{180}) with the non-linearity of both the effect produced by the command (effect on \ang{0}) and the perception of the mechanical effect (effect on \ang{144}, and \ang{180}).
+On the figures, the left and bottom sides correspond to high friction, and the right and top sides correspond to low friction.
+Hence, we explain the fact that the higher difference between the reference levels and the JND are on the edge values (\ang{0}, \ang{144}, and \ang{180}) with the non-linearity of both the effect produced by the command (effect on \ang{0}) and the perception of the mechanical effect (effect on \ang{144}, and \ang{180}).
The overall and conservative recommendation here is to use differences of phase shift greater than \ang{100} to create patterns with this implementation of Stimtac (larger mean JND plus standard deviation).
In any case, it is unlikely users can perceive a difference of phase shift lower than \ang{10} (lower mean JND minus standard deviation).
Its parameters are position and size (width).
The fourth pattern is a \emph{field}, a regular repetition of a shape.
With a sufficiently low size and a high number of repetitions, users are not able to count the item while exploring the surface.
-They rather have a sensation of roughness\fixme{REF?}.
+They rather have a sensation of roughness.
+%\fixme{REF?}.
One of the parameters is the duty cycle, which is the ratio between the signal size and the period size.
We can also adjust the number of repetitions and the width, which together controls the density of the pattern.
The fifth pattern is a \emph{gradient}, repetition of a shape, with a variable size.
%The evaluation of tactile patterns is complex because the design space is large.
The first interesting research question is whether users can distinguish the patterns.
The way of evaluating this is not trivial though because the design space is large and it is difficult, not to say impossible, to present all of them with sufficient repetitions for an accurate analysis.
-Therefore we must evaluate a subset of all possible patterns
+Therefore we must evaluate a subset of all possible patterns.
Recent work addressed this issue with a new method for sampling the design space~\cite{demers21}.
At the time, we used multidimensional scaling (MDS) because of the multidimensional nature of our patterns.
This method consists in asking participants to group items and use the number of times they are in the same group as a similarity metric.
In the previous section, we investigated the output vocabulary for a new device providing a new type of haptic feedback.
We did not explore a particular context or application but rather studied the possibilities and limitations of the technology.
In this project, we are interested in vibrotactile feedback, which is well covered in the literature.
-We are however interested in a particular case: restoring haptic feedback on touchscreens.
+We were however interested in a particular case: restoring haptic feedback on touchscreens.
Indeed touchscreens have many advantages compared to physical interfaces.
They can be updated.
They have no mechanical parts that wear over time.
%Replacing physical controls with touchscreens have advantages: updates, reconfigurable, visual feedback, but most of haptic properties are lost: click sensations of buttons, detents on slides. Impact on interaction. Technologies to restore haptic feedback.
+This work was part of the \href{https://cordis.europa.eu/project/id/645145}{H2020 Happiness} project.
+I was the leader of the Human Factors work package, as well as the leader for Inria that was represented by the Mjolnir and Hybrid research groups.
+In this project I supervised Christian Frisson during his postdoc in the group, as well as Julien Decaudin who was the engineer who implemented the software library, demos, and electronic prototypes.
+
The manufacturing process of touch interfaces such as dashboards augmented with haptic feedback is complex.
Mechanical actuators must be attached underneath such that the vibration transmits to the interactive places of the surfaces.
In this project, we investigate a new kind of actuator.
In particular, we studied a haptic technology that could restore the haptic feedback of physical controls.
This project is the exact opposite.
We embrace physical controls and their haptic properties, and we study how we can use them differently.
+It introduces the idea we will develop in \refchap{chap:input} and \refchap{chap:loop} that haptics is not only about the sense of touch, but also about manipulation.
%better include them in our daily activities beyond the way they usually work.
-In this work, we focused on desktop interaction with the augmentation of desktop peripherals.
+
+This project was a collaboration with Gilles Bailly during his postdoc at Deutsch Telekom, and his debut as a CNRS researcher.
+In the first part of this project I designed and implemented the Métamorphe prototype, and we collaborated with my former colleagues at the University of Toronto for the user studies: Daniel Wigdor and Jonathan Deber.
+In the second part of the project we worked with Sylvain Malacria just before he joined the Mjolnir team, and Sidarth Sahdev, and electrical engineering master student who designed and implemented the Living Desktop hardware before he joined the Univerity of Toronto as a Ph.D. student.
% peripherals: keyboards, mice, and screens.
+In this work, we focused on desktop interaction with the augmentation of desktop peripherals.
There are many examples of extensions of desktop peripherals in past research, in particular keyboards.
For example, additional sensors enable contact sensing on the keys of a keyboard~\cite{rekimoto03}, gesture on the whole keyboard surface~\cite{block10,kato10,taylor14,zhang14}, or force sensing on keys~\cite{dietz09}.
In other works, actuators are embedded in each key to make them harder to press~\cite{hoffmann09,savioz11}.
These two research projects are complementary, and they focus on different levels.
Métamorphe focuses on the device level while Living Desktop focuses on the desktop level.
In both cases, we use actuation as a mechanism to provide new features, with two paradigms in mind.
-The first one is shape-changing interaction: the shape of an object is a signifier of its affordances.
+The first one is shape-changing interaction: the shape of an object is one of the signifiers of its affordances.
Hence changing the shape of a device is an interesting way of providing and advertising an extended interactive vocabulary.
The second paradigm is tangible interaction, which “augments the real physical world by coupling digital information to everyday physical objects and environments.”~\cite{ishii97}.
Here we see desktop peripherals as everyday objects that we manipulate as such.
\includegraphics[height=4.4cm]{metamorphe_raised}\hfill
\includegraphics[height=4.4cm]{metamorphe_pinch}
% \vspace*{-7mm}
- \caption[Métamorphe keyboard]{Métamorphe is a keyboard with actuated keys, which can either be up or down. Left: view of the keyboard with two keys up. Right: raised keys have new affordances. They can be pushed or pinched.}
+ \caption[Métamorphe keyboard]{Métamorphe is a keyboard with actuated keys, which can either be up or down. Left: view of the keyboard with two keys up. Right: raised keys have new interaction possibilities. For example, they can be pushed or pinched.}
\label{metamorphe}
% \vspace*{-3mm}
\end{figure}
\epigraph{The true delight is in the finding out rather than in the knowing.}{Isaac Asimov}
\begin{Abstract}
-After considering haptics as the sense of touch, we discuss here haptics as the human ability to tough and manipulate the environments and the objects it contains.
+After considering haptics as the sense of touch, we discuss here haptics as the human ability to touch and manipulate the environments and the objects it contains.
We present the motor sensing pipeline that is the mirror of the haptic rendering pipeline discussed in the previous chapter.
It reveals the research questions I addressed in my research: the sensing and interpretation of the users' gestures, the input vocabulary, and the design of interaction techniques for unnatural actions.
Then I discuss four contributions: a system latency measurement methodology and tool, flexion as a new degree of freedom for pen interaction, finger identification as a new property of multi-touch interaction, and interaction techniques in virtual reality.
% 6) Wrist ulnar/radial bend %(19/16)
% 7) Wrist flexion extension %(62/40)
Knowledge about the range of movements and maximum forces is necessary for the design and layout of workstations.
-For example, NASA documents these values with and with gravity, or with pressurization because they need to provide precise and documented specifications for spacecrafts~\cite{nasa14}.
+For example, NASA documents these values with and without gravity, or with pressurization because they need to provide precise and documented specifications for spacecrafts~\cite{nasa14}.
They also use these specifications to design clothes that astronauts can wear comfortably to perform routine tasks.
%motion range has application to
%- Workstation Design and Layout
Other devices use one or more RGB or infrared cameras to detect markers or a projected pattern.
They are used for body motion capture or for tracking objects.
VR headsets also use this technology to track the controllers.
-The last family of technologies is Microelectromechanical systems (\defword{MEMS}).
+The last family of technologies is Microelectromechanical systems (\defacronym{MEMS}).
They sense many kinds of physical phenomenons such as acceleration, rotations, magnetic fields, or fluid pressure.
-Input systems typically use combinations of accelerometers, gyroscopes, and magnetometers called Inertial measurement units (\defwords{IMUs}{IMU}).
+Input systems typically use combinations of accelerometers, gyroscopes, and magnetometers called Inertial measurement units (\defacronym{IMU}).
The signal coming from these sensors requires several transformations.
Contact inputs such as buttons require a software or hardware \defword{debouncing} mechanism to avoid unwanted multiple activations.
Threshold-based input such as capacitive sensing not only requires adjusting a sensitivity and threshold value, but they often require an \defword{hysteresis} mechanism to avoid multiple activations as well.
-Analog signals must be transformed to digital values with an Analog-to-digital converter (\defword{ADC}).
+Analog signals must be transformed to digital values with an Analog-to-digital converter (\defacronym{ADC}).
Input values often have noise that must be \defwords{filtered}{filter}.
Many possible filters remove noise, at the cost of latency~\cite{casiez12}.
Some kinds of input require further transformation.
In this case, the device implements a communication protocol that the host has to follow.
There is no standard protocol, but the overall idea is usually similar.
These buses use no correction codes, therefore they are fast but sensible to interferences.
-Thus, devices that users can plug use more robust buses such as USB with the Human Interface Devices (\defword{HID}) class\footnote{\href{https://www.usb.org/hid}{https://www.usb.org/hid}}.
+Thus, devices that users can plug use more robust buses such as USB with the Human Interface Devices (\defacronym{HID}) class\footnote{\href{https://www.usb.org/hid}{https://www.usb.org/hid}}.
This class defines a standard communication protocol for interactive devices.
When the device is plugged, it sends descriptors that list its features.
In particular, the HID descriptor details the format and semantic of the data packets the device will send at a fixed frequency.
Similar to the haptic pipeline, the motor pipeline reveals pitfalls that could lead systems to behave differently than what users had in mind.
%The first limitations are due to the limited capacities of humans.
-Users may ignore the way to perform they intend to do.
+Users may ignore the way to perform what they intend to do.
They can know the action they have to do but it is challenging to perform.
There can be obstacles in the physical world that prevent systems to sense these actions correctly.
The range of physical effects can be out of the sensing range of the system.
The system may interpret what it sensed incorrectly.
This pipeline is therefore a profuse source of HCI research questions.
-I focus here on three categories of research questions that I addressed in my research in the last decade.
+I focus here on three categories of research questions that I addressed in my research in the last decade: sensing and interpretation, input vocabulary, and unnatural input.
\subsection{Sensing and interpretation}
Now, if the natural essence of gestural interaction is not an essential benefit for the design of interactive systems, we can think about this modality differently.
%The physical world has constraints that the digital world does not have.
-The digital world does not have some of the limitations of the physical world.
+The digital world does not have some of the limitations of the physical world~\cite{jacob08}.
For example, we can easily teleport inside a virtual environment.
We can move through objects or even fly.
We can manipulate objects remotely and independently to their weight, size, or shape.
This latency is known to cause performance and usability issues~\cite{deber15,jota13,teather09,waltemate16}.
Therefore, there are research studies about strategies to mitigate these effects or reduce latency~\cite{cattan15,nancel18}.
+This work was part of the \href{https://anr.fr/Projet-ANR-14-CE24-0009}{ANR Turbotouch}, coordinated by Géry Casiez.
+He was involved in this work, along with Nicolas Roussel, and Mathieu Falce who was an engineer in the team.
+I designed and implemented the Lagmeter device with the help of Damien Marchal, who is a permanent CNRS research engineer at the CRIStAL laboratory.
+
\input{figures/lagmeter.tex}
Before reducing latency or mitigating its effects, it is important to measure it and understand the contribution of each part of the system to it.
\subsection{Flexible pens}
\label{sec:flexiblepens}
+Latency is a technical limitation for interaction.
+On the oposite, limited output vocabularies can be improved with technical solutions.
Pen interaction is a good example of the exploration of new degrees of freedom.
This is certainly because pens are used in many contexts, in particular for artistic creation.
Typical interactive pens sense the x-y position as well as proximity.
But it also enables selecting commands and offers richer interactions, whether it is with combinations of pen and touch interactions~\cite{hinckley10} or by leveraging physical attributes of the pen~\cite{vogel11}.
In this work, we were interested in the bending of a flexible pen as additional degrees of freedom.
+This project initiated my collaborations with Audrey Girouard from Carleton Univerity.
+The first part of this work was done by Nicholas Fellion who was a master student at Carleton.
+He did a 6 months internship in Lille to work on the Flexstylus prototype.
+The second part of this work was done by Alfrancis Guerrero, who was also a master student at Carleton University.
+He designed and implemented the Hyperbrush prototypes.
+
\subsubsection{Prototypes}
We built two series of prototypes, the first one being \emph{FlexStylus}~\cite{fellion16,fellion17} (\reffig{fig:penprototypes}, up).