%!TEX root = ../hdrmain.tex
-\chapter{Haptics as an output modality}
+\chapter{The sense of touch}
\epigraph{\lorem}{Auteur}
\begin{Abstract}
\loremipsum
\end{Abstract}
-Haptics is generally seen as an output modality, and typically designated as \emph{haptic feedback}.
-Here we use the computer science convention that an output modality enacles systems to transmit information to people.
-I was surprized when I discussed with a postdoc a few years ago, who has a background in cognitive sciences and therefore categorized haptics as input.
-In his research field they adopted the point of view of humans, hence inverting input and output compared to what I was used to.
+Haptics is generally seen as an output modality, and most haptic systems are designed for providing \emph{haptic feedback}.
+Despite the usual user-centered approach in HCI, this is the typical convention in computer science and robotics, with a system-centered point of view.
+This is not the case for all scientific disciplines.
+For example, I had misunderstandings with Ludovic Potier when he arrived as a Postdoc in my team.
+He has a background in cognitive sciences, and what he refers to as input and output is the opposite of the convention I use.
+This chapter will cover haptics as a way to stimulate the sense of touch.
-Most of the time, haptic is seen ad a feedback modality.\par
-- impairment (Ph.D.), sensory substitution\par
-- other modalities busy\par
-- combination with vision, audio
+The sense of touch is the primary sense of newborns.
+Their vision takes months just to perceive shapes and colors.
+Therefore they start exploring the world by touching with their hands and mouth.
+Several years are necessary to reach \fixme{optimal} visual accuracy.
+However, children quickly use vision as their primary source of information.
+\fixme{trouver ref pour tout ça}
+Hence, haptics is generally not the primary focus for the design of user interfaces.
+The focus is generally on the graphics, and sound to some extent.
+%The main uses of haptics in interactive systems are the following.
+Therefore, haptics is mainly used either as a replacement of vision, or it complements visual and audio stimulations.
-\cite{maclean09}
+\paragraph{Replace vision}
+Sensory substitution refers to situations in which sensations that are typically perceived with one sense are translated to another sense.
+Bach-y-Rita introduced this concept~\cite{backyrita72} and invented the Tactile Vision Sensory Substitution system (TVSS) \cite{collins73}.
+In this article, the authors describe their apparatus, but also mention several other systems that already existed at the time.
+One of them was designed by Linvill and Bliss.
+It had an $8\times 12$ array of photosensors connected to piezo actuators\cite{linvill66}.
+Users could explore documents with the sensor, and feel a tactile version of the text and graphics under their fingers.
+Authors conducted user studies and measured a reading rate of 50 words per minute with an expert user, and 10 words per minute with other trained users~\cite{bliss70}.
+The same principle was used to replace visual information by auditory information~\cite{auvray05}.
-\section{Haptic output vocabulary}
+The Vibe~\cite{hanneton10}
+TDU~\cite{sampaio01}.
+It is also used in other contexts like surgery, in which vision is required for a primary task, and haptics is used to replace vision at a different scale and point of view~\cite{robineau07}.
-Tactile Textures~\cite{potier12,potier16}
+\cite{lenay03}
-\section{Haptic feedback for activity monitoring (Activibe)}
-\cite{cauchard16}
+\paragraph{Active haptics}
-\subsection{Vibrotactile widgets}
+Active: haptics in the common sense
-Replacing physical controls with touchscreens have advantages: updates, reconfigurable, visual feedback, but most of haptic properties are lost: click sensations of buttons, detents on slides. Impact on interaction. Technologies to restore haptic feedback.
+\paragraph{Passive haptics}
+In the early days of tangible interaction, Ullmer and Ishii described Tangible interaction this way: “TUIs will augment the real physical world by coupling digital information to everyday physical objects and environments.”\cite{ishii97}.
+The idea is to break the barrier between the physical and the digital world.
+With this paradigm, any object can either represent digital information or be a proxy for manipulating digital information.
+Similar to input and output devices, these objects are instrumented with sensors or actuators, to create links with the digital world.
-Vibrotactile widgets~\cite{frisson17,frisson20}
-Leverages vibrotactile feedback for touch surfaces.
-,
\ No newline at end of file
+%\cite{deroy12}
+
+
+\paragraph{Complement vision and audio} VR + other modalities
+
+\paragraph{Sensory restoration} restores lost haptic feedback\cite{maclean09}
+
+
+%Most of the time, haptic is seen as a feedback modality.
+
+
+\section{Active haptics}
+
+ Haptic variables, vocabulary => Tactons. Diversity of sensations \cite{lederman87}=> diversity of devices~\cite{seifi19}.
+
+ % The features are: linguistic/nonlinguistic, analogue/non-analogue, arbitrary/non-arbitrary, static/dynamic\cite{bernsen93a}
+
+ % 8. Touch language Touch letters, numerals, words, other touch language related signs, text, list and table orderings.
+ % Example: Braille
+ % 18. Real-world touch Single touch representations, touch sequences.
+ % 20. Touch graphs 1D, 2D or 3D graph space with geometrical forms.
+ % Pure charts (dot charts, bar charts, pie charts, etc.).
+ % 24. Arbitrary touch Touch signals of differents sorts.
+ % 28. Touch structures Form fields, frames, grids, line separations, trees.
+
+\begin{figure}[htb]
+\centering
+\definecolor{cellred}{rgb} {0.98,0.17,0.15}
+\definecolor{cellblue}{rgb} {0.17,0.60,0.99}
+
+\newcommand{\labelcell}[2]{
+\node[minimum width=3.0cm, minimum height=.75cm,text width=3.5cm, align=center, outer sep=0](#1) {\textbf{#2}};
+}
+\newcommand{\bluecell}[2]{
+ \node[minimum width=3.0cm, minimum height=1.5cm,fill=cellblue, text=white,text width=3.5cm, align=center, rounded corners=2ex, outer sep=0](#1) {#2};
+}
+\newcommand{\redcell}[2]{
+ \node[minimum width=3.0cm, minimum height=1.5cm,fill=cellred, text=white,text width=3.5cm, align=center, rounded corners=2ex, outer sep=0](#1) {#2};
+}
+\begin{tikzpicture}
+ \small
+ \matrix[row sep=1cm, column sep=4cm,inner sep=0, node distance=0, outer sep=5mm] (cells) {
+ \bluecell{mechanics}{Electro-Mechanical\\System} & \redcell{sensorial}{Sensorial\\system}\\
+ \bluecell{software}{Software\\Controller} & \redcell{cognitive}{Cognitive\\system}\\
+ \labelcell{info}{Information} & \labelcell{perception}{Perception} \\
+ };
+ \draw [->, -stealth', thick] (info.north) -- (software.south) node [midway, left] {Data};
+ \draw [->, -stealth', thick] (software.north) -- (mechanics.south) node [midway, left] {Command};
+ \draw [->, -stealth', thick] (mechanics.east) -- (sensorial.west) node [midway, above] {Mechanical effect};
+ \draw [->, -stealth', thick] (sensorial.south) -- (cognitive.north) node [midway, right] {Sensation};
+ \draw [->, -stealth', thick] (cognitive.south) -- (perception.north) node [midway, right] {Interpretation};
+
+% \draw [->, -stealth', thick]
+% edge ;
+% \draw [->, -stealth', thick]
+% (software.north) edge (mechanics.south);
+\end{tikzpicture}
+\caption{The Caption.}
+\label{fig:loops}
+\end{figure}
+
+
+ \subsection{Tactons}
+ Haptic feedback for activity monitoring (Activibe)\cite{cauchard16}
+
+ Off-the-shelf smartwatch => simple ERM actuator => simple feedback, limited vocabulary
+
+ Can people notice and interpret correctly information when they do not expect the tactile cues?
+
+ \subsection{Tactile textures}
+
+ Definition
+
+ Tactile Textures~\cite{potier12,potier16}
+
+ Coding => Command => physical effect => sensation => information…
+
+ Many places where information can get lost: command resolution, non-linear mechanical effect, bad contact between device and user, effect out of perceptual range, haptic illusions.
+
+
+
+\section{Passive haptics}
+ % Leverage the physical properties of computer peripherals. Use them as tangibles~\cite{pietrzak17}.
+ % Actuated peripherals.
+
+The question whether computer peripherals such as a mouse can be used as a TUI is subject to debate.
+On one hand it complies with Ullmer and Ishii's definition we provided at the beginning of this chapter.
+Since the introduction of this definition, computer peripherals actually became everyday objects.
+Augmenting them to couple them with digital information would make them TUIs.
+On the other hand, computer peripherals were specifically designed for interaction with digital information, and would not exist otherwise.
+Hence, considering them as TUIs would neglect the very concept of TUI.
+
+Now, consider the situation of users giving a talk with a slideshow.
+They holds the mouse in the hand and just use the buttons to move to the next slide, the same way they would do with a remote control.
+In this case they are not using the mouse as it was designed for.
+Is it sufficient to consider the computer mouse as a TUI in this specific scenario?
+
+Further, now imagine an actuated computer mouse so that it can move around on the table.
+This mouse moves around on the desk to give users notifications when they are not watching the screen.
+In this situation, the mouse is clearly not used as the mouse was designed to be used.
+In this work we explore actuation and motion as a way of interacting with computer peripherals in the way they were not designed for.
+%We discuss scenarios in which computer peripherals are tangible objects for interaction with digital information.
+
+\subsection{Related work}
+
+We describe below evolutions of the desktop interface, the use of motion as an output modality, and shape changing interfaces.
+
+\paragraph{Rethinking desktop interaction}
+
+The way we interact with computer peripherals has not changed much since their invention.
+Mice, keyboards and screens have not changed much on an interaction point of view.
+%The mouse wheel is an exception to this observation.
+%The touchpad replaced the mouse in many situations, but mostly for laptop interaction.
+%Besides, the touchpad is not an evolution of the mouse, but rather a new device itself.
+%Touchscreens are as evolutions of computer screens.
+%However touchscreens on a desktop setup hardly compete with indirect pointing devices such as a mouse or a touchpad.
+
+%We see two reasons why the devices hardly evolve.
+%The first one is that these devices became mainstream because they are well designed.
+%The second one is that users have years of experience with these devices.
+Studies showed that some design choices are questionable.
+For example Pietrzak et al. studied the impact of the mode delimiters for keyboard shortcuts by replicating the \textsc{Ctrl} and \textsc{Shift} on the thumb buttons of the mouse~\cite{pietrzak14}.
+They observed similar performance for keyboard shortcut entry than with the keyboard.
+This means it makes sense to revisit design choices made decades ago.
+
+Research explored additional dimensions to extend the capacities of computer peripherals.
+Rekimoto et al. added capacitive sensing to the keys of a keyboard~\cite{rekimoto03}.
+It enables sensing whether the user touches a key or not.
+They propose scenarios in which they use this information to display feedforward, and other scenarios which take advantage of this extended vocabulary to enhance interaction.
+
+Beyond rethinking desktop devices, Bi et al. used the desk itself for interaction~\cite{bi11}.
+They extend the peripherals capabilities with interaction with the desk, both for multi-touch input and a projected display.
+At the opposite, Gervais et al. use everyday objects as viewports, which share or extend computer screens real estate~\cite{gervais16}.
+These systems explore tangible properties of the desktop environment to extend interaction.
+
+\paragraph{Motion output}
+
+Motion is a property of the interaction with an object.
+It is commonly used as input values, but we are interested in motion of a physical object as an output modality.
+Motion as output produces visual and haptic feedback.
+
+Löffler et al. designed insect-like desk companions~\cite{loffler17}.
+These companions can move around on the desk to give the user notifications through the visual channel.
+Authors focused on their affective effect on the user.
+Interestingly, motion based interfaces can take advantage of both the visual and haptic aspects of movements.
+Zooids are small robots which cooperate to achieve a particular task~\cite{legoc16}.
+In some situations they represents points on a graph.
+In other situations they move an object on a table.
+%\thomas{trouver ref sur l'audio ?}
+
+Actuating an object enables dynamic force feedback when the user touches it.
+For example Roudaut et al. explored the idea of actuating a layer over a touchscreen to guide the finger touching the device~\cite{roudaut13}.
+It makes it possible to teach users gestures, such as gesture keyboard symbols.
+Other studies use motion to encode information.
+Either the system controls the movement~\cite{enriquez03,pietrzak05}, or only constrains the movements of the user~\cite{pietrzak05a}.
+Similarly to Tactons~\cite{brewster04-2}, information is coded by mapping pieces of information to signal parameters such as amplitude, size or shape.
+Below, we explain how we use motion to extend interaction with computer peripherals.
+
+\paragraph{Shape changing interfaces}
+
+Actuating objects also make it possible to change their shape, and therefore their affordances.
+Knobslider is an example of interactive object which is either a button or a slider, depending on its shape~\cite{kim16}.
+This object was specifically designed to behave this way.
+At the opposite, Kim et al. designed an inflatable mouse~\cite{kim08} which can either give notifications, or be used as an elastic input device for continuous rate control.
+
+\subsection{Actuated peripherals}
+
+In this project we use both the concepts of motion as output, and shape changing interfaces to redesign computer peripherals.
+We discuss design rationales on the device level, desktop level, and envision extending the concept to en entire room or a house.
+
+\subsubsection{Device level}
+
+Motion is an essential aspect of interaction with peripherals.
+Pointing devices rely on movement measurements.
+Keyboards use binary key positions as input data.
+In the Métamorphe project~\cite{bailly13}, we actuated the keys so that they can either be up or down (Figure~\ref{metamorphe}, left).
+Keys can still be pressed, whether the key is up or down.
+
+\begin{figure}[!htb]
+ \centering
+% \vspace{-3mm}
+ \includegraphics[height=4.4cm]{metamorphe_raised}\hfill
+ \includegraphics[height=4.4cm]{metamorphe_pinch}
+% \vspace*{-7mm}
+ \caption{Métamorphe is a keyboard with actuated keys, which can either be up or down. Left: view of the keyboard with two keys up. Right: raised keys have new affordances. They can be pushed or pinched.}
+ \label{metamorphe}
+% \vspace*{-3mm}
+\end{figure}
+
+This shape changing keyboard has new properties compared to regular keyboards.
+When a key is up, the use can push it in four directions, or even pinch it (Figure~\ref{metamorphe}, right).
+With a touch sensor all around it, the key could be used as an isometric pointing device such as a trackpoint.
+
+Our previous studies showed that raising keys eases eyes-free interaction with the keyboard.
+Specifically we observed that users can easier locate raised keys and surrounding ones.
+
+% \begin{figure}[!htb]
+% \centering
+% % \vspace{-3mm}
+% \includegraphics[width=\columnwidth]{metamorphe_pinch}
+% % \vspace*{-7mm}
+% \caption{Raised keys have new affordances. They can be pushed or pinched.}
+% \label{pinch}
+% % \vspace*{-3mm}
+% \end{figure}
+
+The possibilities of such a keyboard go beyond text typing and keyboard shortcuts. Similarly to Relief~\cite{leithinger10}, it is a shape changing device which can be used to display information.
+
+\subsubsection{Desktop level}
+
+%Several augmented devices exist in the literature.
+%They are essentially isolated devices, not a set of coherent devices which share the same behavior.
+We observed people when they use a desktop computer, and identified situations in which they move their peripherals besides interaction with the computer.
+For example we observed people turning their screen to avoid sun reflexions.
+Other users turned their screen either to show visual content to somebody, or to show something in the room in a video conference with the camera affixed to the screen.
+It is also frequent to move the mouse and keyboard to make space on the desk for something else.
+
+In the Living Desktop project\cite{bailly16}, we actuated a mouse, keyboard and screen (Figure~\ref{livingdesktop}):
+\begin{itemize}
+\item The mouse can translate in the $x,y$ plane directions.
+\item The keyboard can rotate, and translate in the $x,y$ plane directions.
+\item The screen can rotate, and translate in the $x$ axis direction.
+\end{itemize}
+
+\begin{figure}[!htb]
+ \centering
+% \vspace{-3mm}
+% \includegraphics[width=\columnwidth]{livingdesktop}
+% \includegraphics[width=\columnwidth]{livingdesktop_setup}
+ \includegraphics[height=6cm]{livingdesktop_concept}\hfill
+ \includegraphics[height=6cm]{livingdesktop_poc}
+% \vspace*{-7mm}
+ \caption{The Living Desktop is a concept in which desktop peripherals can move around on the desk.}
+ \label{livingdesktop}
+% \vspace*{-3mm}
+\end{figure}
+
+With these capabilities, devices can move on their own without requiring the user to move them.
+The interesting question here is the degree of control the user has over his devices.
+There is a continuum between full control and full automation, in which we identify some particular degrees:
+
+\begin{itemize}
+ \item Telekinesis: the user moves the devices with distant controls.
+ \item Tele-operation: the user suggests movements, the device decides to which degree it complies.
+ \item Constraint: the user defines the constraints of the devices movements.
+ \item Insurrection: the user has no influence on the device movements.
+\end{itemize}
+\fixme{work this list a little}
+
+We implemented a couple of scenarios, which illustrate the concept.
+
+\paragraph{Peephole display}
+
+Even with a large screen, the interactive screen estate is limited.
+We propose to use the screen as a peephole display in a larger working space.
+In this scenario, the screen moves on the $x$ axis and the pixels show the content in this area in space.
+The screen is like a moving physical window.
+In this scenario the user controls the screen position.
+
+\paragraph{Video Conference}
+
+When video-conferencing with a desktop computer, the camera is usually affixed to the screen.
+The problem is when the user would like to show something he manipulates outside the camera range.
+He has to move the screen at the same time he is manipulating.
+In this scenario the screen follows the user so that he can always see the video conference, and show what he is doing to his collaborators.
+The user does not control the screen position in the strict sense of the term. However he can activate or deactivate this behavior and still control the screen position manually or with another interface.
+
+\paragraph{Ergonomic coach}
+
+Being well seated is essential for healthy office work.
+It reduces fatigue and pain.
+It is however difficult to pay attention to our posture all day.
+In this scenario, devices move away if we are not seated correctly on the chair.
+The user has no control over the devices in this situation.
+
+
+\subsubsection{Going further}
+
+Looking at the office environment, there are many other objects involved.
+They can be actuated to provide other interactive scenarios.
+Probst et al. presented a prototype of chair they use for input\cite{probst14}.
+These chairs are not actuated, but equipped with sensors.
+However, Nissan designed parking chairs\footnote{\url{https://youtu.be/O1D07dTILH0}} which can move around.
+In their scenario the chairs move back under the table to tidy the room.
+But we can envision other scenarios.
+Pull-down beds are other examples of existing moving objects, which are easy to actuate.
+
+In a larger scale, the concept of moving walls makes it possible to have many rooms in small flats\footnote{\url{https://vimeo.com/110871691}}.
+Each wall has specific equipment, suitable for a particular room.
+If we keep think bigger, rotating houses is another example of actuated environment\footnote{\url{https://youtu.be/dIHUwp9x8Fg}}.
+The obvious application is to maintain sunlight at a specific location in the house.
+But there may be many interesting interactive scenarios to study with such a building.
+
+\subsection{Conclusion}
+
+The early studies about TUIs used to consider everyday objects for interaction.
+Nowadays, computer peripherals became everyday objects.
+As such, they can also be considered as TUIs as long as they are not used as the device they are designed to be.
+We discussed how actuating computer peripherals enables new interactions.
+We presented a prototype of keyboard with actuated keys which can move up and down.
+We also presented a concept of moving computer peripherals, which enable new interactions.
+We envision this concept can apply to many other objects in our environment.
+The question we must always keep in mind is the degree of control we would like to keep over these wandering TUIs.
+
+
+\section{Conclusion}