\node[minimum width=1.0cm, minimum height=.75cm,text width=3.0cm, align=center, outer sep=0, column sep=0cm](#1) {\textbf{#2}};
}
\newcommand{\bluecell}[2]{
- \node[minimum width=3.0cm, minimum height=1.5cm,fill=cellblue, text=white,text width=3.5cm, align=center, rounded corners=2ex, outer sep=0](#1) {#2};
+ \node[minimum width=3.0cm, minimum height=1.3cm,fill=cellblue, text=white,text width=3.5cm, align=center, rounded corners=2ex, outer sep=0](#1) {#2};
}
\newcommand{\redcell}[2]{
- \node[minimum width=3.0cm, minimum height=1.5cm,fill=cellred, text=white,text width=3.5cm, align=center, rounded corners=2ex, outer sep=0](#1) {#2};
+ \node[minimum width=3.0cm, minimum height=1.3cm,fill=cellred, text=white,text width=3.5cm, align=center, rounded corners=2ex, outer sep=0](#1) {#2};
}
\tikzexternalenable
\begin{tikzpicture}
% (software.north) edge (mechanics.south);
\end{tikzpicture}
\tikzexternaldisable
- \caption[Motor pipeline.]{Motor pipeline.}
+ \caption[Motor sensing pipeline.]{Motor sensing pipeline with the user side and system side. Both have a hardware and a software aspect.}
\label{fig:motorpath}
\end{figure}
\paragraph{Motor ability}
-In this chapter I use the term \defword{motor ability} as the mirror of the notion of \emph{sense of touch}.
+\begin{definition}{ability}
+ The human \defwords{abilities}{ability} refer to the human capacities to act on their environment, similarly to the human \defwords{senses}{sense} that refer to the human capacities to get information from their environment.
+\end{definition}
+
+%In this chapter I use the term \defword{motor ability} as the mirror of the notion of \emph{sense of touch}.
+In this chapter we focus on the \emph{motor ability}, which leverages the \emph{motor system} to touch and manipulate the environment and objects it contains.
As depicted on \reffig{fig:motorpath}, this notion comprises both the motor system, and the associated part of the cognitive system.
Specifically, when users sould like to perform an action they form an intention that the cognitive system will turn into execution commands for the motor system.
The motor system, typicall muscles, tendons and articulations actuates the body, which in turn produces physical effects.
At the opposite, inputs of depth-cameras are filtered on the host side, because the host retreives raw data and computes a skeleton for example~\cite{shotton11}.
The transfer function is typically computed on the host because it requires information about display.
-When devices are integrated in the interactive system, they are connected to the host with a simple bus like SPI or I2C\footnote{\href{https://en.wikipedia.org/wiki/Serial_Peripheral_Interface}{https://en.wikipedia.org/wiki/Serial\_Peripheral\_Interface} \href{https://en.wikipedia.org/wiki/I\%C2\%B2C}{https://en.wikipedia.org/wiki/I\textsuperscript{2}C}}.
+When devices are integrated in interactive systems, they are connected to hosts with a simple bus like SPI or I2C\footnote{\href{https://en.wikipedia.org/wiki/Serial_Peripheral_Interface}{https://en.wikipedia.org/wiki/Serial\_Peripheral\_Interface} \href{https://en.wikipedia.org/wiki/I\%C2\%B2C}{https://en.wikipedia.org/wiki/I\textsuperscript{2}C}}.
In this case, the device implements a communication protocol that the host has to follow.
There is no standard protocol, but the overall idea is usually similar.
These buses use no correction codes, therefore they are fast but sensible to interferences.
Thus, devices that users can plug use more robust buses such as USB with the Human Interface Devices (\defword{HID}) class\footnote{\href{https://www.usb.org/hid}{https://www.usb.org/hid}}.
This class defines a standard communication protocol for interactive devices.
When the device is plugged, it sends descriptors that list its features.
-In particular, the HID descriptor details the format and the semantic of the data packets the device will send at a fixed frequency.
-Thanks to this protocol, the host can interpret virtually any HID device with a generic driver.
+In particular, the HID descriptor details the format and semantic of the data packets the device will send at a fixed frequency.
+With this protocol, the host can interpret virtually any HID device with a generic driver.
Regardless of the communication method between the host and the device, the drivers of the operating system create \defwords{input events}{input event} that applications will interpret for their own use.
%Buxton collection of interactive devices\footnote{\href{https://www.microsoft.com/buxtoncollection}{https://www.microsoft.com/buxtoncollection}}
When I started writing this chapter, the idea was to highlight the symmetry between output and input, in particular in the case of haptics.
The previous chapter was about the sense of touch, one of the five senses.
% humans use to get information from their environment.
-I was however surprised by the difficulty for me to find an equivalent of the word “senses”, as the human input system, for the human output system.
-I decided to use the word \defword{ability}, that I refer to ways humans have to act on their environment, similarly to the way \defword{senses} enable them to get information from their environment.
+I was however surprised by the difficulty for me to find an equivalent of the word “senses”, as the human input systems, for the human output systems.
+I decided to use the word \defword{ability}, that I refer to ways humans have to act on their environment, similarly to the way \defword{sense} enable them to get information from their environment.
In fact, humans do not have many kinds of ways to act on their environment.
This is maybe the reason why I could not find the word I was searching for.
-To the best of my knowledge, humans can do so with movements, voice, and fluid secretion.
+To the best of my knowledge, humans can do so with movements, voice, and fluid secretions.
This chapter focused on \emph{motor abilities}, which are abilities that leverage the human motor system.
The motor system enables people to touch and manipulate their environment, therefore it is the output part of the human haptic system.
I described on \reffig{fig:motorpath} the process between the moment a user plans a manipulation action and the moment the system produced information based on the effects of this manipulation it sensed.
-Similarly to the analogous figure in the previous chapter, this description clarifies the software and hardware parts on both the human and system side, as well as the connection between hamans and systems.
+Similarly to the analogous figure in the previous chapter (\reffig{fig:hapticpath}), this description clarifies the software and hardware parts on both the human and system side, as well as the connection between humans and systems.
This processed revealed general research questions that I addressed in some of my research projects.
The first research questions was about the way we design and evaluate the sensing and interpretation of physical effects resulting from a human manipulation with an interactive system.
This question is essentially related to the system.
In particular I discussed a methodology and tool we designed and implemented to measure and slice the latency of interactive systems.
-The second research question is about the interactive input vocabulary.
+The second research question was about the interactive input vocabulary.
This is a whole research domain in itself.
It involves both users and systems because the system produces the input vocabulary, and users need the physical ability to perform the appropriate actions.
I discussed first the design of alternative input methods with flexible pens.
Then I discussed the mapping and reduction of degrees of freedom with finger identification for multi-touch interaction.
-The third research question is about unnatural input, or ways to leverage the properties of virtual environments to perform actions that are impossible in the physical world.
+The third research question was about unnatural input, or ways to leverage the properties of virtual environments to perform actions that are difficult or impossible in the physical world.
This research is essentially about interaction techniques, therefore it is both about the users and the system.
The contributions I describe are interaction techniques for immersive virtual reality.
The first one is a distant 3D pointing technique with proximity selection
%unnatural input
% vr
-limitations
-- latency
-- flexible pens
-- finger identification
-- immersive VR
-
-Transition
-
-All these input techniques use hands dexterity and our capacity to touch and manipulate.
-But the sense of touch is barely used.
-
-
+Indeed, all the studies we discussed in this chapter focus on the hands dexterity and our capacity to touch and manipulate.
+However, the haptics as the sense of touch was barely used.
+
+The latency measurement study assumed the output was visual.
+We performed measurements on Linux, MacOS and Windows with several graphics library, that send data to a graphic card connected to a monitor.
+We showed that this part of the process was the major source of latency.
+Haptic systems are quite different, and there is no clear standard.
+There is typically a haptic look running around 1kHz that computes forces or vibrations faster than the human haptic sensitivity~\cite{salisbury04}.
+Audio-haptic systems with physical simulations use loops up to $10kHz$ \cite{leonard15}.
+In the future, I would like to study the effect of the haptic loop on the perception of different kind of force models.
+I will also measure the latency of several types of haptic systems, to compare them to visual systems.
+
+The flexible pens we designed provide passive force feedback with their flexural stiffness.
+We build several prototypes of different flexural stiffnesses and studied its effect on both objective and subjective measures.
+However, we could also provide active vibrotatile feedback with an actuator.
+With controllable haptic feedback, we could give users a haptic immediate response when the users are bending the stylus.
+My hypothesis is that such feedback, like haptic detents, would help users controlling the bend of the device with continuous gestures.
+We could also create discrete input with click sensations for activation gestures.
+The overall idea would be to use haptic feedback to support direct manipulation.
+We will discuss this concept in a different context with a haptic wristband in \refsec{sec:hapticdm} of \refchap{chap:loop}.
+
+Our multi-touch interaction paradigm with finger identification heavily rely on visual cues.
+It uses visual feedforward, feedback, and uses additional visuals for promoting learnability and discoverability.
+Tactile feedback on touch surfaces is usually poor.
+At best smartphones have a low quality tactile actuator.
+We started this project with tabletops, which are difficult to actuate.
+%But above all, there is currently no convenient way to provide
+We discussed the design and evaluation of several potential technologies for this in \refsec{sec:stimtac} and \refsec{sec:printgets}.
+It would be interesting to see how such haptic feedback can reduce visual clutter of our multi-touch interaction paradigm with finger identification.
+
+Finally, we discussed the way that immersive virtual environments enable users to perform actions that are difficult or impossible to perform in the physical world.
+In particular, we discussed two contributions, the first one was a distant pointing technique with proximity selection, and the second one a facial expression selection technique.
+These techniques use basic haptic feedback, with simple clicks upon selection activation for example.
+However, it does not permit users to feel objects they touch in the environment.
+Haptic devices for immersive Virtual Reality is an active research topic \cite{bouzbib21}.
+However, solutions usually focus on particular manipulations.
+One of my current projects is about visuo-haptic integration and includes the design of an expressive vibrotactile VR controller.