%!TEX root = ../hdrmain.tex
-\chapter{Touch to manipulate}\label{chap:input}
+\chapter{The motor ability}\label{chap:input}
\epigraph{The true delight is in the finding out rather than in the knowing.}{Isaac Asimov}
\begin{Abstract}
This is the common interpretation of the word “haptic”.
However haptics also refers to our ability to touch and manipulate.
%In the context of interactive systems, this refers to inputs.
-In this chapter we will focus on touch and manipulation with the fingers, hands and arms.
+In this chapter we will focus on gestures we perform with our fingers, hands and arms.
%Most of input modalities require touch and manipulation.
It covers most of the commonly used input modalities: button, pointing interfaces, touch, and gestural interaction.
All these modalities rely on gestures, but they leverage different parameters.
Typically, buttons only sense binary contacts regardless of which finger pressed it, or the force or speed of actuation.
-Pointing iterface such as mice or touchpads sense 2D movements in addition to contacts.
+Pointing interfaces such as mice or touchpads sense 2D movements in addition to contacts.
Multi-touch interfaces can even sense contacts and movements of several contact points.
Finally, gestural interaction can be sensed in 3D in the air without contact.
%My hope is that skeptical readers will agree it makes sense at the end of the manuscript.
The relation between gesture input in the broad sense and haptics may seem far-fetched at first sight.
-However, if we look at \reffig{fig:hapticpath} and swith the user and the system, we obtain the \reffig{fig:inputpath}.
+However, if we look at \reffig{fig:hapticpath} and switch the user and the system, we obtain the \reffig{fig:motorpath}.
The user produces a mechanical effect that the system will sense and interpret.
-Therefore the user will play the same role than a haptic device.
+Therefore the user will play the same role as a haptic device.
With this in mind, this is not surprising that cognitive scientists call this phenomenon output, as opposed to input for computer scientists.
-The \reffig{fig:hapticpath} depicts both the user and the system, which both have a hardware part, in the physical world and a software part.
+The \reffig{fig:motorpath} depicts both the user and the system, which both have an hardware part, in the physical world and a software part.
The software part of users corresponds to ideas, or the mind in general, whereas the sotware part of the system refers to the code and its execution.
+The purpose of the modalities mentioned above is essentially to sense and interpret gestures of the fingers, hands, and arms.
+This is a much more complex task that it seems at first sight.
+The hand alone has 27 degrees of freedom~\cite{elkoura03}.
+The movement range of each of these degrees of freedom depends on multiple factors, including morphology, physical condition, age and gender~\cite{nasa14}.
+It is therefore not surprising that input systems only sense a small part of the possible human movements.
+
\begin{figure}[htb]
\centering
\definecolor{cellred}{rgb} {0.98,0.17,0.15}
\node[anchor=south, minimum width=\textwidth,minimum height=.75mm, inner sep=0, outer sep=0](thebar3) at (0,3.25) {\textbf{Physical World}};
\matrix[row sep=1.25cm, column sep=7mm,inner sep=0, node distance=0, outer sep=0mm] (cells) {
\labelcell{biopsycho}{Biology\\Movement Sciences} & \redcell{motor}{Motor\\system} & & & \bluecell{mechanics}{Sensing\\System} & \labelcell{elecmeca}{Electronics\\Mechanics}\\
- \labelcell{ergocs}{Ergonomy\\Psychology} & \redcell{movement}{Cognitive\\system} & & & \bluecell{software}{Software\\Controller} & \labelcell{csmath}{Computer Science\\Mathematics}\\
+ \labelcell{ergocs}{Cognitive sciences\\Psychology} & \redcell{movement}{Cognitive\\system} & & & \bluecell{software}{Software\\Controller} & \labelcell{csmath}{Computer Science\\Mathematics}\\
& \labelcell{action}{Action} & & & \labelcell{info}{Information} \\
};
\draw [->, -stealth', thick] (action.north) -- (movement.south) node [midway, left] {Intention};
% (software.north) edge (mechanics.south);
\end{tikzpicture}
\tikzexternaldisable
- \caption[Input pipeline.]{Input pipeline.}
- \label{fig:inputpath}
+ \caption[Motor pipeline.]{Motor pipeline.}
+ \label{fig:motorpath}
\end{figure}
-\paragraph{Motor system}
+\paragraph{Motor ability}
-\paragraph{Input systems}
+In this chapter I use the term \defword{motor ability} as the mirror of the notion of \emph{sense of touch}.
+As depicted on \reffig{fig:motorpath}, this notion comprises both the motor system, and the associated part of the cognitive system.
+Specifically, when users sould like to perform an action they form an intention that the cognitive system will turn into execution commands for the motor system.
+The motor system, typicall muscles, tendons and articulations actuates the body, which in turn produces physical effects.
+This physical effects enable the manipulation or objects in contact with the mobile parts of the body.
-The purpose of the modalities mentioned above is essentially sensing and interpreting gestures of the fingers, hands and arms.
-This is a much more complex task that it seems at first sight.
-The hand alone has 27 degrees of freedom~\cite{elkoura03}.
-It is not surprising that input systems only sense a tiny part of the
+Force and movements
+
+\paragraph{Input systems}
Buxton collection of interactive devices\footnote{\href{https://www.microsoft.com/buxtoncollection}{https://www.microsoft.com/buxtoncollection}}
+sensors
+
+signal: filters, transfer functions
+
+events
+
%Pseudo haptics \cite{lecuyer01}
\section{Research questions}
+\subsection{Sensing and interpretation}
+
+=> Lagmeter
+
\subsection{Input vocabulary}
+lexical, syntactic, semantic
+
combination, size
-\subsection{Engineering and evaluation of input devices}
+=> Flexible pens, finger identification
+
+\subsection{Engineering and evaluation of input techniques and devices}
-\subsection{Other}
+Raycursor, facial expressions
\section{Contributions}