% 2: label
% 3: position
\newcommand{\labelcell}[3]{
- \node[minimum width=2.5cm, minimum height=.85cm,text width=2.0cm, align=center, outer sep=0](#1) at (#3) {#2};
+ \node[minimum width=25mm, minimum height=8.5mm,text width=20mm, align=center, outer sep=0](#1) at (#3) {#2};
}
% 1: name
% 2: label
% 3: color
% 4: position
\newcommand{\mycell}[4]{
- \node[minimum width=2.1cm, minimum height=.85cm,fill=#3, text=white,text width=2cm, align=center, rounded corners=2ex, outer sep=0](#1) at (#4) {#2};
+ \node[minimum width=21mm, minimum height=8.5mm,fill=#3, text=white,text width=20mm, align=center, rounded corners=2ex, outer sep=0](#1) at (#4) {#2};
}
\tikzexternalenable
\begin{tikzpicture}[x=1mm, y=1mm]
\small
+ % Common/shared environment?
+ \node[minimum width=20mm, draw=black!20, fill=black!20, text=black, text width=20mm, line width=10mm, circle, align=center,outer sep=0, execute at begin node=\setlength{\baselineskip}{10mm}](environment) at (0,0) {Shared environment};
% Middle line
- \draw [line width=30, draw=black!20, fill=black!50] (0,-20) -- (0,20);
+ %\draw [line width=30, draw=black!20, fill=black!50] (0,-15) -- (0,15);
- \mycell{user}{User}{myred}{-50, 0}
- \mycell{system}{System}{myblue}{50, 0}
+ \mycell{user}{User}{myred}{-60, 0}
+ \mycell{system}{System}{myblue}{60, 0}
- \labelcell{sensorimotor}{Sensorimotor loop}{-20,0}
- \labelcell{execution}{Execution loop}{20,0}
+ \labelcell{sensorimotor}{Sensorimotor loop}{-30,0}
+ \labelcell{execution}{Execution loop}{30,0}
%Perception
- \draw[->, -stealth', line width=1mm, draw=myred!50!black, shorten >= -4.0, shorten <= 0.0] (-2,0) to[out=90,in=45, looseness=1.1] node[pos=0.8, above, rotate=20] {Perception} (user.north east);
+ \draw[->, -stealth', line width=1mm, draw=myred!50!black, shorten >= -4.0, shorten <= 0.0] (environment.west) to[out=90,in=45, looseness=1.1] node[pos=0.8, above, rotate=20] {Perception} (user.north east);
%Action
- \draw [->, -stealth', line width=1mm, draw=myred!50!black, shorten <= -4.0] (user.south east) to[out=-45,in=270, looseness=1.1] node[pos=0.2, below, rotate=-20] {Action} (-2,0);
+ \draw [->, -stealth', line width=1mm, draw=myred!50!black, shorten <= -4.0] (user.south east) to[out=-45,in=270, looseness=1.1] node[pos=0.2, below, rotate=-20] {Action} (environment.west);
%Input
- \draw[->, -stealth', line width=1mm, draw=myblue!50!black, shorten >= -4.0, shorten <= 0.0] (2,0) to[out=-90,in=225, looseness=1.1] node[pos=0.8, below, rotate=20] {Input} (system.south west);
+ \draw[->, -stealth', line width=1mm, draw=myblue!50!black, shorten >= -4.0, shorten <= 0.0] (environment.east) to[out=-90,in=225, looseness=1.1] node[pos=0.8, below, rotate=20] {Input} (system.south west);
%Output
- \draw [->, -stealth', line width=1mm, draw=myblue!50!black, shorten <= -4.0] (system.north west) to[out=135,in=90, looseness=1.1] node[pos=0.2, above, rotate=-20] {Output} (2,0);
+ \draw [->, -stealth', line width=1mm, draw=myblue!50!black, shorten <= -4.0] (system.north west) to[out=135,in=90, looseness=1.1] node[pos=0.2, above, rotate=-20] {Output} (environment.east);
\end{tikzpicture}
\tikzexternaldisable
\caption{The similarity of a user and a system interacting with their environment, and through which they can interact together.}
%Or this is the most efficient, most obvious, or only way to do this.
\subsection{Human behavior}
+\label{sec:humanbehavior}
%The strongest roots of the understanding of human behavior that had the most significance on HCI research, at least related to the topic of this manuscript, is Gibson's work.
%Perception/action cycle~\cite{gibson79}
% humans and systems as autonomous entities
Humans and interactive systems share the same overall schema in the sense that they are both autonomous entities that interact with their environment through input and output streams (\reffig{fig:loops}).
People perceive and act on their environment thanks to their sensorimotor loop.
-We discussed above the critical role of the perception/action coupling.
-
-Enaction\cite{varela92,thompson10}
-Similar overall schema
- => entities connected to the outside world with input/output streams
- => we perceive what our senses and motions enable us to perceive: type, range, precision…
+Therefore our perception depends both on our sensorial and motor capabilities.
+They have limitations that influence the way we perceive our environment.
+There are \emph{type} limitations, for example we can perceive light waves but not magnetic fields.
+There are \emph{range} limitations, for example we hear a sound wave of \SI{400}{\hertz}, but not \SI{400}{\kilo\hertz}.
+We can reach objects \SI{50}{\cm} away from us, but not \SI{50}{m}.
+There is also a \emph{precision} limitation, for example we can distinguish colors of wavelength \SI{100}{\nano\metre} apart, but not \SI{1}{\nano\metre} apart.
+Precision is actually a range in difference, therefore according to Weber's law the threshold is proportional to the base stimulus value~\cite{fechner60}.
+Finally, there are \emph{processing} limitations, related to our cognitive abilities to interpret signals desulting from our perceptionns and actions.
+Typically, illusions are distortions of what we could consider as ground truth.
+Interactive systems have the same kind of limitations.
+They are limited by the inputs and outputs they receive, and computing capactities to interpret them.
+Here inputs and outputs do not necessarily refer to bit streams, but more generally streams of physical phenomenons of their environment they can sense or produce (\eg movements, sounds, light).
+
+%Enaction\cite{varela92,thompson10}
- \input{figures/interactingloops.tex}
+\input{figures/interactingloops.tex}
% humans and systems communicate
% thei know their interpretation of the other entities, not what they actually are
- adaptations of Norman's theory
-
- => we perceive indirect properties of the environment => perceptual/conceptual model
+Humans and machines are autonomous entities but they are not independent.
+On one side, humans use machines to extend their limitations discussed above.
+Extending human capacities with computers marked the beginning of Human-Computer Interaction, with the pioneer visions of Bush~\cite{bush45} and Engelbart~\cite{engelbart68}.
+On the other side, all machines need humans otherwise they have no purpose.
+They all need instructions and data, and the all modify the environment or produce information.
+These interactions between entities, whether they are humas or machines, require communication.
+What I mean by communication is to produce a physical effect on a shared environment that the other entity can perceive.
+%It is important to note that what the second entity does not perceive this physical effect, but its own interpretation of it.
+In \refsec{sec:humanbehavior} we discussed how people perceive their environment, and in particular interactive systems.
+We presented how Norman's theory of action (see \reffig{fig:sevenstages}) explains the difference between the conceptual model of the system, and the perceptual model the users have of it based on their perception.
+
+\paragraph{Seven stages of reaction}
+
+I suggest that interactive systems follow a similar perceptual scheme, as depicted on \reffig{fig:mysevenstages}.
+% The system senses a physical effect in its environment.
+% Then interprets it to form input events, which are filtered and normalized interpretations of these effects.
+% The system combines these events into input phrases with interaction techniques.
+% After this step,
+The input chain begins with the \emph{sensing} stage.
+Physical sensors measure physical effect in the environment.
+Typically they measure the user movements, but it can be various other information such as light, temperature, moisture, or vibrations.
+All such information is transformed into \emph{input events}.
+At this stage we notice that the infinite richness of the world is reduced to a small number of digits.
+%Let's discuss the simple example of a keypress on a keyboard.
+%The only information in the digital world is whether a key is pressed or not.
+For example when a user presses a key on a keyboard, the interactive system only senses whether the key is pressed or not.
+There is no information about the finger that pressed it, the speed of the finger or its trajectory.
+There is neither the possibility if a finger is hovering the key, if several fingers are pressing it, or even if it was pressed with the nose.
+Input events have to be treated as tokens, or lexical units.
+They are interpreted as \emph{input phrases} with grammars or finite automatons~\cite{appert06} and form the building blocks of \defwords{interaction techniques}{interaction technique}~\cite{nigay93}.
+The \emph{software} part is an interaction machine that receives streams of input phrases, interprets them with algorithms, and eventually sends back information through output streams.
+%chooses whether there are algorithms to run, which algorithms to run, and with which parameters.
+%It occurs typically when a command is selected, or an object is being manipulated.
+%Every computer system has at some point an effect on the world.
+%We extend the notion of dead code to not only code that is never executed, but also code which result has no effect whatsoever on the physical world.
+%Even a routing algorithm will at some point transmit data to a computer which will display it or print it in any way.
+% Quantic information : stored in computer memory. Will it be observed?
+%Bringing a piece of information to the physical world requires several steps.
+First of all the systems must \emph{encode} the pieces of information.
+A visual encoding can be an icon, a text.
+Audio encodings can be sounds~\cite{gaver93} or melodies~\cite{brewster93}.
+Various haptic encodings include vibrations~\cite{brewster04}, forces~\cite{maclean03a,pietrzak05b,pietrzak05}, or textures~\cite{pietrzak09,pietrzak06,potier16}.
+%Effectors can produce light (like screens), sounds, vibrations, forces, …
+Output devices have driving electronics which require specific \emph{commands}, and turn them into \emph{physical effects}.
+These are typically lights (like screens), sounds, vibrations, and forces.
+
+\input{figures/sevenstages2.tex}
+
+\paragraph{Funnel of evaluation}
+
+The funnel of evaluation designates the fact that the input stages reduce the complexity of the system's environment into a few bits.
+A good design of the input chain senses the right phenomenons, at an appropriate amplitude, with a sufficient spatial and temporal resolution, and with little distortions.
+These information must be combined correctly to form a meaningful sequence of actions that the users must perform.
+For example, the \defword{Midas touch} problem is a usual issue with 3D gestural interaction.
+Since the sensors continuously observe the users movements, there is no obvious segmentation.
+The system has no way to know if the users move their hand to interact with the system, or for scratching their nose for example.
+\defword{Occlusion} has the opposite issue.
+%, occlusion is the other problem with vision-based gesture sensors.
+The sensor cannot get position information for objects outside of its field of view.
+In these situations, we can grow the funnel of evaluation respectively by adding segmentation gestures, and using multiple cameras.
+
+\paragraph{Funnel of execution}
+
+The funnel of execution is symmetrical to the funnel of evaluation.
+%The software part of the system keeps an model~\cite{reenskaug79a} (or abstraction~\cite{coutaz87}), and it has to display parts of it in an intelligible way for users.
+The way the objects of the internal model is shown to the users can have a huge impact on how they interact with it~\cite{zhang94}.
+Therefore, the encoding part is crucial, and it is a first filter for reducing the internal complexity of the system.
+The specifications of the output device is a second filter.
+There is a limitation of physical effect (\eg force, color, brightness, frequency) each device can produce in practice.
+There is also a limit of precision, that depends on electronics and mechanics.
+Last, the physical effect can be inconsistent for the same command.
+Some haptic devices can behave differently depending on ambient conditions (\eg temperature, finger moisture, cleanliness).
+
+\begin{idee}
Communication: humans and system both have a perceptual model of the other.
It does not necessarily match their conceptual model.
On the human side, the conceptual model of systems is known by the designer, and the perceptual model comes from experience, and software part evolves.
On the system side, the conceptual model of humans is unknown, studied by psychologists. The perceptual model depends on sensors and algorithms that
=> Towards computing affordance, a generalized notion of computability
-
- \input{figures/sevenstages2.tex}
+\end{idee}
% something…
- Abowd \& Beale's interaction framework\cite{abowd91}
+Abowd \& Beale's interaction framework\cite{abowd91}
+Extension of our capacities: instruments, delegates, collaborators…
+Importance of communication: throughput, time and space constraints, language
+
+\input{figures/wholeschema.tex}
- \input{figures/wholeschema.tex}
-
(Not only machines behavior is different than human behavior), but most importantly the purpose of their behavior is different.
Computers are human inventions, and they are build to follow human-made specifications.
-Their behavior
+Their behavior is
Control and automation: Moravec's paradox
\end{idee}