We did not explore a particular context or application but rather studied the possibilities and limitations of the technology.
In this project, we are interested in vibrotactile feedback, which is well covered in the literature.
We are however interested in a particular case: restoring haptic feedback on touchscreens.
-Indeed touchscreens have many advantages compares to physical interfaces.
+Indeed touchscreens have many advantages compared to physical interfaces.
They can be updated.
They have no mechanical parts that wear over time.
They are flat, so they are easy to clean.
They can even be integrated into curved interactive surfaces.
%This is convenient for dashboards with tactile input.
%Manufacturers are interested in this solution because it simplifies the fabrication process because they can include a flexible actuator sheet in their plastic injection molds.
-Our colleagues at CEA LITEN\footnote{\href{https://www.cea.fr/cea-tech/liten/english/Pages/Work-with-us/Technology-platforms/Large-Surface-Printed-Electronics.aspx}{Pictic platform}} designed and implemented the actuators, and Walterpack\footnote{\href{http://www.walterpack.com/}{http://www.walterpack.com/}} worked on the integration of actuators in the plastic mold.
-We prototyped the driving electronics and clamping system, designed tactile widgets, and implemented a dashboard prototype that was showcased at the Geneva Motor Show 2017\footnote{\href{https://www.carscoops.com/2017/03/this-is-what-sbarro-mojave-really-looks/}{Mojave} concept car, produced by the \href{http://www.e-sbarro.fr/}{Esperra Sbarro} school} (\reffig{fig:mojave}).
+Our colleagues at CEA LITEN\footnote{\href{https://www.cea.fr/cea-tech/liten/english/Pages/Work-with-us/Technology-platforms/Large-Surface-Printed-Electronics.aspx}{Pictic platform}} designed and implemented the actuators, and Walterpack\footnote{\href{http://www.walterpack.com/}{http://www.walterpack.com/}} worked on the integration of actuators in the plastic injection mold.
+We prototyped the driving electronics and clamping system, designed tactile widgets, and implemented the software part and driving electronics for a dashboard prototype that was showcased at the Geneva Motor Show 2017\footnote{\href{https://www.carscoops.com/2017/03/this-is-what-sbarro-mojave-really-looks/}{Mojave} concept car, produced by the \href{http://www.e-sbarro.fr/}{Esperra Sbarro} school} (\reffig{fig:mojave}).
\begin{figure}[htb]
\centering
\includegraphics[height=5.2cm]{figures/mojave}%
\includegraphics[height=5.2cm]{figures/mojavedashboard}
- \caption[Mojave concept car.]{The Mojave concept car designed by the Esperra Sbarro school, showcased at the Geneva Motor Show 2017. We implemented the software of the dashboard and the driving electronics.}
+ \caption[Mojave concept car.]{The Mojave concept car designed by the Esperra Sbarro school, showcased at the Geneva Motor Show 2017. We implemented the software part of the dashboard and the driving electronics.}
\label{fig:mojave}
\end{figure}
\subsubsection{Experimental platform}
\label{sec:printgetsplatform}
-The design of the actuators is a trade-off between their size and thickness, the number of layers, and the signal voltage.
+The design of these actuators is a trade-off between their size and thickness, the number of layers, and the signal voltage.
Our colleagues who designed and implemented these actuators performed FEM simulations and measured the response to signal with laser vibrometers~\cite{poncet17,poncet16}.
They gave us several prototypes that we could use to design vibrotactile widgets.
The first prototype (top of~\reffig{fig:printedslider}) had six buttons embedded in a molded plastic dashboard part.
We added tension strings to tighten the frame so that the clamping area could resonate.
This is the same approach as a drum head on a drum shell.
The capacitive touch sensor, printed with silver ink, is under the actuators.
-The setup is depicted at the bottom of~\reffig{fig:printedslider}.
+The setup is depicted at the bottom of the~\reffig{fig:printedslider}.
\begin{figure}[htb]
\centering
\subsubsection{Vibrotactile widgets}
The idea of using vibrations to simulate button clicks on touch surfaces started with vibrotactile actuators attached to PDAs~\cite{fukumoto01,nashel03,poupyrev02}, then mobile phones~\cite{brewster07,hoggan08}.
-This was before phone manufacturers included vibrotactile actuators in mobile phones.
-And even now that mobile phones do have one, it is low quality (ERM or LRA) and mostly used for messages or call notifications.
-They sometimes propose vibrations for button presses, but the quality of this feedback is poor.
+%This was before phone manufacturers included vibrotactile actuators in mobile phones.
+%This was before smartphonesphone manufacturers included vibrotactile actuators in mobile phones.
+Current mobile phones use low quality actuators (ERM or LRA).
+Vibrations are still mostly used for messages and call notifications.
+%And even now that mobile phones do have one, it is low quality (ERM or LRA) and mostly used for messages or call notifications.
+Mobile phones sometimes use vibrations for button presses, but the quality of this feedback is poor.
Indeed other actuators provide sharper vibrotactile feedback.
We discussed this at the beginning of this chapter.
Voice coil actuators provide precise and strong vibrations~\cite{yao10}, and piezo actuators have a smaller form factor and are convenient for implementing buttons~\cite{tashiro09,lylykangas11}.
\end{figure}
The design of the tactile feedback for slopes and points we described in the previous paragraph requires an iterative process.
-Several tools were designed to support the design of vibrotactile animations~\cite{schneider15,schneider16}
+Several tools were designed to support the design of vibrotactile animations~\cite{schneider15,schneider16}, but they are not necessarily designed for the design of haptic feedback for tactile widgets.
As we discussed in section~\ref{sec:printgetsplatform}, the haptic drivers provide an audio mode in which we can provide an audio signal to drive the actuators.
We leveraged this feature with Purr Data, a web-based audio synthesis programming language that we used to design vibrotactile feedback~\cite{frisson16}.
We describe the design of the vibrotactile feedback for tactile buttons in~\cite{frisson20}.
\subsubsection{Discussion and conclusion}
This work required interdisciplinary skills and knowledge to put together existing building blocks to design a whole system.
-This is what I consider a strength of Human-Computer Interaction research, at least the way I do it.
+This is what I consider a strength of Human-Computer Interaction research, at least the way I practice it.
We have skills in many research domains, which allows us to design, implement and evaluate interactive systems.
When more expertise is required, we can efficiently collaborate with experts in other domains.
In this project, we collaborated with experts in material science who designed the tactile actuators, modeled and simulated their vibrations in a theoretical environment.
%Further, Gervais \etal turned every object in the environment into a screen, to enrich our interaction with interactive systems~\cite{gervais16}.
Actuation is another modality that augments desktop interaction.
For example, a fleet of small robots can represent data dynamically~\cite{legoc16}.
-However, the combination of ubiquitous displays with actuation brings another dimension that increases the interaction vocabulary~\cite{roudaut13a}
+However, the combination of ubiquitous displays with actuation brings another dimension that increases the interaction vocabulary~\cite{roudaut13a}.
Our approach with Living Desktop also considers the desktop workstation as a whole that should be integrated into its environment~\cite{bailly16}.
However, our primary focus is on augmenting devices and their interaction by leveraging their physical properties.
In both cases, we use actuation as a mechanism to provide new features, with two paradigms in mind.
The first one is shape-changing interaction: the shape of an object is a signifier of its affordances.
Hence changing the shape of a device is an interesting way of providing and advertising an extended interactive vocabulary.
-The second paradigm is tangible interaction, which “augments the real physical world by coupling digital information to everyday physical objects and environments.”\cite{ishii97}.
+The second paradigm is tangible interaction, which “augments the real physical world by coupling digital information to everyday physical objects and environments.”~\cite{ishii97}.
Here we see desktop peripherals as everyday objects that we manipulate as such.
\subsubsection{Device level}
% \vspace{-3mm}
% \includegraphics[width=\columnwidth]{livingdesktop}
% \includegraphics[width=\columnwidth]{livingdesktop_setup}
- \includegraphics[height=6cm]{livingdesktop_concept}\hfill
- \includegraphics[height=6cm]{livingdesktop_poc}
+ \includegraphics[height=5.2cm]{livingdesktop_concept}\hfill
+ \includegraphics[height=5.2cm]{livingdesktop_poc}
% \vspace*{-7mm}
\caption[Living Desktop]{The Living Desktop is a concept in which desktop peripherals can move around on the desk.}
\label{livingdesktop}
\end{figure}
With these capabilities, devices can move on their own without requiring the user to move them.
-The interesting question here is the degree of control the user has over his devices.
-Beaudouin-Lafon defines two interaction paradigm: \emph{computer-as-a-tool} that users control, and \emph{computer-as-a-partner} to which users delegate tasks~\cite{mbl04}.
+The interesting question here is the degree of control the user has over their devices.
+Beaudouin-Lafon defines two interaction paradigm: \emph{computer-as-a-tool} and \emph{computer-as-a-partner}~\cite{mbl04}.
+In the first case, the system only reacts to the users' actions.
+In the second case, the users delegate tasks to the system.
Here I argue for a continuum between full control and full automation.
We discuss examples of application scenarios for four degrees of control.
In the first case, I also worked with a postdoc who had a background in cognitive sciences.
Therefore our studies focused on the perception and interpretation of tactile textures.
In the second case, I worked with a postdoc who had a background in audio and music technologies.
-As a consequence, our work was directed to the signal generation and authoring, and the technical apparatus.
+As a consequence, our work was directed to the signal generation, authoring tools, and the technical apparatus.
Finally, with the \emph{actuated devices} project I mainly worked with HCI colleagues and a master intern with an electrical engineering background.
The intern designed and implemented most of the robotics part of Living Desktop.
I worked mostly on the hardware part of Métamorphe, the software part of both, and the observation study