The independent control of both the frequency and amplitude of the signal is necessary for an expressive output vocabulary.
The typical way to drive precise vibrotactile actuators is to use a sound generation system.
This is convenient because the parameters of the signal are the same: frequency, amplitude, and shape.
-The main difference is the frequency range: 1--1000Hz for haptics and 200--20kHz for sound.
+The main difference is the frequency range: \qtyrange{1}{1000}{\hertz} for haptics and \qtyrange{200}{20}{\kilo\hertz} for sound.
Managing the amplitude is easier with vibrations because the required amplitude levels are much lower.
In the end, the shape parameter is in my opinion the bottleneck of complexity for the implementation of vibrotactile devices because it imposes a much higher sampling rate.
It makes the design of sound generation systems complex, especially with microcontrollers available at the time this project started.
For the sake of simplicity, I rather opted for a straightforward design that enabled the precise control of both frequency and amplitude at the cost of a low control of the signal shape\footnote{Controlling the signal shape remains possible, with a software $\Delta\Sigma$ modulation \href{https://tiny.one/DeltaSigma}{https://tiny.one/DeltaSigma}.}.
The idea is to control the frequency and amplitude with two PWM signals generated by the timers of a microcontroller (\reffig{fig:actuatorcircuit}).
-The frequency signal typically ranges between 1--1000Hz.
+The frequency signal typically ranges between \qtyrange{1}{1000}{\hertz}.
The amplitude is controlled with the duty cycle of a high-frequency signal.
We used voice coil actuators, therefore they behave like low-pass filters, which stabilizes this high-frequency signal, hence reducing the amplitude of the actuator's movement.
-Our prototypes used 16MHz controllers with 8 bits timers, which gives a 62.5kHz loop with 256 levels of amplitude.
+Our prototypes used \si{16}{\mega\hertz} controllers with 8 bits timers, which gives a \si{62.5}{\kilo\hertz} loop with 256 levels of amplitude.
It communicated with a host computer with a serial protocol over bluetooth.
\input{figures/actuatorcircuit.tex}
\input{figures/dwellpointing.tex}
The cursor has no color and gives no tactile feedback when it is not over a target.
-When it hovers a target, it is colored and all the four actuators vibrate at 50Hz.
-After $1s$ over the button, a ring is drawn around the cursor, and all the four actuators vibrate at 250Hz during $150ms$, then they stop for another $150ms$.
+When it hovers a target, it is colored and all the four actuators vibrate at \si{50}{\hertz}.
+After \si{1}{\ms} over the button, a ring is drawn around the cursor, and all the four actuators vibrate at \si{250}{\hertz} during \si{150}{\ms}, then they stop for another \si{150}{\ms}.
It warns the users that the animation is about to start.
-The animation vibrates the four actuators in a clockwise sequence at 250Hz during $200ms$, followed by a $175ms$ pause.
-After this animation all the actuators vibrate at 250Hz for $200ms$, then the target is activated.
-Therefore users had to hover a button during $3s$ to activate it.
+The animation vibrates the four actuators in a clockwise sequence at \si{250}{\hertz} during \si{200}{\ms}, followed by a \si{175}{\ms} pause.
+After this animation all the actuators vibrate at \si{250}{\hertz} for \si{200}{\ms}, then the target is activated.
+Therefore, overall users had to hover a button during \si{3}{\s} to activate it.
-We ran an experiment with the idea to measure an increase of performance in a tactile condition over a visual-only condition.
-This was motivated by the fact that we tried the buttons themselves, tactile feedback seemed to bring some benefit.
+We ran an experiment with the idea to measure an increase of performance in a tactile condition over a visual condition.
+This was motivated by the fact that when we tried the buttons, tactile feedback seemed to bring some benefit.
Participants were presented a screen with an array of four by four buttons.
They had to select a series of buttons indicated with a highlight.
The details of the experiment do not matter.
We failed at detecting a significant difference between the conditions, either in selection time or error rate.
I am convinced now that performance is not the benefit of tactile feedback in this situation.
-That being said, we were not the only ones to have this sensation that something was better with tactile feedback.
-Several participants reported that tactile feedback was a good addition to visual feedback, as it reduced their visual attention.
+%That being said, we were not the only ones to have this sensation that something was better with tactile feedback.
+%Several participants reported that tactile feedback was a good addition to visual feedback, as it reduced their visual attention.
+That being said, several participants reported that tactile feedback was a good addition to visual feedback, as it reduced their visual attention.
For example, two participants said: “without tactile feedback I have to focus more on the visual feedback” and “I found [tactile feedback] more helpful than the visual feedback because I didn't have to focus 100\% visually with the tactile redundancy.”
Users also appreciated that tactile feedback indicated when they hovered a button.
For example, one of the participants said: “It felt more like interacting with a physical button when activating the button produced a sharp buzz.”
-Therefore, the benefits of the tactile feedback seemed to be qualitative rather than quantitative
-This is why in the next iteration we tried to measure the qualitative benefits of tactile feedback for gestural interaction.
+Therefore, the benefits of the tactile feedback seemed to be qualitative rather than quantitative.
+This is why in the next iteration we focused on the qualitative benefits of tactile feedback for gestural interaction.
\subsection{Qualitative limitations}
The users steered the car by moving their arms like if they were holding a steering wheel.
The Kinect API computed a skeleton of the user.
The steering angle was computed as a function of the relative position of the hands of the skeleton.
-The tactile feedback indicated the steering angle as indicated on \reffig{fig:cargame}.
+The spatial location of vibrations indicated the steering angle as shown on \reffig{fig:cargame}.
+The speed of the car was mapped to a modulation of the \si{250}{\hertz} signal with a low-frequency signal between \si{1}{\hertz} and \si{25}{\hertz} with \si{50}{\ms} durations.
+This modulation makes users feel like if equidistant strips covered the road.
+%the faster the car goes, the frequent are the vibrations.
+In addition to this, the bottom actuator vibrated for \si{200}{\ms} at \si{100}{\hertz} when the car was braking.
\input{figures/cargame.tex}
The users feedback of this first experiment suggested that the tactile sensations provided benefits in terms of realism and immersion in the game.
Therefore I conducted another experiment, but focused on qualitative benefits.
-In the next submissions we performed a nex user study and measured emotions with the PAD questionnaire (Pleasure Arousal Dominance)~\cite{mehrabian96} and presence with Witmer \etal PQ questionnaire \cite{witmer98}.
+In the next submission we performed a new user study and measured emotions with the PAD questionnaire (Pleasure Arousal Dominance)~\cite{mehrabian96} and presence with Witmer \etal PQ questionnaire \cite{witmer98}.
%, significant difference in sensory and realism factors.