## Hover sensor tremor dataset Original DOI: http://dx.doi.org/10.5525/gla.researchdata.246 Creator: John Williamson / University of Glasgow Creation year: 2015 Format: Multiple CSV files, consisting of sensor time series ** See TremorExplanation.ipynb for a working IPython notebook which loads this data format. ** The dataset consists of measurements of a human hand above a capacitive touch sensor (x,y,z finger position timeseries), and measures the sychronisation of muscle tremor between fingers from the same hand. This data corresponds to the paper *"Fingers of a Hand Oscillate Together: Phase Syncronisation of Tremor in Hover Touch Sensing" J. Williamson, Proceedings of ACM SIGCHI 2016, 2016* This sensor has 160 long range electrical field sensors arranged in a 10x16 grid which produces intensity values which vary as conductive bodies approach the surface. The sensor samples at 120Hz, but the capture software provides data at ~100Hz. The sensor has a maximum sensing range of approximately 5cm above the surface, but is most effective in the 0-2cm range. Onboard firmware computes up to five independent finger $x,y,z$ positions, which is streamed along with the raw sensor values to a remote logging application. ### Data file format The data is stored in CSV files, one directory per user. Each user has 9 conditions, as detailed below. Each condition has three repetitions. There are N=8 users in the study; 6 male and 2 female. All data is fully anonymised; only randomised user IDs are used. Files are named: /flow___.csv * **user** is the random alphabetic user ID (e.g. **xkx**) * **condition** is one of the finger pose conditions described below (e.g. **1-1s**) * **rep** is the repetition number of the trial, 1, 2 or 3 The CSV file format is timestamp, rep, state1, x1, y1, z1, confidence1, slopex1, slopey1, slopeconfidence1, hxx1, hyy1, hxy1, fingerlikelihood1, likelihoodconf1, state2, ..., likelihoodconf2, state3, ..., likelihoodconf3, state4, ..., likelihoodconf4, state5, ..., likelihoodconf5, raw0, raw1, ..., raw159 This makes for a total of 227 columns. There are 5 repetitions of the [state*n*, x*n*, y*n*, z*n*, ..., likelihoodconf*n*] sequence, once for each possible finger $n$ (up to five fingers can be sensed at once). If a finger is not sensed, the corresponding state*n* is zero and the other finger values are zero as well. * **timestamp** is in (fractional) seconds since the start of the trial * **rep** is the repetition number (1, 2 or 3 -- matches the filename \) * **state*n*** indicates the touch state of the sensor for the finger *n*; 0=absent, 1=above, 3=touching screen * **x*n*, y*n*, z*n*** is the *x,y,z* co-ordinate of finger *n* * **confidence*n*, slopex*n*, slopey*n*, slopeconfidence*n*, hxx*n*, hyy*n*, hxy*n*, fingerlikelihood*n*, likelihoodconf*n*** are not used * **raw*k*** is the raw value of the sensor *k*; there are 160=*16 x 10* sensors arranged in a grid, as shown in the code and images below. ## Condition codes Each code identifies how many fingers of each hand were above the device. The hands are either static above the device (identified by an "S" suffix in the condition code), or moving in slow circles (identified by a "D" suffix in the condition code) Conditions: * 1-1S: one finger right hand, one left hand, static * 2-0S: two fingers right hand, static * 1-1D: one finger right hand, one left hand, moving in circles * 2-0D: two fingers right hand, moving in circles * 2-1S: two fingers right hand, one left hand, static * 2-2S: two fingers right hand, two left hand, static * 3-0S: three fingers right hand, static * 4-0S: four fingers right hand, static ## Experimental details Participants were seated with the sensor resting on a table about 30cm in front of them. The experimental trials consisted of a series of hand poses which were given via instruction images. Each pose was held for 20 seconds and repeated three times. There were a mix of static poses (i.e. hands held at rest) and dynamic poses (fingers moving). Participants were instructed to keep their fingers around 1cm above from the screen and avoid touching it. Participants are sitting with the right of the device facing them, in landscape orientation. ----