=== Explaining the data: The data for this experiment is separated into multiple folders. The names of these folders consist of 'participantName_observedObject_recordingNumber'(e.g. Katy_Gnocchi_1). All contain two files - 'livedata.json' and 'fullstream.mp4'. The file called 'livedata.json' contains the raw eye-tracking data gathered during the experiment, such as 2D space coordinates(x, y), eye-tracking timestamp, video timestamp etc. The file 'fullstream.mp4' contains a video recording of what is in font of the participant while wearin the glasses. Accumulated was data from two participants observing 9 objects, one participant and one object at a time. The objects were observed from different angles therefore there are multiple short recordings for each object. === The hardware: To collect human eye-tracking data were used the Tobii Pro eye-tracking glasses. === Runnning the scripts: To run all scripts which would process the data, generate frames, filter the data, cluster, find the convex hull, generate cortical iamges and fixation crop images, open a teminal in the 'Data' folder, where the run_scripts.py file is and type: python run_scripts.py folder_prefix E.g. python run_scripts.py Katy_ If you use as prefix 'Katy_Gnocchi' then only the Gnocchi data from the partisipant 'Katy' will be ran, if you use as prefix 'Katy_' then all object data from the participant 'Katy' will be ran. -> 'python run_scripts.py Katy_Gnocchi' and python 'run_scripts.py Katy_'