Sound source localization in virtual reality with ambisonics sound reproduction - Dataset.csv
datasetposted on 28.09.2021, 14:18 authored by Thirsa Huisman, Ewen MacDonald, Axel AhrensAxel Ahrens
This dataset was collected for a study on the effect of virtual reality glasses on auditory localization with ambisonics sound production.
Participants localized the perceived origin of a ambisonics reproduced sound source, both with and without a virtual reality headset and both blindfolded and with visual information. They used a virtual reality controller to point at the perceived location.
The localization performance of 21 subjects was measured. All participants provided written informed consent. Note that subject 7, did not have normal hearing thresholds. In the analysis, data from subject 7 were not used in the analysis. The subjectnr is indicated as subject in the dataset.
4 ambisonics orders were used to produce the sound sources (1st, 3rd, 5th, 11th). The ambisonics order is indicated in the dataset as ambi_order.
Stimuli (short noise burst) were presented from -90 to 90 degrees azimuth in 7.5 degrees steps (indicated as x in the dataset).
6 Conditions were tested:
1 - Blindfolded without HMD
2 - Blindfolded with HMD
3 - Visual information without HMD
4 - Visual information with HMD
5 - Visual localization without HMD (to measure a pointing bias, localization error corrections are based on these data)
6 - Visual localization with HMD (to measure a pointing bias, localization error corrections are based on these data)
The perceived azimuth and elevation are stored as azi and elev in the dataset.
The corrected_err is the perceived azimuth after correcting for the position of the stimulus and pointing biases (calculated using the visual localization data).
No special software is required to open and analyze the data.