Mapping the perceptual topology of auditory space permits the creation of hyperstable virtual acoustic environments
-
- Brimijoin W. Owen
- Facebook Reality Labs
-
- Featherly Shawn
- Facebook Reality Labs
-
- Robinson Philip
- Facebook Reality Labs
この論文をさがす
説明
<p>The perception of acoustic motion is not uniform as a function of azimuth; listeners need roughly twice as much motion at the side than at the front to judge the two motions as equivalent. Self-generated acoustic motion perception has also been shown to be distorted. Sounds moved slightly with the listener's head are more consistently judged to be world-stable than those that are truly static. These distortions can be captured by a model that incorporates a head-centric warping of perceived sound location, characterized by a displacement in apparent sound location away from the acoustic midline. Such a distortion has been demonstrated; listeners tend to overestimate azimuth when they are asked to point at a sound source while keeping their head and eyes fixated ahead of them. Here we show that this mathematical framework may be inverted and we demonstrate the benefits of re-mapping sound source locations toward the auditory midline. We show that listeners prefer different amounts of spatial remapping, but none preferred no remapping. Modelling shows minimal impact on spatial release from masking for small amounts of remapping, demonstrating that it is possible to achieve a more stable perceptual environment without sacrificing speech intelligibility in spatially complex environments.</p>
収録刊行物
-
- Acoustical Science and Technology
-
Acoustical Science and Technology 41 (1), 245-248, 2020-01-01
一般社団法人 日本音響学会
- Tweet
詳細情報 詳細情報について
-
- CRID
- 1390565134814962560
-
- NII論文ID
- 130007782688
-
- ISSN
- 13475177
- 03694232
- 13463969
-
- 本文言語コード
- en
-
- データソース種別
-
- JaLC
- Crossref
- CiNii Articles
-
- 抄録ライセンスフラグ
- 使用不可