References
Adams, Ernest (2009). Fundamentals of Game Design. Thousand Oaks: New Riders Publishing.
Adcock, Matt and Stephen Barrass (2004). “Cultivating design patterns for auditory displays.” In Stephen Barrass and Paul Vickers (eds.), Proceedings of the 10th International Conference on Auditory Display (n.p.).
Alves, Valter and Licinio Roque (2010). “A pattern language for sound design in games.” In Katarina Delsing and Mats Liljedahl (eds.), Proceedings of the 5th Audio Mostly Conference: A Conference on Interaction with Sound.
Barrass, Stephen (1996). “EarBenders: Using Stories About Listening to Design Auditory Interfaces.” In Proceedings of the First Asia-Pacific Conference on Human Computer Interaction APCHI’96.
Barrass, Stephen (2003). “Sonification Design Patterns.” In Eoin Brazil and Barbara Shinn-Cunningham (eds.), Proceedings of the 2003 International Conference on Auditory Display (pp. 170-175).
Blattner, Merra M., Denise A. Sumikawa and Robert M. Greenberg (1989). “Earcons and icons: their structure and common design principles.” Human Computer Interactive 4/1: 11-14.
Blizzard Entertainment (2012). World of Warcraft: Mists of Panderia (PC/Mac Software). Irvine, CA: Activision Blizzard Inc.
Block, Richard A. (2003). “Psychological timing without a timer: The roles of attention and memory.” In Hede Helfrich (ed.), Time and mind II: Information processing perspectives (pp. 41-59). Göttingen: Hogrefe & Huber.
Brewster, Stephen A. and Ashley Walker (2000). “Non-Visual Interfaces for Wearable Computers.” In Proceedings of IEE Workshop on Wearable Computing. London: Institution of Electrical Engineers.
Brewster, Stephen A., Peter C. Wright and Alistair D. N. Edwards (1993). “An evaluation of earcons for use in auditory human-computer interfaces.” In Ashlund, Mullet, Henderson, Hollnagel and White (eds.), In Proceedings of INTERCHI’93 (pp. 222-227). Amsterdam: ACM Press, Addison-Wesley.
Chang, Jaeseung and Marie-Luce Bourguet (2008). “Usability framework for the design and evaluation of multimodal interaction.” In Proceedings of: The 22nd British HCI Group Annual Conference on People and Computers: Culture, Creativity, Interaction 2: 123–126.
Dumas Bruno, Denis Lalanne and Sharon Oviatt (2009). “Multimodal Interfaces: A Survey of Principles, Models and Frameworks. In Denis Lalanne and Juro Kohlas (eds.). Human Machine Interaction. Lecture Notes in Computer Science, vol. 5440. Berlin: Springer.
Ekman, Inger (2005). “Meaningful noise: Understanding sound effects in computer games.” In Diddle A. (ed.), Proceedings of the 6th Digital Arts and Culture 2005 Conference: Digital experience: design, aesthetics, practice.
El-Nasr, Magy Seif and Su Yan (2006). “Visual Attention in 3D games.” In Hiroshi Ishii, Newton Lee, Stephane Natkin and Katsuhide Tsushima (eds.), Proceedings of the ACM SIGCHI international conference on advances in computer entertainment technology. New York: ACM.
Friberg, Johnny (2004). “Audio games: New perspectives on game audio.” In Ryohei Nakatsu, Mark Billinghurst and Gino Yu (eds.), Proceedings of the 2004 ACM SIGCHI Advances in Computer Entertainment Technology (pp. 148-154).
Gärdenfors, Dan (2003). “Designing sound-based computer games.” Digital Creativity 14/2: 111-114.
Gaver, William (1986). “Auditory Icons: using sound in computer Interfaces.” Human-Computer Interaction 2: 167-177.
Gaver, William (1989). “The SonicFinder: An interface that uses auditory icons.” Human-Computer Interaction 4/1: 67-94.
Gaver, William, Randall Smith and Tim O’Shea (1991). “Effective sounds in complex systems: the ARKola simulation.” In Scott P. Robertson, Gary M. Olson and Judith S. Olson (eds.), Proceedings of CHI ’91 (pp. 85-90). New York: ACM.
Graham, R. (1999). “Use of auditory icons as emergency warnings: evaluation within a vehicle collision avoidance application.” Ergonomics 42/9: 1233-1248.
Grimshaw, Mark, Craig A. Lindley and Lennart Nacke (2008). “Sound and Immersion in the First-Person Shooter: Mixed Measurement of the Player’s Sonic Experience.” In Proceedings of 3rd Audio Mostly Conference.
Grimshaw, Mark and Gareth Schott (2008). “A Conceptual Framework for the Analysis of First-Person Shooter Audio and its Potential Use for Game Engines.” International Journal of Computer Games Technology 2008.
Hollender, Nina, Critian Hofmann, Michael Deneke and Bernhard Schmitz (2010). “Integrating cognitive load theory and concepts of human–computer interaction.” Computers in Human Behavior 26/6: 1278–1288.
Jaimes, Alejandro and Nicu Sebe (2007). “Multimodal human–computer interaction: a survey.” Computer Vision and Image Understanding 108/1: 116–134.
Jørgensen, Kristine (2006). “On the Functional Aspects of Computer Game Audio.” In Proceedings of the Audio Mostly Conference (pp. 48-52).
Kahneman, Daniel (1973). Attention and effort. Englewood Cliffs, NJ: Prentice-Hall.
Kong, Jun, Wei Yi Zhang, Nan Yu and Xiao Jun Xia (2011). “Design of human-centric adaptive multimodal interfaces.” International Journal of Human-Computer Studies 69/12: 854-869.
Kramer, Gregory (1994). Auditory Display: Sonification, Audification, and Auditory Interfaces. Boston, MA: Addison-Wesley Longman Publishing.
Lalanne, Denis, Laurence Nigay, Phillipe Palanque, Peter Robinson, Jean Vanderdonckt and Jean-Francois Ladry (2009). “Fusion engines for multimodal input: a survey” In Proceedings of the 2009 International Conference on Multimodal Interfaces (pp. 153-160). New York: ACM.
McGee, Marilyn, Phil Gray and Stephen Brewster (2000). “The Effective Combination of Haptic and Auditory Textural Information.” In Stephen Brewster and R. Murray-Smith (eds.), Proceedings of the First International Workshop on Haptic Human-Computer Interaction(pp. 118-126). London: Springer Verlag.
Morgan, Alisa L. R. and John F. Brandt (1989) “An auditory Stroop effect for pitch, loudness, and time.” Brain and Language 36/4: 592-603.
Nesbitt, Keith (2004). “Comparing and Reusing Visualisation and Sonification Designs using the MS-Taxonomy.”Stephen Barrass and Paul Vickers (eds.), In Proceedings of International Community for Auditory Display. Sydney: Australia.
Nesbitt, Keith V. and Stephen Barrass (2004). “Finding trading patterns in stock market data.” IEEE Computer Graphics and Applications 24/5: 45-55.
Nesbitt, Keith V. and Ian Hoskens (2008). “Multi-sensory game interface improves player satisfaction but not performance.” In Beryl Plimmer and Gerald Weber (eds.), Proceedings of the Ninth Conference on Australasian User Interface, Volume 76 (pp. 13-18). Darlinghurst: Australian Computer Society.
Ng, Patrick and Keith Nesbitt (2013). “Informative Sound Design in Video Games.” In Stefan Greuter, Christian McCrea, Florian Mueller, Larissa Hjorth, and Deborah Richards (eds.), Proceedings of the 9th Australasian Conference on Interactive Entertainment: Matters of Life and Death.
Ng, Patrick, Keith Nesbitt and Karen Blackmore (2015). “Sound improves player performance in a multiplayer online battle arena game.” In Stephan Chalup, Alan Blair and Marcus Randall (eds.), Artificial Life and Computational Intelligence: Lecture Notes in Computer Science (pp. 166-174). Switzerland: Springer International Publishing.
Nigay, Laurence and Joelle Coutaz (1993). “A design space for multimodal systems: concurrent processing and data fusion.” In Bert Arnold, Gerrit van der Veer and Ted White (eds.) INTERCHI: Conference on Human Factors in Computing Systems (pp.172-178). Amsterdam: ACM.
Oviatt, Sharon (1997). “Multimodal interactive maps: designing for human performance.” Human Computer Interactions 12/1: 93–129.
Oviatt, Sharon, Rebecca Lunsford and Rachel Coulston (2005). “Individual differences in multimodal integration patterns: what are they and why do they exist?” In Wendy Kellogg, Shumin Zhai, Gerrit van der Veer and Carolyn Gale (eds.), Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 241-249). New York: ACM.
Pao, Lucy Y. and A. Lawrence Dale (1998). “Synergistic Visual/Haptic Computer Interfaces.” In Proceedings of Japan/USE/Vietnam Workshop on Research and Education in Systems, Computation and Control Engineering (pp. 155-162).
Parker, Jim R. and John Heerema (2008). “Audio Interaction in Computer Mediated Games.” International Journal of Computer Games Technology.
Patterson, R. D. (1982). Guidelines for Auditory Warning Systems on Civil Aircraft. Cheltenham: Civil Aviation Authority.
Polotti, Pietro and Guillaume Lemaitre (2013). “Rhetorical Strategies for Sound Design and Auditory Display: A Case Study.” International Journal of Design 7/2: 67-82.
Ramos, Daniel and Eelke Folmer (2011). “Supplemental Sonification of a Bingo Game.” In Marc Cavazza, Katherine Isbister and Charles Rich (eds.), Proceedings of the 6th International Conference on Foundations of Digital Games (pp. 168-173). New York: ACM.
Reeves, Leah M., Jennifer Lai, James A. Larson, Sharon Oviatt, T. S. Balaji, Stéphanie Buisine, Penny Collings, Phil Cohen, Ben Kraal, Jean-Claude Martin, Michael McTear, T. V. Raman, Kay M. Stanney, Hui Su, Wang and Ying Qian (2004). “Guidelines for multimodal user interface design.” Communications of the ACM 47/1: 57–59.
Röber, Niklas and Maic Masuch (2005). “Leaving the Screen, New Perspectives in Audio-only Gaming.” In Proceedings of the 11th International Conference Auditory Display (pp. 92-98).
Smith, Daniel R. and Bruce N. Walker (2005). “Effects of auditory context cues and training on performance of a point estimation sonification task.” Applied Cognitive Psychology19/8: 1065-1087.
Something Else (2013). Papa Sangre. London: iOS Software.
Space Invaders (1978). Space Invaders.Toyko: Taito Corporation.
Square Enix (2014). Thief (PC Software). Toyko: Shinjuku.
Stanton, Neville A. and Judy Edworthy (1999). “Auditory warnings and displays: An overview.” In Neville A. Stanton and Judy Edworthy (eds.), Human factors in auditory warnings (pp. 3-30). Aldershot: Ashgate.
Stockburger, Axel (2003). “The Game environment from an auditive perspective.” In Marinka Copier and Joost Raessens (eds.), Proceedings of Level Up: Digital Games Research Conference. Utrecht: Utrecht University.
Tan, Siu-Lan, John Baxa and Matthew P. Spackman (2010). “Effects of Built-in Audio versus Unrelated Background Music on Performance in an Adventure Role-Playing Game.” International Journal of Gaming and Computer-Mediated Simulations 2/3: 1-23.
Townsend, Jim T. and Ami Eidels (2011). “Workload capacity spaces: A unified methodology for response time measures of efficiency as workload is varied.” Psychonomic Bulletin and Review 18/4: 659-681.
Turk, Matthew (2014). “Multimodal interaction: A review.” Pattern Recognition Letters 36/1: 189-195.
Valente, Luis, Clarisse Sieckenius de Souza and Bruno Feijó (2008). “An exploratory study on non-visual mobile phone interfaces for games.” In Lucia Filgueiras and Marco Winckler (eds.) Proceedings of the VIII Brazilian Symposium on Human Factors in Computing Systems (pp. 31-39).
Valve Corporation (2013). Dota 2 (PC/Mac Software). Bellevue, WA: Valve, LLC.
Walker, Bruce N. and Gregory Kramer (2006). “Auditory Displays, Alarms, and Auditory Interfaces.” In W. Karwowski (ed.), International Encyclopedia of Ergonomics and Human Factors (pp. 1021-1025). Second edition. New York: CRC Press.
Wickens, Christopher, D. (2008). “Multiple resources and mental workload.” Human. Factors 50/3: 449-55.
Williams, Paul, Keith Nesbitt, Ami Eidels and David Elliott (2011). “Balancing risk and reward to develop an optimal hot-hand game.” Game Studies 11.
Wolfson, Stephen and G. Case (2000). “The effects of sound and colour on responses to a computer game.” Interacting with Computers 13/2: 183-192.
Zakay, Dan and Richard A. Block (1995). “An attentional-gate model of prospective time estimation.” In Véronique de Keyser, Géry d'Ydewalle and André Vandierendonck (eds.), Time and the dynamic control of behavior (pp. 167-178). Seattle: Hogrefe & Huber Publishers.