Auditory display is an often underutilized interface modality for conveying information to a user. However, audio has previously proven effective in a variety of use cases for information presentation and is particularly effective when the user is unable to attend to a visual interface, whether from a disability or a temporary constraint such as vehicle operation. In addition to auditory representations of data (sonifications), audio can also be used to represent a list of commands or menu within an interface. This thesis presents a concept for auditory menus that minimizes responses/inputs by the user as well as the number of tactile controls necessary. Such types of menus therefore limit simultaneous manual interactions when the user is also engaged with another demanding motor task. This approach to auditory menu interaction is referred to as a push menu and can be thought of as an alternative to more conventional auditory menus, which are referred to as pull menus. Push menus present menus in an automated sequence during which the user recognizes the desired menu item and makes a selection within a selection interval. In contrast, pull menus require that the user navigate via a combination of multiple navigation inputs and item selections. In this thesis a general hypothesis is presented that predicts that a primary visual-motor task, such as operating a vehicle, will be less negatively impacted by the secondary task of auditory menu interaction when the menu is a push menu rather than a pull menu.