A novel interface for generating choreography based on augmented reality

A novel interface for generating choreography based on augmented reality

International Journal of Human-Computer Studies 132 (2019) 12–24 Contents lists available at ScienceDirect International Journal of Human-Computer S...

2MB Sizes 0 Downloads 9 Views

International Journal of Human-Computer Studies 132 (2019) 12–24

Contents lists available at ScienceDirect

International Journal of Human-Computer Studies journal homepage: www.elsevier.com/locate/ijhcs

A novel interface for generating choreography based on augmented reality Tafadzwa Joseph Dube, Gökhan İnce



T

Computer Engineering Department Istanbul Technical University Istanbul, Turkey

ARTICLE INFO

ABSTRACT

Keywords: Choreography Virtual reality Augmented reality User experience Natural user interfaces

Choreography is a creative art of defining sequences of movement. The choreographing process benefits a lot from the use of digital applications. In this study, we investigate the effects of different interactive techniques on user experience for a choreography generator interface. We develop a virtual reality-based and an augmented reality-based interface and compare them with 1) a personal computer-based choreography generator interface and 2) a touch-based mobile application for choreography generation. We evaluate user performance and user experience on the interfaces in terms of task completion time, as well as subjective criteria, like mental stress, physical stress, and pleasure. Our research contributes to the study of how different interaction methods of the same application affect user experience. This research also contributes to the study of virtual and augmented reality in education and training applications. The results verify the effectiveness of augmented reality in developing training and design applications, particularly choreography. The augmented reality interface emerged as the preferred choice for users. Furthermore, the results demonstrate the effectiveness of natural user interaction approaches in interface design.

1. Introduction Choreography generation is a creative process for designing movement that gains a lot from the use of digital tools. The digital train that swept across different sectors has transformed the way choreography has been done since the late 1960s (Cunningham, 1969; Noll and Hutchinson, 1967). Nonetheless, choreographers still lack access to digital tools to enhance their creative process (Felice et al., 2016). This motivates a need to explore more interactive ways to assist in choreography. Augmented reality (AR) and virtual reality (VR) show promise in education and training (Sinclair, 2016). As researchers are finding ways to embrace digital tools for choreography, this study explores VR and AR in the development of interfaces for choreography design. Furthermore, it is imperative to develop interfaces that are user-friendly so that users completely embrace these tools. It is therefore vital to develop interfaces that are effective, efficient and satisfying as outlined in ISO 9241 standard Ergonomics of Human-System Interaction (ISO, 2008) (Shneiderman, 1997). Natural User Interfaces (NUI) shows much promise in the HumanComputer Interaction (HCI) field. NUI approach puts more emphasis on developing interfaces that allow users to perform tasks in a natural way, these include touch-based techniques, gestures and voice; it runs away from the traditional approach of using the keyboard and mouse (Besançon et al., 2016). In this study, we develop interfaces that utilize



the traditional interactive approach and also interfaces that seek aspects of NUI. NUIs have a pivotal role to play in education and training applications. With the rapid advances in technology, VR and AR have emerged as the top technologies in the 21st century (Wojciechowski et al., 2004), as predicted by David Hawks in 1997 (Hendee and Wells, 1997). Choreographers are seeking novel ways to enhance their work and make it more enjoyable. Therefore, in this study, we sought to use AR and VR based interfaces to enhance the choreographing process. VR has its cradle on Ivan Sutherland’s 1965 dream to see computer displays being windows into a virtual world that is as real as possible (Sutherland, 1965). David Hawks defines VR as the process by which a computer system replaces the sensory impetuses of vision, sound, or touch that would be provided by the natural environment to give an impression of being in an artificial or virtual world (Hendee and Wells, 1997). Developments in computer graphics and animation particularly in the entertainment industry have produced realistic virtual environments (Hendee and Wells, 1997; Lee et al., 2009). VR has since gained use in most critical sectors, unlike in its early days when it was used only for simulators and entertainment applications (Brooks, 1999). VR has also been widely explored in learning and education (Psotka, 1995). It has proved effective in providing simulation environments for many applications. Virtual environments produce an interesting and fascinating way for interface developers and users.

Corresponding author. E-mail address: [email protected] (G. İnce).

https://doi.org/10.1016/j.ijhcs.2019.07.005 Received 24 October 2017; Received in revised form 15 April 2019; Accepted 10 July 2019 Available online 15 July 2019 1071-5819/ © 2019 Published by Elsevier Ltd.

International Journal of Human-Computer Studies 132 (2019) 12–24

T. Joseph Dube and G. İnce

AR presents an interesting approach to interface development. AR together with the touch screen affords NUI features. AR brings together the real world and the virtual world; it affords interaction with 3-dimensional objects superimposed onto a real world (Yilmaz, 2016). Using AR, users can interact with both the virtual and real world simultaneously. Early AR devices were characterized by the bulky headmounted devices. However, the surge in mobile devices as a cheaper AR platform has seen an increase in mobile AR applications (Craig, 2013). Rapid advances in computer graphics and the increasing processing power of modern-day mobile devices have influenced the rise (Lee et al., 2009). Bujak et al. (2013) contends that the superimposition of virtual 3D objects onto the real world provides a novel experience that gives pleasure and generates amazement and curiosity (Bujak et al., 2013). In the year 2016, AR took major strides in the gaming industry, after the launch of “Pokemon Go”. The New York Times contends that the release of Pokemon Go can possibly usher a new era for AR, and it can break through the current status to something bigger (Wingfield and Isaac, 2016). Consequently, it is imperative to embrace AR in interface development for a rich experience. Thus, we sought to incorporate AR in the choreographing process. In this study, we design, implement and compare the VR and mobile AR based interfaces with 1) a Personal Computer (PC) based choreography generator and 2) a touch-based mobile application for choreography generation. Our aim is to investigate the effects of different interactive techniques of the same application on user experience. We also investigate in general how VR and AR affect user experience in training tools, particularly choreography. The results of the experiments are important in understanding the impacts of different interaction techniques on user experience, as well as understanding the effects of VR and AR on training applications. In this study, we provide an important contribution presenting VR and AR choreography applications bridging the gap in the creative technological field.

bionic dancer”, she designed a wonderful, agile computerized dancer for Twyla Tharp’s video choreography titled “Catherine Wheel” (Allen, 1983). This became the backbone of most modern 3D dance animation systems. The model produced realistic representations for human motion. Calvert et al. (1993) describe the evolution of a choreography design interface in the early 90s, they highlight the importance of a computer interface in choreography design (Calvert et al., 1993). This computer based interface depended on the traditional keyboard and mouse for interaction. VR technologies offer theatre and performance unique and compelling possibilities (Dixon, 2015), thus it has been used in choreography tool interface development. DeLahunta (2002) studied aspects of VR in the theatre. The research underlined the importance of VR in arts. Further strengthening the notion that head-mounted displays equipped with headphones are very effective for immersion because it can essentially isolate the participant from her surroundings and only provide them with the sights and sounds of the virtual world (Heim, 1998). In 2016, the Australian Centre for the Moving Image (ACMI) and Sydney Dance Company collaborated to introduce an immersive film that combines choreography with virtual reality, that allows the audience to view the production from all angles (Jana, 2016). However, despite an invention of a 3D VR creation, manipulation and editing system including voice and gesture input interface (Cox et al., 2000), these interfaces have not been explored completely in choreography design. Issues that have impeded advances in VR for choreography include costs and time (Dixon, 2015). However, the current trends have seen a lot of VR head-mounted devices like the Oculus powered Gear VR1 and Google VR2 unleashed on the market. This has seen a decrease in the cost of VR. Thus, the passion of John Walker and the forecasts of other VR legends of the late 20th century can now be fully realized in creative VR applications. Therefore, our research explores the use of VR in choreography. As digital tools evolved, AR has been incorporated in different applications. AR has gained a lot of use in teaching, training and visualization for many different institutions over the last decade (Lee, 2012). Choreography has also embraced AR and VR in recent years. Broll et al. (2004) mentions an interactive mixed reality system for stage management and choreography. The system is a hybrid choreography application that uses both VR and AR. It facilitates planning for stage shows and events. It is a 3D environment that is flexible and gives the choreographer and stage managers a chance to use digital objects to visualize their different tasks. It makes use of head mounted devices to allow the choreographer to design a dance for 3D generated characters. Furthermore, 3D props can be used in a stage set up to visualize how the real stage will look like. To define choreography the researchers used tracked tangible units which are linked to the virtual characters. A voice command is used to initiate choreography generation. However, the major downside of this approach is the dependence on bulky and expensive hardware. Furthermore, it requires a lot of space for setting up the miniature stage, making it difficult for mobile users. However, this cumbersome approach to AR has been improved by recent advances in mobile hand-held devices like smart phones (Van Krevelen and Poelman, 2010). Therefore, in the interface we develop users make use of the touch screen of the hand-held device to define and edit choreography. Another digital approach to dance learning that is based on a Web3D environment is described by Davcev et al. (2003). This interactive system realizes dance animations for training and education. Using the application, a choreographer can compose various dance annotations. The Web3D environment is effective for observing fast and slow dance steps from different viewing angles. The system enables the dancer to learn the annotations defined by the choreographer from

2. Related work Over the years different approaches have been explored to fully embrace technology for choreography generation. Judith Gray studied the evolution of dance learning with particular focus on how technology shaped choreography over time (Gray, 1989). The author highlighted the coexistence of dance learning and technology over time. However, the lack of standards in the development of choreography applications hinders advances (Alaoui et al., 2014). Creating standards to shape the development of choreographing applications remains a major challenge. Although this allows creativity by enabling developers to come up with different ideas, it also leads to the development of applications that lack a proper structure. Digital tools for choreography generation were introduced as early as the 1960s (Gray, 1989). Noll and Hutchinson (1967) and a New York choreographer (Cunningham, 1969) pioneered the transformation to digital choreography almost at the same time. The first choreographing computer system was developed by Noll in 1967. It was influenced by the need to create dance annotations without the need for a physical space or physical dancers (Noll and Hutchinson, 1967). The system developed made use of a two-dimensional interface, which utilized stick figure representation of dancers displayed on the computer screen. The choreographers crafted dance annotations using different buttons and means to control the stick figure representations. The system played an important role in crafting a way for choreographing in the digital space. However, the major weakness of such a system is its dependence on 2D visualization which hinders the natural feel of the interface. Furthermore, early systems suffered from limited input techniques and low processing power of computers. However, the modern century has seen an increase in processing power and the availability of different input techniques. The major breakthrough in dance animation was courtesy of Rebeca Allen at New York Institute of Technology. In a research entitled “the

1 2

13

https://www.oculus.com/gear-vr/ https://vr.google.com/

International Journal of Human-Computer Studies 132 (2019) 12–24

T. Joseph Dube and G. İnce

different angles and speed. This 3D animation environment provides an easy way for dancers and choreographers to perform their work without the need for a physical platform. However, portability is a challenge as the interface is only effective on desktop browsers. VR and AR have been compared in different fields especially in the manufacturing and industrial fields. A study by Boud et al. (1999) discusses the advantages and limitations of using VR and AR as training tools in assembly operations. The experiments focused on the completion times of different tasks using VR and AR. The results of the research demonstrate the importance of AR and VR as they gave an advantage of improved performance for assembly training over the traditional approach. However, they also demonstrated different time outputs. The researchers concluded by stating that both AR and VR have relative merits for training purposes, but their use relies upon the particular application. Therefore, before employing these technologies it is important to investigate the task to ensure the benefits offered by both technologies are maximized. Thus, our research investigates these aspects with regards to choreographing tools. Much work has been conducted to compare different interactive techniques. Besançon et al. (2016) investigates the usability of the mouse-based and touch-based interactive approaches in manipulating objects in a 3D virtual environment. In the study, they measure subjective aspects such as fatigue, workload, and preference. The researchers used docking tasks on participants to accomplish the investigation. The experiments were conducted in a well-controlled environment, which allowed the users to continuously give feedback. The results were important in showing that the two approaches provide relatively similar levels of precision, however, time of interaction differs. The subjective results also showed which of the interactive techniques users preferred. The study conducted contributed to the study of interactive techniques and usability of applications. However, as per our knowledge, there is no work comparing the different interactive techniques on user experience for choreographing applications which also takes AR and VR into consideration. Therefore, our study focuses mainly on user experience on a choreographing tool. Furthermore, we focus on AR and VR for training and education. We seek to develop a clear understanding on the effects of mobile AR on user experience.

Fig. 1. A screen shot from the PC based interface showing props added.

Fig. 2. PC based interface: draw path.

interface is a 3D choreographing environment that a choreographer uses to define different annotations. The presented stage is rich in props and presents a well-modelled stage that resembles real stages for theatres. The design of the stage was informed by the choreographers who participated in the study. The choreographer is presented with controls to add dancers and props into the stage. Fig. 1 shows the PC based interface with dancers and props added. The choreography generator interface can be customized to suit well for different setups. In this way, users can define their stages in different ways depending on the type of choreography they work on. Furthermore, individual character profile for the dancer can be created when adding a character, the profile contains attributes such as name, age, gender and height. This functionality allows the choreographer to define 3D virtual characters with attributes that make it easy for real performing artists to follow so as to understand their roles in the choreography. The user can zoom into and out of the scene using a mouse and also rotate the scene to view it from different angles. The interface provides high levels of zooming that permits both the choreographer and the dancers to view the defined annotations clearly. To add a dancer into the scene, the user utilizes the “add dancer” button, then selects the character from the presented grid. Once the dancer is added into the scene the choreographer can then define the path that the dancer must follow. The “draw path” button is used to set the scene into a mode for defining the path. The user uses the mouse to drag the character along the path to follow. A line trail is used to show the user the path they are defining as shown in Fig. 2. After defining the choreography the “play” button is used to play the defined choreography. While the scene is playing the choreographer can use a slider to control the speed of the characters and timeline of events.

3. Interfaces We developed four interfaces: the PC based interface, the touch-base mobile application interface, the VR based interface and the AR based interface. We employed the participatory design approach. This approach puts emphasis on collaboration between developers and the users (Schuler and Namioka, 1993). Its a huge step towards a good user experience (Wiklund Axelsson et al., 2017). This approach has been explored in the design of choreography applications (Calvert et al., 1993; Felice et al., 2016). Therefore, choreographers helped to craft the design of the interface based on their need. However, LaViola et al. (2017) argue that users may not be aware of the perspectives or guidelines (LaViola et al., 2017). Therefore, to finalize the designs we explored set principles and guidelines. The design process was also informed by other players in the arts field. This interactive, iterative process played a big role in crafting the control for the characters in the choreographing scene. The PC based interface was the first to be developed, the rest of the interfaces inherit their functionalities from this interface. Although they are no standard guidelines for choreography applications, prior work (Calvert et al., 1998), also helped shape the design for the choreography interface. In the application, we limit choreography animation to movement only, for the sake of simplicity. Nonetheless, this choreography is also important for military drills and drum majorettes. 3.1. PC based interface We first developed the PC based choreography generator. The 14

International Journal of Human-Computer Studies 132 (2019) 12–24

T. Joseph Dube and G. İnce

Fig. 4. Mobile application interface: draw path. Fig. 3. PC based interface: text bubble.

Interaction in virtual environments can be classified into three; physical manipulation, direct manipulation and virtual manipulation (Jung et al., 2014; Mine, 1995). The physical manipulation category includes mice and joysticks as interaction devices in the virtual world. Despite giving users much control these methods hinder the natural feel. Direct manipulation interaction method includes Head Mounted Displays (HMD) and whole-hand gloves. Direct manipulation methods allow users to interact with objects using their intuitive motions which give a natural feel of interaction. Virtual manipulation devices include those input devices that are developed as virtual controls such as the virtual hand (Tomozoe et al., 2004). These devices present important flexibility when interacting with virtual environments however their major drawback is the lack of haptic feedback and other issues when interacting with virtual objects (Mine, 1995). With this background, we developed a VR based interface that combines direct manipulation techniques and physical manipulation techniques. For direct manipulation, the application utilized Samsung’s Gear VR headset.3 Whereas for the physical manipulation the Gear VR headset’s touchpad is used. The decision to use this approach to interact with the virtual world is motivated by the fact that manipulation methods for these techniques are based on human intuitiveness, therefore training users before use is not necessary. Adapting 2D menus in 3D user interfaces allows users to retain their already existing knowledge from the use of 2D based applications (LaViola et al., 2017). Therefore, we adapted 2D menus in the VR based interface. However, spatial requirements have to be taken into consideration to maintain the immersion effect. Furthermore, the placement of the menu must be such that it is easy for the user to locate as they move around the scene (Bowman and Wingrave, 2001). Our VR scene is set up inside a virtual theater, the 2D menu is placed at the top back of the scene where it is visually accessible most of the time. Bowman et al.(2001) describes another approach, which fixes the menu to the user’s head so that it moves with the head every time (Bowman and Wingrave, 2001). However, during the iterative design process choreographers preferred the fixed menu at the back of the stage. The interactive buttons provide color feedback, which is an important visual feedback to the user (LaViola et al., 2017). To interact with the virtual environment, a virtual pointer called the reticle that helps the user identify the position of his/her eyes in the virtual world is used. The reticle is the critical pointer when interacting with the virtual objects. The Gear VR headset coupled with the gyroscope sensors in the smartphone assist in tracking the user’s viewpoint as he/she moves the head. The touchpad is the core interaction method with menu items and other objects in the virtual world. When the reticle is over an item in the virtual environment, the user can interact with these objects by tapping on the touchpad. The touchpad responds differently to a single tap, double tap and swipe taps.

Other animations include a bubble to keep track of any text that a performing artist utters in the scene, and also the ability to raise hands as a sign for different signals. The choreographer can set different signals that dancers use in the scene, including voice signals which come in the form of note bubbles on the interface. The bubble carries a message that a character must utter as shown in Fig. 3. This is analogous to a lead dancer hailing out instructions during the dance or an actor telling his lines. 3.2. Mobile application interface Mobile touch screens such as those of phones and tablets remove reliance on the keyboard and mouse for interaction and present the touch-based interaction technique. The touch-based approach presents a new dimension for controlling the characters and interacting with the interface. This interface is designed to function on hand-held devices of different screen sizes. The application retains the same functionality and graphics as the PC based application but relies on the finger for interaction. The user can select the actors to add into the scene by touching a specific button from the character grid presented in the interface. Once the character has been added the finger is used to draw the path the character follows. The user uses his/her finger to drag the character towards a path to follow. A line trailer utility is used to set a visual trail to show the path being defined as shown in Fig. 4. Drawing the path using the finger allows a natural way of performing this task. This enables an easy way for the user to achieve the required result, in addition, it facilitates defining paths that are difficult to achieve using the mouse. To play the scene the “play” button is used. Whilst playing the choreography the choreographer can view the scene from different angles and zooming levels. To zoom in or out the interface allows the simultaneous use of two fingers on the touchscreen. Rotating the scene is achieved by dragging the view towards the required viewing angle using just one finger. The mobile application also allows the user to add different props to decorate the scene. The underlying visual graphics for the PC based interface and the mobile application interface are generally the same with a little difference owing to the nature of mobile devices’ screens. 3.3. Virtual reality-based interface VR presents an interesting extension to the cyberspace presenting an opportunity for amazing experience and more possibilities especially in training and education (Sinclair, 2016). We developed an interface that increases realism in the choreography process. Virtual environments present diverse interaction techniques, they allow users to use their eyes, ears, and hands, just as they would normally do in the real world when performing virtual interactions (Jung et al., 2014).

3

15

http://www.samsung.com/global/galaxy/gear-vr/

International Journal of Human-Computer Studies 132 (2019) 12–24

T. Joseph Dube and G. İnce

Fig. 6. Template designed for the AR marker.

Fig. 5. A screenshot of the VR interface with actors added.

The VR based interface is a virtual environment that presents a 3D theatre with a stage where actors and different 3D props can be added. To maintain consistency across the different interfaces the presented environment is similar to the PC based and mobile application’s theatre. However, in the VR based interface, the user is immersed in the virtual environment which gives him/her the feeling of existing inside the theatre. The user can view the theatre by just turning his/her head around just like in real life. The VR based interface gives real-time responses to user’s actions thus further enhancing their experience in the synthetic world. Fig. 5 shows the VR interface with actors added to the scene as observed from the smart phone’s screen. The small red circle functions as the reticle, which is a vital tool when interacting with the interface. The user is presented with menu items to add actors, play the scene, define movement, add different props and reset the scene. Whenever the reticle hovers over a certain menu item, the menu item changes colour to make the user aware that he/she is looking at that particular menu item. To add an actor, the user uses the “add actor” button, by positioning the reticle on top of the button and then tapping the touchpad. The user selects the actor from the presented grid. The “draw path” button is used to set the mode to define the choreography. The user selects an actor on the stage by looking at the actor and tapping the touchpad. The user then defines movement by moving the head in a way the motion of the actor is expected to appear. A trailer utility is used to show a visual trailer for the actor. To give the user more control for the task to define a path, voice commands to undo a defined path can be used. This is achieved through the “Undo” command. This undoes the path defined and allows the user to redefine another movement. To control the speed of the actors in the scene the user uses the touchpad by swiping it towards the top to increase the speed and swiping it downwards to reduce speed. Whilst playing the scene the user can view the scene from different zooming levels by swiping the touchpad forward or backwards. Finally, a reset button is presented that allows the user to reset the scene and start over. The reset button is important to allow the choreographer to quickly reorganize the characters in the stage.

Fig. 7. AR interface.

interface overlaid on the AR marker. Using AR, different real-world tangible objects can be added on top of the marker apart from the computer-generated objects. This aspect gives the choreographer ability to use different real-world tangible objects for illustration. Therefore, using AR the stage is not a fixed environment but rather it allows the choreographer to creatively define different environments for choreographing. AR’s inherent nature provides a unique opportunity to create authentic extraordinary environments that make use of both digital and physical material (Dunleavy et al., 2009). In Fig. 8 two physical objects are utilized as props and an image as a background scenery. In mobile or PC based interfaces only computer-generated imagery can be used in the background for stage setup. The AR user interface menu also uses a 2D menu. 2D menus for AR have been studied and used before (Höllerer, 2004; LaViola et al., 2017). Posture is an important consideration when designing interfaces for handheld devices, such that users use the interface comfortably (LaViola et al., 2017). It is vital to consider where the buttons are placed. LaViola et al. (2017) assert that on a handheld AR interface selection should be easy and menus should not overlap with the work space. On the AR interface buttons are positioned at the bottom so that they are easily reachable to the users at a comfortable posture. The user utilizes his/her mobile device’s camera to view the marker and initiate the interface. To interact with the interface, the user utilizes the touch screen of the mobile device and gestures. Once the marker is detected the user is presented with the application’s interface on the mobile device’s screen. The interface presented has controls for adding a dancer, drawing the path, adding props to decorate scene, controlling the speed of the characters and resetting the scene. To add the characters into the scene the “add button” is utilized. Fig. 9 shows a user interacting with the AR interface. To define the choreography, the user drags the 3D character along the path to follow; this is achieved using the finger. This approach utilizes the touchscreen as opposed to the use of tangible units to define choreography as used in Broll et al. (2004) for their mixed reality application.

3.4. AR based interface The AR-based interface presents the same functionality as the two previous interfaces. However, the graphical interface of the AR-based application differs significantly due to the nature of AR. The AR application is a mobile-based AR tool that utilizes marker based detection. The marker is used as the stage set up required to initiate the interface. Marker-based AR uses a camera and a visual marker to determine the centre, orientation and range of its spherical coordinate system (Craig, 2013). The marker used for this application is shown in Fig. 6. Once the marker is recognized by the camera the user is presented with the interface superimposed into the real word. Fig. 7 shows the AR 16

International Journal of Human-Computer Studies 132 (2019) 12–24

T. Joseph Dube and G. İnce

Fig. 8. A screen shot from the AR interface, which shows interactive buttons, physical and virtual objects as props and an image as background scenery.

Fig. 9. User interacting with the AR interface.

A play button to play the defined choreography is presented to the user as shown in Fig. 7. Whilst playing the defined choreography, the user can control the speed of the characters, view the choreography from different angles and zoom in or out. To achieve different zooming levels the user moves the mobile device closer or further from the marker, whilst keeping the marker at a fixed position on the screen. This same functionality can also be achieved by moving the marker closer to the device’s camera, whilst keeping the mobile device in a fixed position. In order to view the scene from different angles, the user can move the camera around the marker. The scene can also be reset to correct any errors that arise. This functionality enables the choreographer to explore the interface further without fear of making errors.

on the Unity3D platform. It uses computer vision technology to recognize and track planar images and simple 3D objects in real-time. Vuforia and Unity3D have a seamless combination that enables the development of rich interfaces. The PC based platform is developed and published as both a web application and a standalone application. However, to use the web application, a Unity3D plugin is needed. The user interacts with the interface using a keyboard and a mouse. The mobile application interface is designed for any hand-held device. For the experiments, we published the application on Android devices. The users use the touch screen to interact with the interface. The interface also allows voice gestures to perform some of the tasks. The VR application is developed and published for Samsung Gear VR headsets. This was motivated by the variety of interaction techniques that the Gear VR headset offers compared to other VR headsets. The users interact with the interface using the touchpad on the Gear VR and gestures. The AR application was also published for the Android platform. To test the PC application, the participants accessed as a web application from the Mozilla Firefox web browser and as a standalone desktop application. To test the mobile AR application and the mobile application, we used an LG G4 stylus and a Samsung S6. These devices have high-quality cameras having 12 megapixels and 16 megapixels respectively that are effective for AR applications. The VR tests were conducted using the Samsung S6. The gear VR was used as the VR headset for our experiments. These mobile devices were efficient for these applications.

4. Experiments Experiments were conducted to evaluate the performance of users on the four interfaces for choreography generation, the PC based interface, the touch-based mobile application interface, the VR based interface and the AR-based version. We investigated four of Nielsens factors of usability: satisfaction, effectiveness, ease of learning and efficiency (Nielsen, 1993). To obtain satisfaction levels, we used the posttest questionnaire form. The users responded to how pleasant it is to perform the different task on the interfaces. The effectiveness is shown by how the test subjects completed the given tasks. We also used the questionnaire and observations to identify how well the users complete the given tasks. The user’s feedback on how easy it is to learn the system gives an indication of the time needed to learn the different interfaces. Furthermore, observing the test subjects using the interface helped us come up with an idea of this effect. The time scores from completing the tasks give the efficiency of the interface. The experiments sought a clear understanding of the effects of different interaction methods on the same application. In the experiments, we also evaluated the physical and mental effort required to perform the tasks on the interfaces. A combination of time taken to complete tasks, physical stress and the mental stress levels can help explicitly show the efficiency of an interface (Winter et al., 2008). Furthermore, we sought for an understanding of the preferred interface from the four interfaces.

4.2. Participants To conduct the experiments, we used both professional choreographers and non-professionals. We used 20 test subjects. A total of five choreographers was used as test subjects. These test subjects gave us insight from a professional choreographing view. Furthermore, since they are experts in the field of choreography, their views were vital to understanding the positives and negatives of the interfaces from a technical view. We also used a total of fifteen unpaid non-professional test subjects. From the fifteen, nine were male and six were female. Their ages ranged between 19 and 29, with a mean age of 24. The participants were selected based on their knowledge to use computers and they were drawn from a variety of fields. None of these users had professional experience with professional choreographing tools. We used the non-professional test subjects primarily to evaluate the general usability of the interfaces. Regarding expertise with AR four participants had experience with the technology, eight knew the technology but they had never used it before, whereas three of the users did not know anything about this technology. Concerning the VR technology six participants had used the VR headsets before, whereas the other nine participants had not used the devices but had knowledge of the technology and have seen it being used before. The participants were motivated individuals who were

4.1. Software and hardware To develop all the four interfaces, we used Unity3D4 software a cross-platform game engine used to develop video games for computers, consoles, mobile devices and websites. Unity is a powerful platform for developing 3D based graphical applications with rich interfaces. The AR application makes use of an additional plugin for Unity3D named Vuforia.5 Vuforia facilitates for the creation of robust AR applications 4 5

http://www.unity3d.com http://www.vuforia.com 17

International Journal of Human-Computer Studies 132 (2019) 12–24

T. Joseph Dube and G. İnce

willing to explore our interfaces.

the following issues:

• Users’ awareness and experience with AR and VR technologies • Users’ preference on touch-based or pointer based interactive technique • Users’ reaction to the different zooming approaches • How the users interact with the four interfaces • Time taken to complete the tasks

4.3. Experimental conditions The experiments were conducted in a user experience laboratory at the Istanbul Technical University that had devices equipped with all the interfaces. We maintained the same environmental conditions for all the users. During the experiments, our participants used the interfaces in a random order to avoid task adaptation. When comparing more than two different interfaces of the same application, this approach assists to reduce the uncertainties of bias.

In our experiments, we used an even 10 point Likert scale (1–10) to measure the different aspects of user experience. Furthermore, we limited the number of questions so that we keep our users interested since a research by Revilla et al. (2014) showed that as the number of questions increases, the users end up choosing any of the values inattentively mostly out of boredom or tiredness. Our questions were shaped by previously used user experience evaluation questionnaires like the Questionnaire for User Interaction Satisfaction (QUIS) (Norman et al., 1998), the Computer Usability Satisfaction Questionnaire (Lewis, 1995) and the NASA Task Load Index (NASA TLX) (Hart, 2006).

4.4. Tasks To compare the four interfaces, users were required to perform a set of four tasks on the interfaces. For this comparative study, we chose simple tasks that are easy to handle for users since some are first-time users of a choreographing application. The comparative assessment examined the usability of the interfaces and effects they have on user experience. Below are the four tasks that users were expected to perform:

5. Results

• Task 1) Add two dancers to the left and right of the scene • Task 2) Add two props to the left and right of the scene • Task 3) Draw a simple path from the back to the front for the dancers • Task 4) Play the choreography (free interaction time)

To analyse the statistical significance of the results, we used a within-subject repeated measure Analysis of Variance (ANOVA) test (Tabachnick and Fidell, 2007). A p value of less than 0.05 shows a significant difference amongst the means. In the tables, the best results are shown in bold. Furthermore, on the p value column the results in italics show that the differences are significant. In addition to the ANOVA test, we performed Post Hoc test. We used the Tukey HSD (Honest Significance Test) (Montgomery, 2017) to compare the means between each interface. In the Tukey test, a p value that is less than 0.05 shows that the difference between the means is statistically significant. We utilised the Statistical Package for the Social Sciences (SPSS) to analyse the results.

Firstly, the user had to add a dancer into the scene to the left and to the right. After the actor is added into the scene, the user then adds props into the scene. For the props, the user chose between a tree and a box. Once the prop is added, the user had to define a simple path for the actor to follow. Finally, the user had to play the defined choreography and view the scene. This task is a free time task so that users can explore the system as much as they like. Therefore, users were assigned to zoom in and out and also change the speed of the characters as they followed the defined path.

5.1. Results on mean task completion time

4.5. Procedure

The results in Table 1 indicate that completing the tasks was generally faster on the touch-based interactive approaches of the AR and the mobile application interface as compared to the mouse approach of the PC based interface. The average time scores recorded for completing the first three tasks in sequence are 30.6 s for the PC based interface, 24.5 s for the mobile application interface, 23.1 s for the ARbased interface and 26.7 s for the VR based interface. These average scores show that the AR-based interface produced the best task completion rates compared to the other interfaces. Especially Task 3 of drawing path shows a significant difference on the mean time values. The drawing path task is performed faster on the mobile-based interface, the AR interface and the VR interface as compared to the PC based. From the ANOVA analysis, the p values for adding a character to the scene and drawing path show that the users performed significantly different in terms of task completion rates. However, for Task 3, the difference is insignificant across the interfaces. The results for the

The participants performed the four tasks in sequence on each of the four interfaces. Prior to the start of the experiment, each participant was given a detailed brief description of the task to be performed. During the experiments, we also employed the think-aloud usability evaluation technique. The think-aloud approach enabled us to understand what was going through the participants’ heads as they performed the tasks. Jakob (2012) argues that the think-aloud is the single most valued usability engineering technique. Furthermore, it helps us obtain the subjective results regarding users’ psychological perception of the interfaces as they use it. To obtain the objective results for efficiency, we recorded the time the users took to complete the tasks on the given interfaces for Task 1, 2 and 3. However, for the last task to play the scene, time scores were not recorded so as to allow users to explore the system yielding comprehensive results. At the end of the experiment, the users were presented with a posttest questionnaire that assisted in obtaining subjective analysis of the experiments. The users gave feedback on satisfaction, the pleasure they experienced, time to learn the interface, their frustration levels and they also gave feedback on their preferred interface. The results of the effort required to perform different tasks on interface were also obtained in the form of mental stress and physical stress. We also expected users to give a feedback on the effects of the different interactive approaches on viewing the choreography from different angles and also the effects of the different zooming interactive techniques. The user experience test conducted on the interfaces investigated

Table 1 Mean task completion time (in seconds) for non-professional users. Best scores are shown in bold. Interface Task 1 Task 2 Task 3 Task 4 Average

18

PC based interface 9.3 ± 1.2 8.9 ± 1.2 12.4 ± 2.1 free time 10.1 ± 1.1

Mobile interface 8.3 ± 1.0 9.0 ± 2.5 7.2 ± 1.2 free time 8.1 ± 1.2

VR interface 9.1 ± 0.9 8.9 ± 1.1 8.7 ± 0.9 free time 8.8 ± 1.0

AR value 7.5 ± 1.2 9.1 ± 1.3 6.5 ± 0.8 free time 7.8 ± 0.8

p 3.8E-6 0.9 3.2E-16 free time

International Journal of Human-Computer Studies 132 (2019) 12–24

T. Joseph Dube and G. İnce

Table 2 Mean task completion time (in seconds) for professional users. Best scores are shown in bold.

Task 1 Task 2 Task 3 Task 4 Average

PC based interface

Mobile interface

VR interface

AR interface

9.2 ± 1.2 9.0 ± 0.9 11.5 ± 1.5 free time 9.9 ± 1.2

8.0 ± 1.2 9.1 ± 1.1 7.0 ± 1.0 free time 8.0 ± 1.1

9.2 ± 1.1 9.0 ± 1.2 8.4 ± 1.1 free time 8.9 ± 1.1

7.6 ± 0.9 9.0 ± 1.3 6.6 ± 1.0 free time 7.7 ± 1.0

Table 3 Mental stress results (in Likert scale 1–10) for non-professional users. Best scores are shown in bold.

Task 1 Task 2 Task 3 Task 4 Average

PC based interface

Mobile interface

VR interface

AR interface

p value

3.1 1.5 6.7 1.5 3.2

3.0 1.4 2.9 2.0 2.3

2.9 1.5 1.9 4.1 2.6

2.9 1.7 1.3 1.3 1.8

0.9 0.9 9.7E-24 1.67E-10

± ± ± ± ±

0.9 0.8 1.0 1.1 0.9

± ± ± ± ±

0.7 1.0 1.2 1.0 1.0

± ± ± ± ±

1.1 0.9 0.8 0.8 0.9

± ± ± ± ±

0.9 1.1 0.7 1.2 0.9

Table 4 Mental stress results (in Likert scale 1–10) for professional users. Best scores are shown in bold.

Tukey HSD test show that for Task 1 there was no significant difference between the PC based interface and the VR interface with a p value of 0.8. Furthermore, for both Task 1 and Task 3 the mobile-based interface and the AR-based interface show no significant difference with a p of 0.13 and 0.6 respectively, which are more than 0.05. The results for the VR based interface is greatly influenced by its interaction method’s hybrid nature which leverages both direct and indirect input methods. However, we noted that the number of errors associated with the VR based interface on the drawing path task negatively affect task completion time. Drawing the path on the touchbased interfaces using the finger was simpler and faster as compared to using the mouse to define the path or using head movements. The time scores generally show that the touch-based approach is more efficient than the mouse-based approach as they produce faster task completion rates. Although time was not measured for Task 4 it was observed that zooming in and out or changing the viewing angles was faster in the AR-based interface. This further highlighted the efficiency of AR as it gives users a quicker way of achieving different viewing positions. The VR interface’s zooming technique produced significantly slow task completions scores since the user had to continuously swipe the touchpad to achieve the required result. The task completion time score results for the professional test subjects show a similar trend as the results from non-professional test subjects as shown in Table 2. Professional test subjects emphasized that the AR interface gave them a faster technique to view the scene from different angles and zooming levels. The feedback from professional users was important during results analysis.

Task 1 Task 2 Task 3 Task 4 Average

PC based interface

Mobile interface

VR interface

AR interface

2.6 1.7 5.8 1.3 2.9

2.3 1.3 2.4 2.0 2.0

2.2 1.3 2.2 4.8 2.3

2.1 1.9 1.2 1.4 1.7

± ± ± ± ±

0.8 0.6 1.1 1.0 0.9

± ± ± ± ±

1.0 0.8 1.0 0.9 0.9

± ± ± ± ±

0.9 1.1 1.0 1.0 1.0

± ± ± ± ±

1.1 1.0 0.9 1.0 1.0

Table 5 Physical stress (in Likert scale 1–10) for non-professional users. Best scores are shown in bold.

Task 1 Task 2 Task 3 Task 4 Average

PC based interface

Mobile interface

VR interface

AR interface

p value

1.5 1.6 2.9 1.6 1.9

2.0 2.8 1.7 6.8 2.8

4.7 4.4 3.9 4.8 5.1

4.3 4.2 1.4 6.6 4.1

5.2E-14 3.8E-11 2.9E-12 8.9E-21

± ± ± ± ±

1.3 1.2 0.9 1.0 1.0

± ± ± ± ±

0.7 0.4 0.6 0.9 0.6

± ± ± ± ±

0.9 1.0 0.7 0.7 0.9

± ± ± ± ±

1.0 1.3 0.7 1.7 1.1

5.3. Results on physical stress Physical stress is high on the AR-based interface and the VR based interface as the results in Table 5 show. This is influenced by the interfaces’ use of gestures when interacting. The AR interface has an average physical stress of 4.1, whereas the VR based interface has an average of 5.1 as compared to 2.8 and 1.9 of the mobile application and PC based interface respectively. The p values in the table show that the differences among the interfaces are significant in all the tasks with scores that are less than 0.05. Interestingly for Task 1 and Task 2 the results for the Tukey HSD test show that for the VR and AR the results are not significantly different with p values of 0.7 and 0.9 respectively. In addition, the drawing path task shows an insignificant difference between the AR and Mobile based interfaces with a p of 0.9. This is influenced by the same means of drawing the path which utilizes a touchscreen. Task 4 “playing the scene” shows the highest level of physical stress experienced by the users on the VR interface, demonstrating the negative side of VR as users are subjected to a high physical load. AR results also highlight one big disadvantage of mobile AR on user experience as it requires more physical effort over time. The high levels of physical stress on the AR interface are due to the movement of hands, as the user moves the device closer and further from the marker to achieve different angles and zooming levels. This effect is also the cause of the physical stress scores on the mobile application as the fingers need to constantly interact with the interface for any activity. The VR interface produces high physical load scores since the users have to continuously move the head or hands to interact with the virtual environment. Furthermore, the strain on the eyes was another effect. As shown in Table 5, best scores are achieved on the PC

5.2. Results on mental stress The level of mental stress on all the four interfaces is generally low owed to the simplicity of the tasks and the interfaces, as shown in Table 3. In the table, the p values for Task 1 and Task 2 show that users experienced differences that are not significant as they are over 0.05. However, when drawing the path and viewing the scene the difference between the mean scores is significant. The Tukey HSD test results show that for Task 3 the results for the PC based interfaces is significantly different from all the other interfaces. This was a result of the approach that required more time to grasp. On the AR interface, users experienced the least mental stress on three of the four tasks. Results for Task 1 show the same result for VR and AR highlighting the close relation of AR and VR. The VR based interface also shows low cognitive load on the users however analysis of the think allowed results showed that users expressed significant mental load whilst playing the scene, especially with regards to zooming. The mental stress results from the professional test subjects show a similar trend to the non-professional test subjects as shown in Table 4. Users generally experienced less mental stress on the AR-based interface as compared to the three other interfaces, in particular, the Task 4 to view the scene. Generally, professional users expressed less mental stress as compared to non-professional test subjects. 19

International Journal of Human-Computer Studies 132 (2019) 12–24

T. Joseph Dube and G. İnce

Table 6 Physical stress (in Likert scale 1–10) for professional users. Best scores are shown in bold.

Task 1 Task 2 Task 3 Task 4 Average

PC based interface

Mobile interface

VR interface

AR interface

1.1 1.0 1.6 1.2 1.2

2.1 2.0 1.7 3.9 2.4

5.9 5.1 4.2 6.2 5.4

5.7 5.1 2.0 6.1 4.7

± ± ± ± ±

1.0 0.9 0.9 1.1 1.0

± ± ± ± ±

0.9 0.9 1.1 1.1 1.0

± ± ± ± ±

1.0 1.0 1.1 1.2 1.0

± ± ± ± ±

Table 9 Pleasure results (in Likert scale 1–10) for non-professional users. Best scores are shown in bold.

1.1 0.9 1.1 1.0 1.1

Task 1 Task 2 Task 3 Task 4 Average

Table 7 Satisfaction results (in Likert scale 1–10) for non-professional users. Best scores are shown in bold.

Task 1 Task 2 Task 3 Task 4 Average

PC based interface

Mobile interface

VR based interface

AR interface

p value

7.1 7.9 6.7 7.2 7.2

6.5 7.3 7.6 7.0 7.1

7.0 7.8 6.5 7.4 7.2

6.8 8.0 8.2 8.6 7.8

0.2 0.09 0.001 1.3E-5

± ± ± ± ±

1.1 0.9 2.1 0.5 1.2

± ± ± ± ±

1.2 1.3 0.2 0.9 0.8

± ± ± ± ±

0.7 0.6 1.1 0.9 0.8

± ± ± ± ±

0.8 0.5 0.8 1.0 0.8

Task 1 Task 2 Task 3 Task 4 Average

Mobile interface

VR interface

AR interface

7.3 7.8 7.0 7.0 7.3

7.1 7.1 7.3 6.9 7.1

7.0 6.9 6.0 7.6 6.9

7.1 7.6 7.8 8.7 7.9

± ± ± ± ±

1.0 1.0 1.1 1.0 1.0

± ± ± ± ±

0.9 1.0 0.8 1.1 0.9

± ± ± ± ±

1.0 1.0 1.2 0.8 1.0

± ± ± ± ±

Mobile interface

VR interface

AR interface

p value

7.0 7.6 6.7 6.6 6.9

6.7 7.6 8.2 7.0 7.4

8.3 8.1 8.1 8.0 8.1

7.4 8.0 8.6 8.9 8.2

0.0012 0.39 7.1E-5 6.2E-8

± ± ± ± ±

1.1 1.0 1.2 0.7 0.9

± ± ± ± ±

1.6 1.2 1.3 1.1 1.1

± ± ± ± ±

0.7 0.7 0.9 1.2 0.9

± ± ± ± ±

1.0 1.1 0.9 0.6 0.9

Table 10 Pleasure results (in Likert scale 1–10) for professional users. Best scores are shown in bold.

Task 1 Task 2 Task 3 Task 4 Average

Table 8 Satisfaction results (in Likert scale 1–10) for professional users. Best scores are shown in bold. PC based interface

PC based interface

PC based interface

Mobile interface

VR interface

AR interface

6.8 6.9 6.5 6.4 6.7

6.8 7.0 7.9 6.9 7.2

8.4 8.3 7.1 8.0 8.0

7.2 7.3 8.4 8.8 8.1

± ± ± ± ±

1.2 1.0 1.0 0.9 1.0

± ± ± ± ±

1.0 1.3 1.0 0.8 1.0

± ± ± ± ±

0.7 0.8 0.9 1.0 0.8

± ± ± ± ±

1.0 0.8 1.1 0.8 0.9

move around the scene using the AR interface. Results shown in Table 8 are in agreement with the results obtained for non-professional test subjects. However, professional test subjects expressed that completing Task 1 and Task 2 was more satisfying on the PC based interface as compared to the three other interfaces. This result is influenced by the stability of the PC based interface. The professional choreographers expressed that the lack of stability for the AR interface has a potential of causing incomplete actions especially when the user needs to add many characters. However, whilst playing the scene they expressed that AR would be very effective for their choreographing process since it allows them to quickly view the scene from different angles and zooming levels.

0.9 1.1 1.1 1.3 1.1

based interface highlighting that the mouse as an interaction device requires low physical effort. The main source of physical stress on the PC based interface is the movement of the mouse through small movements of the hand while the arm lies in rest-up position. The results for professional users agree with the results obtained from non-professional users as shown in Table 6. The difference between the results for the professional and non-professional test subjects is small.

5.5. Results on pleasure Table 9 shows the scores recorded for the pleasure experienced by users when performing the tasks. The results show significant excitement when using the AR and VR interfaces. The main excitement for the AR-based interface is brought by the curiosity that the interface introduces as stated by Chang et al. (2010). VR interface’s results are influenced by the fact that it immerses a user into a virtual world. Table 9 shows that the VR interface gave the highest pleasure results for Task 1 and Task 2. However, the AR interface gives the best pleasure scores for Task 3 and Task 4. The p value scores for Task 1, 3 and 4 show that the obtained pleasures results are significantly different amongst the interfaces as they are less than 0.05. Of interest are the Tukey HSD results for Task 4. The p value between the VR and AR is 0.8 showing that there is no significant difference in pleasure to users when playing and viewing the choreography. Task 4 results are of particular interest since in this task users were allowed free time to explore the system. The AR’s higher score is attributed to its zooming and view changing technique which gave users a compelling effect. This approach of changing viewing position gave users an easier way to move around the scene and it reduces the time required to achieve the required view or zooming position as compared to using the touchscreen to zoom in or using the mouse to zoom. Therefore on the overall, the AR interface is more effective than the other interfaces in the last task. The VR based interface’s Task 4 results were negatively affected by the method to zoom in

5.4. Results on satisfaction Users expressed high levels of satisfaction from completing their tasks on all the interfaces as shown in Table 7. These results are attributed to the simplicity of the interfaces. The p value results shown in the table for Task 1 and Task 2 show that the results were not significantly different amongst the interfaces. Users were generally satisfied almost the same way. Although for Task 3 and Task 4 the p value scores are less than 0.05 showing a significant difference, the Tukey results show that this difference is between the PC and the AR interface. The Tukey HSD results between the PC and AR interfaces have p value of less than 0.05. Overall, Users expressed more satisfaction with the AR interface as compared to the three other interfaces. For Task 3 drawing the path users were less satisfied on completing this task on the VR interface with a score of 6.5. This result is influenced by the lack of control on the path being defined when a user uses head movements to draw. Users expressed satisfaction in the quicker and easier way they managed to 20

International Journal of Human-Computer Studies 132 (2019) 12–24

T. Joseph Dube and G. İnce

Table 11 Frustration results (in Likert scale 1–10) for non-professional users. Best scores are shown in bold.

Task 1 Task 2 Task 3 Task 4 Average

PC based interface

Mobile interface

VR interface

AR interface

p value

1.9 1.4 3.1 3.0 2.4

1.4 1.8 2.8 4.1 2.5

1.8 1.4 4.0 4.1 2.8

2.3 1.8 1.0 1.8 1.7

0.3 0.2 0.2 0.2

± ± ± ± ±

0.8 1.3 0.8 0.9 0.9

± ± ± ± ±

0.9 1.1 0.9 0.8 0.8

± ± ± ± ±

0.7 0.6 1.1 1.2 0.9

± ± ± ± ±

0.9 0.9 0.8 0.9 0.9

Fig. 10. Interface preference results.

Frustration results of the professional and non-professional test subjects show the same trend as shown in Table 12. Professional users expressed a sense of frustration with the touchpad of the VR headset as they try to tap it to select different controls. Since they are immersed in another world pinpointing the touchpad proved to be a challenge.

Table 12 Frustration results (in Likert scale 1–10) for professional users. Best scores are shown in bold.

Task 1 Task 2 Task 3 Task 4 Average

PC based interface

Mobile interface

VR interface

AR interface

5.7. Time to learn results

1.1 1.2 2.9 3.9 2.2

1.5 1.6 3.0 5.9 3.0

2.0 2.1 5.9 5.1 3.8

3.3 2.8 1.6 1.5 2.2

The simplicity of the interfaces is further shown by the “easy to learn” results we obtained from the two types of users, which have scores of 8.4 for the AR-based application, 8.2 for the VR based interface, 8.0 for the mobile application interface and 8.1 for the PC based interface. The interactive approaches we employed on the interface do not require prior learning before using the system. The AR interface has fewer visual buttons than the other three interfaces this makes it easier to learn as shown by the results. It is of high importance to develop interfaces that are easy to grasp for the users, especially for applications that are used in training and education. This also helps the users retain over time.

± ± ± ± ±

0.6 0.9 1.1 1.2 1.1

± ± ± ± ±

1.0 1.0 1.2 0.7 0.9

± ± ± ± ±

1.1 0.7 1.2 1.0 1.0

± ± ± ± ±

1.1 0.9 1.0 0.7 0.9

and out of the scene. The results also indicate that users experienced more excitement on the mobile application than on the PC based interface. The results from the professional and non-professional tests subjects show a similar pattern as shown in Table 10. However, professional choreographers expressed less comfort for the AR interface especially with tasks that require a stable environment, for example by adding an actor or dancer.

5.8. Results on user’s preference After completing the experiments, we also asked which interface for choreography generation the users prefer best in general after having performed tasks on all the four interfaces. Eight preferred the AR interface, four preferred mobile application, two preferred VR based and five preferred the PC based interface as shown in Fig. 10. These results show that the AR-based interface was the preferred option by the users. From the users that already had prior experience with AR only one of the users did not choose the AR-based application as a preferred interface for choreography generation. From the five professional choreographers two preferred the AR-based interface, two preferred the PC based interface and one preferred the VR based interface. Users who chose the AR interface alluded to its natural interactive approach to viewing the scene and changing zooming levels. Furthermore, the AR interface made it possible for the use of both tangible and virtual objects for illustration purposes. This ability was emphasized by choreographers who viewed it as an effective feature, especially in training. The major reason for users who did not prefer the AR interface was the lack of stability, especially for tasks like adding characters and props. In this regard, the PC based interface was a preferred alternative since it brings that stability. Choreographers who chose the PC based interface said that they felt more comfortable using the PC based interface since they are already used to performing their choreographing tasks on the PC. The VR interface’s high physical load and difficult ways to achieve different zooming levels influenced the preference results for the VR based interface.

5.6. Results on frustration The mobile-based interface produced high level of errors when the user was zooming in and out of the scene, thus more frustration as shown in Table 11. This was largely due to the user rotating the scene instead of zooming. A number of errors were also experienced on the PC based interface, but this was less than the ones on the mobile based. This result is attributed to the level of control a pointer device has on the scene largely due to the small point of contact with the interface as compared to the finger’s surface. However, we note that with the ARbased interface this error does not exist as the AR interface allows a more natural way of zooming which is analogous to moving closer to an object to view it clearly. Frustration scores for Task 4 demonstrate the effects of the errors on the interface as the users were frustrated on interfaces with many errors. Users were less frustrated on the AR interface whilst playing the scene. This is largely due to the easiness of changing viewing angles and zooming levels. The VR based interface’s technique to zooming does not present many errors, however, the time factor to achieve the required zooming level gave users high levels of frustration. However, we note that when adding the characters the AR presents the highest frustration score. This result is caused by the tilting effect of the mobile device whilst the user is positioning the characters. Therefore, this makes it difficult for the user to correctly position the characters in the scene in the way they wanted. On the contrary, in the PC based interface and the mobile application, this difficulty does not exist. For Task 3 frustration results show that the VR based interface gave users more frustration. This is attributed to the low level of control that a user had when using head movements to draw a path. However, the p values from the ANOVA test are greater than 0.05 showing that the difference in the mean scores for frustration is not significantly different.

6. Discussion Experiments were carried out to complete the same task on four different interfaces using different interactive techniques: 1) the PC based application that used pointer based interaction approach utilizing a mouse and keyboard, 2) the mobile application interface utilizing a touchscreen-based approach, 3) the VR based interface that utilized a 21

International Journal of Human-Computer Studies 132 (2019) 12–24

T. Joseph Dube and G. İnce

touch-pad and head-gestures and 4) the AR application that used touchbased interaction approach and gestures. Tasks used in the experiment were structured to capture the basic choreographing process as defined by the choreographers. The results obtained contribute to the study of how different interaction techniques affect user experience on a choreography application. Furthermore, the results also highlight the effects of VR and AR on user experience in choreography applications. The use of the two different sets of users helped us understand completely the usability of the interfaces from the experts in choreography and the general users of the interfaces.

dancers and props into the scene the rate of errors was higher on the AR-based interface as compared to the other three interfaces. Mobile AR‘s greatest challenge is for users to keep the mobile hand-held devices in a stable position during use. This is attributed to the natural hand tremor (Vincent et al., 2013). However, depending on the criticality of the applications such errors can be tolerated. The VR based interface and the PC based interface produced few errors when adding props and actors into the scene. This is because of the control the pointer based input device gives to the users, which strengthens the findings by Forlines et al. (2007). The reticle acts as a virtual pointer in the VR device whilst the touchpad acts as a mouse, this explains the similar effects with the PC based application on these tasks. Despite the hand tremor in the AR interface, character positions can be quickly corrected. On the other hand, the AR-based interface had no errors associated with zooming or rotating the scene, since this was accomplished by only moving the mobile device closer to or around the scene. On the mobile application interface, there were more errors when zooming or rotating the scene. This was due to the larger surface area of the finger when rotating the scene as compared to the mouse pointer or the reticle. The mouse’s scroll wheel also affords a simpler zooming approach compared to using the finger on the touchscreen. Furthermore, to change the viewing angles by dragging the scene by means of a mouse pointer gives faster and more accurate results, compared to the use of a finger on the touchscreen or use of the touchpad on the VR headset. The results from the mouse pointer based interface are influenced by the small contact area of the mouse pointer which in turn gives the user more control when performing the task. Although the touch-based approach is efficient, care has to be given especially when dealing with interfaces that have many controls on the same line. This presented many errors on the touch-based mobile application interface whereby a user wanted to zoom but instead found himself/herself rotating the scene. Nonetheless, such errors are rarely experienced on the PC-based interface because of the smaller area of contact of the mouse compared to the larger surface area of the finger.

6.1. Discussion on efficiency The touch-based approach produced faster task completion time scores compared to the mouse-based approach. The touch-based interactive approach of the AR interface and the mobile application exhibited faster time scores using the finger to draw path demonstrating the power of interfaces that provide a natural way of interaction to the users (Besançon et al., 2016). The best overall task completion time scores were obtained from the AR-based interface. Nonetheless, the small difference between the AR-based interface and the mobile application is due to the fact that they both use the touch-based interactive approach, although the AR-based interface also uses gestures to complete some tasks. This result shows that choreography tools can benefit from the use of the touch based interfaces. In addition, the AR touch interface can help improve the legacy mixed reality interface defined in Broll et al. (2004), which depends on bulky hardware. In as much as the VR based interface also introduces some natural ways of interaction, drawing the path is a significant challenge because it depended on the user moving his/her head which negatively affected task completion times. Our results differ from those found by Boud et al. (1999) when AR was compared with VR on an industrial training application. However, as the researchers in the same research allude to, the performance of VR and AR differs depending on the application area. Therefore, in choreography design VR’s default interaction technique of using head movements is not suitable for defining choreography which is an essential function in choreography design. However, other interaction techniques like using motion tracked controllers remain to be studied. In addition to the faster task completion rates on the AR interface, the ability to zoom in and out of the scene by only moving the handheld device closer or away of the scene gave users a faster way to complete the task. This further demonstrates the effectiveness of having interfaces that facilitate a natural way to complete tasks. In contrast, the VR interface utilized the touchpad, an indirect method to achieve different zooming levels which resulted in a long way to achieve the required zooming levels. Choreographers expressed excitement in being able to efficiently zoom using the AR based interface. This was an important result for choreographers, they mentioned that the ability to move around the scene efficiently by only moving around the marker improves their choreographing process. Thus, interfaces which facilitate a natural way to complete tasks possess a great advantage as far as user experience is concerned. Therefore, when we combine the think aloud and the time scores as mentioned by Winter et al. (2008), we conclude that the AR-based interface is more efficient than the other interfaces in choreography. This result showed that choreography applications can benefit from mobile AR, choreographers can design and play their choreography in an efficient manner. Furthermore, the results show that the touch-based approach is efficient in defining choreography, including path defining and editing which is an essential function in most choreographing applications (Broll et al., 2004; Calvert et al., 1998).

6.3. Discussion on learnability Designing and developing interfaces that are easy to use is of vital importance. The simplicity of the graphical user interface assists to get quicker “time to learn” scores as observed in the experiments. The time it takes to learn the basic functionality of the interface affects the overall usability of the interface, especially when it comes to tools used for training and education. Interfaces that are simple to comprehend also aid boost the users’ retention rate. In this research, we employed interfaces that were easy to learn, which motivated our users and made them more interested in exploring the different functionalities of the interfaces. Furthermore, utilizing input devices that do not require prelearning is important when developing education and training applications. NUI approach presents an opportunity to use input modalities that are intuitive for users. From the experiments, the AR interface produced the best “time to learn” score. This also led to the interface being the preferred choice of the users. The AR-based interface has few interactive menu items, this boosts its time to learn results. Therefore, the fewer the interactive menu items, the simpler the interface and thus it becomes easy to learn for the users. 6.4. Discussion on physical and cognitive load Using the AR interface the users are afforded the ability to interact with the characters by only moving the mobile device around the AR marker. Users expressed pleasure and excitement by the being able to transform viewing angles and zooming levels using this approach. This approach is parallel to moving closer to an object or moving further from an object to view it from different angles, which is a natural way of changing viewing points. With this approach, users were afforded an

6.2. Discussion on effectiveness From the experiments, the results also showed that when adding 22

International Journal of Human-Computer Studies 132 (2019) 12–24

T. Joseph Dube and G. İnce

easier and quicker way to move around the scene. However, the effects of this approach is the continuous movement of hands and body which leads to fatigue. Thus, mobile AR provides high levels of physical stress on the hands of users. One other factor that aids to the physical stress of mobile AR is the screen size of the handheld mobile device. Devices with limited screen size tend to restrict the covered field of view (Bimber and Raskar, 2006), thereby forcing the user to continuously adjust viewing positions. Choreographers expressed that this high strain can be a factor especially in lengthy choreography designs. The VR interface also exerts a lot of physical load on the users. Thus, NUI approach presents a new challenge to interface developers, the need to avoid bad ergonomics. Since users interact with these interfaces using natural means like movements of hands, body or head, this leads to the physical exhaustion on the body. Therefore, as time passes whilst the user is interacting with interfaces that use the NUI approach fatigue becomes a factor. With a VR headset, the effects are even severe as it can strain the eyes as mentioned by Mandal (2013). This is another major negative effect of using VR headsets.

used to interact with the interface items and to control actors. This is an attempt to completely embrace natural user interface approach through combining virtual buttons and voice commands in a single application. Thus enabling users to interact with virtual objects superimposed into the real world using only voice commands and virtual buttons. 7. Conclusion and future work In the study, we designed and implemented four interfaces for choreography generation a PC based interface, a mobile application interface, the VR based interface and the AR-based interface. We performed a comparative assessment in terms of user experience and usability. Our study investigated the effects of different interaction methods of the same application on user experience. Furthermore, the research investigated VR and AR in choreographing applications. The results of the experiments highlight the importance of the NUI approach in interface design. This approach presents an intuitive interaction technique where users utilize their everyday methods of interaction. From the results, touch-based methods afford users a faster way to accomplish tasks as compared to the use of the traditional mouse and keyboard. The results also indicate the challenges and limitations of VR and AR in different domains. The VR based interface and AR-based interface give users more excitement and pleasure. Although the VR interface gives the highest pleasure, when the results from other parameters like time and physical stress are inspected, the AR-based interface proves more effective and efficient in the choreographing process. The results of the experiment showed that AR has a big role to play in the development of training and educational applications. Therefore, more research on AR for choreography needs to be conducted so as to give choreographers flexibility when performing their choreographing process. The ability to use both computer-generated imagery and physical objects is an important factor in the choreographing domain. Major limitations for VR are the development time for a realistic environment for choreography and interaction techniques. Development time factor can adversely affect its adaptability in choreography as different choreography applications require different virtual environments. Thus in the future, it is important to find ways to decrease the time required to create virtual worlds. We plan to build upon this study to further investigate AR coupled with natural user interface aspects like voice and gestures. Another work we envision for the future is the use of virtual buttons to interact with the interface. Furthermore, drawing the path virtually on top of the marker instead of using the touchscreen. These approaches enhance the use of natural ways when interacting with the interface. It is also part of our future plans to increase the types of movement and animation for the choreography generator and then perform usability test with the experts in the field of arts.

6.5. Virtual reality and augmented reality in training applications The results show that AR is important for designing tools for training and teaching as mentioned by Lee (2012). AR gives users a compelling and exciting effect as it creates curiosity. In addition, AR is a flexible technology that affords users the ability to interact with both the physical and virtual world at the same time. Therefore, with this approach trainers and teachers are afforded a way to explore different physical objects to illustrate concepts using the same interface, without the need to change the overall graphics. Dunleavy et al. (2009) found that the combination of the virtual and real world was engaging for students and made them more interested in learning (Dunleavy et al., 2009). Our results show that for choreography the use of both virtual and real object assists the choreographer to easily express ideas. Furthermore, users expressed engagement and interest. In addition, the results of user preference show that AR has a good potential, this strengthens the findings by Dunleavy et al. (2009). Our results also show that VR is indeed a fascinating technology as mentioned in Mandal (2013), however, it is also important to consider the application domain and also the input modalities as these greatly influence its efficiency and effectiveness. The important features that boast VR’s fascinating factor are immersion, presence and interactivity. In this research, we realized that in as much as the users experienced great pleasure using the VR based interface, task completion rates were a major factor in its overall efficiency. Using head movements to interact in VR is an intuitive solution but for VR this approach makes it hard to compose choreography. Results showed that using this approach has high physical demand. However, other VR input modalities can be implemented but this comes with additional costs. To limit the pronounced drawbacks of mobile AR, wearable glasses and googles provides a great promise for AR in training and teaching tools. The use of wearable headsets like the Google‘s cardboard limits the use of hands when interacting with the interface. Therefore, it removes the need to continuously use the hand to adjust and position the mobile device correctly, however, this will require other input modalities. We leave this methodology as work for the future. This will give a comprehensive undertaking of the effects of wearable devices on mobile AR and training applications, particularly choreography. We further envision an AR interface that completely uses virtual buttons. With this approach, the mobile device screen is only used as viewing screen instead as the interaction medium. Furthermore, to enhance the natural feel of the approach, the use of voice command is imperative. In the meantime, we only used voice commands to enter text into speech bubbles with the use of the Google speech API. Consequently, in the future, we will complement the touch-based mobile application interface and the AR-based interface by adding voice command that will be

Conflicts of Interest None. Acknowledgment The authors would like to thank Gökhan Kurt, Güneş Karababa, Çağatay Koç and Ege Sarioğlu for the fruitful discussion sessions and their invaluable ideas. References Alaoui, S.F., Carlson, K., Schiphorst, T., 2014. Choreography as mediated through compositional tools for movement: constructing a historical perspective. Proceedings of the 2014 International Workshop on Movement and Computing June 16–17, 2014, Paris, France. ACM, pp. 1–6. ISBN: 978-1-4503-2814-2, Allen, R., 1983. The bionic dancer. J. Phys. Educ. Recreation Dance 54 (9), 38–39. Besançon, L., Issartel, P., Ammi, M., Isenberg, P., 2016. Usability comparison of mouse,

23

International Journal of Human-Computer Studies 132 (2019) 12–24

T. Joseph Dube and G. İnce

articles/thinking-aloud-the-1-usability-tool/. Eccessed: 10.04.2017. Jana, P., 2016. Audience takes centre stage in pioneering virtual reality dance film. https://www.theguardian.com/stage/2016/mar/07/virtual-reality-film Accessed: 01.07.2017.https://www.theguardian.com/stage/2016/mar/07/virtual-reality-film. Accessed: 01.07.2017. Jung, J., Park, H., Hwang, D., Son, M., Beck, D., Park, J., Park, W., 2014. A review on interaction techniques in virtual environments. Proc. 2014 International Conference on Industrial Engineering and Operations Management. pp. 1582–1590. LaViola Jr, J.J., Kruijff, E., McMahan, R.P., Bowman, D., Poupyrev, I.P., 2017. 3D User Interfaces: Theory and Practice. Addison-Wesley Professional. Lee, G.A., Yang, U., Kim, Y., Jo, D., Kim, K., Kim, J., Choi, J., 2009. Freeze-set-go interaction method for handheld mobile augmented reality environments. Proceedings of the 16th ACM Symposium on Virtual Reality Software and Technology November 18–22, 2009, Kyoto, Japan. ACM, pp. 143–146. ISBN: 978-1-60558-869-8, Lee, K., 2012. Augmented reality in education and training. TechTrends 56 (2), 13–21. ISSN: 8756-3894, language = english Lewis, J.R., 1995. IBM computer usability satisfaction questionnaires: psychometric evaluation and instructions for use. Int. J. Hum.-Comput. Interact. 7 (1), 57–78. Mandal, S., 2013. Brief introduction of virtual reality & its challenges. Int. J. Sci. Eng.Res. 4 (4), 304–309. Mine, M.R., 1995. Virtual Environment Interaction Techniques. UNC Chapel Hill CS Dept. Montgomery, D.C., 2017. Design and Analysis of Experiments. John Wiley & Sons. Nielsen, J., 1993. Usability Engineering. Morgan Kaufmann Publishers Inc. ISBN: 0125184050 Noll, A.M., Hutchinson, A., 1967. Choreography and computers. Dance Mag. 41, 43–45. Norman, K. L., Shneiderman, B., Harper, B., Slaughter, L., 1998. Questionnaire for user interaction satisfaction. University of Maryland (Norman, 1989) Disponível em. Psotka, J., 1995. Immersive training systems: virtual reality and education and training. Instr. Sci. 23 (5), 405–431. Revilla, M.A., Saris, W.E., Krosnick, J.A., 2014. Choosing the number of categories in agree–disagree scales. Sociol. Methods Res. 43 (1), 73–97. Schuler, D., Namioka, A., 1993. Participatory Design: Principles and Practices. CRC Press. Shneiderman, B., 1997. Designing the User Interface: Strategies for Effective HumanComputer Interaction, Third ed. Addison-Wesley Longman Publishing Co., Inc., Boston, MA, USA. Sinclair, B., 2016. The promise of virtual reality in higher education. EDUCAUSE Review. Sutherland, I. E., 1965. The ultimate display. Multimedia: From Wagner to virtual reality. Tabachnick, B.G., Fidell, L.S., 2007. Experimental Designs Using ANOVA. Thomson/ Brooks/Cole. Tomozoe, Y., Machida, T., Kiyokawa, K., Takemura, H., 2004. Unified gesture-based interaction techniques for object manipulation and navigation in a large-scale virtual environment. IEEE Virtual Reality 2004. pp. 259–260. Van Krevelen, D.W.F., Poelman, R., 2010. A survey of augmented reality technologies, applications and limitations. Int. J. Virtual Reality 9, 1–20. Vincent, T., Nigay, L., Kurata, T., 2013. Handheld augmented reality: effect of registration jitter on cursor-based pointing techniques. Proceedings of the 25th Conference on l’Interaction Homme-Machine (IHM) November 12–15, 2013, Bordeaux, France. ACM, pp. 1–6. Wiklund Axelsson, S., Björklund, C., Wikberg Nilsson, Å., Normark, C.J., 2017. HealthCloud: Participatory Design of user Interfaces for Senior People’s Active Aging. Luleå University of Technology. Wingfield, N., Isaac, M., 2016. http://www.nytimes.com/2016/07/12/ technology/ augmented-reality-to-a-mass-audience.html. Accessed: 01.07.2017. Winter, S., Wagner, S., Deissenboeck, F., 2008. A Comprehensive Model of Usability. Springer Berlin Heidelberg, Berlin, Heidelberg, pp. 106–122. Wojciechowski, R., Walczak, K., White, M., Cellary, W., 2004. Building virtual and augmented reality museum exhibitions. Proceedings of the ninth international conference on 3D Web technology. ACM, pp. 135–144. Yilmaz, R.M., 2016. Educational magic toys developed with augmented reality technology for early childhood education. Comput. Hum. Behav. J. 54, 240–248. ISBN: 0747–5632,

touch and tangible inputs for 3d data manipulation. CoRR abs/1603.08735, 1–10. Bimber, O., Raskar, R., 2006. Modern approaches to augmented reality. ACM SIGGRAPH 2006 Courses July 30–August 03, 2006, Boston, USA. ACM, pp. 1–88. ISBN: 1-59593364-6, Boud, A.C., Haniff, D.J., Baber, C., Steiner, S.J., 1999. Virtual reality and augmented reality as a training tool for assembly tasks. 1999 IEEE International Conference on Information Visualization (Cat. No. PR00210). pp. 32–36. Bowman, D.A., Wingrave, C.A., 2001. Design and evaluation of menu systems for immersive virtual environments. Virtual Reality, 2001. Proceedings. IEEE. IEEE, pp. 149–156. Broll, W., Grünvogel, S., Herbst, I., Lindt, I., Ohlenburg, M.M.J., Wittkämper, M., 2004. Interactive props and choreography planning with the mixed reality stage. Proceedings of the 3rd International Conference Entertainment Computing (ICEC), September 1–3, 2004, Eindhoven, The Netherlands,. Springer Berlin Heidelberg, pp. 185–192. ISBN: 978-3-540-28643-1, Brooks, F.P., 1999. What’s real about virtual reality? IEEE Comput. Graph. Appl. 19 (6), 16–27. Bujak, K.R., Radu, I., Catrambone, R., Macintyre, B., Zheng, R., Golubski, G., 2013. A psychological perspective on augmented reality in the mathematics classroom. Comput. Educ. J. 68, 536–544. ISSN: 0360–1315 Calvert, T., Mah, S., Jetha, Z., 1998. Designing an interface for choreographers. IFAC Proc. Vol. 31 (26), 77–82. 7th IFAC Symposium on Analysis, Design and Evaluation of Man Machine Systems (MMS’98), Kyoto, Japan, 16–18 September 1998 Calvert, T.W., Bruderlin, A., Mah, S., Schiphorst, T., Welman, C., 1993. The evolution of an interface for choreographers. Proceedings of the INTERACT’93 and CHI’93 Conference on Human Factors in Computing Systems. ACM, pp. 115–122. Chang, G., Morreale, P., Medicherla, P., 2010. Applications of augmented reality systems in education. Proceedings of Society for Information Technology and Teacher Education International Conference March 29, 2010, San Diego, USA. Association for the Advancement of Computing in Education (AACE), pp. 1380–1385. Cox, D. J., Patterson Jr, R. M., Thiebaux, M. L., 2000. Virtual reality 3d interface system for data creation, viewing and editing. US Patent 6,154,723. Craig, A.B., 2013. Understanding Augmented Reality: Concepts and Applications. Morgan Kaufmann. ISBN: 978-0-240-82408-6 Cunningham, M., 1969. Changes: Notes on Choreography. Vol. 1 Something Else Press. Davcev, D., Trajkovic, V., Kalajdziski, S., Celakoski, S., 2003. Augmented reality environment for dance learning. Proceedings of the Information Technology Research and Education International Conference (ITRE) August 11–13, 2003, New Jersey, USA. pp. 189–193. DeLahunta, S., 2002. Virtual reality and performance. PAJ 24 (1), 105–114. Dixon, S., 2015. Digital Performance: A History of New Media in Theater. MIT Press. Dunleavy, M., Dede, C., Mitchell, R., 2009. Affordances and limitations of immersive participatory augmented reality simulations for teaching and learning. J. Sci. Educ. Technol. 18, 7–22. ISSN: 1573–1839 Felice, M.C., Alaoui, S.F., Mackay, W.E., 2016. How do choreographers craft dance?: Designing for a choreographer-technology partnership. Proceedings of the 3rd International Symposium on Movement and Computing. ACM, pp. 20. Forlines, C., Wigdor, D., Shen, C., Balakrishnan, R., 2007. Direct-touch vs. mouse input for tabletop displays. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, New York, NY, USA, pp. 647–656. Gray, J.A., 1989. Dance Technology. Current Applications and Future Trends. ERIC. Hart, S.G., 2006. Nasa-task load index (nasa-tlx); 20 years later. Proceedings of the Human Factors and Ergonomics Society Annual Meeting. Vol. 50. Sage Publications Sage CA: Los Angeles, CA, pp. 904–908. Heim, M., 1998. Virtual reality. In: Kelly, M. (Ed.), Encyclopedia of Aesthetics. Oxford University Press, pp. 4–442. Hendee, W.R., Wells, P.N., 1997. The perception of visual information. Springer Science & Business Media. Höllerer, T., 2004. User interfaces for mobile augmented reality systems. Columbia University New York, NY. Jakob, N., 2012. Thinking aloud: The #1 usability tool. https://www.nngroup.com/

24