Category Archives: Virtual reality and BIM

“Ain’t Nobody’s Business But My Own”–Health update with Taj Mahal, Physical Therapy & The Pennsylvania Housing Research Center

“Man I don’t care what in the world that you do
As long as you do what you say you going to”

Today is Sunday August 30, 2015. How it got to be 1 PM I do not know.

  1. I do know that this ambitious posting will be under construction for a while. Consider the host of categories above which includes everything from Health Crisis to the Department of Architectural Engineering at Penn State to Joy of Motion.
  2. Why I begin this full disclosure  [see footnote (1)] account with a lie: “Ain’t Nobody’s Business But My Own” can and cannot be explained.
  3.  Here the overriding intent is to disclosure my plan for the future which I grandiosely refer to as “my life’s work.”

[youtube]https://www.youtube.com/watch?v=trcb0LZfuZA[/youtube]

This posting is under construction. Put on your hardhat and exercise caution.
This posting is under construction. Put on your hardhat and exercise caution.

 

 

 

 

 

 

 

 

 

 

 

Footnotes

Chapter 4. Dr. Kumar’s thesis on virtual reality modeling

Chapter 4

FRAMEWORK FOR EXPERIENCE-BASED VIRTUAL PROTOTYPING

This chapter discusses the approach to developing interactive virtual prototypes that enable experience-based design. First, there is a brief overview of how experience-based design can be combined with virtual prototypes by incorporating scenarios of tasks performed by end-users.

The second part of the chapter proposes a procedure to incorporate task-based scenarios into the virtual facility prototyping development process. The process is divided into four parts- requirements analysis, process design, process implementation and development and finally, process validation. (An overall experience-based virtual prototyping system plan is attached in the Appendix A) The virtual prototyping process was implemented on various project cases to test and refine the overall process.

In process design, system architecture is proposed for the development of interactive virtual prototypes. Based on this architecture and the requirement analysis, the experience-based design review process is defined and the model content and interactive media to be used for virtual prototyping are determined.

In prototype development and process implementation, workflows to transfer model content from design authoring and Building Information modelling applications to the interactive programming environment are discussed. Project cases are used as examples to demonstrate lessons learned and refinement of the above procedures. Procedures to develop interactivity are discussed and implementation of the scenarios framework to simulate in the prototype is presented. An approach to develop reusable model content is also discussed.

Finally, in Process Validation, the project cases used to develop the EVPS process are reviewed and a summary of lessons learned is discussed. Apart from that, informal interviews with domain-experts to validate the procedure for development are also discussed. 

4.1  METHOD TO DEVELOP EXPERIENCE-BASED VIRTUAL PROTOTYPES

Experience-based design can be combined with virtual prototyping by allowing end-users to simulate scenarios of tasks that they perform in their facilities. The approach for developing the experience-based virtual prototyping system is to look at the design review process of healthcare facilities through the lens of experience-based design using interactive virtual prototypes as the tool. Using game engine based applications allows scenarios of tasks to be embedded in the  virtual prototypes and can help enhance the design review process especially for end-users.

A method to develop virtual prototypes with embedded scenarios is shown in Figure 4-1.The process is adapted from requirements analysis methods used in software engineering (Robertson and Robertson 1999) and virtual reality system development methods (Sherman and Craig 2003). The detailed virtual prototyping process is adapted from Wilson (1999) and discussed at the end of this chapter. The following sections discuss the four overarching steps of Requirements Analysis, Concept and Process Design, Development and Process Validation.BigFour

Figure 4-1. Experience-based virtual prototyping procedure 

4.2  REQUIREMENTS ANALYSIS 

In requirements analysis, a framework for mapping scenarios with the spaces involved and the level of detail required is proposed for use. This is an exploratory research phase that helps define a virtual prototyping procedure to extract end-user experience and tacit knowledge of work tasks that are undertaken in a healthcare facility. Requirements analysis consists of firstly identifying stakeholders and defining their overall goal for using the interactive virtual prototype for design review.

“A requirement is something that a product must do or a quality that the product must have.” A requirement exists either because the type of product demands certain functions or qualities, or the end-user wants that requirement to be part of the delivered product (Robertson and Robertson 1999). Requirements can be functional or non-functional. Functional requirements are something the product must do to be useful within the context of the customer’s domain and non-functional requirements are properties or qualities that the product must have to enhance the product.

Requirements gathering for design can sometimes be difficult as the end-users are required to imagine what their needs for the future facility would be. The idea of using a prototype is give people something that has the appearance of reality and is real enough so that potential users can think of requirements that might otherwise be missed (Robertson and Robertson 1999).

4.2.1  Identifying Stakeholders

There are various stakeholders who play a key role in the design process of healthcare facilities including project teams- designer, engineers, contractors, board of trustees, financiers, vendors, suppliers, patients, caregivers, staff , community partners and donors (The Center for Health Design 2010). Stakeholders bring their own expertise, perspectives, and objectives to the design task (Kirk and Spreckelmeyer 1988). For the purpose of this study, stakeholders in the healthcare design process are divided into two categories: “Designers” of healthcare facilities from the architects, engineers and construction (AEC) personnel and “End-Users” from the healthcare context (see Figure4-2). End-users of healthcare facilities can range from Care Receivers, Care Givers, and Facility Managers. Care Receivers comprise of patients, their families and visitors. The Care Givers include doctors, nurses, healthcare staff and home health aides. Finally the healthcare facility managers encompass all the administration staff and maintenance personnel.

  ig4_2A

 

 NurseNurse

fig4-2cPatientPatient

fig4-2dfacilityManagerFacility Manager

Figure 4-2. Stakeholders in the healthcare design process: Designers and End-Users.

4.2.2  Investigating and Documenting Scenarios

Scenarios can be documented as a list of tasks performed by end-users of the facilities being designed. Documentation methods for scenarios could include concept mapping based on interviews and focus group discussions with end-users and subject matter experts involved in the design and where possible, observations during design review of the facilities. Methods such as concept mapping and focus groups with end-users are proposed to identify and extract scenarios of tasks that end-users may perform in their facilities. Semi-structured interviews conducted with the project stakeholders and end-users of healthcare facilities provide an understanding of how they use the facility and the tasks they perform in them.

Healthcare facilities scenarios vary depending on the user, the type of task being performed and the issues they address. For instance, in a particular scenario, a nurse would receive a signal to respond to a patient call at the nurse’s station. The nurse would then navigate to the defined space, and finally perform a task within that space. Another scenario could involve facility management personnel who need to perform an inspection of the HVAC equipment installed. The worker navigates to the required area to identify the equipment’s location and then performs a task to complete the inspection.

The potential for use and inclusion of scenario-based design in interactive virtual prototypes is discussed in Chapter 2. The following are some methods explored to document scenarios of activities that were tested on healthcare related studies. 

4.2.2.1  Envisioning experiences as scenarios of tasks

A pilot study was conducted in July 2010 in the Immersive Construction Lab (ICon Lab) at Penn State to review the design of a pharmacy in a medical office building (Leicht et al. 2010). Observations during design review and informal discussions with the pharmacists and owner’s representative helped establish the type of tasks performed in the pharmacy and the way different spaces are used in the facility.

During design review session with the end-users- the pharmacist’s and the owner’s representatives, it was observed that the pharmacists would often envision scenarios of tasks that they would perform in the pharmacy (see Figure 4-3). They would talk about how they would go about administering prescriptions for their customers. Fig4-3 Figure 4-3. Design Review of the Kaiser Pharmacy Virtual Mockup in the ICon Lab. 

One of the issues discussed during the design review (Figure 4-4) was the need to extend the partition walls between the pharmacy counters to provide more privacy for the customers ordering their prescriptions. Other concerns that were raised included determining the location of trash cans for hazardous medical waste, and ensuring that the cabinets for narcotic medicine had ample storage space and that they could be locked.

 Fig4-4Figure 4-4. Virtual Mockup of a Medical Office Pharmacy.

During and after the design review session, the two pharmacists were interviewed to gain better understanding of the tasks performed and their exact locations within the pharmacy. Based on their feedback, a task-based scenario was constructed as follows:

Role: Pharmacist

Scenario: Filling a prescription order Tasks:

a.      Get Order from customer (Pharmacy Counter)

i.     Urgent- needs to be done right away (Work Station)

ii.     Placed in the pipeline (Computer system)

b.      Fill prescription (Work Station)

i.     Requires narcotic drugs (Narcotics cabinet)

ii.     Not on the shelves (Get from stock in Receiving Area)

c.      Complete order and place it in storage cabinet

d.      Give customer the prescription (Pharmacy Counter) 

4.2.2.2  Concept mapping to extract scenarios 

Another way to elicit scenarios from end-users is through the use of concept maps. A concept map (or mind map) is a drawing and text combination that attempts to represent information the way that our brain stores it by making associations and linking each new piece of information through something we know. Mind maps are beneficial for requirements work as they help spot connections in the information that the users put forth.

(Robertson and Robertson 1999). Knowledge elicitation through concept mapping enables the end-users to build up a representation of their domain knowledge (Crandall et al. 2006). Concept maps can help develop a hierarchical morphology structure that frames scenarios and helps inform model content.

Figure 4-5 shows an example for concept mapping to elicit and document scenarios from end-users. The concept mapping method was tested in a healthcare related independent housing facility for the elderly. The formal documentation of scenarios for EVPS development and implementation using concept mapping and focus groups is discussed in more detail in Chapter 5.

Fig4-5

 Figure 4-5. Example of eliciting and documenting scenarios

 4.2.2.3  Scenarios Categories

After scenario documentation, scenarios can be categorized based on whether they are related to way finding in a facility or performing specific detailed task. Scenarios are categorized based on the nature of tasks being performed and defining the steps involved in them in detail.

This helps to identify the objects and their corresponding behaviors that will be required for the simulation of each scenario.

Table 4-1 shows scenario categories and defines the various tasks that can be simulated based on the level of detail. Scenarios are described with examples of type of healthcare facilities can benefit the most from them. It also identifies the participants involved and the objects required for each type of scenario.

SCENARIOS Description Healthcare Facility Examples End-Users/ Stakeholders Objects involved LoD for geometry
Movement- based Way-finding and   navigating through the facility from one location to another Large hospital facilities- going through long   corridors to reach   desired location

Patients,   Visitors, HC   staff, nurses,   doctor, new   staff,

Facility   managers

Facility   to navigate Mini-map,FPC (First   person Controller), Notes and alerts in HUD Low-High
Task-based Accomplishing a specific   task that could   involve relocating, moving certain objects   or people to desired location Nurse moving IV to patient Bed, locating electric outlet near patient   bed and using it.

Nurses,   HC staff,   Doctors, Visitor,   Patients,

Patient’s family,   Facility managers, Repair personnel

Facility,   Objects or other avatars,

FPC (First   Person Controller),

Notes and alerts   in HUD

Hand   movements of FPC*

High-Very high
Process-based Accomplishing a combination   of tasks and movements   to undergo a certain process. Doctor performing operation and moving patient from one area to the other; Admitting patient in the ER   and taking for tests or to patient room; Home health aide assisting elderly   with daily life   activities

Doctors,   Nurses, HC   Staff, Visitors,   Facility

Management   Repair Personnel

Facility,   Objects or other avatars,

FPC (First   Person Controller), Notes and alerts

Hand   movements of FPC*

Medium – Very high
Spatial Organization Examining the position of objects to ascertain if the layout   is efficient, facilitates   flow, meets requirements, follows anthropometric rules HC staff   examine whether there   is enough space   and correct layout of equipment to perform certain   tasks; equipment and furniture is relocated to improve   layout

Designers,   Engineers, Nurses,   Doctors, HC   staff,

Facility   Managers, Patients*

Facility,   Objects or other avatars,

FPC (First   Person Controller),

Notes and alerts   in HUD

Medium – Very high
3rd person view Looking at and examining   space from various locations and through   eyes of other user   roles View of patient   room from bed, view and experience going through MRI scan*,   view of facility on a wheel   chair

Designers,   Engineers, Nurses,   Doctors, HC   staff, new   staff

Medical   students Facility Managers,

Facility,   Objects or other avatars,

FPC (First   Person Controller),

Notes and alerts   in HUD

Medium – High
Inquiry-based Retrieve   information and data   associated with certain   objects Designers and Facility managers   inspect facility and retrieve Building related information (BIM);   HC personnel inquire about   certain equipment   in the facility

Designers,   Engineers,  Facility Managers,

HC   staff,* new   staff*

Medical   students*

Facility,   Objects BIM data

FPC (First   Person Controller),

Notes and alerts   in HUD

Low – Very high
* Optional

Table 4-1. Scenario categories based on Level of Detail (LoD) required.

 Categorization of scenarios based on their type can help identify level of detail, model content and interactivity required. Most importantly, this would also help determine the level of detail (LoD) required for modeling the digital content (facility, objects, avatars) in various authoring tools, to be able to simulate these scenarios effectively. For instance, a way finding scenario that requires a hospital visitor to walkthrough one end of the facility to another may not need the LoD that will be required for detailed tasks that take place in a fixed location such as a scenario where the nurse needs to check a patient‘s blood pressure.

4.2.3 Framework of scenarios

For requirements analysis, a framework for mapping scenarios with the spaces involved and the level of detail required is proposed for use (see Figure 4-6). The categorization and analysis of the documented scenarios helped develop a framework for structuring scenarios of tasks that end-users can perform in the virtual environments. This framework was initially developed to represent modelling needs and evolved through repeated testing on smaller healthcare related facility projects such as patient rooms, small operating suites, independent housing for elderly, medical laboratories and pharmacies.

Based on the scenario framework, several use scenarios can be identified, developed and documented for use during the virtual prototyping system development. The developed scenario framework can be leveraged to identify specific spaces and objects that need to be modelled as well as any additional object and user interface representations that will be essential for depicting the scenario. For instance, a patient emergency scenario identified may require quick patient transport from the hospital entrance to a specific location within the Emergency Department (ED) to provide critical care to the patient. This scenario has multiple forms of information that translate as a list of specifications to include in the experience-based virtual prototype that the end-user reviews. Based on the scenario, the specification elements could include location and path information shown in an abstract mini-map, temporary or moveable objects within the facility (e.g., patient bed, doors that open, and other elements that may not be included in the facility model) and representation of people involved in the scenario (e.g., patient and patient transporter’s avatars). fig4-6Figure 4-6. Framework for incorporating Scenarios in virtual prototypes. 

The scenario framework facilitates structured organization of the scenarios in a hierarchical object tree setup and enables a formal representation of specifications for development of the experience-based virtual prototypes. 

4.3 EVPS DESIGN PROCESS

 In the second phase of development, the design process of the EVPS proposes the system architecture and lays down the approach for translating specifications derived from requirements analysis into interactive virtual prototypes.

 4.3.1  System Architecture 

The system architecture for embedding scenarios in interactive virtual prototypes primarily consists of three components- the element library, the scenario engine and the user scene. The scenario engine is the main component of the EVPS application as it combines the task-based scenarios with geometric data and user input in the virtual environment. The element library is associated with other applications, databases and libraries. The user scene displays the virtual prototype and obtains feedback through the user input. The system architecture for the EVPS application is shown in Figure 4-7. Within the application, the 3D Element Library  consists of three types of geometric models: the space (facility) models, the object (equipment) models and the avatar (user role) models. While the facility and equipment/ object models are obtained from a geometric database, the user role models are retrieved from the Avatar Object library. The Scenario Engine allows for the addition of scenario and task tracking scripts by attaching various objects to these scripts from the Scripting Library. The User Scene of the EVPS application contains the 3D rendering module and a GUI widget that displays the virtual healthcare facility prototype scene along with the objects and user roles on the user’s screen or output device. The objective is to allow the user to navigate through the space, interact with various objects and perform specific tasks and scenarios within the virtual facility prototype dynamically.  This generic system architecture could be implemented in various game engine environments, but the specific structure of the data files and formats will vary depending on the game engine. fig4-7

 Figure 4-7.  System Architecture of Experience-based Virtual Prototyping System. 

4.3.1.1  Element Library 

The element library functions as a repository for 3D geometric data of the facility including healthcare equipment and other objects that are required to be displayed in the virtual environment.  Since most of the 3D geometric data is authored in various BIM authoring tools, the application needs to be interoperable with these tools such that all the data is stored in file formats that are easily exported from authoring tools and imported in the EVPS application. The element library communicates with the BIM database to get relevant 3D geometric data of the facility that will be required for design review within the virtual environment. Apart from 3D geometric data, real-time rendering requires additional attribute information regarding texture, material and lighting. Other important attribute information required during real-time walkthroughs of a facility in a virtual environment is collision detection on the wall, floor, ceiling or other similar elements within the facility.

In addition to geometric and attribute information, certain objects also have inherent behaviors that are included in the element library. For instance all the door elements in the facility could be animated to slide or swing open depending on the type of door and direction of hinge.

Healthcare equipment objects such as the patient bed could be animated so that they are configured for the user (patient) to sit or lie on them. The element library combines the 3D geometric data obtained from a BIM authoring application with all the additional behaviors described above such as attributes, physics and animations associated with various elements of a healthcare facility. The element library also contains information regarding the avatars of the user roles used in the scenario-based design review of healthcare facilities in the virtual environment that the element library will store 3D geometric data for the avatar representation based on the user role along with behavior, features, functions and constraints associated with each type of user role. 

4.3.1.2  Scenario Engine 

The implementation of scenarios for the purpose of interactive design review in a virtual environment takes place in the scenario engine. The scenario engine links various behaviors, scenario and task tracking scripts from the scripting library to the 3D geometric objects and  model in the space. The organization of this information is based on a hierarchical data structure developed as a scenario framework wherein each design review space of the healthcare facility can be reviewed by various user roles. These user roles range from AEC design professionals and facility managers, to the end-users such as nurses, patients and other healthcare staff. Since every user would perform a distinct task within the space and have a different agenda for design review, the functions and features afforded to the user role chosen within the virtual environment are different. For instance, nurses might be interested in knowing if they are able to carry out certain duties like placing a particular piece of patent monitoring equipment near the patient’s bed in a convenient manner. However, facility managers might be more interested in checking the location and ease of access for various air filters that may need to be replaced in a fan coil unit.

Therefore, each user role should have a different Graphical User Interface (GUI) that can be customized for that particular user.

The GUI displays a distinct set of scenarios, which are further broken down into a series of tasks.

The functions performed by the scenario engine are as follows:

        Load appropriate healthcare space or facility model from element library in the user scene.

        Based on the space chosen, receive user input of the role in healthcare facility design and then load the first person controller (FPC) or avatar with relevant behavior for the user role. The GUI elements with functions and features appropriate for the user role will be loaded based on the user role selection.

        Load scenarios from a number of available scenarios defined based on the user role. Display GUI elements, additional objects needed for the scenario, and the list of tasks that are performed during the scenario.

        Once a scenario is activated, keep track of the steps or tasks performed to complete that particular scenario by updating and retrieving scenario conditions from the behavior scripting library. 

4.3.1.3  User Scene 

The user scene is the medium of communication or access point between the users and the EVPS.  As the user’s connection to the virtual prototype, the user interface affects the design of the virtual prototype itself (Sherman and Craig 2003). A User Interface is part of the application with which a user interacts in order to undertake his or her tasks and achieve his or her goals (Stone et al. 2005). 

4.3.2  Story boarding the Graphical User Interface 

The Graphical User Interface (GUI) depicts to the user features and widgets that are embedded in an application. The GUI for each project can be custom designed based on the scenarios and model content required. The EVPS application concept was designed using storyboards to allow scenarios to be loaded within the game environment as shown in Figure 4-8.  Fig4-8Figure 4-8. Concept design for the Graphical User Interface (GUI) of the EVPS application. 

Storyboards are sequences of sketches or screen layouts that focus on the main actions and interactions in a possible situation. Storyboards take textual descriptions of task flows (such as scenarios) and turn them into visual illustrations of interactions (Stone et al. 2005).

Storyboarding is a valuable tool for conveying functionality of a proposed solution and also help in collecting requirements and generating feedback (Gruen 2000).

Figure 4-9 shows a sequence of story boards designed during conceptualization of the EVPS. Each storyboard consists of initially, the scene or the facility depicting particular healthcare spaces. Once a specific space is chosen, the GUI displays various roles within the healthcare facilities that can be loaded as interactive avatars. Lastly, having chosen the healthcare facility space and role, the EVPS application loads relevant GUI components with corresponding scenario information as well as furniture, fixtures and equipment.

Fig4-9Figure 4-9. Snapshots within the Unity game engine interface showing different steps for design review using scenarios for a healthcare facility model. 

4.3.3  Identifying Media for Interaction 

The User Interface is made up of hardware and software design components. Hardware design components or interaction devices are the input and output devices. Software design components are generated by the computer system or the scenario engine. Based on the media identified for interaction, the scenario engine needs to be programmed to have the software components respond to the user input and represent on the chosen display output. 

4.3.3.1  User Input media 

Virtual prototypes can only be interactive if they are able to accept real time input from the user. There are many ways of getting user input information into the system that could include from simple mouse and keypad input to the use of more complex physical controls like wands, flying sticks, joysticks, data gloves and platforms. Virtual reality interfaces such as motion tracking and eye tracking are other passive methods that can be used to input the user’s location and orientation in the virtual facility prototype (Sherman and Craig 2003). Latency between user input and display response is the product of many system components – input devices, computation of world physics, graphical rendering routines and formats. 

4.3.3.2  Output display media 

Visual displays for virtual prototypes impact the application development based on the ease of implementing the particular system. The media used for display can range from simple desktop, handheld display and large screen projection display to the more immersive virtual reality based displays that include stereoscopic projection and head-mounted display systems. Similar to the input media, development of the EVPS needs to be in accordance to the chosen media’s requirements for resolution and compatibility with file formats for real-time rendering (Dunston et al. 2010; Sherman and Craig 2003).

According to a study by Nikolic (2007) and Zikic (2007), the best configuration to use for the evaluation of designed spaces depends on the purpose and context of use of virtual prototypes. Their study confirms the usefulness of having large screen and wide field of view due to the fact that they provide a scale reference and sufficient spatial information. However, in the context of end-user presentation, they suggest a useful and affordable combination to display a highly detailed and highly realistic model on small screens with a narrow field of view. 

4.3.4  Interactivity Environment Selection 

As discussed in the literature review chapter, game engine applications provide greater functionality compared to 3D modeling applications for embedding interactivity in virtual prototypes. Many game engines have been used in the AEC industry such as the XNA game engine for construction simulation (Nikolic et al. 2010) and architectural walkthroughs (Yan et al. 2011), C4Engine, Torque and Unreal Editor (Shiratuddin and Fletcher 2007). Even virtual worlds such as the Second Life platform have been used for interacting with facility prototypes. Earlier attempts to incorporate interactivity in virtual prototypes were done by creating custom applications making the development very cumbersome. However, the use of most commercially available applications also includes high learning curve, labor intensive development and high costs.

Based on review of several real-time rendering engines, the Unity game engine was considered a feasible option for the development of the Experience-based Virtual Prototyping System (EVPS) application. The Unity game engine was chosen since it has a fast rendering environment, a robust feature set that allows for customization, is affordable, and has a relatively easy interface with drag and drop ability making it easy to learn and use (see Figure 4-10). The game engine can also be used in both the Mac and PC operating systems and it uses the JavaScript and Just-In Time (JIT) compilation within the C++ mono library. For the purposes of physics and rendering, it uses the Nvidia PhysX physics engine, OpenGL and DirectX for 3D rendering and OpenAL for audio. There are also various workflows for conveniently transferring geometric data from BIM and CAD authoring applications such as Autodesk Revit, Google SketchUp and Autodesk 3DS Max to the Unity Game engine. This helped develop the level of information transferred such as textures as well as other intelligent information attached to the Building Information Model. 

Fig4-10 Figure 4-10. Screenshot of the Unity game engine application. 

Within each Unity project folder, there is a default Assets folder that performs the function of the element library for the EVPS application and stores all the data associated with the healthcare facility. The user scene file is located in this Assets folder and includes the main levels as well as different zones and spaces of the healthcare facility. Moreover, all elements that are ever used in the user scene including game objects, prefabs (objects with attached behavior scripts), textures and other components along with their behavior scripts are stored in the assets folder. These elements, referred to as assets in the Unity game engine can be reused from one project to another. Within the Unity game engine interface, the elements stored in the assets  folder are displayed in the Projects tab as shown in Figure 4-10. Digital models of the facility and other object geometry are stored as assets within the projects database.

The next section outlines the preliminary development process, focused specifically on importing building information models and digital model content in the game engine environment. 

4.4 EVPS DEVELOPMENT PROCESS 

One of the objectives of the research is to streamline the EVPS development process so that the amount of time and effort required to create interactive virtual prototypes of healthcare facilities can be reduced. The steps for developing information exchange workflows to develop the EVPS are as follows:

        Identify modeling and interaction tools 

        Identify and list file format that can be exchanged between the applications 

        Test exchange of model information using different file formats 

        Note any issues and challenges with data exchange

4.4.1  Design Information Workflows 
The system and process design phase helps identify the ideal modeling and software implementation tools based on the goals and objectives established for developing the EVPS. While some projects may need to be modeled from scratch, others may already have existing highly detailed models used from the project team. Once the modeling application is chosen, it is important to identify the file formats (see Figure 4-11) that can be exchanged between the chosen software applications.

Fig4-11

Figure 4-11. Interoperable file formats to transfer geometry content between tools.

The intent of testing various workflows is to transfer as much model content as possible into the Unity game engine to ensure that limited amount of modeling is needed within it.

Experiments with transferring model data using different file formats tested the amount of data that comes through various applications and noted any issues or missing information. The tests identify the most efficient workflow and record this design information exchange process from 3D modeling software to interactive tools.

Various workflows were tested to best utilize 3D content from different BIM authoring tools for use in the Unity game engine. The advantage of using existing BIM authoring tools,  such as Revit Architecture was that it allowed the use of existing building information models for the development of the interactive virtual prototypes. For this purpose, workflows to transfer model content from building information modeling tools such as Autodesk Revit along with other visualization and 3D modeling tools such as Autodesk 3D Studio Max, and Google SketchUp.

Another benefit of directly exporting models into Unity was that design changes made to the model in native authoring tool (such as Autodesk Revit) could simply be exported again with changes that would be automatically updated in Unity. 

4.4.2  Information exchange challenges 

During workflow development, certain interoperability issues were encountered while importing models from Revit into Unity which were primarily related to textures, lighting, and overall organization of the model hierarchy. Interoperability is the ability of several systems, identical or completely different, to communicate without any ambiguity and to operate together. Maldovan et al. (2006) and Dunston et al. (2010) note that the lack of interoperability inhibits development of virtual mock-ups. Data are lost when models are transferred between applications and when models are sent to immersive virtual environments. Furthermore, depending on the type of display output, most applications do not support 3D stereoscopic visualization of models in real-time.

While 3D meshes of the lighting fixtures did transfer successfully into Unity, the lighting characteristics did not.  Lighting for the spaces can be added within the Unity engine. Typically, a large amount of time is required in making the lighting as realistic as possible; hence successfully carrying through these characteristics can translate into potential time savings for the project team. One possible solution to this lighting issue could be incorporating the use of baked textures within the lighting workflow. By importing a facility model into 3D Studio Max and using render to texture for the objects, baked textures depicting the lighting effect can be transferred to the Unity model. However, this may also raise the issue of redoing this entire process whenever revisions are made to the design of the model in Revit before being transferred again to Unity. 

4.4.3  Real-time rendering challenges 

Building or facility models typically contain walls, ceilings and floors that partition space into rooms. The geometric content of these models comprises of 3D meshes, textures, and lighting attributes. Larger models have larger polygon count that requires more resources for rendering and can affect performance of the real-time simulation.

According to Funkhouser et al. (1996), visual realism with short response times as well fast and uniform rate is desirable in virtual simulation prototypes of facility models. The level of detail needed is quite high to purvey the sense of presence and realism comparable to the true constructed space (Nikolic 2007) ; Zikic, 2007). Sense of realism is low when response to user input is slow. Latency between user input and display response is the product of many system components – input devices, computation of world physic, graphical rendering routines and formats.

The higher the frame-rate (number of images displayed per second), the smoother (lagging is greatly reduced) the real-time images are presented. This allows users to experience the virtual prototypes of facilities to a greater level of immersion and interaction (Shiratuddin 2007). Occlusion culling for models where large portions of the models can be hidden or occluded by polygons in front of the user’s viewpoints can be used to improve frame rate. 

4.4.4  Optimal information exchange workflow 

It was found that the most efficient workflow for embedding interactivity in virtual prototypes was by importing files into Unity game engine as an FBX file format (see Figure 4- 12). One of the benefits of using a FBX file format was that when files were transferred within the Autodesk suite, it retained information pertaining to 3D meshes, texture and camera locations. However, when a FBX file format was exported into Unity, it lost any type of textural  information associated with the model and the materials and textures had to be reapplied.  Since the Unity engine had an extensive in-built library of textures and material, basic realistic textures can be applied within Unity with relative ease although it requires additional effort. Other workflows, particularly through other visualization applications, may not encounter this texture issue, although it has not been uncommon to have similar issues when moving from CAD / BIM authoring applications to interactive game engines. Fig4-12Figure 4-12. Typical workflow adopted for transfer of model content to develop EVPS. 

4.5 INCORPORATING INTERACTIVITY IN EVPS 
Interactive virtual prototypes can prove to be effective design communication tools between the professionals and end-users by extracting domain specific tacit knowledge from both parties to create better understanding of the facility which further leads to better design.

Interactive features are important for end-users as it helps them relate what they are seeing in virtual environments to the real world (Wang 2002). Interactivity features not only enables end- users to interact with the virtual prototype during design review, but also gives the AEC professionals an opportunity to review the prototype through the roles and point of view of the end-users.

Interactivity is defined as the extent to which a user can participate in modifying form and content of a mediated environment in real time (Steuer 1995). The role of interactivity is to generate greater involvement or engagement with content (Sundar 2007). Most, if not all, modern-day interfaces are interactive, empowering the user to take action in highly innovative and individualized ways.   Interactivity influences user by increasing/decreasing perceptual bandwidth, offering customization options and by building contingency in user-system exchanges. These factors combined contribute in different ways to user engagement in terms of cognition, attitude, and behavior.

Some interactive features and functionality were designed, developed and tested for the EVPS within the Unity game engine. Figure 4-13 represents a framework of interactivity features that were developed and tested for implementation in healthcare facilities. The framework is flexible and can evolve so that more features can be added in the future. Features developed include different modes of navigation, interactive objects and user interface. The user interface also includes a scenario tracking system that enables end-users to keep track of activities that they perform in the EVPS. The following section describes these features in detail. fig4-13Figure 4-13. Interactivity in virtual prototypes. 

4.5.1  Navigation 

For real-time architectural visualizations, there is often a need to have moveable objects and navigation through the virtual prototype. Architectural walkthroughs are usually non- interactive flythrough videos that give a virtual tour of a facility to the user by displaying pre- choreographed views of the design. Real-time rendering applications enhance user interactivity by supporting several navigation modes, such as Walk, Fly, Examine, etc. and almost all applications support the walk mode for architectural walkthroughs and game design.

In most applications that use interactive 3D graphics as a platform, this navigation can be portrayed either from a first person point of view (PoV) or a particular character’s point of view. Using the Unity game engine, feasible ways of incorporating both modes of navigation can be explored along with providing greater customized camera control for the user. 

4.5.1.1  Camera movement 

Observer viewpoint and movements can include turning and changing direction very easily. The observer can spin around quickly and look very closely at any feature of the model. Attaching a camera to an object that can be controlled makes it possible to explore the facility through the first person PoV. If the camera is attached to a character controller or avatar with the camera behind the avatar, it gives the appearance of a third person PoV. Both navigation modes can also be combined if the camera object itself is controlled through user input. 

4.5.1.2  First Person Controllers 

The first person character controller comprises of a capsule geometry, a camera attached to view the scene and a script that enables motion on user input. Some of the variables that can be controlled and modified by the user include speed, rotation, camera view rotation, and jump. In a first person controller, the camera is the user’s point of view, so making it follow another object around the scene (in case of third person controller) is not required. The user controls the variables and attributes of the camera object directly. First person cameras are therefore relatively easy to implement. Within the standard assets in the Unity Game Engine, there is a first person controller that can be dropped into the scene or hierarchy window. While navigating, the character controller will not be able to go through objects that have the Collider component attached due to collision detection of the underlying Unity physics engine. 

4.5.1.3  Third Person Controllers 

After the character or avatar is imported into Unity as an asset, character controllers and other scripts can be attached to enable third person navigation. Usually a camera is attached a set distance behind the character controller within the hierarchy in a parent-child relationship. This enables the user to view the facility prototype through the camera that constantly follows the third person controller. Figure 4-14 shows an example of third person controller developed for a healthcare facility project.Fig4-14Figure 4-14.  Character controller depicting a nurse downloaded fromwww.mixamo.com. 

Since developing end-user character avatars in 3D modeling applications and animating them can be very time-consuming, a website that develops digital content called Mixamo (http://www.mixamo.com/) was used that allows downloading of custom characters with required character animations that include walk, sit, idle and many more character motions that can be used during the review of a virtual facility prototype. The website also allows custom characters created and developed in other 3D modeling applications to be uploaded on the website to attach animations. 

4.5.2  Interactive Objects 

Interaction with elements in the model can be classified into dynamic, controller, trigger and animated objects based on how they are interacted with and how they react or behave on interaction. In dynamic objects, attributes of rotation, position, scale and appearance of objects can be modified in through addition of certain properties, components or custom scripts. Almost all the elements of the model can have colliders so that on navigation, users don’t walk through them and the floors or ground plane can have gravity so that user doesn’t fall through the floors. Moveable objects can have physics applied to transform their position, rotation or scale. Custom scripts with options of different colors, materials and texture can be used to change the appearance of certain objects. Lastly scripts can be used for obtaining contextual information about the object on interacting with them either through clicking or any other chosen mechanism. 

4.5.2.1  Controller objects 

A good example of controllers is the first person or third person point of view (PoV) navigation mode where the capsule or avatar with an attached camera is controlled to move  within the prototype through the user’s input. Other objects such as wheelchairs, mobility devices and patient beds can also be attached controller scripts so they are moved around the facility. 

4.5.2.2  Trigger objects 

Triggers are invisible components, which as their name implies, trigger an event. In Unity, any Collider can become a Trigger by selecting its “Is Trigger” property and setting it as true in the Inspector window. While navigating through the facility, users can pass through doors that open or move a trolley or wheelchair from one space to another. This interactivity in the virtual prototype can be achieved by adding animation to doors that can swing or slide open and physics to objects that move when force is applied to them by another object such as the character controller in the game. Alternatively, this movement or animation can also take place with mouse clicks or triggers. Triggers could also take form of objects that enable an action to take place on either proximity or clicks or certain keyboard commands.

The Unity game engine is set up to show visual assets, however, these also have to be connected to each other to provide the interactivity expected in the virtual prototype. These connections displayed in object hierarchy tree format are known as dependencies. Objects have a parent-child relationship where any changes made to the parent object also affect the sub-objects attached. Objects can also be connected to other objects through scripts so that events on one object can trigger effects on a second object. The result is that your assets are tied to each other with myriad virtual bits of string scripts tying them all together to make a real-time gaming environment. 

4.5.2.3  Automated Objects 

A user can usually walk past the door in a facility model, since the door does not have a Collider component attached to it. However, in real-time navigation of a virtual facility prototype, the ability to make the door swing open can increase the level of realism and experience of the user reviewing the facility. This can be achieved by adding an animation on the door object in the Unity game engine that enables it to swing open. Additionally, to ensure that the door only opens when a user approaches it, triggers can be used, either for detecting proximity of the character controller or through some user input. An invisible collider trigger object of the required dimensions can be superimposed on the door object such that when the user collides with the trigger object, it enables the door animation to play and the user can walk between spaces once the door has swung open as shown in Figure 4-15. On repeated trials to make the door swing open, it was realized that the door and doorframe were combined as a single object and it was not possible to split them within Unity. Thereafter, it was realized that to simplify the incorporation  of swinging doors in the virtual facility prototype, it was necessary to split the door frame and door panel in the authoring application such as Autodesk Revit Architecture, where the door family is edited to split and create two distinct door objects- the door and the door frame. fig4-15Figure 4-15.  Door Prefab with the door trigger collider

4.5.3  User Interface Development 

A Graphical User Interface (GUI) represents the information and actions available to a user through graphical icons and visual indicators. Based on the storyboards developed during system design phase, menus, icons and buttons can be developed that have interactivity scripts to enable various actions to take place in the prototype. These actions include loading specific spaces, avatars models, loading websites and keeping track of scenarios of tasks performed by the user. The user interface can also be used for providing context awareness and other related information to the end-user through the use of heads up displays and mini-maps. In Unity, the GUI system is called UnityGUI, which allows the creation of a huge variety of GUIs complete with functionality quickly and easily. 

4.5.3.1  Menu 

A start menu is developed to launch when the virtual prototype is opened. Similar to video game development, start menus can enable the user to change options and enter levels that contain the virtual prototype of the facility (Kumar et al. 2011). The start menu for the virtual prototype of the facility can generally be saved as a separate scene with buttons or textures that allow the user to choose which spaces they want to explore or what role they want to choose.

Based on their choices, different levels or Scenes are loaded in the Unity application for the user to explore. Figure 4-16 shows an example of a start menu with interactive icons from a healthcare related project. Fig4-16

 Figure 4-16.  Start Menu with interactive buttons to load levels and change options.

 4.5.3.2  Heads-Up Displays 

The heads- up display (HUD) is a feature used in video games that provides live, constantly updated information such as scores for the user. In virtual prototypes developed in Unity, textual information can be customized and displayed based on requirements analysis. The HUD can be used to display names and other information on specific objects when they are clicked. Additionally, when the controller object navigates and collides with trigger objects placed in different areas, HUDs can display the specific name and information of that space. 

4.5.3.3  Way-finding 

Studies show that the use of “you are here” maps aid in spatial cognition by using abstract representations of the large-scale environment to provide information on orientation and guide way finding (Dutcher 2007). A mini-map with tracker to is used locate position of the user in real-time within the virtual prototype and aid in way finding and spatial awareness (Klippel et al. 2010). Using an orthographic camera to view the prototype from the top plan view, the mini map is placed on the user display according to the design and storyboard requirements. Figure 4- 17 shows that both HUDs and mini-maps provide customized information about the environment to the user exploring the virtual facility prototypes.

Fig4-17Figure 4-17. “Heads-up displays” and mini-maps to aid in way finding and spatial awareness. 

4.5.4  Scenario Scripting 

The concept of quests and level design in video games provides players with objectives and guide gameplay (Smith et al. 2011). Taking cue from quest design that uses a tracking mechanism to monitor the progress of players within their quests, task-based scenarios embedded in the virtual prototype also use the same approach for keeping track of how the user explores the prototype. Figure 4-18 shows the approach for scripting the scenario menu. Fig4-18

 Figure 4-18. Scenario Menu and Scripting approach. 

To manage the scenario tracking process, two scripts have been developed–the scenario monitoring script and the task monitoring script. The scenario monitoring script observes which step of a given scenario the user is on and ensures that proper instructions are displayed on the user interface. The task script is responsible for checking to see if the user (controller object) is interacting with the correct object (dynamic, trigger or automated) at a given step of a specific scenario. Each game object is assigned a unique identity, and the task monitoring script is applied to each object via a simple drag and drop interface in Unity 3D.

The Task Monitoring Script is governed by three chief variables defined within its code and a series of if statements that determine what step of a scenario task the user is on (Table 4-2). When certain conditions have been met, the script allows the user to progress to the next step in the task. All three of these variables are also passed on to the Scenario Monitoring Script, which is responsible for displaying the current task information on the user interface.

Table4-2

Table 4-2. Scenario Tracking script with its variables and functions. 

With reference to the three variables from Table 4-3 (below), a set of three if statements must be modeled in code for every possible step of a scenario task. If any of the three if statements are false, then the step is incomplete and the user cannot progress to the next task. In summary, each of these if statements analyze the following conditions:

1.      Has the user completed all of the prerequisite steps prior to the current step? 

2.      For a given step, is the user interacting with the correct object? 

3.      Has the user completed the current task step?

 If each of these three conditions are met with the code equivalent of a ‘yes,’ then the value of the Step Count variable is increased by one, signifying that the user is able to progress to the next step in the scenario task. 

The next section describes the overall process for experience-based virtual prototyping, and discusses process validation, framework for developing reusable model content and strategies for development. 

4.6 PROCESS VALIDATION 

In the final step of the process, the framework for structuring scenarios and the information architecture for the experience-based virtual prototyping system was validated. Validation of the EVPS development process was achieved through initial testing of the design and concepts in various project cases related to healthcare facility design as well as informal interviews of industry professionals and subject matter experts with the required domain-specific knowledge on developing applications using game engines. Revisions were made to the framework based on the expert feedback and the required changes were implemented in the development process of the Experience-based Virtual Prototyping System (EVPS).

Throughout the application development process, the research intent was to constantly assess the capabilities and limitations of the programming environment and gaming engine used as well as the procedure employed to develop the scenario-based interactive virtual environment for end-user testing as a part of internal validation. Some of the strategies identified for rapid virtual prototyping were the use of reusable model content and determining the level of effort required for EVPS development.

4.6.1  Framework to rapidly develop reusable model content 

There is a significant amount of time-consuming effort required in developing model content and transferring it using adequate file formats to work in the required virtual environment outputs. For the facility model, Leite et al. (2011) discuss modeling effort associated with generating building information modeling (BIM) at higher levels of detail (LoD). Most of the times the BIM of facilities are not model to the adequate LoD and often other interactive objects are either not modeled to the LoD desired or not modeled at all. Modeling other interactive objects required based on specifications can be very time consuming and not worth the effort or use of resources.

Digital Content Creation (DCC) vendors online can be used for accessing 3D model content. Access to DCC websites and other resources can develop faster design development cycles. Some of the examples are avatar websites, Google Sketchup warehouse, Unity Asset Store, Mixamo etc. Digital model content can be combined with interactive behaviors and packaged for reuse amongst other healthcare or related projects.

Examples of packages include a nurse avatar with zoom and rotate camera, moveable patient beds and medical equipment, scripts for custom menus and widgets, avatar on mobility device and doors that swing open. Figure 4- 19 shows an example of a reusable model content where an avatar of an elderly person was designed and downloaded from the Mixamo website, digital model of a mobility device was supplied by the scooter manufacturer’s company and customized controller and moveable object scripts were applied to create a package that can be used between different virtual prototyping projects. Fig4-19me Figure 4-19. Reusable model content package – avatar of an elderly person on a mobility device. 

++++

[Publisher’s note: Dr. Kumar told the publisher that he is the model for the avatar shown here.]

++++

Finally, an element library can help organize and store developed content and make it available for use in later projects. The objective of developing this library is to gradually grow the amount of reusable model content and make it available on open source websites for use. Taking inspiration from a similar effort, Dunston et al. (2010) are working with the Technology HUB initiative that leverages cyber infrastructure to share resources for design of virtual healthcare environments. The intent was to let partners across universities share 3D model content of objects such as furniture and medical equipment along with computer code to utilize these models in their respective projects. 

4.6.2  EVPS Development Strategy 

The scenario framework from requirements analysis phase and the system architecture from the system design phase can be used together to develop a strategy for implementation and development of EVPS on healthcare projects. A template strategy plan (Appendix A) for experience-based virtual prototyping was created to aid in the EVPS development. The EVPS plan proposed helps identify specifications and modeling requirements by asking the following questions related to model content and media used for interactivity: 

Model Content 

  • ·            What elements and properties should be included?
  • ·            What is the required level of detail?
  • ·            What is the required level of realism, or abstraction?
  • ·            What the required level of interaction with end-users? 

Media Used for Prototype Interaction 

  • ·            What display system(s) are to be supported?  E.g., desktop, large immersive display
  • ·            What interactive devices will be used? E.g., mouse, joystick, data glove
  • ·            What display resolution will be supported?
  • ·            Will you aim to support stereoscopic visualization, or surround sound audio?

Based on answers to the above questions on amount and type of model content required for development and selection of media for interaction, detailed steps for the EVPS development can be laid out. The EVPS development process was implemented on various healthcare projects to determine challenges, resources and time-consuming tasks required to accomplish the virtual prototyping requirements. Figure 4-20 shows a conceptual representation of the level of effort (LOE) required for EVPS development, which increases with the complexity of the model content, level of detail, level of realism and level of interactivity desired. 

Fig4-20Figure 4-20. Conceptual representation of level of effort (LOE). 

However, repeated testing of the EVPS development process on various projects also indicated that leveraging reusable interactive model content; clearly defined design information workflows and greater knowledge or experience with virtual prototyping process may reduce the level of effort required.

During EVPS development, it has been important to “prototype the prototype,” to ensure that the knowledge and resources available can sufficiently meet the requirements set during the requirements analysis phase. Based on the challenges and limitations encountered during initial development, requirements and specifications can be revisited and negotiated based on available resources. To illustrate, Table 4-3 shows an example matrix for specifying the requirements while comparing them with the resources available to be able to effectively scope out the development process.  Based on the end user specifications, the level of effort required for each aspect of  model content and level of interactivity can be mapped.

The example in the table shows that the EVPS specification for a specific project may require low level of effort in model content detailing and lighting but medium level of realism with textures for a large size model. Similarly, from the interactivity standpoint, low level of effort will be required for the movement-based scenario and navigation point of view but medium level for interaction with objects and graphical user interface. This matrix can help determine the complexity of the EVPS for any specific project. Then, based on the developer’s knowledge, experience, and accessibility to available reusable content and EVPS development workflows, level of effort or time taken for EVPS development can be determined.

Table4-3

Table 4-3. Matrix of example specifications mapped based on level of effort. 

4.6.3  Experience-based Virtual Prototyping Procedure 

This research activity was focused on refining, improving and expanding the procedure to rapidly convert design models from BIM authoring tools into the virtual prototyping system,  while incorporating task-based scenarios. Figure 4-21 shows the overall experience-based virtual prototyping procedure that gradually emerged from iterations of developing virtual prototypes using various case study projects as well as  methods adapted from other virtual reality system implementation studies (Wilson and D’Cruz 2006; Wilson 1999).

As discussed throughout this chapter, the steps for developing the EVPS include 1) requirements analysis, 2) system design, 3) system development and 4) system validation, and finally 5) implementation on a project. Requirement analysis encompasses identifying stakeholders, establishing goals, and undertaking user needs analysis. Framework of scenarios proposed in Section 4.2.3 can be leveraged to identify model content and interactivity features required.

Second phase of system design begins with concept design through storyboarding followed by use of system architecture to identify appropriate programming environment and media for interaction. Strategies for development and EVPS development plan (Appendix A) can also be used to detail out steps for development. The third phase of system development includes transferring BIM and other digital model content into programming environment and incorporating interactive features and functionality in the EVPS.

Finally, the system and process can be validated and lessons learned can inform the development process until the EVPS is ready for implementation on a healthcare facility project for end-user design review. 

FigLast

Figure 4-21.  Experience-based Virtual Prototyping procedure. 

4.7 SUMMARY 

This chapter described the procedure for development of experience-based virtual prototyping systems. The steps for EVPS development included requirements analysis, system design, system development and finally system validation for implementation. Several project examples were used to illustrate and test these processes. Finally strategies for rapid EVPS development were proposed and the overall EVPS procedure was presented. The next chapter will introduce the Hershey Children’s Hospital case study where the EVPS process is implemented and assessed.

++++

Copyright 2013 by Sonali Kumar. All rights reserved. Thesis published on this site by the express permission of Sonali Kumar.

Note: Under construction link to Chapter 5.

++++

Copyright 2013 by Sonali Kumar. All rights reserved. Thesis published on this site by the express permission of Sonali Kumar.

Chapter 3. Dr. Kumar’s thesis on virtual reality modeling

Chapter 3 RESEARCH METHODOLOGY

Research is defined as a “systematic, intensive study directed toward fuller scientific knowledge of the subject studied.” (Blake 1978). Research can either be pure, undertaken to develop knowledge and contributor to existing theories or applied, which seeks to address issues of application and solve practical problems (Fellows and Liu 2003). A research methodology consists of a combination of process, methods and tools that are used in conducting research in a research domain. According to Blake (1978), methodology includes the assumptions and values that serve as a rationale for research and the standards or criteria the researcher uses for interpreting data and reaching a conclusion. Research methods are at the basis of the production of knowledge in any given field (Dubé and Paré 2003).

A research process involves understanding research domains, asking meaningful questions, and applying valid research methodologies to address these questions. Research follows a pattern of “problem, hypothesis, analysis, and argument.” During research design, one has to decide the methodological approach in finding solutions to the research problems or questions (Fellows and Liu 2003).

3.1 RESEARCH APPROACH

Research methods most applicable to the construction domain include action research, surveys, case studies, experiments and ethnographic research (Fellows and Liu 2003). With the advent of information technology in the AEC domain, construction research needs to adopt research approaches and methods that better serve their research purpose. Research domains such as information systems are sufficiently broad that they require a wide range of methodologies. Information systems research approaches were explored for adoption in this study.

3.1.1  Information Systems Research

Information systems research is generally interdisciplinary concerned with the study of the effects of information systems on the behavior of individuals, groups, and organizations (Galliers et al. 2007). An applied research discipline, information systems benefits from employing “plurality of research perspectives to investigate information systems phenomena” (Orlikowsky and Baroudi 1991). Action research methods are also very applicable in IS research (Baskerville and Wood-Harper 1996).

According to Hevner et al. (2004), two paradigms characterize much of the research in the Information Systems (IS) discipline: behavioral science and design science. The behavioral science paradigm seeks to develop and verify theories that explain or predict human or organizational behavior. The design-science paradigm seeks to extend the boundaries of human and organizational capabilities by creating new and innovative artifacts (Hevner et al. 2004).

March and Smith (1995) proposed a framework for researching different aspects of Information Systems including outputs of the research and activities to carry out this research. They identified research outputs as follows: 1) constructs which are “concepts to characterize phenomenon”, 2) models that “describe tasks, situations, or artifacts”, 3) methods as “ways of performing goal directed activities”, and 4) instantiations which are “physical implementations intended to perform certain tasks”. The defined research activities as: 1) build an artifact to perform a specific task, 2) evaluate the artifact to determine if any progress has been achieved and 3) theorize and justify theories about artifacts developed.

Similarly, Nunamaker et al. (1991) advocated the integration of system development into the research process, by proposing a multimethodological approach that would include 1) theory building, 2) systems development, 3) experimentation and 4) observations.

Figure 3-1 presents Hevner et al.’s (2004) conceptual framework for understanding, executing, and evaluating IS research combining behavioral-science and design-science paradigms. The framework focuses on development of artifacts by applying theoretical knowledge and knowledge of requirements from the environment or context of application.

Fig3-1

Figure 3-1.   IS Research Framework (Source: Hevner et al. 2004).

3.1.2  Systems Development

Systems development is a multimethodological approach to information systems (IS) research (Nunamaker et al. 1991). System development serves both as a “proof of concept” for  the fundamental research and provides an artifact that becomes focus of expanding and  continuing research. The multimethodological approach to IS research proposed by Nunamaker et al. (1991) consists of four research strategies: theory building, experimentation, observation and systems development (Figure 3-2).

 

fig3-2

Figure 3-2. Research approach (Source: Nunamaker et al. 1991).

Theory building includes development of new ideas, concepts, and construction of conceptual frameworks, new methods or models. Theories may be used to suggest research hypotheses, guide the design of experiments, and conduct systematic observations.

Observations include research methodologies such as case studies, field studies and sample surveys. It may help researchers to formulate specific hypotheses to be tested by experimentation, or to arrive at generalizations that help focus later investigations. Since research settings are more natural, more holistic insights may be gained and research results are more relevant to the domain under study.

Experimentation includes research strategies such as laboratory and field experiments as well as computer simulations. It bridges the gap between theory building and observation as experimentation helps validate underlying theories or issues of technology transfer and acceptance. Experimental designs are guided by theories and facilitated by systems development. Results from experimentation may be used to refine theories and improve systems.

Systems development consists of five stages: concept design, constructing the architecture of systems, prototyping, product development, and technology transfer. Systems development is the hub of research that interacts with other research methodologies to form an integrated research process (Nunamaker et al. 1991).

3.1.3  Adopted Research Approach

 

Advancement of information systems (IS) research and practice often comes from new systems concepts. In engineering system domains, the concept at issue is viewed for its application value rather than its intrinsic value. The concept with wide range applicability goes through a research life cycle of the form: concept – development – impact. Much IS research demonstrates such a life cycle (Nunamaker et al. 1991). Their paper suggests that “theories” are needed to identify broad classes of things that can be done more efficiently or effectively, “instantiations” are needed to provide a continuing test bed for theories, and that “evaluations” of particular instantiation (systems) are needed to quantify the success or failure of the system in both technical and social terms. Systems development provides the exploration and synthesis of available technologies that produces the artifact, which functions as a bridge between technological research referred in the study as the “concept” stage and social research, which has been referred to as the “impact” stage.

 3.1.3.1  Concept

Initially, literature from various domains of experience-based design in healthcare and virtual facility prototyping were studied to find opportunities for developing systems that could improve the current design review process. Observations during a pilot study of an interactive virtual prototype in an immersive virtual environment showed that end users envision tasks they perform in their respective facilities while giving feedback during design reviews (Leicht et al. 2010). A concept emerged of developing interactive virtual prototypes that allow end users to visualize performing tasks while reviewing the space. This concept was termed as the “Experience-based Virtual Prototyping System” or the EVPS.

EVPS = Interactive virtual prototype + end user’s task-based scenarios

Literature review helped define the research problem by finding gaps in prior research. A review of multiple studies showed that game engines could be effectively used to develop interactive virtual prototypes and possibly embed task-based scenarios. However the studies also show that it is quite cumbersome and time-consuming to develop these virtual prototypes and their impact had not been assessed in detail (Figure 3-2). 

fig3-3a

Figure 3-3.  Research process.

Based on the literature review it can be summarized that the research problem is regarding the efficient development and subsequent testing of the EVPS. Although interactive virtual prototypes using game engines have been used for design review, there is a lack of streamlined procedures to make development easier. Moreover, because the interactive virtual prototypes are so difficult to develop, it is imperative to know if end users would actually benefit from using task-based scenarios embedded in virtual prototypes during design reviews.

These gaps and problems raise pertinent research questions on both the development and impact of the concept of experience-based virtual prototyping system.

1)     How can we extract end user scenarios of tasks to embed in virtual prototypes?

2)     How can we more effectively develop interactive virtual prototypes with task-based scenarios?

3)     How can we use experience-based virtual prototypes (EVPS) in healthcare setting?

4)     How can we evaluate effectiveness of experience-based virtual prototypes?

Research questions led to formulation of objectives. Each objective was broken down into a set of tasks and sub-tasks that will be discussed in the research steps.

3.1.3.2  Development

The development part of Concept – Development – Impact model dealt with investigating procedures to develop the concept further. This step explored methods and procedures that could take the EVPS concept into the development stage. The first two research questions focus on developing a process to first, understand what tasks should be incorporated in virtual prototypes and second, figure out an effective way to develop the interactive virtual prototypes.

Development process enabled that through the study of relevant disciplines for discovering new ideas and approaches. Domains of interest included virtual prototyping, gaming engines, and scenario-based design. Next, processes to rapidly transfer model content from existing applications into an interactive virtual prototyping environment were tested by “prototyping the prototype”. Chapter 4 gives a detailed description of the scenario framework and EVPS development process.

Various healthcare-related projects and small case studies were used as test-beds throughout the development process to streamline design information workflows. While the concept EVPS design was part of healthcare facility design, process to embed scenarios in an interactive virtual prototype were tested through a project on independent living facilities for the elder (Kumar et al. 2011). Once an efficient procedure to develop the EVPS concept was defined, it was implemented on a healthcare facility case study to assess the impact.

 3.1.3.3  Impact

Research questions related to how EVPS can be implemented in a healthcare setting and how it would affect the design review process for end users were studied within the impact phase of the “Concept – Development – Impact” model. Evaluating impact of the developed concept – EVPS was done using a multi-method approach of case study and experimentation. The case study chosen for the evaluation phase was the new Hershey Children’s Hospital (Chapter 5) that was under construction during the study. The EVPS development process was tested on the Hershey Children’s Hospital, by first using focus groups to elicit requirements and then developing interactive virtual prototypes for use by end users of the facility. Evaluation of the use of EVPS was done through observational field study during design review meetings with the pharmacy staff of the hospital.

Next, a user study tested the effect of using task-based scenarios embedded in the interactive virtual prototype on end user feedback during design review. The evaluation study is described in detail in Chapter 6.

3.2  RESEARCH STEPS

Once the research approach was defined, objectives for the research were broken down into specific research tasks and activities. The following describes specific research activities assigned to accomplish the objectives.

Research Activities for Objective 1

 Objective 1:  Develop a virtual prototyping procedure to extract end user experience of healthcare activities in interactive virtual prototypes.

  1. Develop a procedure to document scenarios of activities within the healthcare facility context.
  2. Develop a hierarchical data structure for task-based scenarios that characterizes attributes of the interactive objects used and specific tasks performed within these scenarios.

Research Activities for Objective 2

Objective 2: Design a framework for structuring end-user activities into scenarios that can be simulated in an interactive virtual prototyping system.

  1. Design of overall system architecture for interactive virtual prototyping that defines the components, databases and libraries that will be used for development of the virtual prototyping system within a rendering/ gaming engine.
  2. Classify design review requirements along with additional features and functionality that can be added to the virtual prototyping system for interactive design review.
  3. Development of an interactive interface to allow the end user to carry out specific task-based scenarios in the virtual environment through scenario definition, scripting and use of interactive objects.

Research Activities for Objective 3

Objective 3: Develop an interactive computing platform titled the Experience-Based Virtual Prototyping System (EVPS) for implementation in healthcare design reviews.

  1. Utilize an appropriate programming environment / gaming engine for the development of an interactive virtual prototyping system termed the Experience- based Virtual Prototyping System (EVPS).
  2. Investigate design information workflows for importing facility models from various BIM authoring tools (E.g., Autodesk Revit) to the interactive virtual prototyping system. This would include identifying challenges and limitations in the workflow to facilitate rapid conversion of facility models into the virtual prototyping system.
  3. Generate reusable interactive virtual content models such as avatars of user roles and objects with dynamic behaviors like wheelchairs that are needed for the development of interactive virtual prototypes.
  4. Validate the procedure used to develop and implement task-based scenarios in the interactive virtual prototyping system and review capabilities and limitations of the EVPS against functional requirements.

Research Activities for Objective 4

Objective 4:  Assess the developed EVPS to evaluate effectiveness of interactive virtual prototyping for enhancing the experience-based design review process of healthcare facilities.

  1. Identify a suitable facility within the healthcare context that will benefit from the implementation of an experience-based design review process using interactive virtual prototypes. Investigate and document several specific scenarios of tasks undertaken in healthcare facility.
  2. Evaluate the use of the EVPS application for design review of healthcare facility implemented in an immersive virtual environment to assess its effectiveness in collaborative design review and decision making process. Obtain feedback through observations during field study meetings of end users during the design review process.
  3. Implement user study to evaluate the effect of using task-based scenarios in interactive virtual prototypes for design review of healthcare facilities by end users.

Following Table 3-1 shows research methods adopted for each specific research task.

 

Research Methods Research Objectiveand Tasks Literature Review Systems Development Requirement Analysis Concept Mapping Case Study Focus Group Field Observation User Study Talk Aloud Protocol Survey Questionnaire Protocol Analysis
I.  Develop a virtual prototyping procedure to extract end user experience of healthcare activities in interactive virtual prototypes.
1. Scenario Concept
Ch 2 Ch Ch Ch
4, 5 4, 5
2. Scenario documentation and structure ✓Ch 2 ✓Ch 2 ✓Ch 4, 5
II. Design a framework for structuring end-user activities into scenarios that can be simulated in an interactive virtual prototyping system.
3. System Architecture ✓Ch 4
4. Model and media requirements
Ch 2 Ch 4 Ch
4, 5
5. Interactivity interface ✓Ch 2 ✓Ch 4
III. Develop an interactive computing platform titled the Experience-Based Virtual Prototyping System (EVPS) for implementation in healthcare design reviews.
6. Programming environment ✓Ch 2 ✓Ch 4
7. Design Information workflows ✓Ch 4
8. Reusable model content
Ch 4 Ch
2, 5
9. Validate procedure
Ch Ch
3, 4 4,5
IV. Assess the developed EVPS to evaluate effectiveness of interactive virtual prototyping for enhancing the experience-based design review process of healthcare facilities..
10. Hershey Children’s Hospital
Ch Ch Ch 5 Ch
4,5 4,5 4,5
11. EVPS application
Ch 5 Ch
4,5
12. Evaluation of EVPS ✓Ch 6 ✓Ch 6 ✓Ch 6 ✓Ch 6

 Table 3-1. Research method adopted for each research task.

3.3  RESEARCH METHODS AND TOOLS

While research methodology refers to the principles and procedures of logical thought processes, which are applied to a scientific investigation, methods concern the techniques that are available for data collection and analysis as well as a summary of methods that are employed in the research project. In this study, the systems development approach was adopted for the EVPS system design. For EVPS implementation, the case study approach was adopted that included data collection through focus groups using concept mapping as a tool for requirements analysis. Finally evaluation adopted field study method for observation of EVPS application and experimentation for EVPS assessment. Data was collected through survey and talk aloud protocol and analyzed using protocol analysis.

The following are brief descriptions of the research methods adopted in the research project.

3.3.1  Systems Development

The research process outlined in Figure 3-4 is a systems development methodology that includes elements of both social and engineering research approaches. This methodology is adapted from Nunamaker et al. (1991) and adopted to develop the concept of the Experience- based virtual prototyping system (EVPS).  Chapter 4 describes the system architecture for EVPS and process for incorporating task-based scenarios in interactive virtual prototypes. [Publisher’s note: The following is a reformatting of what was originally a figure; hence, it is labeled Figure 3-4.

Construct a Conceptual Framework

  • State a meaningful research question
  • Investigate system functionalities and requirements
  • Understand the system building process/ procedures
  • Study relevant disciplines for new approaches and ideas

Develop System Architecture

  • Develop a unique architecture design for extensibility and modularity
  • Define functionalities of system components and interrelationships among them

Analyze and Design the System

  • Design the knowledge base schema and process to carry out system functions
  • Develop alternative solutions and choose one solution

Build the (Prototype) System

  • Learn about the concepts, framework, and design through system building process.
  • Gain insight about problems and the complexity of the system.

Observe & Evaluate the System

  •  Observe the use of the system by case studies and field studies.
  • Evaluate the system by laboratory or field experiments.
  • Develop new theories/ models based on the observation and experimentation of the system’s usage
  • Consolidate experiences learned.

Figure 3-4.  Systems development methodology (Source: Nunamaker et al 1991).

3.3.2  Case Study

Case-study research method is defined as “an empirical inquiry that: (1) investigates a contemporary phenomenon within its real-life context, especially when (2) the boundaries between the phenomenon and the context are not clearly defined (Yin 2003). Within the Construction Engineering and Management (CEM) domain, often approach case studies as mixed-methods projects with both qualitative and quantitative aspects (Taylor et al. 2011). Some of the limitations of case studies are that they may be limited to samples and cannot be generalized and they may lack precision, quantification, objectivity or rigor in execution.

According to Taylor et al. (2011), case-study research can be a rigorous research method that can lead to new insights, open up new lines of inquiry, and yield rich theoretical models that can enhance and expand research in the field of AECM. However, they must meet a “burden of proof”, an obligation that can be met in two ways, referred to as “burden of going forward”.

Taylor et al. (2011) provide a research strategy checklist to improve the consistency and comprehensiveness of case study research that includes longitudinal data collection, utilizing multiple researchers and/or raters and triangulation. The research study involved multiple researchers during data collection as this reduced bias.

3.3.2.1 Criteria for selection

The criteria for case study selection included four factors for consideration. They were: 1) the project should be in a healthcare setting, 2) the project should be ongoing (design or construction phase), 3) BIM or digital model of the project facility should be available, and 4) researcher should have access to end-users of the project for data collection and assessment. The Hershey Children’s Hospital was selected as a case study for data collection. The final procedure to develop EVPS was tested using digital models of the new, currently under-construction Children’s Hospital. Chapter 5 discusses the new Hershey Children’s Hospital case study in  detail.

Focus groups were used as a primary means of data collection for requirements analysis.

End users were asked to elicit scenarios of activities they would like to see embedded in the interactive virtual prototypes. The focus group method is an established rigorous technique for collecting interviews aimed at eliciting and exploring in-depth opinions, judgments and evaluations expressed by professionals, experts or users/ clients about specific topics (Morgan 1997). The key difference between one-to-one interviews and focus-group discussions is that the latter is far more appropriate for the generation of new ideas formed within a social context (Breen 2006).

 

 3.3.2.3 Data Analysis – Concept Mapping

 

During focus group discussions, concept-mapping tool was used to record and validate the scenarios collected and discuss them in more detail. Concept maps are diagrams that are used to represent and convey knowledge (Klein and Hoffman 1993). Concepts maps traditionally use paper and pencil, posters or stick-on notes and are often used in brainstorming sessions. In concept-mapping knowledge elicitation, the researchers help the domain practitioners build up a representation of their domain knowledge, in effect merging the activities of knowledge elicitation and representation. Concept maps help form knowledge models to support knowledge preservation, knowledge sharing and creation of decision support systems (Crandall et al. 2006). This research used concept maps to develop a hierarchical structure that frames scenarios and helps inform model content.

Both qualitative and quantitative research approaches were adopted for evaluating the developed EVPS in a healthcare context. The first part of evaluation was done through observational field study (Chapter 5) and the next part was a field experiment (Chapter 6). Both evaluation studies involved end users of the Hershey Children’s Hospital as participants.

 

 3.3.3.1  Research Method – Field Study

 

For first part of analysis, an observational field study was conducted during design review meetings of the new Hershey Children’s Hospital. The pharmacy staff at the hospital utilized the EVPS developed during the case study research for their transition planning review meetings. Interviews and observations from the design review meeting helped inform how end users can apply EVPS during all phases of the facility life cycle. The design review meetings and their findings are described in more detail in the latter half of Chapter 5.

3.3.3.2  Research Method – User Study

A user study was designed using a posttest only control group experiment design where nurse participants were randomly assigned to either the walkthrough only control group or the task-based scenario treatment group. Pretest was used only to get demographic data from the participants.

During the user study, participants were video taped as they navigated through their specific assigned stimulus and performed specified tasks based on the condition they were assigned to. A Cognitive Task Analysis approach, talk aloud protocol was employed during the study for knowledge elicitation. Chapter 6 gives a detailed description of the user study design and procedures.

 3.3.3.3  Data Collection – Survey

Surveys can be divided into two broad categories of questionnaires and interviews.

Surveys vary from highly structured questionnaires to open-ended questions (Fellows and Liu 2003). During the user study, posttest questionnaires were administered to ask questions regarding ease of use, design layout, model content and overall experience in using EVPS. Chapter 6 describes the data collection procedure using surveys in detail.

3.3.3.4  Data Collection – Talk Aloud Protocol

Talk Aloud Protocol (TAP) originated in classic research domain within the psychology of problem solving. In studies, research participants are instructed to speak their thoughts as they work on problems and do so as if they are “speaking to themselves” (Ericsson and Simon 1993). TAP helps capture what end users know about their domain: its concepts, principles and events. Participants were encouraged to do task explication. In addition to thinking aloud, participants could also be probed with questions afterward (Van Someren et al. 1994).

3.3.3.5  Data Analysis – Protocol Analysis

The approach to analysis of TAP data falls in the middle of the analytic spectrum drawing from both qualitative and quantitative analytic techniques. Data from TAPS consists largely of a record of the participant’s verbalizations. The data recordings – either audio or video has to be transcribed and then coded in some way. The procedure for coding a protocol is referred to as Protocol Analysis (PA). In traditional PA, every statement in protocol is coded according to some sort of a priori scheme that reflects the goal of the research. The coding scheme depends on the task domain and purposes of analysis and begins with indexing common themes and patterns while breaking down data into units.

It is often valuable to have more than one coder conduct the protocol coding task. In some cases it is necessary for demonstrating soundness of research method and conclusions drawn from research. For research in which data from TAP task are used to make strong claims about reasoning process, especially reasoning models that assert cause-effect relations among mental operations, the assessment of inter-coder reliability of protocol coding is regarded as a critical aspect of research.

3.4 SUMMARY

This chapter begins with describing the adopted research approach of Systems Development and the rationale for its selection. Next the research steps are listed and the corresponding research methods undertaken to accomplish each research task are described. The next chapter describes the development process and framework for experience-based virtual prototyping system in detail.

++++

Copyright 2013 by Sonali Kumar. All rights reserved. Thesis published on this site by the express permission of Sonali Kumar.

Note: See Chapter 4. http://www.joelsolkoff.com/chapter-4-dr-kumars-thesis-on-virtual-reality-modeling/

 

 

Chapter 2. Dr. Kumar’s thesis on virtual reality modeling

Link to abstract, Chapter One, etc. http://www.joelsolkoff.com/dr-sonali-kumars-thesis-on-virtual-reality-modeling/

Chapter 2

ROLE OF VIRTUAL PROTOTYPING IN DESIGN REVIEW

This chapter evaluates the current state of literature and identifies gaps to inform areas for further research. The role of virtual prototyping in design reviews of healthcare facilities is determined by exploring three broad topics of a) design reviews as a process that take place within b) the context of healthcare facility design and finally through the use of c) virtual prototypes as a tool as shown in Figure 2.1. Literature sources reviewed are primarily from the AEC domain that comprises virtual prototyping theories and design theories including healthcare design.

Fig1C2

The first two sections of this chapter introduce design review concepts in the healthcare field. The third section focuses on virtual facility prototyping as an effective tool for experience- based design review with end-users. This section describes how virtual prototyping allows for better visualization of facilities through case studies of successful implementation both within and outside the healthcare context. Finally, it identifies the potential for engaging design reviewers by incorporating greater interactive experiences in the virtual facilities. 

Based on the above findings, the final section explores the possibility of portraying virtual prototyping through game environments to enable interactive review of virtual healthcare facilities.

2.1 DESIGN REVIEW

Design is the process by which the needs, wishes, and desires of the owner are defined, quantified, qualified, and communicated to the builder. As such, it is the particular phase of the project where many key decisions are made (Sanvido and Norton 1994).

The traditional approach to the planning, design, construction and operation of a facility in the AEC industry favors a sequential, “over the wall” approach to project development where many participants often work independently while taking decisions that may affect others.

Decision-making during design reviews are usually dominated by the perceptions of the “expert” decision makers (E.g., planners, architects, and design engineers) and focus mainly on the technical elements of a project (Isaacs et al. 2011). According to Anumba et al. (1997), this often leads to inadequate capture, analysis and prioritization of client requirements, lack of communication of design intent and rationale as well as poor integration, co-ordination and collaboration between the functional disciplines involved in the project.

2.1.1  Design Review Process

Design Review is defined as a process in which design is reviewed for constructability, coordination of systems and visualization of spaces and building details (Computer Integrated Construction Research Group 2010). 

The Design Review Process usually occurs during the design phase of the facility life cycle. The design process is traditionally broken into the following phases of programming, schematic design, design development and construction documents (Figure 2-2).

Figure 2-2. Design Review during the Facility Life Cycle

Fig2_2

Design review is usually done between participants that include the Architecture, Engineering and Construction (AEC) team and the clients or end-users of the facility. The purpose of design review is to evaluate the proposed design of the facility against the programmatic requirements and needs of the client.

Kagioglou et al. (2000) employ a “stage gate” approach for design review that applies a consistent planning and review procedure throughout the process covering the whole ‘life’ of a facility project from recognition of a need to the operation of the finished facility and, finally, to its demolition. They propose the Generic Design and Construction Process Protocol (GDCPP) which divides the project into four major phases: 1) Pre-Project Phases comprising of demonstrating the need, conception of need, outline feasibility, substantive feasibility study and outline financial authority; 2) Pre-Construction Phases including outline conceptual design, full conceptual design, and coordinated design procurement and full financial authority; 3) Construction Phases that include production information and construction; and finally 4) Post- Construction Phase comprising of operations and maintenance. Each of the above phases require a review from project stakeholders (Kagioglou et al. 2000). 

2.1.2  Design Communication and Visualization

Communication of design information is of vital importance in the facility design process as each building project involves the collaboration of several disciplines and project stakeholders. Traditional design information communication took the form of two-dimensional (2D) drawings, with designers using plans, elevations and perspectives to represent their design intent.

Along with the emerging new media and the rapid changes of information technology, digital design has become a leading trend, and design thinking in the digital world has changed accordingly. Representation applied in the digital design world has been modified to meet the different design situations. The newly emerged virtual environments, with the advantages of visualizing the virtual world through perception, have created different dimensions of representation for design. Thus, the representation and perception are two important human cognitive faculties involved with design in virtual environments (Chan 2011).

Based on the studies, Figure 2-3 shows a conceptual model of design communication between AEC professionals/ project team and client/end-user. While the AEC professionals use media to represent design intent, end-users are expected to perceive the design from the media to engage with design, and provide design review feedback.

Fig2-3

Figure 2-3. Design Communication between participants

In design, designers use suitable means to mentally create design concepts, apply communication channels (media) to express their design concepts and turn the concepts into external visible artifacts (products); so that designers and other viewers (or clients) can visualize the design in progress. These various means used for creation are internal representations, whereas the artifacts are external representations of the design.

The entire design process phenomenon consists of representation of design concepts and utilization of appropriate media to make the concepts visible until the final solution is reached. These complicated mental processes usually generate some external representation of a drawing, video, physical model, digital model, virtual model; or the combination of all (Kalay 2004).

2.1.2.2 Design Perception

Consequences of design alternatives are often difficult to envision during early phases of design and communication techniques can ensure that all team members reviewing the design clearly understand the nature and content of those alternatives (Kirk and Spreckelmeyer 1988)

3D visualizations can fulfill various functions in participatory planning workshops. These can be divided into three main groups: functions to support (1) individual information processing,

(2) participant discussions, and (3) achieving the objectives of information transfer in different phases of the planning process (Chan 2011).

2.1.3  Review of Design Visualization Media

Architectural design progresses through different stages where representations take different forms depending on the level of information that needs to be communicated. Thus, the nature of design representation varies from more abstract forms in the conceptual stage to become more detailed and more realistic as design evolves.

There are five categories of design communication media are used. These include the pencil-and-paper– usually abstract drawings, quick sketches, or even construction (or working) drawings, physical models that depict scaled down 3D representation  for study;  digital models- developed using computer software; film and video- create animation for demonstrating design concepts; and lastly, Virtual Reality (VR) -advanced media for visualization and simulation of design (Chan 2011).

Figure 2-4 shows two categories of physical (top) and digital (bottom) design visualization media used in AEC domain. From left to right, these visualization media are arranged from the level of lowest (abstraction) to highest (realism) fidelity.

Physical 

Fig2-4

Digital

Figure 2-4. Spectrum of AEC representation tools

 2.1.3.1  Plans and Elevations or 2D drawings

 Traditionally, two-dimensional floor plans of buildings and elevations or perspective projections have been the basic communication media between architects and their clients.

Particularly with respect to the interior of buildings, architects relied on the client’s imagination to visualize a proposed building from its architectural plan views, assuming that clients are familiar with architectural symbols, and have training and experience to construct three dimensional images from two dimensional plan views (Funkhouser et al. 1996).

2.1.3.2  Scaled Models and Physical Mock Ups

Clients and architects often discuss design proposals by studying the scaled models through the overhead perspective that forces the viewers to imagine themselves looking and moving within the model which can lead to misperceptions.

Full-scale models known as mock-ups, allow viewers to experience an artifact as close to reality as one can come without constructing the facility itself. Physical mock-ups at full-scale are essential for verification of constructability and functional performance especially for complex façade assembly (Pietroforte et al. 2012) or more costly unique construction such as operating rooms.

2.1.3.3  Building Information Modeling

Building Information Modeling (BIM) is a process that provides a means for owners, designers, contractors, and operators to generate, organize and use detailed information throughout a project lifecycle. Over the past several years, BIM implementation has increased substantially within the AEC Industry. The National BIM Standards (NBIMS) Committee defines BIM as: “… a digital representation of physical and functional characteristics of a facility. A BIM is a shared knowledge resource for information about a facility forming a reliable basis for decisions during its lifecycle from earliest conception to demolition. A basic premise of BIM is collaboration by different stakeholders at different phases of the life cycle of a facility to insert, extract, update or modify information in the BIM to support and reflect the roles of that stakeholder. The BIM is a shared digital representation founded on open standards for interoperability” (buildingSMART alliance 2007).

The Computer Integrated Construction (CIC) Research Group at Penn State has developed a BIM Project Execution Plan that describes an execution strategy to implement BIM on a project which comprises of the following four steps: 1) Identify BIM Goals and Uses; 2) Design the BIM Project Execution Process; 3) Develop Information Exchanges; and 4) Define Supporting Infrastructure for BIM Implementation (Computer Integrated Construction Research Group 2010).

The BIM Project Execution Plan defines BIM Use as a process in which a project team member utilizes building information for the purpose of improving the planning, design, construction, or operation of a facility. The Plan identifies Design Reviews as a BIM Use (Figure 2-5) and defines design reviews as: “… a process in which stakeholders view a 3D model and provide their feedback to validate multiple design aspects. These aspects include evaluating meeting the program, previewing space aesthetics and layout in a virtual environment, and setting criteria such as layout, sightlines, lighting, security, ergonomics, acoustics, textures and colors, etc.”

Of the 25 BIM uses defined by the BIM Project Execution Planning Guide, design reviews ranked second in both the frequency of use and perceived benefit of use by AEC organizations (Kreider et al. 2010). 

 fig2-5

Figure 2-5. Design Reviews as a BIM Use (Source: CIC Research Group 2010)

The Building Information Modeling (BIM) use for “design review” can employ either some computer software or advanced virtual environment facilities, such as CAVE (Computer Assisted Virtual Environment) and immersive screens. Further, the virtual mock-ups (prototypes) can be performed at various levels of detail depending on project needs.

2.1.4  End-Users in the Design Process

Successful construction projects are designed, built and equipped to meet users’ needs. Whether it concerns the function and expression of an entire building or the design of a single space, users hold a unique knowledge, which should be integrated properly in the design to ensure a successful building project.

Designers engage in conversations with clients and users at various stages of the design process, in part to make sense of the information gathered and then to make decisions and generate ideas for the design of the space. Aesthetic and functional design decisions are made on the spot by designers engaged with stakeholders as they define how the space should be occupied and for what purposes (Poldma 2010). There is a service relationship that develops between the designers and users as they participate together in both design and production processes (Tzortzopoulos et al. 2006).

End-users are defined as those who use/occupy the building; they are not experts in managing it, but have knowledge and opinions, nonetheless, about its performance in relation to their own objectives. The end-users of a building are typically building inhabitants, external service providers, operation and maintenance personnel, and building administration. They may have conflicting wishes and expectations on building performance in many cases.

Building end-users are a source of experience and expertise that can be brought into the briefing stage for the evaluation of design proposals. End-users are often a source of new ideas promoting innovation (Carvajal 2005).

2.1.4.1  Design Decisions

In creative collaboration such as design, teams are formed to find a novel solution to the problem.  They are required to have shared understanding and joint decision-making skills (Kalay 2004). According to Akin (1986), the design process needs to be decomposed explicitly into smaller phases to make possible the contributions of a large number of participants, such as engineers, planners, and clients or user groups. Since the integration of each participant in this complex process is essential, these interest groups can participate in design meaningfully if they are informed about the relevant decisions and alternatives during the course of the design process.

Furthermore, experts from diverse fields may be responsible for parts of the design, and they need to be able to communicate their ideas to other people in the design team (Schön 1991). The design process can, in itself, become a common artifact, around which communication takes place (Davies 2004).

Drawings are still predominantly used in design and engineering, though computers have been adopted in the last fifteen years to support design and drafting mostly in 2D. BIM is making an enormous shift in approach to design since it is replacing 2D drawings with 3D models. 3D visualization is one of the most important aspects of the design optimization because it improves communication with all stakeholders and speeds up design decisions (Zikic 2009).

2.1.4.2  Participatory Design Approach

Participatory design is an advantageous approach when the tacit knowledge invested in the people who work day-to-day in a particular situation must be captured (Davies 2004). Since, users evaluate the built environment differently from designers (Zimmerman and Martin 2001), participatory design approaches attempt to bridge a gap in understanding between users and designers. This can be extremely useful in understanding the tasks performed by healthcare practitioners by incorporating their knowledge of how things work in their setting into the virtual environment. Furthermore, when large changes are to take place in a work situation, involving the people who are to work in the new environment in the design process increases the acceptance of those changes.

2.1.4.3  User-Centered Design

User-centered design (UCD) is an approach to design that grounds the process in information about the people who will use the product. UCD processes focus on users through the planning, design and development of a product. This concept is derived from the domain of information sciences and technology and Human Computer Interaction (HCI) theories, specifically Human Centered Design Processes that provides guidance for the design and development of computer systems.

User-Centered Design (UCD) is defined as: “An Approach to user interface design and development that views the knowledge about the intended users of a system as a central concern, including, for example, knowledge about users’ abilities and needs, their tasks, and the environments in which they work. The users would also be actively involved in the design process.” (Stone et al. 2005)

Users are typically experts in some domain of activity relevant to the product being (facility) being designed (Erickson 1995).

2.2 EXPERIENCE-BASED DESIGN APPROACH TO HEALTHCARE FACILITIES

Many studies suggest strong links between the physical environment to patient and staff outcomes in the following areas: Reducing staff stress and fatigue and increasing effectiveness in delivering care; improving patient safety; reducing patient stress and improving outcomes; and finally improving overall healthcare quality (Ulrich et al. 2004). Therefore, there is a need for innovative approaches to design such as Experience-based Design (EBD) that focuses on end- user and staff experiences in a facility to identify creative design solutions.

2.2.1 Healthcare Design- Complexity, challenges and present state

Hospitals are the most complex of building types. Each hospital is comprised of a wide range of services and functional units. These include diagnostic and treatment functions, such as clinical laboratories, imaging, emergency rooms, and surgery; hospitality functions, such as food service and housekeeping; and the fundamental inpatient care or bed-related function (Carr 2009). In addition to the wide range of services that must be accommodated, hospitals must serve and support many different users and stakeholders. Ideally, the design process incorporates direct input from the owner and from key hospital staff early on in the process. The designer also has to be an advocate for the patients, visitors, support staff, volunteers, and suppliers who do not generally have direct input into the design. Good hospital design integrates functional requirements with the human needs of its varied users with the planning process.

The design of healthcare facilities is governed by many regulations and technical requirements. Healthcare facilities encompass a wide range of types, from small and relatively simple medical clinics to large, complex, and costly, teaching and research hospitals (Carr 2009).

Healthcare facilities have to be specially designed to fulfill the needs of medical care and maximizing the efficiency of the whole medical system; it is also crucial to plan all spaces well so it will minimize the travelling of patient within the facilities; to maximize the efficiency of the integration of each related facilities. It is always a challenge for architects to design healthcare facilities; the knowledge of medical care and the medical systems is essential to begin.

It is estimated that $100 billion in inflation-adjusted dollars has been spent on new healthcare construction in the past 5 years and $250 billion will be spent in the next 10 years (Clancy 2008). The entire healthcare system is under great pressure to reduce costs, and at the same time, be more responsive to “customers”. This can be challenging as the design of the healthcare facility has to address the human needs of each of the defined user groups which includes patients, caregivers, employees, visitors as well as the community.

2.2.1.1 Need for tools in healthcare

Healthcare facilities not only require specialized and complex functions to be performed in them, but they must also address the needs of the end-users such as the patients, staff and healthcare practitioners using the facility. This makes them exceedingly difficult to design, build and operate due to the incorporation of the interdisciplinary knowledge and input of various stakeholders such as the design professionals, engineers, facility managers as well as the end- users (patients, staff and medical professionals) of the building. Hence, there is a need to create innovative tools and procedures that facilitate high levels of participatory design and allow better visualization of these complex facilities to aid in the decision-making process during the planning and design of these specialized facilities.

 2.2.3  Evidence-based Design Approach in Healthcare

The Center for Health design (CHD) was formed as an organization in 1993 with the express intention to serve as a consortium for knowledge in many different fields that contribute to the creation of healing environments for both patients and staff. (Center for Health Design 2010). Evidence-based design has evolved from other disciplines that have used evidence-based model to guide decisions and practices in their respective fields.

“Evidence-based design is the process of basing decisions about the built environment on credible research to achieve the best possible outcomes. Evidence-based design is measuring the effect of specific design features on patient outcomes, productivity, and also employee and  patient morale and stress levels. Hamilton and Watkins (2009) define Evidence-based design as:

“A process for the conscientious, explicit, ad judicious use of current best evidence from research and practice in making critical decisions, together with an informed client, about the design of each individual and unique project”

Evidence-based design has evolved from other disciplines that have used evidence-based model to guide decisions and practices in their respective fields. Evidence-based design is used to persuade decision-makers to invest the time and money to build better hospital buildings that are based on current research so they can realize specific advantages (Stankos 2007).

2.2.4  Defining Experience-based Design

Experience-based design is defined as a user-focused design process with the goal of making user experience accessible to the designers, to allow them to conceive of designing experiences rather than designing services (Bate and Robert 2006). Figure 2-6 shows co- productive relationship between designers and users to create optimum value in the healthcare facility design.

Fig2-6

Figure 2-6. Co-productive relationship between designers and users in EBD. (Source: NHS 2008).

Using experience to design better healthcare is unique in the way that it focuses on capturing and understanding patients’, careers’ and staff experience at crucial points in the care pathway. NHS website: what’s special about using experience to design better healthcare is its focus on capturing and understanding patients’, care givers’ and staffs’ experience of services and not just their views of the process like the speed and efficiency at which they travel through the system.

The term has emerged from the UK’s National Health Services’ (NHS) Institute for Innovation and Improvement. According to Bate and Robert (2007), Experience in Experience- based design (EBD) is designated, as “how well people understand it, how they feel about it while they are using it, how well it serves its purpose, and how well it fits into the context in which they are using it.” By identifying the key moments and places, where people come into contact with  the service and where their subjective experience is shaped, and therefore where the desired emotional and sensory connection needs to be established—and working with the front-line people who bring alive those various touch points in the journey—it is possible to begin  designing experiences rather than processes. The task for experience-based design is to gain access to that knowledge and use it in the service of a better design and a better experience for the user.

2.3 VIRTUAL PROTOTYPING FOR DESIGN REVIEW

Design evolves, ideally, through an iterative process of prototyping, involving actual users, designers, engineers, and other experts until a satisfactory result has been achieved (Norman 1988) and can be used for “facilitating meaningful innovation” from a combination of “understanding what people do and think” and “innovative technology” (Rheinfrank et al. 1994).

“Prototypes” are representations of design ideas created before final artifacts exist. In some industries or companies, the term prototype is reserved for highly resolved and close-to- launch versions that in essence “stand for” a final product or offering (Coughlan et al. 2007).

Prototyping, which is the process of developing prototypes, is an integral part of iterative user-centered design because it enables designers to try out their ideas with users and to gather feedback. Prototyping as a process involves moving from the world of abstract ideas, analysis, theories, plans, and specifications to the world of concrete, tangible, and experiential things  (Rudd et al. 1996). In the human-computer interaction context, the main purpose of prototyping is to involve the users in testing design ideas and get their feedback in the early stage of development, thus to reduce the time and cost. It provides an efficient and effective way to refine and optimize interfaces through discussion, exploration, testing and iterative revision. Early evaluation can be based on faster and cheaper prototypes before the start of a full-scale implementation. The prototypes can be changed many times until a better understanding of the user interface design has been achieved with the joint efforts of both the designers and the users (Rosson 2002; Rudd et al. 1996).

Prototyping can be divided into low-fidelity prototyping, medium-fidelity prototyping and high-fidelity prototyping. The determining factor in prototype fidelity is the degree to which the prototype accurately represents the appearance and interaction of the product. Low-fidelity prototypes are quickly constructed to depict concepts, design alternatives, and screen layouts, rather than to model the user interaction with a system. Low-fidelity prototypes provide limited or no functionality. In contrast, high-fidelity prototypes are fully interactive, simulating much of the functionality in the final product. Users can operate on the prototype, or even perform some real tasks with it (Rosson 2002).

In our (AEC domain) use of the term, and more typically within the design profession, prototypes can be usefully thought of as “design representation/ communication tools” and consequently may exist at any level of resolution—from very rough to highly refined—and may be used at any stage in the design process to explore, evolve, and/or communicate ideas (Coughlan et al. 2007).

2.3.1  Definition of Virtual Prototypes

A virtual prototype is defined as “A digital model (mock-up) of a structure or product used for testing and evaluating form, design fit, performance and manufacturability as well as used for study and training” (Wang 2002). It can also be defined as “A computer-based simulation of a system or sub-system with a degree of functional realism comparable to a physical prototype”

Virtual prototyping is defined as “The process of using a virtual prototype in lieu of a physical prototype, for test and evaluation of specific characteristics of a candidate design” (Schaaf and Thompson 1997). Markham (1998) identified three factors contributing to effective visualization using virtual prototyping: 1) immersion, 2) interaction, and 3) engagement. These factors provide many advantages of using virtual prototyping for design reviews in the AEC field.

2.3.2  Virtual Prototyping in AEC Domain

With the recent advances in technology, many cases of using virtual prototypes during the facility life cycle have emerged in the Architectural Engineering and Construction industry. The use of Virtual prototyping in the building industry began in the nineties and build up in the last ten years (Whyte 2003). Some of the earlier applications of virtual prototyping used expensive immersive virtual prototyping environments such as CAVEs (Cruz-Neira 1998) or described the use of custom-based virtual reality suites that enabled real-time visualization of building models (Funkhouser et al. 1996).

Table 2-1 shows a list of studies that have applied virtual prototyping in the AEC context either to assess their effectiveness for design review, or to develop tools and technology required for virtual prototyping. For each study, Table 2-1 lists the year, authors, facility type, and the type of virtual reality (VR) system/ software and VR technology/ hardware used. Although the following list of studies is not comprehensive, the table shows shifts in trends of using more off- the-shelf real-time rendering/ gaming engines, digital 3D file formats like VRML and virtual reality plug-ins from previous use of highly customized VR suites. The studies that are in bold, relate to use of virtual prototyping in healthcare facilities.

Table 2-1. List of studies that used Virtual Prototyping for Design Review.

Year Author Facility Type VR system/ software VR Technology/                                                                                                                                                                 Hardware      
2000 Fröst and Warren Collaborative Design of Laboratory Layouts ArchiCAD v6 – dVISE from Division CAVE
2002 Patel et al. Home designs InfiniteReality 2E graphics Reality Room suite
2002 Shiratuddin and Thabet Office Building Unreal tournament Desktop
2004 Davies Foundry Superscape Projector and PC
2004 Palmon et al. Home/ office for disabled EON Reality Desktop
2005 Carvajal Home design 3D vs. 2D Video – Studio Max Architectural Desktop Laptop and projector
2006 Maldovan et al. Courtrooms VRML 3 screen immersivedisplay
2006 Majumdar et al. Courtroom Design Panda3D Curved Front Projection
2007 Lu and Riley Patient Exam Room Unreal tournament Desktop
2007 Dunston et al. Hospital Patient Rooms OSGExp plug-in CAVE – Fakespace FLEX VR theatre system
2007 Shiratuddin Student Designs BuildITC4 – C4Engine Desktop
2008 Mobach Community Pharmacies OSG-RC developed in- house 3 projectors and Cylindrical screens
2009 Tang et al. Way finding Quest3D Desktop
2009 Wahlström et al. Patient Rooms VR4MAX CAVE
2010 Bullinger et al. Concept design IAO VRfx Powerwall
2010 Leicht et al. Pharmacy VR4MAX 3 screen immersive display
2010 Dunston et al. Healthcare facilities OSGExp plug-in CAVE
2011 Christiansson et al. Office Building and other case studies VIC-MET CAVE, HMD Wii remote etc
2011 Zhang andEdelstein Healthcare environments CAVE-CADTM StarCAVE UCSD Calit2
2011 Kumar et al. Healthcare facilities Unity Scalable – Immersiveand Desktop
2011 Yan et al. House for egress andoperations XNA Game Engine Desktop
2011 Isaacs et al. Urban Planning XNA- HIVE Immersive and Desktop
2011 D’Souza et al. Children’s Zoo Second Life Desktop
2011 Shiratuddin and Thabet 3BR House Model Torque game Engine Desktop

 For each study, Table 2-1 lists the year, authors, facility type, and the type of virtual reality (VR) system/ software and VR technology/ hardware used. Although the following list of studies is not comprehensive, the table shows shifts in trends of using more off-the-shelf real-time rendering/ gaming engines, digital 3D file formats like VRML and virtual reality plug-ins from previous use of highly customized VR suites. The studies that are in bold, relate to use of virtual prototyping in healthcare facilities.

Recently, apart from applying virtual prototyping for design and constructability review, studies have also explored issues related to construction safety processes (Lin et al. 2011). These studies use virtual prototyping to visualize the construction process for hazard identification in the early phases of design (Chun et al. 2012). The next section focuses on the use of virtual prototyping specifically in healthcare facilities.

2.3.3  Virtual Prototyping for Healthcare Facilities

Virtual Prototypes can be extremely useful in understanding the tasks performed by healthcare practitioners by incorporating their knowledge of how things work in their setting into the virtual environment. For large complex projects such as hospitals and healthcare facilities, virtual prototypes can be used to tailor environments to user needs (Whyte 2002). Furthermore, when large changes are to take place in a work situation, involving the people who would work in the new environment in the design process increases the acceptance of those changes (Davies 2004).

Healthcare facilities can benefit significantly through the application of virtual prototyping in the design process as it enables the evaluation of a range of essential criteria. These could include but not be limited to evaluating mobility of equipment and furnishings; dimensions and placement of doors, windows and cabinetry; accommodation of flow into, out of, and within the room; accessibility and safety of bathroom facilities; assessment of noise levels filtering from outside the patient room; identification of architectural features for infection control; and intensity of various light sources. Figure 2-8 shows a patient room display in a virtual reality CAVE (Dunston et al. 2007).

fig2-7

Figure 2-7.  Patient room display in a virtual reality CAVE system. (Source: Dunston et al. 2007).

A study by Whalström et al. (2009) employed an immersive CAVE system to examine how end-users perceive use of virtual environments to analyze patient rooms (Figure 2-9). The study showed that virtual prototyping was convenient for evaluating most issues identified by the study participants in the actual hospital wards. Participants of the study included both nurses and patients who assessed a range of issues including aesthetics, correct location of equipment, supplies and materials, window/ door positions and the living/workspace size. Participants also identified that it was not possible to evaluate certain healthcare facilities features in virtual environments such as temperature, air circulation and noise. Some other limitations highlighted were the inability to touch modeled objects as well as accurately evaluating lighting levels. 

fig2-8

Figure 2-8. Patient interview within an immersive virtual environment. (Source: Whalström et al 2009).

2.3.4  Advantages of Virtual Prototyping

Some of the advantages of using virtual prototyping for design reviews are that they involve end-users and experts (Norman 1988), and externalizes thoughts to spark innovations (Davies 2004; Schrage 2000) by helping end-users understand the design space and tasks they would perform (Carvajal 2005; Mobach 2008). Furthermore, virtual prototypes can be developed relatively rapidly and allow interactivity and functionality compared to other design visualization media. The following are some advantages of virtual prototyping:

2.3.4.1  Collaboration, Creativity, Innovation

According to Schrage (2000), prototypes can foster collaborative creativity by externalizing thought and sparking conversations to make knowledge more explicit. Virtual prototypes can be effective tools to extract tacit knowledge of the end-user during the design process, thereby enhancing creativity. User involvement can take a variety of forms, from appraisal of an expertly modeled and animated 3-D virtual model with ensuing discussion to active design using virtual prototypes as a design tool.

 2.3.4.2  User Engagement, Interactivity and Functionality

The advantages of having user involvement and engagement with the virtual prototypes is that it leads non-AEC professionals to understand the design intent and imagine consequences of the design on their workplace, hence making them committed to the decision making process (Mobach 2008). Moreover, designs can be worked on over a long period of time and discussed among a larger group than is possible in traditional design situations (Davies 2004).

Virtual prototypes can provide a degree of functionality as they can allow further degrees of interactivity including multiple viewpoints, the ability to zoom in and out, and the ability to selectively view components. It has also been noted that virtual prototyping and 3D modeling is a means of rapidly developing designs (Gopinath 2004; Schaaf and Thompson 1997). This ability to rapidly develop and modify designs makes the use of virtual prototyping more alluring.

2.3.4.4 Quality, Efficiency and Cost savings

Virtual prototyping effectively communicates the design intent to the owner, construction team and end-users to get instant feedbacks on whether design meets program requirements, owner’s needs and building or space aesthetics aspirations (Leicht et al. 2010). These opportunities for early feedback increase coordination and communication between different parties which is more likely to generate better decisions for design, thereby reducing cost of changes and increasing scope of influence in design as shown in Paulson’s (1976) cost influence curve in Figure 2-10.

 fig2-9

 

Figure 2-9. Cost Influence Curve. (Source: Paulson 1976).

Studies have also shown that use of virtual prototypes for design review can eliminate costly and timely traditional construction mock-ups (Majumdar et al. 2007, Messner et al. 2007, Leicht et al. 2010). Different design options and alternatives may be easily modeled and changed in real-time during design review base on end-users and/or owner feedbacks (Gopinath 2004) that helps create shorter and more efficient design review cycles.

Additionally virtual prototypes can evaluate if the design meets building program criteria and owner’s needs that enhance the health, safety and welfare performance of their projects. For instance, virtual prototypes of BIMs can be used to analyze and compare fire-rated egress enclosures, sprinkler system designs, and alternate stair layouts (Yan et al. 2011).

2.4 EXPERIENCE-BASED VIRTUAL PROTOTYPES: NEEDS AND OPPORTUNITIES

Based on the literature reviewed, it can be inferred that virtual prototyping enhances the design review process by engaging end-users to interact with the facility during design reviews. In pilot studies conducted where virtual facility prototypes were used for design review of a medical pharmacy (Leicht et al. 2010), it was observed that project teams, especially end-users would frequently envision tasks that they would perform within the space they reviewed. This phenomenon triggered the idea of incorporating greater level of interactivity within virtual prototypes such that it allows end-users to move objects and navigate through spaces to simulate typical activities they perform while virtually reviewing the designed space. Project team members could also perform these tasks collaboratively with end-users to possibly identify creative design solutions.

 2.4.1  Addressing the Research Gap

The current research literature lacks a defined framework and methodology to make the virtual prototyping process more efficient for developers, while also engaging healthcare end users in the design review process.

The ability to virtually perform tasks and interactively review designs requires the development of an experience-based design system combined with interactive virtual prototyping that facilitates collaboration between design disciplines and enables experience- based design with end-user feedback. The concept of Experience-based Virtual Prototyping System (EVPS) can be a combination of experience-based design involving end-users in the design review and interactive virtual prototyping.

To realize these complex interactive virtual prototypes with end-user activities, it is important to develop and evaluate a virtual prototyping procedure for their efficient and rapid development. Literature indicates that real-time rendering engines and gaming environments could potentially be used to develop these interactive virtual prototypes that engage end-users through interactive simulations scenarios of activities. However, at present there is only an insignificant relationship between game engines and standard architectural or design visualization tools as they seldom offer real-time rendering and simulations that game engines do.

Therefore, to employ gaming environments in interactive virtual prototype development, it is important to study 1) games engine development, and 2) theories related to scenario-based design.

2.4.2  Game Engines to develop Virtual Prototypes

Game engines are the core software component that provide the underlying technology, simplify development, and incorporate all the elements vital in a game like physics, collision detection, graphical user interface (GUI), artificial intelligence, network functionality, sound and event engine (Eberly 2007; Fritsch and Kada 2004). Most game engines have a built-in physics engine that supports basic physics, collision detection, rigid body and vehicle physics.

Gaming consists of “interaction among players placed in a prescribed setting and constrained by a set of rules and procedures” (Hsu 1989). Contemporary developments in gaming, particularly interactive stories, digital authoring tools, and collaborative worlds, suggest powerful new opportunities for educational media.

While gaming environments and simulations are becoming more and more widespread in education, very little is known about how they work (Squire 2006). Similarly, in the design context, unlike reviewing virtual prototypes of facilities in virtual environments, gaming environments can offer extensive possibilities to engage end-users through interactive simulations of task scenarios within the virtual prototype. However, in order to employ gaming environments in the design review of virtual prototypes, it is important to understand how games and simulations can be developed. Furthermore, game engines allow multiple simultaneous users to explore the designed environment (Shiratuddin et al. 2004; Wang 2002) opening possibilities for collaborative design reviews.

 2.4.3  Simulating Experiences as Scenarios in Gaming Environments

To truly experience the tasks that are performed in healthcare facilities, the tasks can be simulated as scenarios within virtual prototypes. In the gaming world, the core of Massively Multiplayer Online Role-Playing Games (MMORPG) revolves around completing quests or a series of clearly outlined tasks that are given to the player to complete for in-game rewards (Karlsen 2008; G. Smith et al. 2011). In the context of experience-based design simulations, specific healthcare tasks can be categorized as scenarios that are movement, task, or inquiry based. The scenarios can vary depending on the user, the type of task being performed and the issues it addresses. For instance a movement-based scenario could involve a nurse moving the patient from the Emergency Department (ED) to the patient room. Large–scale healthcare facilities could benefit through simulation of such scenarios as it could help address issues of way finding in large spaces and also check if there are adequate architectural clearances to move equipment, wheelchairs and patients beds through all the corridors of the facility. Similarly scenarios for design professionals of the healthcare facilities could pertain to spatially reorganizing the architectural model in the virtual environment and evaluating different design options.

2.4.4  Scenario-based Design Theories

Scenarios are a narrative description of what people do and experience and can be couched at many different levels of description and many grains of detail (Carroll 1995). They are defined as “concrete description of activity that the user engages in when performing a specific task, a description sufficiently detailed so that design implications can be inferred and reasoned about” (Carroll 2003).

Scenarios have been gaining immense popularity in designing systems for both human- computer interaction (HCI) and software engineering (Kuutti 1995). However, these scenarios can also be instrumental in designing specialized facilities such as healthcare by extracting specific end-user knowledge. Scenarios can provide insights into how end-users of healthcare facilities behave in their environments and the type of specialized tasks they perform in them.

Scenarios simulated in a virtual prototype would help designers and users of healthcare facilities envision the outcomes of design. Hence, scenarios can further open possibilities for new and innovative alternatives for both the way facilities are designed and the way tasks are performed in them. The application of scenario-based design in architectural design review feedback can be viewed as part of the overarching design rationale theory within the HCI domain. Theory of design rationale couples theoretical concepts and methods with the designed artifacts that instantiate them (Carroll 2003). The EVPS concept development is an example of the theory- based design that demonstrates role that models and theories can play in invention, development and evaluation of new technology.

2.5 SUMMARY

This chapter provided an overview of the literature related to design reviews, emerging healthcare design theories like experience-based design, virtual prototyping and its application in the AEC domain, and finally potential use of game engines to develop interactive virtual prototypes. Based on the literature reviewed, the EVPS concept was introduced, which combined theories of scenario-based design, virtual prototyping and healthcare design reviews. The literature review indicates that developing the EVPS would enhance the design review with end- users of healthcare facilities. The next chapter on research methodology lays down the steps and research methodology adopted to design, develop, implement and assess the EVPS.

nent that provide the underlying technology, simplify development, and incorporate all the elements vital in a game like physics, collision detection, graphical user interface (GUI), artificial intelligence, network functionality, sound and event engine (Eberly 2007; Fritsch and Kada 2004). Most game engines have a built-in physics engine that supports basic physics, collision detection, rigid body and vehicle physics.

Gaming consists of “interaction among players placed in a prescribed setting and constrained by a set of rules and procedures” (Hsu 1989). Contemporary developments in gaming, particularly interactive stories, digital authoring tools, and collaborative worlds, suggest powerful new opportunities for educational media.

While gaming environments and simulations are becoming more and more widespread in education, very little is known about how they work (Squire 2006). Similarly, in the design context, unlike reviewing virtual prototypes of facilities in virtual environments, gaming environments can offer extensive possibilities to engage end-users through interactive simulations of task scenarios within the virtual prototype. However, in order to employ gaming environments in the design review of virtual prototypes, it is important to understand how games and simulations can be developed. Furthermore, game engines allow multiple simultaneous users to explore the designed environment (Shiratuddin et al. 2004; Wang 2002) opening possibilities for collaborative design reviews.

2.4.3  Simulating Experiences as Scenarios in Gaming Environments

To truly experience the tasks that are performed in healthcare facilities, the tasks can be simulated as scenarios within virtual prototypes. In the gaming world, the core of Massively Multiplayer Online Role-Playing Games (MMORPG) revolves around completing quests or a series of clearly outlined tasks that are given to the player to complete for in-game rewards (Karlsen 2008; G. Smith et al. 2011). In the context of experience-based design simulations, specific healthcare tasks can be categorized as scenarios that are movement, task, or inquiry based. The scenarios can vary depending on the user, the type of task being performed and the issues it addresses. For instance a movement-based scenario could involve a nurse moving the patient from the Emergency Department (ED) to the patient room. Large–scale healthcare facilities could benefit through simulation of such scenarios as it could help address issues of way finding in large spaces and also check if there are adequate architectural clearances to move equipment, wheelchairs and patients beds through all the corridors of the facility. Similarly scenarios for design professionals of the healthcare facilities could pertain to spatially reorganizing the architectural model in the virtual environment and evaluating different design options.

2.4.4  Scenario-based Design Theories

Scenarios are a narrative description of what people do and experience and can be couched at many different levels of description and many grains of detail (Carroll 1995). They are defined as “concrete description of activity that the user engages in when performing a specific task, a description sufficiently detailed so that design implications can be inferred and reasoned about” (Carroll 2003).

Scenarios have been gaining immense popularity in designing systems for both human- computer interaction (HCI) and software engineering (Kuutti 1995). However, these scenarios can also be instrumental in designing specialized facilities such as healthcare by extracting specific end-user knowledge. Scenarios can provide insights into how end-users of healthcare facilities behave in their environments and the type of specialized tasks they perform in them.

Scenarios simulated in a virtual prototype would help designers and users of healthcare facilities envision the outcomes of design. Hence, scenarios can further open possibilities for new and innovative alternatives for both the way facilities are designed and the way tasks are performed in them. The application of scenario-based design in architectural design review feedback can be viewed as part of the overarching design rationale theory within the HCI domain. Theory of design rationale couples theoretical concepts and methods with the designed artifacts that instantiate them (Carroll 2003). The EVPS concept development is an example of the theory- based design that demonstrates role that models and theories can play in invention, development and evaluation of new technology.

2.5 SUMMARY

This chapter provided an overview of the literature related to design reviews, emerging healthcare design theories like experience-based design, virtual prototyping and its application in the AEC domain, and finally potential use of game engines to develop interactive virtual prototypes. Based on the literature reviewed, the EVPS concept was introduced, which combined theories of scenario-based design, virtual prototyping and healthcare design reviews. The literature review indicates that developing the EVPS would enhance the design review with end- users of healthcare facilities. The next chapter on research methodology lays down the steps and research methodology adopted to design, develop, implement and assess the EVPS.

++++

link to Chapter 3. http://www.joelsolkoff.com/chapter-3-dr-kumars-thesis-on-virtual-reality-modeling/

Note: Under construction link to Chapter 3.

Copyright 2013 by Sonali Kumar. All rights reserved. Thesis published on this site by the express permission of Sonali Kumar.