Home
UNIVERSIDADE DE LISBOA
Contents
1. eene 5 22 State OF tie ALE a S Mu s 9 2 2 1 User interaction with objects in the VE sees 9 2 2 2 Tracking with the MTx Sensots estate oet OG Egas ga AU ata dA 11 Chapter3 Analysis and Planning eie se escsssaneandencdsvuncdcedsneadaacnceses 13 3l Proj ct Analysis ertse DR du eq P qudd en perita dta a rudi 13 Sell Cuo text of Used cun RC A e 13 Soh gt Stakeholders tera eae ade eee pasa IS reu sagas aged und ions 14 Su1 9 2 ROQUIECHICHUS ori ceases 15 JA se Cases ER 16 3 2 Project A 22 3 2 1 Development Processu siste eet eene Re coadsasadendcasnaceseeecaesdeacoade 22 3 2 2 Pr j ect Plang syen diss eugiat aucun dedu ini Ducis NEUE 23 2 2 9 RESQUICES unde a ped RU elisa Grande da lola eite 25 Chapter 4 Design and Implementation eee 28 4 1 Modeling Virtual Han oou roce rete oed de e DAR qu ee dti eq deus 28 ALT Body ARIAL OM o eesotofa OS 31 41 2 Facial Amati a as 32 425 Implemented Moda Ii oe 33 4 2 1 BaseApplication and DemoApp eene 34 4 2 2 Sensors Integration SensorlInteract esee 37 4 2 3 Physics Engine PhysicsIntetaet ete ct nee reor neg 43 4 2 4 Interaction with Objects ObjectInteract eese 46 XV 4 2 5 Skeleton Informati0N cocononncncnnnnnnononunanacinnnnnononananiconccnnnnnnnnnaniniccnnns 48 Chapter 5 Conclusion and Future Work eee 49 ACTO OS ninia SRD ORE NE E
2. CA A 030 ox X6 Co OR ERROR ERE ChooseSceneManager void Configure bool CreateCamera void CreateFrameListeners void CreateResourceListener void CreateScene void CreateViewports void CyclePolygonMode void CycleT extureFilteringMode void DestroyScene bool Dispose void Go void LoadResources void ReloadAIITextures void RootOnFrameRenderingQueued FrameEvent bool RootOnFrameStarted FrameEvent bool Setup bool Shutdown void TakeScreenshot void UpdateScene FrameEvent bool Base Joystick JoyStick keyboard Keyboard mouse Mouse Te e i dd et A Initializelnput void OnKeyPressed KeyEvent bool OnKeyReleased KeyEvent bool OnMouseMoved MouseEvent bool OnMousePressed MouseEvent MouseButtonlD bool OnMouseReleased MouseEvent MouseButtonlD bool Updatelnput void ZSightHandler 5 baseOrientation Quatemion Quatemion ZERO _baseOrientationValues List Quatemion new List lt Quater oldZSightOrientation Quaternion Quaternion IDENTITY timer Timer _zSight JoyStick CalibrationBufferSize int 50 ActivateZSight JoyStick bool FromTwos nt short GetOrientation JoyStickState NativePtr Quaternion QutsideFilterLimits Quaternion bool ReadDataFromSensor void StopCapture void TimerElapsed object ElapsedEventArgs void ZSightHandler
3. Cr N rearm Hand s Parent Arm Forearm s Parent Shoulder Arm s Parent Hand Figure 14 Upper Limbs Hierarchy The animation that uses the skeleton is called Skeleton Animation This method attaches the vertices of the mesh to the bones of the skeleton and this process is called skinning Each vertex has up to four influences of independent bones to each of that independence is assigned a weight together with the bone so when the bone moves the deformation of the other vertices can be weighted by that amount This technique is useful to determine the appropriate positioning of a certain bone when the bone s parent moves This kind of influences of a parent to its child is called Inverse Kinematics IK IK is a branch of physics that describes body movements widely used in games and 3D animations IK is available in 3D modeling programs such as Daz Studio Pro 4 and 3DS Max OGRE does not have IK only Direct Kinematics DK DK is for example when the arm is moved it is only possible to see the movement on their children in this case the forearm and hand This is not a problem since OGRE plays pre recorded animations As previously explained to export the animations to OGRE it is necessary to indicate manually to the exporter which are the keyframes of the animations and its 31 type The skin option comes from skinning the animation technique described earlier It is required to select a bone as the root b
4. event ZSightData Action lt Quaternion gt 56 B 2 DemoApp class DemoApp InitTrackers PhysicsWorld camera Camera readOnly interaction Sensorlnteraction readOnly mogreQuatSensor Quaternion xsensTracker XSensTracker zSightHandler ZSightHandler bodyCylinder Body bodyFireExtinguisher Body cylinderScene SceneNode dropAreaBodyCandle Body dropAreaBodyFireExtinguisher Body dropAreaCandleBox Box dropAreaFireExtinguisher SceneNode InitTrackers Camera Sensorinteraction XSensTracker InitZSight JoyStick void XsensConnect bool XSensOnSensorDataUpdated List lt Sensor gt void XsensStop void ZSightOrientationChanged Quaternion void dropAreaFireExtinguisherCylinder Cylinder ent SceneNode fireExtinguisher SceneNode fireExtinguisherCylinder Cylinder materiallD MateriallD initlODevices PhysicsWorld sceneManager SceneManager readOnly worldBody Body CreatePhysicsNonStaticWorld void CreatePhysicsStaticWorla void PhysicsWorld SceneManager property InitLibraries DropAreaCandle SceneNode actorlnteraction Actorlnteraction Raio eWorld camera Camera readOnly entHV Entity readOnly initlODevices InitTrackers interaction Sensorlnteraction joystick JoyStick readOnly mogreNewtHandler MogreNewtHandler nodeHV SceneNode readOnly Base objectinteraction ObjectInteraction InitMo
5. www microsoft com visio V http www fmh utl pt ergovr team htm 26 This chapter presented the analysis and planning of the project The analysis sections presented the context of the project the requirements and the corresponding use cases The planning sections described the development method chosen and the project phases Finally the material and human resources were enumerated 27 Chapter 4 Design and Implementation This chapter described the developed work The first section explains how to obtain a complete VH model compatible with the ErgoVR system Section two explains how the implemented module ErgoInteract works The second section gives a general idea of how the developed system works followed by a separate detailed explanation of each library 4 1 Modeling Virtual Human One of the goals of this project is to find a standard method to create a VH model usable in ErgoVR system As ErgoVR uses OGRE the VH model must be OGRE compatible At the beginning while still considering only the VH to reflect the participant s movements upper and lower limbs with skeleton were enough to make possible the association of the necessary bones to the sensors However along the project more VH models were needed to insert in the VE for instance as bystanders As such these VH should have e Body animation to give more realism to the model e Facial animation for example to simulate a dialogue or have natural facial expr
6. 22 OgreMax Object Settings ien des decet A 14 Figure 23 Add Mesh Animations ssa rasas eto eiae ovid taceant 15 Figure 24 Add Root Bone una id 15 Figure 25 Add Mesh Animation resina 16 Figure 26 OgreMax Export Scene ita erede id 16 1 Introduction This manual has the goal to help modeling a virtual model an VH to use in the ErgoVR system This VH has skeleton clothes hair and facial and body animations 2 Requirements To be possible to complete this tutorial it will be necessary to have installed the following software e DAZ Studio 4 0 Pro e 3DS Max 2009 e Autodesk FBX exporter v2011 3 e OgreMax exporter v2 1 2 3 Overall workflow To create an VH for the ErgoVR system it is required an VH with clothes hair and animation to be the more credible as possible To do that Daz Studio 4 0 Pro is used which has pre made models with skeleton pre made animations and clothes and hair to be used in the VH simply by drag and drop them into the model However Daz Studio 4 0 Pro cannot export to the OGRE format So an intermediate step is needed on which the model is exported to the FBX format and imported into 3ds Max 2009 because this tool has an exporter to the OGRE format In 3ds Max 2009 it is only necessary to separate and save the animations that are going to be exported to the OGRE format 4 Modeling the VH VH Step 1 Open DAZ Studio 4 0 Pro In the main window appears immediately a human
7. Da Amports and EX POTS S a a a a G 11 Export from Daz Studio 4 0 Pro ss sccccssicccpsccscacccuseesesusteasssssdenssusencdesanseceessvcouannavsds 12 Import in S DS MAR tii ibi 12 Export DOOR B OPHIdE S coe vaso Gs ps e asa cass race EAR ia 13 GF MOS SAL a E 16 List of Figures Figure 1 Main Window i2 nien e pte deren aee EIE as 5 Fig te 2 SKIN M AN OR O ORE EE a RR ESES EEN iSS 5 Figure 3 PAIF a 6 Figure Clothes odo a a a E REA hari des Gh 6 Figure 5 Fit To Window usted an a mei p un tatis His 7 Figure 6 Timeline of Animations A A 7 Figure 7 Pub Clap Animation ud A eae eee eee 8 Figure 8 Bake To Studio Keyfrar ies 2o ie lidia dadas 8 Figure 9 Timeline Animations ns eiie Loc Ped tst atc asp otc ado agde stude 9 Figure 10 Add Subtrack to Timeline une eet regentes 9 Figure 11 Add Empty Block in Timeline eee 9 Figure 12 Rename Empty Block e eq eo cn 3 aca tuse aa 10 Figure 13 Add Pace Deformation vine en dee lo ped epatis 10 Figure 14 Remove Face Deformaton eee e Julia aan 11 Figure 15 Bake to Studio Keyframes sais dramas A Gaede 11 Figure 16 Export FBX OPONS sil iio 12 Figure 17 Import FBX m 3DS MaX Lione dte id 12 Irure LS GrOBlp Menu oou RD tod HU meu ca 13 Figure 19 Named Group iui laica cds 13 Figure 20 Attach Object to OTOUD erre Shel eae Sa ew on teda Re eda 13 Figure 24 D re Menu casa SE Re SCE ies 14 Figure
8. LY qua oiga Figure 3 Sensors ID iiie pedi asiri as Figure 4 Connecting the 6 Sensors ur Figure 3 c Siraps With Sens uota conie ir os Figure 6 Connected Sensors with the Straps oooooconocccnocononccooncnoncnannnonn nono ncnoncnnnos Figure 7 Place Sensors Front Vie Wasi ena Figure 8 Place Sensors Side View niodo dla 1 Introduction This manual has the goal to help the connection of the sensors and place them in the participant A XM B User Manual exists and it shows how to install the software how the software works what information it gives us how the sensors work and how to connect them through the cables with the wireless receiver The problem is that it does not show the order in which the sensors should be connected and the correct placement on the participant 2 Requirements To be possible to complete this tutorial it will be necessary to have the following documentation and material e XM B User Manual e XM B Technical Documentation e Xbus Master Figure 1 e MVN Mounting Straps Figure 2 MVN Mounting Straps e MVN Mounting Straps User Manual Y gt ex FE Figure 1 Xbus Master Figure 2 MVN Mounting Straps 3 Connecting the Sensors All the sensors have their ID written as it is possible to see in Figure 3 It is necessary to connect them by ascending order In total it is possible to work with six sensors three in each arm Hand Forearm and Shoulder I
9. List of Sample Code Code Sample 1 Example of Euler Angles Conversion ooooooccconocccoonnnonnncninnnnnnnns 42 Code Sample 2 UserData Example diia 46 xix Chapter 1 Introduction 1 1 Motivation Virtual Reality VR tries to simulate reality by using tridimensional 3D reproductions of objects and real environments Computers are used to create those environments in which you can navigate and interact with devices Gutierrez Vexo and Thalmann 2008 The use of VR has increased and Virtual Environments VEs became a useful and effective tool in several areas such as e Inthe automobile industry VEs are created to do impact tests that evaluate the resistance and the security of the automobile e In Medicine VR is used for example in teaching anatomy and simulating operations e In military training scenarios of virtual war are created to be able to evaluate the soldiers reaction to certain combat situations or just to practice techniques e In the industry of entertainment there are games with interaction in real time with the help of Head Mounted Displays HMD and motion trackers e In education VR is applied in distance learning with remote meetings and virtual participation in events This work was developed in the context of a discipline of the second year of the Master s degree in Informatics Engineering at the Faculty of Sciences of University of Lisbon called Project in Informatics Engineering PEI Proj
10. The last difficulty was easily overcome due to the presence of an Ergonomist Professor in the team The physics library took more time than expected The physics engine used is an incomplete wrapper for the latest version of the Newton Game Dynamics which caused some difficulties in understanding the complete features of the wrapper there was no documentation and which features of the physics engine are truly implemented Despite these problems the wrapper used seemed the most complete to use with MOGRE The positive side is that with these problems it was possible to learn more about the library and the limitations of the wrapper The gesture library took the predicted time At this moment it is possible to grab drop push and press objects in the scene The participation in the scientific paper Using space exploration matrices to evaluate interaction with Virtual Environments L Teixeira et al 2012 was very useful This collaboration brings more knowledge in several areas Also it was important to get acquainted with the ErgoVR system and the team In this project the high cost associated to the materials means that not everyone can buy the motion trackers used XSens XBus Kit In addition there are some possible side effects with the use of VR mainly simulator sickness The advantage to the research s area is that this work use VR and as referred above is very useful to use in many areas The advantage to me was the
11. VE to be possible to have physical behavior in the scene The first step Physics Engine Integration took more time than the expected There were some problems with the physics engine because it was used an open source library that is still incomplete and does not have clear documentation which makes it difficult the comprehensibility and further implementation in the system 24 Fourth Phase Actions Recognizer The goal of this phase was to implement in the system the possibility for the participant to interact with smart objects Final Report As predicted this report has been made since the delivery of the preliminary report It was updated every week with the performed work The completed Gantt Chart is in the Appendices 3 2 3 Resources The resources for this project have been defined in the im Phase 1 Familiarization and they are divided into material and technologic resources Software SW and Hardware HW and human resources As a result of this step the next decisions are taken Material and technological resources chosen by the laboratory team e MOGRE SW It is a wrapper to OGRE Object Oriented Graphics Rendering Engine which is an open source 3D graphics engine As rendering system supports OpenGL and DirectX and runs on Windows Linux and Mac OS MOGRE Managed OGRE was developed for use OGRE across NET languages e Newton Game Dynamics SW It is an open source physics engine that simula
12. because it was a complementary goal to this project and now is considered as future work First Phase Familiarization This phase is composed by five steps Theme Familiarization Performed Tutorials Modeling a Virtual Human Develop Test Application and State of Art Study The first three steps and last one were made within the expected times but the fourth step Develop Test Application took more time than the estimated in the plan The task of defining a Virtual Human with mesh skeleton hair clothes body and facial animation in a compatible format with the ErgoVR system was more difficult than expected consuming considerable more time that the scheduled It was necessary to work with a considerable number of modeling software tools until the right one was found Second Phase Gesture Recognizer The main goal of this phase is to implement in the Ergolnteract system the possibility to reflect the real movements of the participants upper limbs in the upper limbs of the VH This phase was divided in three steps Sensors Integration Testing and Demo The first step was the one with more duration and all of the three steps were executed in the scheduled time interval Preliminary Report The preliminary report has been written and delivered on schedule Third Phase Physics Engine This phase does not have changes in the main goal and steps that is composed The main goal is to implement physics in the objects in the
13. pushing and pressing Four VH models were modeled and animated to inhabit any VE loaded by ErgoVR These models are capable of reflecting the natural movements of the participant using sensors attached to the upper limbs of the participant It was necessary to develop a library that implement physics attributes in the VE to give more realism to the simulation It was been created a gesture library to implement smart objects in the VE This library and the previous one supports the interaction activities of the participant with objects in the VE The last and complementary goal was to add support in the ErgoInteract module for the Data Glove It was not possible to reach this goal during project duration but it is considered as future work To create a VH it was necessary to work with two tools one for modeling and the other to export to the OGRE format However during the export process many tools lost fundamental model information Therefore it was necessary to test different modeling tools with distinct file formats This work stage consumed a considerable amount of time but at the end it was a good investment because now it is possible to have a VH to represent the participant and three more VH models to include in any VE 49 The library of the sensors took less time than the expected However some difficulties appeared related with Euler angles transformations and anatomy questions about the movements of the hand elbow and shoulder
14. the developed libraries SensorInteract and PhysicsInteract The SkeletonInformation library works with the Skeleton file of the mesh which specifies the bones positions and their hierarchy This information is necessary to reproduce the movements of the sensors on the VH s bones to place the physics bodies at the right position and to calculate the exactly size of each bone Only one class composes this library InformationSkeleton Figure 31 the complete class diagram is in the Appendix B 6 class Skeletoninformation InformationSkeleton Figure 31 SkeletonInformation Class Diagram The first section of this chapter described issues related with modeling and animation of VH models compatible to the ErgoVR The second section was dedicated to the module Ergolnteract developed in this project giving a comprehensive explanation of each library 48 Chapter 5 Conclusion and Future Work Virtual Reality is used in many areas nowadays For instance it can be used for treating phobias and to construct virtual realities that demand higher cost or that are dangerous in real life ErgoVR supports VE that can simulate dangerous situations The main purpose of this project was to enable the possibility of representing the upper body movements of the participant on the HV so that it would be possible for the participant to interact with objects in the VE performing natural gestures using their own hands such as grabbing dropping
15. the elements in the scene it is necessary to attach the elements to the mesh To do that select the desired objects and with the right mouse button select the option Fit To Figure 8 In the new window that appears select Genesis name of the model and press the Accept button Figure 6 Click to accept this dialog Figure 5 Fit To Window Body Animation To add body animations it is necessary to open the label Pose amp Animate open the menu at the bottom of the application with the name aniMate2 AniMate2 is a timeline were the animations are placed Figure 6 Step 1 There are several pre made animations and it is only necessary to drag and drop them into the timeline Figure 7 Onpr oar 4220 tete com i o Figure 6 Timeline of Animations 2 p m E Tools Render Connect Window Help Actors Wardrobe amp Props Pose amp Animate Lights amp Cameras Render DAZ 3D FeedBack My Account hes hr 0 2 2 2 E o S a Tool Settings aniMate2 Timeline high five A high five B shoulder pat A shoulder pat B Figure 7 Put Clap Animation Attention 1 If you want to add facial animation go to the section Facial Animation if not do Step 3 Attention 2 It is advisable to save first in the daz format before doing Step 2 because after doing this step it is not possible to change the animations Step 2 To export the model it is necessary to prepare the
16. the upper body movements of the participant on the HV so that it would be possible for the participant to interact with objects in the VE performing natural gestures such as grabbing dropping pushing and pressing To accomplish this in this project an ErgoVR L Teixeira et al 2010 compatible module including several libraries named Ergolnteract was created Besides four VH models with different genders types of hair and clothes were modeled and animated ready to inhabit any VE loaded by ErgoVR These models are capable of reflecting the natural movements of the participant that are input to the system through sensors used in the upper limbs of the participant Three of the VH models have facial and body animation to interact with the participant in a VE ErgoInteract is useful and can be important to several application areas It can be used in the psychology area to help people that have phobias to interact with the VE or in industrial design to visualize and interact with virtual prototypes of a product In architecture it can be used to interact with architectural 3D models There was also a participation in a scientific paper Using space exploration matrices to evaluate interaction with Virtual Environments L Teixeira et al 2012 This collaboration was very useful in the familiarization phase to understand the ErgoVR system and to ease the integration in the team 1 4 Document Structure This document is dividing in
17. this class with a physics visual debugger The debugger of MogreNewt shows in the VE the physics bodies Figure 29 Figure 29 Physics Objects in the Scene The class Callback is divided in two parts WorldCallback and TriggerCallback WorldCallback treats the collision between physics bodies that do not have physical effects when they collide TriggerCallback treats the collision between the hand and all the smart objects The class ActionEventArgs defines the information that should be sent to the receiver of the event notification when one of the callbacks is activated 45 The class Actor is responsible to call the class Interaction Actor to give the initial position of the scene node that has the VH NormalCamera and VRCamera are two classes of the ErgoVR system and they are used for preparing the system to use an HMD Head Mounted Display or a projection in a screen 4 2 4 Interaction with Objects ObjectInteract The ObjectInteract library has three classes ObjectInteraction Play Action and Trigger Figure 30 the complete class diagram is in the Appendix B 5 class Objectinteract Objectinteraction PlayAction trigger trigger Figure 30 ObjectInteract Class Model The interaction with the objects in the VE can be accomplished with one or both hands depending on the type of the object and the action to be performed All the objects in the VE are inside the scene file Inside this xml file exists a nod
18. void property CameraNode SceneNode CamerasList List lt Camera gt FOVx Radian FOVy Radian Orientation Quaternion Position Vector3 ReferenceCamera Camera property BodyOrientation Quaternion event ContactDropArea DropAreaEvent ContactMade ActionEvent EventArgs ActionEventArgs NormalCamera objectHand Vector3 readOnly objectinteraction SceneNode readOnly triggerName Trigger readOnly camera Camera Init void NormalCamera SceneManager RenderWindow float ActionEventArgs Vector3 SceneNode Trigger property FOVx Radian FOVy Radian property Hand Vector3 Objectint SceneNode Trigger Trigger B 5 ObjectInteract class Objectinteract Objectinteraction action String arraylD int 0 dropArea string firstDelimiter Char m r readOnly id string 0 informationScene string readOnly listOfTriggers List lt Trigger gt new List lt Trigger gt readOnly name String necessaryHand int phrase string O sceneAreaDrop SceneNode secondDelimiter Char readOnly thirdDelimiter Char readOnly trigger Trigger ES de ES GetListTrigger List lt Trigger gt InitObjectInteraction void Objectinteraction string ReadDataScene S
19. which allows changing the gender of the VH to be loaded by the system If it is intended to load a man s VH the value should be man and if it is intended a woman s VH the value should be woman For example Sex of the model you want to load LoadHV VHModel woman 6 Section ConfigSensor In the section ConfigSensor there are three properties Arm and NumSensores In the Arm property it is possible to write left right or both depending on the arm s that is intended to have control over with the motion sensors In the NumSensors property it is possible to define the number of sensors to use This property is dependent on what was defined on the Arm property e Left or Right You can only have 2 or 3 sensors e Both You can chose 4 or 6 sensors Configure the number of sensor you want to load ConfigSensor Arm Both NumSensores 6 E Complementary User Manual for the Xsens Complementary User Manual for the Xsens Motion Sensors How to connect the sensors and place them on a participant Contents l Introd ctoB AAA cette terit ort eee Nie ce Dye a A Tipi uei ip nb m EM ute te au tud ae estates Se Connecting the Sensors sioe qe ead VE ASSI eS LUTPAT EE HERE sinta USO E dada 4 Place the Sensors in the Participant s Body List of Figures Figure 1 X bus Master aou ed e pede incidi ipe EIE re dios Figure 2 MVN Mounting Straps eoo inest en Y UR o language ERES pelo ge
20. years later Heilig launched the Experience Theater Morton L Heilig 1969 which was a version of Sensorama Simulator but for large scale audiences However the term VR only came to public knowledge in the beginning of the decade of 80 with the computer scientist Jaron Lanier which defined the term Virtual Reality Virtual Reality involves three I s aspects Immersion Imagination and Interaction Burdea and Coiffet 2003 Figure 1 Interaction enin in yt Figure 1 The Three I s of Virtual Reality The performance of the participant depends on these three I s In an immersive VR system the participant feels that he is completely inside the VE using devices that stimulate human senses Participant interaction is mandatory in VR systems The participant must feel that he is really a part of the VE and that he is able to change something inside it Of course participant s imagination also performs an important role in the sense of immersion The architecture of a VR system is usually composed by five components Figure 2 VR System Virtual Environment Software Task VR Simulator Figure 2 General Architecture of Virtual Reality Systems Images adapted from Burdea and Coiffet 2003 Diagram adapted from Burdea and Coiffet 2003 6 The User performs some Task in the Virtual Environment The Virtual Environment is loaded by the VR System which is composed by the VR Simulator and several D
21. 1 Select the model go to the OgreMax menu and choose the option Object Settings Figure 21 In the new window select the label Mesh Animations put a check in the option Always Export Skeleton in the field Skeleton Settings Figure 22 Help Export Scene Settings Object Settings Global Settings Convert to OgreMax Materials Create Master Slave Materials Revert to Master Materials ein da Tain TA Figure 21 OgreMax Menu r M E P3 OgreMax Object Settings Tankini for Genesis Shape CU General User Data Node Animations Mesh Mesh LOD Mesh Animations Mesh Animation Type fd p Skeleton Settings JV Always Export Skeleton Remove Bones With No Influence T Export Animations to Separate Skeleton Files Link Separate Animations to Main Skeleton Animated Root Skeleton Animation Sampling Use Global Setting p Pose Settings T Always Export Poses p Morph Settings Vertex Animation Sampling Use Global Setting p Mesh Animations Add Edit Copy Remove Name Track Frames Length Samping DK Figure 22 OgreMax Object Settings Step 2 In the same window Figure 22 in the Mesh Animations section click in the button Add to add an animation to the mesh it is necessary to add the animations manually to the exporter since it cannot distinguish correctly between Morph and Skin animations You will see a
22. EE Cutti Andrea Giovanni et al 2010 Outwalk a protocol for clinical gait analysis based on inertial and magnetic sensors Medical amp Biological Engineering amp Computing 48 1 17 25 http www ncbi nlm nih gov pubmed 19911214 Duarte Em lia 2004 Informa o de seguran a Avalia o da efic cia de sinais pict ricos de seguran a Faculdade de Motricidade Humana Universidade T cnica de Lisboa Gutierrez Mario F Vexo and Daniel Thalmann 2008 Stepping into Virtual Reality Ist ed Santa Clara CA USA Springer Verlag TELOS Heilig M L 1962 Sensorama simulator US Patent 3050870 Heilig Morton L 1969 Experience Theater US Patent 3469837 Kozlov Michail D and Mark K Johansen 2010 Real Behavior in Virtual Environments Psychology Experiments in a Simple Virtual Reality Paradigm Using Video Games 53 Magnenat Thalmann N and Daniel Thalmann 2004 Handbook of Virtual Humans Wiley Norman Donald A 2002 1 3 New York Basic Books Chapters 257 The Design of Everyday Things Basic Books http www amazon com dp 0465067107 Teixeira Jo o Marcelo et al 2006 GeFighters an Experiment for Gesture based Interaction Analysis in a Fighting Game Teixeira Lu s et al 2010 International Symposium on Occupational Safety and Hygene SHO 2010 505 509 ErgoVR An approach for automatic data collection for Ergonomics in Design studies eds P Arezes et al Portuguese So
23. Interaction in Virtual Environments with the Upper Body desenvolveu se no mbito da Realidade Virtual e enquadrou se no projeto Future Safety Warnings Virtual Reality in the study of technology based warnings financiado pela Funda o para a Ci ncia e a Tecnologia PTDC PSI PCO 100148 2008 que se contextualiza na rea de Ergonomia Este trabalho foi realizado no Laborat rio de Ergonomia ErgoLAB da Faculdade de Motricidade Humana da Universidade T cnica de Lisboa mais especificamente na unidade de Realidade Virtual chamada ErgoVR e contou com o trabalho de uma equipa multidisciplinar composta por Ergonomistas Psic logos Engenheiros Inform ticos Arquitetos Designers entre outros Para suportar todos os projetos realizados no ErgoVR existe um sistema de Realidade Virtual com o mesmo nome No projeto Future Safety Warnings Virtual Reality in the study of technology Z based warnings a Realidade Virtual utilizada para avaliar a conson ncia 1X comportamental dos participantes perante avisos de seguran a em situa es de emerg ncia no interior de edif cios O projeto descrito neste documento veio resolver dois aspetos que podem afetar a imers o 1 o facto de o utilizador n o ser representado por nenhum Humano Virtual e 2 o facto de a intera o com os objetos do Ambiente Virtual ser limitada e pouco natural Existe um sistema de intera o que funciona da seguinte forma colocado um sensor na m o
24. LoArm 35 rollSensor boneHandL Orientation pitchLoArmL 36 37 matrix FromEulerAngles Y XZ yawSensor rollSensor pitchSensor 38 39 mogreQuatSensor FromRotationMatrix matrix 40 return mogreQuatSensor 4 42 43 endregion Code Sample 1 Example of Euler Angles Conversion The data comes from the sensor as a quaternion To work with the Euler angles it is necessary to transform that quaternion into a rotation matrix line 1 From that rotation matrix it is possible to take the Euler angles line 3 42 After that it is necessary to verify from which sensor is the data The first data comes from the sensor ID 0 it is for deposit in the shoulder As we talk previously in the shoulder we can see that the axis Y and Z are changed in the sensor Figure 26 but the Euler angles are at the right place so it is not necessary to change the coordinates line 14 The roll is 0 because as mentioned that movement does not exist The second data comes from the sensor ID 1 that information it is for the elbow As with the shoulder the elbow has Euler angles at the right place so it is not necessary to change the coordinates Figure 25 line 26 The pitch is O because as mentioned that movement does not exist The third data comes from the sensor ID 2 that information it is for the hand In the hand the Euler angles that change are pitch and roll Figure 24 That way it is necessary to change the values when the matr
25. M Su entre muitos outros Aos meus amigos Fculianos obrigado pelas noites nos laborat rios do DI caf s e brainstormings na ponte do C1 pausas no jardim festas no mocho e principalmente amizade Tijolinho Xico Andr Portim o Reis Tap Rute Milena entre outros Aos meus padrinhos e afilhados que longe ou perto marcaram esta fase com momentos imensamente felizes Ao Pombal pela preciosa ajuda na execu o deste relat rio um axu s para ti minha querida orientadora Professora Doutora Ana Paula Boler Cl udio que esteve a meu lado profissional e pessoalmente em todo o meu percurso acad mico Obrigada uma Grande Professora Ao meu orientador Prof Dr Francisco dos Santos Rebelo um obrigado pela excelente oportunidade de realizar este PEI consigo e obrigado por todo o apoio e conhecimento que me transmitiu Ao Lu s Teixeira um grandess ssimo obrigado pela prontid o e disponibilidade durante a realiza o deste projeto A ajuda que me deste e os conhecimentos que me fizeste adquirir nas mais diversas reas foram bastante importantes A toda a equipa do ErgoLab pelos conhecimentos que me transmitiu e por me terem recebido t o abertamente E por fim coisa mais linda da tia que apenas com o seu sorriso majestoso me conseguia dar for as em todas as alturas em que pensei desistir minha querida e Essencial Sobrinha Teresa com saudades que deixo esta vida Obrigado a voc s que tornara
26. RR RD Eo ROUES NUR E NEED ended 52 A nas a cd a iepa O Sai EE SS oai 53 JXDDEDUIGeS S eMe ut t tM 55 Ax Gantt Carta A A eel ae alah A AA 55 B Class Diagram ninne a pie ES 56 B 1 ASSP CAL O SS 56 B 2 DEMO ADD seieren ar airaa aa eaa eirates 57 B 3 Sensorintera mee eia si EE aS da EN ESEESE Ia aS 58 B4 lt Physies IM dd a IUe UM R 59 B 5 ObjectInterach P 60 B 6 SkeletonfnformatlOl cui eere eda oh eel atte Oppius eee 61 Es diserManualto Create VE ose o eene aida 63 D User Manual to Configure FileConfig cfg eene 78 E Complementary User Manual for the Xsens eee 82 xvi List of Figures Figure 1 The Three I s of Virtual Realities at autora Sa eed 6 Figure 2 General Architecture of Virtual Reality Systems oooconnnocccnonccinoccninnnos 6 Fisure 3 Mechanical Device i 8 Figure 4 Electromagnetic Device nini eei esi erre ti d ge ads 8 Figure 5 Ultrasonic Device eie ge ek ae eps Duero dete eee 8 Figure o Optical Device it ti odiada 8 Figure 7 Inertial DEVICE A i el AA 8 Figure 8 Stakeholders Areas ca 15 Figure 9 Ergonomist Use Cases dins 17 Figure 10 Designers Architects Use Cases s c scassquets pen ctis nete ii 18 Figure criaron lic na aa aieiaa 22 Figure 12 5 PDCA Cy ele tattoo dios eos faac Ub aaa 23 Figur 13 MDEX SenSOE oa a a 26 Figure 14 Upper Lambs Hierarchy 5o eo a 3l Figure 15 System Modules cuu eene ias 34 Fi
27. Run the application 2 User moves the upper limbs to perform the desired movements Post Conditions e User can see their movements reflected in the VH in the VE 17 Use Case 3 Use Case User Interacts with Objects in VE Actor Ergonomist Pre Conditions e User must be informed about the application e The user needs to have the sensors connected Body 1 Run the application 2 User tries to interact with an object in the VE Post Conditions e The user can interact with objects with the upper limbs in the VE Ergolnteract Choose Number o Sensors Designers Architects Place Sensors in the User Figure 10 Designers Architects Use Cases Use Case 1 Use Case Choose VH Actor Designers Architects 18 Pre Conditions e The group needs to be informed about the use of the application e The group needs to have the User Manual Configure FileConfig cfg 1 Open configuration file ConfigFile cfg 2 In the LoadHV section write in the VHModel key man or woman 3 Save file 4 Run the application Extensions la Error opening the configuration file lb Create a new configuration file according with the User Manual Configure ConfigFile cfg 2a If nothing is written in the field a female VH is chosen 3a Error saving the file create a new one start in the step 1b 4a Error running the application consult ogre log Post Conditions e User has the VH wit
28. SensorInteract class is the class in charge of reading the configuration file ConfigFile cfg creating the class UpdatePositionBones with the right attributes and send the data from the sensors to that class 38 The best way to work with the upper limbs movements is with yaw pitch and roll The sensors use the Cartesian System The roll is the rotation about X axis the pitch is the rotation about the Y axis and the yaw is the rotation about the Z axis Sensors and VH s skeleton use right handed coordinate systems However sensors have the Z axis pointing up Figure 21 while the skeleton has the Y axis pointing up Figure 22 As we can see from the two mentioned figures Figure 21 and Figure 22 the coordinate system is different in the sensor and the bone what causes different movements Figure 21 Sensor Coordinate System Figure 22 Virtual Human Coordinate System Note that the sensors are placed along the arm of the participant in different positions which alters the orientation of the coordinate system Figure 23 shows the correct placement of the sensors Appendicesx contains a complementary user manual for XSens with more details about this question 2 http eris liralab it viki images 8 82 XsensMtx pdf http en wikipedia org wiki Euler_angles 39 Figure 23 Sensors Placed in the Upper Limbs Therefore due to the differences in the orientations of the coordinate systems involved data from sensor
29. The user has a corresponding VH in the VE 2 The VH reproduces the upper limbs movements performed by the user 3 The user can interact with specific objects in the VE using natural gestures like grabbing dropping pushing and pressing Requirements of the Designers and Architects team To be able to choose one VH from a set of predefined female and male models To be able to choose the number of sensors to work with To be able to choose a specific VE from a set of predefined ones Te oq oo To be able to make the previous choices just by editing a configuration file 15 5 To be able to create new VH models with skin skeleton animations and textures and make them available in the set of predefined models 6 To be able to place properly the sensors in the upper limbs of the user Non Functional Requirements As previously mentioned the computer engineer system developer is responsible for non functional requirements which are related with quality attributes The system was developed as a module compatible with the ErgoVR system Comprehensibility and Usability were identified as requirements To ensure these requirements the system has been developed having in mind that it is important for the team to know how the system works and how it can be modify or updated To make this easier there is an editable configuration file all the developed code is documented and there are three user manual Reliability is another requir
30. Tome IM Add Audio Block Delete Layout Add Start Mask Start rtta here high five A Swap Track with selected object Figure 11 Add Empty Block in Timeline Genesis zoom a Save as new Save for legacy aniMate get outta here high five A Figure 12 Rename Empty Block Step 3 In the label Pose amp Animate go to the Editor menu Pose Controls menu and Head menu Select the head of the model and then in the Editor menu select which part of the face you want to animate Eyes Mouth Brow etc Choose a position in the sub track timeline and move the slider of what deformation you want to do Figure 13 To remove the animation select another position in the sub track timeline and move the slider until the initial position Figure 14 Warning If you do not remove the expressions that you add the model will always have that expression Otherwise if you want only a momentary animation you have to add and remove the animation as was made in Figure 13 and Figure 14 Figure 13 Add Face Deformation to Lights amp Cameras Render br OBO t t s Figure 14 Remove Face Deformation Attention It is advisable to save first in the daz format before doing Step 4 because after doing this step it is not possible to change the animations Step 4 To export the model it is necessary to prepare the animations This is done by clicking with the right mouse button on the play bar and selecting the o
31. UNIVERSIDADE DE LISBOA Faculdade de Ci ncias Departamento de Inform tica INTERACTION IN VIRTUAL ENVIRONMENTS WITH THE UPPER BODY Mariana Barros Vital MESTRADO EM ENGENHARIA INFORM TICA Especializa o em Engenharia de Software 2012 UNIVERSIDADE DE LISBOA Faculdade de Ci ncias Departamento de Inform tica INTERACTION IN VIRTUAL ENVIRONMENTS WITH THE UPPER BODY Mariana Barros Vital PROJETO Trabalho orientado pela Prof Dr Ana Paula Boler Cl udio e co orientado pelo Prof Dr Francisco dos Santos Rebelo MESTRADO EM ENGENHARIA INFORM TICA Especializa o em Engenharia de Software 2012 Acknowledgments Este projeto marca o final de uma etapa gratificante Pelo fant stico percurso acad mico que tive quero tentar registar os meus sinceros agradecimentos s pessoas que o marcaram Quero tentar porque o que senti ao longo destes anos com essas pessoas imposs vel descrever Grande Fam lia que tenho que esteve sempre presente apoio me nos piores momentos e festejou os melhores M e Pai Irm s e Tia um grande obrigado Foram voc s que me educaram e me fizerem ser quem sou Meus Essenciais Ao fant stico namorado pela for a que sempre me deu A for a da tua palavra foi sempre uma Luz no Escuro Obrigado Nelson por estares sempre ao meu lado s Essencial Aos meus amigos que me acompanham h anos e sempre demonstraram o seu interesse e preocupa o pelo meu caminho Filipa BIP
32. a Glove Integration 131 Dia s Gesture and Actions Recognizer EE Gesture and Actions Recognizer 11 Dia Physics Engine E Physics Eng 14 Dia Testing EE Testing 5 Dias Demo B Demo 1 Dia s Final Report ESPE EE ESSSSSSSSSEEESESSSREREE HHHHHIHTIO BBP Final Report 213 Dia 55 class BaseApplication B Class Diagram B 1 BaseApplication CameraMan DebugOverlay mCamera Camera readOnly mGuiAvg OverlayElement readOnly mGuiBest OverlayElement readOnly mGuiCurr OverlayElement readOnly mGuiTris OverlayElement readOnly E E CameraMan Camera MouseMovement int int void UpdateCamera float void Base camera Camera cameraMan CameraMan ContinueRendering bool true debugOverlay DebugOverlay PluginsCfg string plugins cfg renderMode int Mae ose mGuiWorst OverlayElement readOnly property mModesText OverlayElement readOnly FastMove bool mWindow RenderWindow readOnly Freeze bool timeSinceLastDebugUpdate float 1 GoingBack bool GoingDown bool DebugOverlay RenderWindow GoingForward bool Update float void GoingLeft bool property GoingRight bool Additionallnfo string GoingUp bool debugOverlay cameraMan IDisposable RenderWindow RenderWindow ResourcesCfg string resources cfg root Root SceneManager SceneManager textureMode int
33. animations This is done by clicking with the right mouse button on the play bar and selecting the option Bake to Studio Keyframes Figure 8 aniMate2 Oupnr age ALLO p s PRO wi 2 m B 5 Create aniBlock From Studio Keyframes Studio Keys Preferences JS Boots zoom Studio no selection TT Figure 8 Bake To Studio Keyframes Facial Animation If you already have done the body animations you are familiar with the timeline If not you have to open the label Pose amp Animate open the menu at the bottom of the application with the name aniMate2 Figure 9 If you add body animations you can do the facial animation in the same timeline that the body animations are but is better to do them separately in order to become easier to understand or OD E PRO tet 29 gt wu Figure 9 Timeline Animations Step 1 In the timeline click in the Plus to add a sub track to Genesis Figure 10 get outta here high five A Figure 10 Add Subtrack to Timeline Step 2 In the sub track timeline click the right button of the mouse choose Add Empty Block Figure 11 It will appear a Properties window and you choose Start with no properties click Accept and an empty block will appear in the sub track timeline Click again with the right button up on the empty block and do rename if you add more than one animation is good rename all of them so you can distinguish Figure 12 Genesis sudo
34. aving the file create a new one start in the step 1b 4a Error running the application consult ogre log Post Conditions e User has the VH in the VE that will reflect movement with the chosen number of sensors 20 Use Case 4 Use Case Create VH Actor Designers Architects Pre Conditions e The group needs to have the User Manual Create an VH for the ErgoVR system Body 1 Follow the steps detailed in the User Manual Create an VH for the ErgoVR system Post Conditions e User has the VH to load in the VE Use Case 5 Use Case Place Sensors in the User Actor Designers Architects Pre Conditions e The group needs to be informed about the use of the application e The group has to have the XM B User Manual e The group has to have the Complementary User Manual to the Sensors e The ConfigFile cfg must be updated in the ConfigSensors Section Body 1 Open the Xsens box and connect the sensors 2 Place the sensors in the user in the bones that is needed 3 Turn on the sensors 4 Run the application 5 Move the upper limbs to test Extensions la Do not know how to connect the sensors read the XM B User Manual and the Complementary User Manual to the Sensors Tt was not given permission by the Xsens to reference the Manual in this document 21 2a Do not know how to put the sensors in the user read the XM B User Manual and the Complementary User Manual to the Sensors 4a Error runnin
35. c Occupational Safety amp Hygiene http www sposho pt sho2010 proceedings2010 pdf Tori Romero Claudio Kirner and Robson Siscoutto 2006 Virtual Reality Fundamentos e Tecnologia de Realidade Virtual e Aumentada ed Romero Tori SEC http romerotori org Sumario Livro RV2006 pdf 54 Appendices A Gantt Chart Phase 1 Familiarization Phase 1 Familiarization 15 Dias ES Famiianzation with the Theme Familiarization with the Theme T s gt m Performing Tutorials Performing Tutorials lee EXE Modeling a Viual Human Modeling a Virtual Human E EE Develop Application Test Develop Application Test NE Search Related Work ira mi Search Related Work 15 Dia s Phase 2 Gestures Recognitio RRS Sp iesus psp gnition 123 Discs OO GEES Integration Three Sensors ETT Testing ME estro 8 Dia s E ES Demo 3 Dix Preliminary Report E eliminar Repos 81 Dia Phase 3 Physics Engine M M Phase 3 Physios Engine Las 21 Physics Engine Integration EDS Pes is Engine Integration 25 Dia Testing EN Testing 4 Dia s Demo ES Demo 4 Dia Phase 4 Actions Recognition Phase 4 Actions Recognition 135 Dia s Recognition of actions with Sensor Hand BEEN Recognition of actions with Sensor Hand 31 Dia Testing Ej Testing 2 Dia s Demo E Demo 12 Diag Phase 5 Data Glove Integration Phase 5 Dat
36. cial do projeto houve o envolvimento num trabalho de equipa que foi fundamental para a etapa inicial de familiariza o e para compreender o funcionamento do sistema ErgoVR Deste envolvimento surgiu a colabora o num artigo cient fico elaborado pela equipa do laborat rio como co autora intitulado Using space exploration matrices to evaluate interaction with Virtual Environments Neste artigo descreve se um estudo realizado com o sistema ErgoVR que avalia as decis es dos utilizadores perante a influ ncia da informa o de seguran a colocada nos ambientes quando s o confrontados com situa es de emerg ncia S o recolhidos durante a simula o entre outros dados relativos posi o do participante s dist ncias percorridas aos tempos do percurso Com estes dados s o geradas matrizes a partir das quais poss vel identificar especificamente os fluxos e as zonas do ambiente que s o mais visitadas pelos participantes e deste modo avaliar o seu comportamento Na modela o do Humano Virtual ainda na fase inicial do projeto foram realizados testes exaustivos utilizando diversas ferramentas de modela o com o intuito de identificar as ferramentas que exportam o modelo para um formato aceite pelo sistema ErgoVR mantendo correta toda a informa o necess ria associada malha texturas cabelo anima es corporais e faciais e roupas Concluiu se que a melhor abordagem seria trabalhar com o 3DS Max e com o Daz 3D Na se
37. configuration files that has information about which plugins are loaded where the resources and media are located and creates the necessary things to present the OGRE graphic interface camera viewports and ambient lights The Base Input partial class deals with the interaction with VE using the keyboard the mouse and the joystick The DebugOverlay is the class that specified the information presented in the graphic interface but is not part of the scene It gives us information such as for example frames per second The CameraMan is the responsible for moving the created camera in the VE through the keyboard or mouse The ZSightHandler is the class that allows a simpler interaction with the motion tracker that is part of the Sensics ZSight HMD Team members had previously implemented this class and it was adapted to the Ergolnteract This sensor allows the participant to move the head giving a more natural point of view of the scene giving a higher degree of realism to the simulation 35 Five classes compose the DemoApp RunApplication InitMogre ObjectInformation InitLibraries InitTrackers and PhysicsWorld Figure 17 the complete class diagram is in the Appendix B 2 class DemoApp InitTrackers PhysicsWorld initlODevices physicsWorld InitLibraries initLibraries getScenelnformation Base OgreMaxSceneCallback InitMogre Objectinformation RunApplication Figure 17 DemoApp Class Model RunAppl
38. dia amp Entertainment and click OK Figure 17 Current Preset Autodesk Media amp Entertainment Y Jurema fbx Cleto M aiena ED Nona FBX SDK FBX Plugins version 2012 1 2012 5 31 19 48 28 Y up Centimeters Zup Inches File 77 Elements 35 Materials 16 Textures File content fada and Update scene elements Y FBX Plugin version 2011 3 1 Web updates Edit Figure 17 Import FBX in 3DS Max Step 2 Run the animation to ensure that has been well imported Step 3 Select the model s body and go to the Group menu Figure 18 create a new group and click OK Figure 19 Next choose an object clothes and hair one at a time and go to the Group menu and click Attach Figure 20 The animations are made with the skeleton The mesh has skeleton but the hair and clothes do not so they do not have animations If those objects were attach to the main mesh they become part of the mesh and they will have animations File Edit Tools Group Views Create Modifiers 15 kE Perspective Group Ungroup Open Close Attach Detach Explode Assembly Figure 18 Group Menu Group Ungroup Open Close Attach Detach Explode Assembly Figure 20 Attach Object to Group Export to OGRE format To export to the OGRE format it is necessary to install the OgreMax plugin in 3DS Max After that you can complete the following steps Step
39. do participante quando este se encontra num m nimo de uma dist ncia pr definida do bot o bastava esticar a m o para o sistema detetar que tinha havido um movimento e desencadear o evento associado a o No entanto este sistema tem algumas limita es visto que o nico feedback visual que o participante tinha era um cursor bidimensional o que limita a perce o de dist ncia do objeto que o participante querer interagir Para al m desta limita o o sistema apenas permitia a a o de pressionar Assim o objetivo principal deste projeto 1 a cria o de um Humano Virtual 2 a possibilidade de reproduzir os movimentos do participante e refleti los no Humano Virtual e 3 permitir que o sistema suporte mais a es como por exemplo agarrar ou largar Com a cria o do Humano Virtual e a reflex o dos movimentos do participante existir uma maior perce o de dist ncia no Envolvimento Virtual Para cumprir com este objetivo utilizaram se sensores de movimento que captam a orienta o 3D e dados cinem ticos relativos aos membros superiores Estes sensores de movimento s o colocados no bra o antebra o e m o para capturar os movimentos e orienta es reais do participante e pass los para os membros do Humano Virtual cujo modelo foi criado previamente A este modelo e a todos os elementos do Envolvimento Virtual associaram se caracter sticas f sicas como por exemplo a massa para dar realismo e credibilidade si
40. e most complete and they have the plugins to export to the OGRE format Each software has their own importers exporters supporting their specific import export format Table 1 Export Make Human Poser Daz Evolver Import Blender 3DS Blender 3DS Blender 3DS Blender 3DS File Formats Obj X X X X X X Mhx X E F lt Es Dae X X X x x as x 3ds zu Du X X Fbx EE M V x x Table 1 V Successful Conversion X Failed Conversion Nonexistent Conversion 18 This software was used before being bought by Autodesk 3DS Max 29 At the end of the test conversion the choice was to work with the Daz Studio Pro 4 and 3DS Max 2009 tools The reasons for choosing Daz Studio Pro 4 were There are pre made basic VH models with skeleton There is a library of textures clothes and other objects to add to these pre made models There are skeleton and vertex animations already made it is only required a drag and drop in the User Interface UT to apply it to the model There is an exporter to the file format used by Autodesk s programs the FBX format The reasons for choosing 3DS Max were It is useful to work with this tool because is one of the most used by the ErgoVR team Has support for the FBX file format which supports all the characteristics that are needed for the VH mesh texture skeleton animations Th
41. e named Interaction that contains a field called userData where it is possible to add customized information about that object changing it into a Smart Object An example of the userData field lt node name Interaction gt lt userData gt lt CDATA name lamp action grab NumNecessaryHands 1 drop dropAreaLamp name button action press ID 1 2 3 NumNecessaryHands 1 name fireExtinguisher action grab NumNecessaryHands 1 drop dropAreaExtinguisher 11 gt lt userData gt lt position x 0 y 0 z 0 gt scale x 1 yz zz rotation qx 0 qy 0 qz 0 qwz I node Code Sample 2 UserData Example 46 That information has a field name that is the name of the object the field action that contains the type of action that is possible to do with it grab drop press or push the field NumNecessaryHands that has the number of hands that is necessary to that interaction Regarding the action of pressing in the information there is also an array of elements named D that enumerates the type of behavior of the pressed object For the object that has the action grab a field drop is defined to give the name of the area were is to bem drop the object For example if we press a button we can change the material to become brighter and change the position of the button to give the sensation that is pressed This information presented in the scene file is accessed and treated in the class Obj
42. ectInteraction and afterwards it is saved in an object named Trigger All the smart objects that are read from the scene are saved in a list of triggers As explained in section 4 2 3 there is an event named Callback that sends a message when two physics objects have contact with each other One of the callbacks defined is to be executed when there is interaction between one hand and one object in the scene When that event happens one condition verifies if the object touched by the hand is a smart object in the trigger list and if it is then the action related to that object is performed If the action is grabbing when the two bodies have physical contact the object position is changed to the hands position An object that was grabbed has to be dropped in some specific area at some point depending on the task For example if the participant wants to grab a lamp he has a specific place to drop it To be possible to drop the object it was created an invisible box that limits a zone that specifies the drop location When the object passes in that zone and stays longer than a specific time threshold e g two seconds the position of the object is no longer attached to the hand and stays in that zone If the action is to push or pull for example a door is necessary to change the position of the door or run an animation to give a dynamic sensation 47 4 2 5 Skeleton Information The information about the skeleton is necessary on two of
43. ement To ensure that data provided by sensors are continually updated by the system and immediately reflected in the VH that exhibits fluid movements To ensure Scalability ErgoInteract has been designed and implemented so that all modules can be as independent as possible from one another This is important to reduce dependencies and problems with future changes in case it needs to be ported to other environment Modifiability is a requirement that depends on Comprehensibility Usability and Scalability requirements 3 1 4 Use Cases This section describes the use cases for the functional requirements of the Ergonomist and the Designers Architects Figure 9 presents the Use Cases diagram of the Ergonomist and Figure 10 presents the Use Cases diagram of the Designers and Architects both figures are followed by the corresponding specification 16 User Performs Movements in VE Ergonomist User Interact with Objects in VE Figure 9 Ergonomist Use Cases Use Case 1 Use Case View the VH Actor Ergonomist Pre Conditions e User must be informed about the application Body 1 Run the application 2 User moves the head and sees the body of the VH Post Conditions e The user can see him represented in the VE Use Case 2 Use Case User Performs Movements in VE Actor Ergonomist Pre Conditions e User must be informed about the application e The user needs to have the sensors connected Body 1
44. ere are several exporters to the OGRE format but OgreMax was chosen because the team already used it The conventions to modeling a VH passed through the step of choosing an available model from the Daz Studio Pro 4 s library a texture for the skin clothes and shoes and then apply some animations To apply facial animations the software has a pose controls menu that separate the face areas that can have deformation such as the mouth eyes eyebrows and nose and each one of these areas has a slider to control the required deformation For the body deformation there are pre made animations such as walking running jumping among others and it is only needed to drag and drop them over the body When the VH model is finished it is exported to the FBX format and imported in 3DS Max Before exporting to OGRE it is required to verify the type of animation vertex or skeleton which will be detailed in the next sections Facial and Body Animation The VH is exported to the OGRE format represented by a mesh and skeleton files 30 The next two sections are related with the creation of animations in the three VH model to insert in the VE 4 1 1 Body Animation The animations of the body are made through the skeleton The skeleton is defined as a hierarchy of bones that is the shoulder is parent of the arm the arm is parent of the forearm and the forearm is parent of the hand as it is possible to see in Figure 14
45. esArmLeft MaterialPair pairBodiesArmRight MaterialPair pairBodyArmL MaterialPair Actor pairBodyArmR MaterialPair pairBodyHandL MaterialPair pairBodyHandR MaterialPair pairBodyHVCylinder MaterialPair pairBodyStaticWorld MaterialPair pairBodyWorld1 MaterialPair pairBodyWorld2 MaterialPair pairBodyWorld3 MaterialPair IDisposable Dispose void HideDebugLines void Init SceneManager float void ShowDebugLines void Update double void actorNode SceneNode readOnly bodyNode SceneNode handNode SceneNode headCamera VRCamera readOnly sceneManager SceneManager readOnly db 3E 3 ee o3 pairBodyWorld4 MaterialPair p conoManagor VACamera pairDropAreaObject MaterialPair ChangeHeadOrientation Quatemion void pairDropAreaObject MaterialPair gt gt ChangeHeadOrientation Degree Degree void pairDropAreaObjectCandleL MaterialPair ChangePosition Vector3 void pairDropAreaObjectCandleR MaterialPair CreateBodyNode Vector3 SceneNode void pairDropAreaObjectExtinguisherL MaterialPair InitlVectors Quatemion SceneNode void pairDropAreaObjectExtinguisherR MaterialPair O VoTo Vector void Update float void pairExtinguisher MaterialPair pairExtinguisherR MaterialPair property pairWorldHandL MaterialPair ActorNode SceneNode pairWorldHandR MaterialPair BodyOrientation Quate
46. essions to give more realism e Realistic clothes and hair To model the VH many 3D modeling programs were tested To have the model in the OGRE format with the desired characteristics many importers exporters and file formats were tested until a good conversion process was achieved While performing 28 these tests some problems have emerged due to the fact that several of these 3D modeling software are open source and occasionally what is working in an older version stops working in the next version Because of this it was necessary to experiment many versions of the same software Another reason is that the exporters and importers are also open source which sometimes makes them not complete tools These exporters and importers were usually conceived for particular purposes and sometimes do not fulfill the general conversions The exporters do not always export all the information and similar situations happen with the importers Therefore it is impossible to guarantee that all the information is correctly exported and imported To find out which was the appropriated modeling animation 3D tool to produce an VH model it was tested the following set of tools e Make Human 1 6 Alpha and Nightly Builds e Poser PRO 2012 e Daz3e4 Pro e www Evolver com e 3DS Max 2009 e Blender 2 49 2 56 Beta and 2 57 It was necessary that the last software in the conversion pipeline would be 3DS Max or Blender because from this list they are th
47. etics of the Technical University of Lisbon more specifically in the research unit ErgoVR In the project Future Safety Warnings Virtual Reality in the study of technology based warnings VR is used to evaluate the participant s behavior towards safety warnings in emergencies inside buildings In this project the interaction was weak and due to the fact of a participant do not have virtual representation the immersion was less Therefore the main goal of the described project is give to the participant the possibility of see is upper limbs movements reflected in a Virtual Human VH inside the VE and have the possibility to interact with virtual objects give more sensation of immersion To complete with this goal sensors were used and place in the arm to capture the participant s movements A VH was created to performed the participant s movements in the VE To this VH and to the VE physics elements were added to give more credibility to the simulation and make possible the interaction with objects in the scene Keywords Virtual Environment Virtual Reality Movement Human Interaction Immersion xiv Contents Chapter 1 IntrOduelons en tO Qu eds o asper ON aora 1 1 41 Mouv ation evito E 1 152 FQ A aetebesttanghsatanstacnsdey ded ae aaa e ata S 2 1 3 Contributions Mp 3 LA Document Structure ana a nof ates 4 Chapter 2 Context and State of the Art assa pie ui Di seua 5 2 1 Fundamental Concepts of Virtual Reality
48. eto de Engenharia Informatica This project Interaction in Virtual Environments with the Upper Body fits in another one called Future Safety Warnings Virtual Reality in the study of technology based warnings which is funded by the Portuguese Science Foundation PTDC PSI PCO 100148 2008 and is developed in the Ergonomics Laboratory ErgoLAB of Faculty of Human Kinetics of the Technical University of Lisbon more precisely in the research unit ErgoVR VR appears in the ErgoLAB with the goal to evaluate the participant s behavior in experimental VE To be able to evaluate the behavior of the participant there must be immersion in the VE that is provided for navigation and interaction To support this project has been created a VR system called ErgoVR This project focuses in the interaction that the participant can have with virtual objects like grabbing or pushing This kind of interaction are only in the virtual side because there is not available a haptic feedback system at the Ergonomics Laboratory To accomplish the task sensors are used placed in the arms and hands of the participant in order to capture his movements and pass that information to the Virtual Human VH This VH reproduces in the VE the actions of the participant Providing ErgoVR system with interaction modes that are more natural will improve participants sense of immersion and therefore he will behave naturally as if he was experiencing the real sit
49. evices The VR Simulator is the software a set of applications that include software libraries such as a graphics engine Integrated Development Environment IDE and 3D modeling software Devices are hardware components used to perform specific Tasks input devices and or to provide an output feedback output devices Input devices are classified as passive or active Passive devices are usually sensors that capture the movement of a participant and can be supported by different technologies e Mechanical consists of a kinematic structure composed of links that contain sensors Figure 3 e Electromagnetic uses magnetic fields for the detection of the orientation and position of the sensor Figure 4 e Ultrasonic using ultrasonic signals produced by a transmitter to determine the real time position of a moving element Figure 5 e Optical uses a system of cameras that capture the location of markers Figure 6 e Inertial uses electromechanical instruments to detect movement by measuring the change in inclination acceleration and gyroscopic forces Figure 7 e Hybrid composed of two distinct systems the most common is the combination of optics and inertia simultaneously Figure 5 Ultrasonic Device Bente tis Optical Device Figure 7 Inertial Device Active input devices allow the interactive change based on decision of the participant such as navigation selection and manipu
50. f it is intended to work with the six sensors it is required to connect the Sensor ID 0 Sensor ID 1 and Sensor ID 2 in one side of the Xbus connector of the wireless receiver and Sensor ID 3 Sensor ID 4 and Sensor ID 5 in the other side in order to have three sensors separated for each arm It is possible to see in Figure 4 Figure 4 Connecting the 6 Sensors If you want to work with fewer sensors you just need to remove the sensors with the higher ID Example If you want to work with four sensors you need to remove the Sensor ID 5 and Sensor ID 4 In the side that you remove the both Sensor you need to place the Sensor ID 2 You need to change the Sensor ID 2 to the secondary connection in order to have two sensors on each side Attention It is necessary to work with the same number of Sensor on each side 4 Place the Sensors in the Participant s Body Figure 5 Straps with Sensor shows how to place the sensors in the Mounting Straps Figure 5 Straps with Sensor There are six straps each strap has a label and in that label it has written the member and an arrow to indicate the direction in which the sensor should be inserted Two straps have written L Leg another two F Arm The straps L Leg are for the shoulder because they are bigger the F Arm are for the forearm and for the hand there are two gloves The sensors need to have the cables in the direction indicated in the label by the arro
51. f the Art This chapter presents the history and some fundamental concepts related to Virtual Reality and also a state of art covering some investigation projects in similar areas 2 1 Fundamental Concepts of Virtual Reality Virtual Reality can be defined as an advanced interface for computer applications that allows the participant to navigate and to interact using sensory channels vision hearing touch and even smell and taste According to Gutierrez et al 2008 a VR system simulates reality using a computer to create 3D environments The main goal is to give the participant the illusion of being inside a virtual environment that reacts and changes according to his interaction In 1950 a system was developed by the U S Air Force that provided some immersion to a participant in a VE which consisted in a flight simulator for testing Tori Kirner and Siscoutto 2006 In 1962 Morton Heilig considered the father of VR created the Sensorama Simulator M L Heilig 1962 which simulated a motorcycle ride in Manhattan It consisted in multiple sensors and the participant were subjected to various physical sensations that created a sense of immersion simulating the hole in the pavement the vibration of the seat and body position corresponding to the inclination of the motorcycle the wind in the face and in the hair different smells in specific locations of the city e g food smells near a restaurant and the surrounding sounds A few
52. f the elbow needs to rotate to do both movements Considering that the calibration position is with the internal part of the elbow turned front it is only considered the yaw Therefore the roll and pitch in the elbow are not passed to the bone in order to not reproduce that movement The shoulder does not have roll 41 The UpdatePositionBones is the class that makes this conversion and places the right data in the right bone Code Sample 1 Example of Euler Angles Conversion Matrix3 matrix mogreQuatSensor ToRotationMatrix Radian yaw Sensor pitchSensor rollSensor matrix ToEulerAnglesZY X out yawSensor out pitchSensor out rollSensor WN nd region 3 Sensors if numberSensors 3 ON 7 8 if index Sensor 0 UP ARM 9 10 yawUpArm yawSensor 11 pitchUpArm pitchSensor 12 rollUpArm rollSensor 13 14 matrix FromEulerAngles Y XZ yawSensor pitchSensor 0 15 mogreQuatSensor FromRotationMatrix matrix 16 return mogreQuatSensor 17 18 if indexSensor 1 LO ARM 19 20 yawLoArm yawSensor 21 pitchLoArm rollSensor 22 23 yawSensor yawSensor yawUpArm 24 pitchSensor pitchSensor pitchUpArm 25 26 matrix FromEulerAngles Y XZ yawSensor 0 pitchSensor 27 28 mogreQuatSensor FromRotationMatrix matrix 29 return mogreQuatSensor 30 31 if index Sensor 2 HAND 32 33 yawSensor yawSensor yawLo Arm 34 pitchSensor pitchSensor roll
53. fact of that with this project knowledge of some areas such as physics ergonomics anatomy and informatics were acquired and remembered Another advantage was the fact of knowing different people of different areas that brings to me more information and culture Despite all the problems like in every project there is a functional prototype This prototype can be improved in the future with the implementation of support for a Data Glove and a gesture recognition library The Data Glove because it can capture the finger flexion allowing more definition in the hand movements and the gesture recognition library because it offers more possibilities for gestures For example to 50 associate the movement of the hand from the left to the right in a straight line to a slide action or another example to associate the action of staying more than x seconds with the hand stopped to select something like an option in a menu Another future work that can be implemented in the ErgoVR system are reactive intelligent agents In order to introduce human behavior in the simulation and get more natural behavior by the participant giving more immersion also Additionally to be possible to have more kind of studies and simulations in the ErgoVR At the end this project is a positive point to the ErgoLAB Now the simulations are richer in terms of interactivity because it is possible besides navigating to grabbing dropping pushing and pressing in the VE T
54. g the application consult ogre log to found the error and repeat step 4 5a Upper limbs do not move verify command line to see if the sensors are being recognized repeat step 3 Post Conditions e The sensors are working in the user s upper limbs e The user sees the upper limbs movements reflected on the VH in the VE 3 2 Project Planning 3 2 1 Development Process The development strategy of this project focuses in an iterative and incremental model The model used in this project is base in the Scrum model as depicted in Figure 11 The choice of using Scrum was because this model is the most adequate for long projects and projects that are subject to changes of the requirements frequently The justifications are still the same but there is another one the planning meeting and the review of the accomplished work Product BackLog Sprint BackLog Gestures Recognition n Physics Engine Integration Actions Recognition Planning Meeting Review Meeting Figure 11 Scrum The Product Backlog is a set of all global requirements in the project and the Sprint Backlog are the selected requirements to do in a Sprint A Sprint is a unit of 22 development process that can last between one week and one month and is done following the Plan Do Check Act PDCA cycle that is going to be explained with more detail later on the document A Daily Scrum is a short daily meeting a Sprint Planning Meeting is Sprint Backlog for the nex
55. gre physicsWorld PhysicsWorld renderWindow RenderWindow readOnly acessories SceneNode Sscenelnformation string readOnly camera2 Camera sceneManager SceneManager readOnly entBoots Entity triggersList List lt Trigger gt entHair Entity xsensTracker XSensTracker readOnly entHV Entity entTshirt Entity DropAction ActionEventArgs void a getScenelnformation Objectinformation GetCollidableActor Actorinteraction hwModel String GetlODevices InitTrackers Tm initLibraries InitLibraries m GetMogreNewtHandler MogreNewtHandler initLibraries MPathSceneCfg string ConfigFile cfg InitLibraries SceneManager string SceneNode Entity Camera RenderWindow JoyStick nodeHV SceneNode InitLibrarys void On bool i InitObjecilnteraction void pathScene String InitPhysicsinteraction void SaveAction ActionEventArgs void CreateCamera void CreateScene void ADATA A CreateViewports void Sensorinteraction Sensorinteraction Debugger void InitLibraries void LoadResources void OgreMaxSceneCallback OnKeyPressed KeyEvent bool T OnKeyReleased KeyEvent bool Objectlnformation Setup bool interactionData String lt lt Shutdown void getScenelnformation UpdateScene FrameEvent bool Getlnformation String App HandleObjectExtraData OgreMax OgreMaxT ypes Objec
56. gunda etapa foi implementada a biblioteca respons vel por fazer a leitura dos dados dos sensores e transp los para os ossos do Humano Virtual Esta biblioteca composta por quatro classes cada uma com fun es diferentes No final desta fase j xi era poss vel colocar os sensores no utilizador e ver o Humano Virtual refletir os movimentos no ambiente Na terceira etapa para realizar a biblioteca de f sica foi necess rio levar a cabo um levantamento sobre quest es de f sica Para n o haver perda de imers o do utilizador no ambiente necess rio impedir situa es como 1 o Humano Virtual trespassar as paredes ou 2 o Humano Virtual levar a m o a um objeto e esta atravess lo Para tal todo o objeto tem de ter associadas massa gravidade e for as para se deslocar no ambiente Na ltima etapa para desenvolver a biblioteca de intera o com objetos foi necess rio fazer um levantamento sobre quais as a es que um utilizador podia realizar no sistema Conclui se que seria necess rio poder agarrar largar puxar e pressionar Ao realizar esta biblioteca decidiu se que alguns objetos virtuais tinham de ter associados a si informa o extra sobre os poss veis modos de intera o que o utilizador pode realizar sobre eles As bibliotecas desenvolvidas neste projeto constituem um m dulo perfeitamente integr vel no sistema ErgoVR e permitem a utiliza o de sensores nos membros superiores do utilizador cujos movimento
57. gure 16 BaseApplication Class Model eee 35 Figure 17 DemoApp Class Model eese seen eene tn eene enne nn 36 Figure EVI P 37 Fig te Do RCA al H ma snsc teet SPD ie ht es ea pa dt inst ete 37 Figure 20 SensorInteract Class Model eene 38 Figure 21 Sensor Coordinate System viii see A ee 39 Figure 22 Virtual Human Coordinate System eee 39 Figure 23 Sensors Placed in the Upper Limbs sere 40 Figure 24 VH and Sensors Euler Angles at Hands seeeses 40 Figure 25 VH and Sensors Euler Angles at Forearm esee 41 Figure 26 VH and Sensors Euler Angles at Shoulder 41 Figure 27 PhysicsInteract Class Model eiii is 44 Figure 28 B dy S PHYSICS cia anta dal ma rosa tay ERU duet ia 45 Figure 29 Physics Objects in the Scene sette ici nas 45 Figure 30 ObjectInteract Class Model ete dt 46 Figure 31 SkeletonInformation Class Diagram eee 48 List of Tables Table 1 V Successful Conversion x Failed Conversion Nonexistent CONVELSION pis ess postas elas pee p eta ais a a O a ai deeds 29 Table 2 Possible Number of Sensors in One Arm eee 38 Table 3 Possible Number of Sensors in Both Arms eee 38 xvili
58. h the gender defined in the configuration file in the VE Use Case 2 Use Case Choose the number of sensors Actor Designers Architects Pre Conditions e The group needs to be informed about the use of the application Body 1 Open configuration file ConfigFile cfg 2 In the ConfigSensor Section and Arm Key write what arms you want to work in the NumSensores Key put the number of sensors that are pretend to be used and in case of one sensor in the Member Key place the bone to work with 3 Save file 4 Run the application 19 Extensions la Error opening the configuration file lb Create a new configuration file according with the User Manual Configure ConfigFile cfg 3a Error saving the file create a new one start in the step 1b 4a Error running the application consult ogre log Post Conditions e User has the VH in the VE that will reflect movement with the chosen number of sensors Use Case 3 Use Case Choose the VE Actor Designers Architects Pre Conditions e The group needs to be informed about the use of the application Body e Open configuration file ConfigFile cfg e In the LoadScene section write in the PathFile the physics localization of the scene to charge e Save file e Run the application Extensions la Error opening the configuration file lb Create a new configuration file according with the User Manual Configure ConfigFile cfg 3a Error s
59. his allows for more complex environments and different kind of studies to be performed with more focus on interaction 51 Acronyms Terms VE CIDA DOF FCT GNSS GPS HDM VH MEMS MOGRE PEI AR VR VS 52 Definition Virtual Environment Chaotic Interaction Device Abstraction Degree of freedom 6 degrees in total 3 of rotation and 3 of translation Funda o para a Ci ncia e Tecnologia Global Navigation Satellite Systems Global Positioning System Head Mounted Display Virtual Human Micro Electro Mechanical Systems Managed Object oriented Graphics Rendering Engine Projeto Engenharia Informatica Augmented Reality Virtual Reality Microsoft Visual Studio References Bergmann Jeroen HM Ruth E Mayagoitia and Ian CH Smith 2009 A portable system for collecting anatomical joint angles during stair ascent a comparison with an optical tracking device Dynamic medicine DM 8 3 http www pubmedcentral nih gov articlerender fegi artid 2684094 amp tool pmcent rez amp rendertype abstract Brand Richard A 2008 Elbow and Shoulder Lesions of Baseball Players Clinical Orthopaedics and Related Research 466 1 62 73 Buchmann Volkert et al 2004 FingARtips gesture based direct manipulation in Augmented Reality In Virtual Reality ACM p 212 221 http portal acm org citation cfm id 98887 1 Burdea Grigore and Philippe Coiffet 2003 Virtual reality technology Volume 1 Wiley IE
60. ication initializes the InitMogre object InitMogre is the class that overrides some BaseApplication methods in order to load the configuration file ConfigFile cfg to add the VH create another viewport and create new keyboard shortcuts to move the VH The VH moves in the VE through keys that make him move forward backward and rotate Each key pressed causes the application of a force in a specific direction to the VH allowing the physics engine to move the VH based in physical properties The introduction of physical forces to induce VH movements allows in the future the abstraction of the navigational interfaces used in the ErgoVR system e g joysticks Nintendo Wii Balance Board motion trackers among others The InitLibraries is responsible for creating an instance of the libraries of the system SensorInteract PhysicsInteract ObjectInteract and XSensWrapper and get all the trackers used in the simulation glasses and sensors through the class InitTrackers The ObjectInformation is responsible for reading the information in the scene file about the Smart Objects explained in the 4 2 4 The test scene loaded by these two libraries is an existing VE provided by the team composed by two adjacent rooms In the first room Figure 18 there are a table 36 with a lamp a fire extinguisher a pillar and the VH Figure 19 Every object can be integrated on the scene file or can be loaded in the system separately The VH is loaded separa
61. it is convenient to run the animation to verify that it was correctly imported from Daz Studio Pro 4 After that the model is exported to OGRE and it is required to choose if the animation should be exported as n poses or morph As mentioned before pose animations will have to run one by one and morph animations are called one time for each animation created All information related to the vertex animation is in the mesh file To help with the process of modeling and animating a user manual has been produced to guide step by step the creation of a compatible VH for the ErgoVR system 32 This user manual is the Appendices Erro A origem da refer ncia n o foi ncontrada 4 2 Implemented Module The ErgoInteract module was designed and developed based on two factors e The system has to have the less dependencies as possible with external libraries Low Coupling e The responsibilities are well divided between the classes High Cohesion These two factors enable the possibility of code reuse and easy comprehension of the same There are seven libraries in this system One of them has been provided the XSensWrapper and the other six were developed in the scope of this project BaseApplication DemoApp Ergolnteract PhysicsInteract ObjectInteract and SkeletonInformation The XSensWrapper the provided library is the responsible for the connection and data capture of the sensors BaseApplication is a library with the necessar
62. ix is being created line 37 but in this case all the values are passed because the hand has all the movements There is another question to have in mind When it is passed for example a roll movement to the shoulder it needed to be considered that the elbow is going to suffer also that roll movement Therefore to correct that problem is necessary to subtract the value of the shoulders on the value of the elbow and in the values of the elbow is necessary to subtract from the hand As is it possible to see in the Code Sample 1 Example of Euler Angles Conversion in each if statement is saved the values line 10 11 and 12 and those respective values are subtract in the next if statement line 23 and 24 4 2 3 Physics Engine PhysicsInteract In order to have more realism in the VE it is necessary to have a credible physical behavior in the VH and all object presented in the scene If an object trespasses through walls tables and other objects in the scene for example it is obvious that the sense of realism and immersion is lost It deals with gravity mass forces among others As referred in section 3 2 3 the physics engine that was used is Newton Game Dynamics It is an open source library which works in OGRE and has a wrapper to MOGRE This physics engine has collision 2 http newtondynamics com forum newton php 43 detection and dynamics behaviors The PhysicsInteract library is divided in eight classes Figure 27
63. lation of virtual objects through devices like a mouse joysticks and others Output devices provide an active feeling to the participant such as sound touch odor and or visual information that can be a response to changes in the VE Therefore VR has its disadvantages e High costs that are associated with VR devices and production of their own VEs e Large limitation of force and tactile feedback devices e Some people experience discomfort such as simulator sickness and disorientation using VR devices Images taken from http www fakespacelabs com tools html 3 Images taken from Burdea and Coiffet 2003 Images taken from http digitalcortex net work academic virtual reality 7 Images taken from http www intersense com categories 18 8 2 2 State of the Art Virtual Reality is applied in several areas such as design architecture education entertainment industry and medicine Within these areas several research projects that use various devices are carried out In this chapter some projects related with the scope of this project are presented We divided this presentation in three parts First it is presented studies that involve real time participants interaction with objects in the VE using various types of devices The second part presents some projects that use the same sensors that are used in this project Xsens MTX 2 2 1 User interaction with objects in the VE Two of the next presented works focus
64. lt Trigger gt readOnly AnulatePhysicsBodyHV string string void UserAABBOverlap ContactMaterial Body Body int int GetObjectinteraction string UserProcess ContactJoint float int void triggerCallbackTrigger WorldCallback SearchTriggers string string void worldCallback TriggerCallback List lt Trigger gt p UserAABBOverlap ContactMaterial Body Body int int UserProcess ContactJoint float int void Actorinteraction actorBody Body baseBodyOrientation Quaternion baseCameraOrientation Quatemion bodyArmay Body readOnly MogreNew Handler Pr Span int instance MogreNewtHandler odyElip Body debugger Debugger collision Ellipsoid materiallD MateriallD cylinder Cylinder MES MasrialPai entHV Entity aterialPair2 MaterialPair inertia Vector3 newtHandler informationSkeleton InformationSkeleton listBodiesPositions List lt TagPoint gt new List lt TagPoint gt readOnly ListBones List lt string gt new List lt String readOnly ListPhysicsBones List lt string gt new List lt String readOnly listTagPoint List lt TagPoint gt new List lt TagPoint gt readOnly materiallD MateriallD readOnly property newtHandler MogreNewtHandler readOnly Gravity float nodeHV SceneNode Instance MogreNewtHandler offset Vector3 World World pairBodi
65. m MARAVILHOSO este percurso vi Aos meus Essenciais vil viii Resumo Pode se considerar que a Realidade Virtual uma interface entre o utilizador e um sistema computacional cujo objetivo principal simular um Envolvimento Virtual real stico no qual poss vel navegar e com o qual se pode interagir em tempo real O objetivo proporcionar ao utilizador uma forte sensa o de imers o no Envolvimento Virtual ou seja a sensa o de presen a f sica e efetiva nesse envolvimento Para haver o m ximo de imersividade nos Envolvimentos Virtuais s o usados diversos dispositivos para que a navega o e intera o sejam o mais cred veis poss vel dispositivos tais como Head Mounted Displays luvas de dados de rastreamento e dispositivos que geram sensa es de tato e for a feedback h ptico Alguns sistemas disp em de superf cies de representa o de grandes dimens es como por exemplo a CAVE Atualmente a Realidade Virtual utilizada em diversas reas porque uma forma de simular uma experi ncia pr xima da realidade reduzindo em alguns casos o perigo que existe no mundo real e permitindo de forma f cil a repeti o de situa es ou experi ncias A Realidade Virtual aplicada em reas t o diversas como a Medicina para treino cir rgico em pacientes virtuais o entretenimento com os jogos e filmes tridimensionais a Psicologia no tratamento de fobias e traumas entre outras Este projeto intitulado
66. mation Step 3 In the same menu of the window in Figure 21 go to the Export option Export Scene Figure 26 and give a name to the file Rendering Lighting Analysis Customize MAXScript Help Export Scene Export 4 1 i Export Selected Objects Scene Settings Export Scene Materials Object Settings Export Selected Material Editor Material Global Settings Export Current Material Library Convert to OgreMax Materials Figure 26 OgreMax Export Scene Now the file is ready to be loaded in the ErgoVR system 6 Glossary Morph Morph is a vertex animation It uses information about the movement of vertices directly to animate the mesh This technique stores the absolute position of the vertices in a certain keyframe and stores another different absolute position in another keyframe and when playing the animation at runtime interpolates the intermediate positions Skin Skin comes from skinning is a skeleton animation This animation technique consists in a mesh that represents the skin attached to the bones and when a bone moves the skin follows the movement of the bone that is attached D User Manual to Configure FileConfig cfg User Manual Change the Configurations of the ErgoInteract CONFIG Contents Uo ro du cton XE 2 Overall Worki OWN A po a dale a 3 Template of Configuration Ple is Sai Ge e eene Ae Secun Eoad Scenes A ee fai coute dat S CE ON LoadH Ve RD 6 Sectio
67. model with incorporated skeleton Figure 1 F E File Edt Cre Tools Render Connect Window Help Actors Wardrobe amp Props Pose amp Animate Lights amp Cameras Render DAZ 3D FeedBack My A e Se d da d o o o a Parameters AjeJqr 3493002 Tool Settings Figure 1 Main Window Step 2 To assign a skin texture select the label Actors Wardrobe amp Props and next inside the folder Materials choose the intended material Figure 2 E Window Help Pose amp Animate Lights amp Cameras Render DAZ 9D FeedBack e a d d Scene Parameters o o gt 3 2 d g 3 Tool Settings Figure 2 Skin Step 3 To add hair to the model go to the option Hair and select the desired hair Figure 3 Daz File Edit Create Tools Render Connect Window Help Actors Wardrobe amp Props Pose amp Animate Lights amp Cameras Render DAZ 3D Feedback My Account eh fh oe da o Scene Parameters o o 3 d B 3 Tool Settings Figure 3 Hair Step 4 To add clothes to the model go to the option Clothing in Content and select the desired clothes from the ones available Figure 4 Actors Wardrobe amp Props Pose amp Animate Lights amp Cameras Render DAZ AZD FeedBack e a d da d Parameters o o zi 3 za 4 3 3 Tool Settings Figure 4 Clothes Step 5 In order that the animations affect all
68. mula o e fazer com que passasse a ser poss vel a intera o do utilizador com determinados objetos presentes no ambiente O projeto descrito nesta tese envolveu quatro etapas A primeira etapa foi de familiariza o com o sistema ErgoVR com as ferramentas de desenvolvimento nele utilizadas e foi realizado um levantamento do estado da arte Nesta etapa tamb m se criou um Humano Virtual com as ferramentas de modela o 3D que permitiram criar um modelo com esqueleto anima es e texturas e que tornaram poss vel export lo para o formato utilizado no sistema ErgoVR Na segunda etapa tratou se de todas as quest es relativas aos sensores leitura de dados transforma es dos ngulos de Euler e transposi o dos dados provenientes dos sensores no Humano Virtual A terceira etapa foi relativa simula o das regras da f sica Newtoniana dentro do Envolvimento Virtual A quarta etapa foi relativa s formas de intera o do utilizador com os objetos do Envolvimento Virtual como agarrar largar puxar e pressionar Na etapa de familiariza o decidiu se que este projeto iria desenvolver se sobre a plataforma Microsoft NET DotNET visto o sistema ErgoVR ter sido desenvolvido sobre essa mesma plataforma Deste modo garantia se a integra o do resultado deste projeto no sistema ErgoVR de uma forma mais simplificada Ao mesmo tempo as caracter sticas da plataforma NET enquadravam se com as necessidades deste projeto Na fase ini
69. n Config SensOE iii E IE ERI DM I EU AA sinet es eget PY Ee EU e RR ca 1 Introduction This manual has the goal to instruct how to modify the Ergolnteract system configuration via a configuration file The use of a configuration file allows changing some properties of the system without the need to recompile the complete system This configuration file is placed in the folder where the system is going to be used near the executables of the application 2 Overall Workflow The configuration file is divided in sections Each section has keys that have the information to be read by the system The file is divided in three sections named LoadScene LoadHV and ConfigSensor The LoadScene has the information about the scene the LoadHV has information regarding the VH ie gender and the ConfigSensor section has information regarding the sensors and the arms to be used 3 Template of Configuration File The configuration file has the name ConfigFile cfg and looks like this Z Comment Section Key Value 4 Section LoadScene In the section LoadScene there is a property called PathFile that represents the location of the scene file on the hard drive to be loaded in the system For example Path of the scene to load LoadScene PathFile C Users Mariana Vital Desktop Corredores Corredor Sinais com Node Interaction teste scene 5 Section LoadHV In the section LoadHV there is a property with the name VHModel
70. ng images through cameras To capture gestures this game uses ARToolkit on an AR library that uses fiducial markers A marker is placed in each hand to capture the gesture and there is a set of figures that is previously given to the user that explains all the positions that correspond to game commands Real Behavior in Virtual Environments Psychology Experiments in a Simple Virtual Reality Paradigm Using Video Games Virtual Reality has been widely used in Psychology because it is possible to create situations that are impossible or difficult to generate in the real life and is an easy way to treat difficult problem in some people The paper Real Behavior in Virtual Environments Psychology Experiments in a Simple Virtual Reality Paradigm Using Video Games Kozlov and Johansen 2010 talks about the usefulness of a simple video game based virtual environment for psychological research on real world behavior Their main focus is Helping behavior In this game the participants are instructed to get to the exit of a building in a short time When they are on their way out they are faced with virtual persons asking for help in the presence or not of virtual bystanders The conclusion was that the bystanders and time pressure reduce helping in a virtual game The participant s behavior similar to previous real life experiments with human actors The results of the study defend the usefulness of a VR game as a psychological experimenta
71. nother window in which you will assign a name to the animation and choose the track as Morph or Skin according to the type of animation facial or body respectively Step 2 1 If the track is of type Skin body animation Step 2 1 1 In the Skeleton Animation field you put a check in the Copy First Animation Key to Last option Figure 23 Step 2 1 2 In the Start End Frames you put the time of the specific animation that you are saving Step 2 1 3 Click in the Add Bone option and choose the bone Hip There select the Include and Apply to Bone and Descendants options click OK Figure 24 OK in the Add Mesh Animation window Figure 23 and final again OK in the Object Settings Figure 22 Figure 24 Add Root Bone Step 2 2 If the track is of type Morph facial animation Step 2 2 1 Put a check in Copy First Animation Key to Last and Export as Morph options Step 2 2 2 In the Start End Frames you put the time of the specific animation that you are saving Click OK Figure 25 and again OK in the Object Settings Figure 22 Ts Add Mesh Animation inn ua Name Falal Track Morpher NA StartfendFrames D 2 0 s Tee Scaing e use Object Setting E Link to Main Skeleton Use Object Setting Copy First Animation Key to Last Export as Morph r Skeleton Animation Remove Bone Translations Add Bone Indude Exdude Bones Add Bone A A Figure 25 Add Mesh Ani
72. on the area of Augmented Reality AR Although the area of this work is not AR is interesting to talk about the modes and interaction styles that are investigated and used in this study since they can also be applied in VR e FingARtips Gesture Based Direct Manipulation in Augmented Reality FingARtips Buchmann et al 2004 is a project that studies a technique for interaction with virtual objects through gestures using fingers Markers are used to track the user s gestures This technique was applied to an interface of urban planning in order to make it possible for architects to build virtual cities modifying them as they wish Users can interact with the application by grabbing pointing and pressing virtual objects such as houses and even navigating through the scene e GeFighters and Experiment for Gesture based Interaction Analysis in a Fighting Game http www Xsens com Nowadays the biggest focus in the gaming world is trying to give to the user the chance to perform commands without the need to press several buttons This paradigm applies to the Nintendo Wii or to the PlayStation Move The GeFighters Game J M Teixeira et al 2006 is a project that tries to give the user the possibility to interact with the game with gestures using tracking sensors This article talks about how a game can implement the use of gestures There are several ways of capturing gestures such as for example with gloves or capturi
73. one of the skeleton otherwise the software does not export the skeleton and since the animations need the skeleton they are not exported The information of the animations is inside a skeleton file that is connected to a mesh file both specific of OGRE 4 1 2 Facial Animation The facial animation is created through deformations of the mesh without using the skeleton This happens because the deformations on the face are not deformation are caused by movements of the muscles This is named vertex animation and has two animations modes morph or pose Pose animation is a technique that stores the offsets to the original vertex data In each keyframe in the track of the timeline it is possible to blend one or more poses Each animation track refers to a single set of geometry When this animation is played it is necessary to call these tracks one by one Morph animation is a technique that stores the absolute position of the vertices in a certain keyframe and stores another different absolute position in another keyframe and when playing the animation in runtime calculates the intermediate positions using interpolation As mentioned previously to do the facial animation Daz Studio Pro 4 has a pose controls menu so it is only necessary to select a time in the timeline and choose the deformation value in the slider of the selected area When that animation is imported in 3DS Max it is possible to run and see the animation In addition
74. ption Bake to Studio Keyframes Figure 15 aniMate2 Loon D 1 A eae t 1 2 aa Create aniBlock From Studio Keyframes Studio Keys Preferences JS Boots zoom Studio no setecton TT Figure 15 Bake to Studio Keyframes 5 Imports and Exports Export from Daz Studio 4 0 Pro Step 1 To export the model to the FBX format go to the File menu and click on the Export option choose the Autodesk FBX fbx format The FBX options window will appear On that window leave the default options and click Accept Figure 16 Export Options Output Options Selected Figures Props Colect Textures To Folder Lights Merge Diffuse and Opacity Textures Cameras Merge Clothing Into Figure Skeleton Animations Allow Degraded Skinning Morphs Allow Degraded Scaling Staging Assets to be exported Figure Genesis 19296 vertices 18872 polygons 72 bones SubD Cage Figure JS Boots 21548 vertices 20866 polygons 12 bones Figure Magus Shorts 7833 vertices 7456 polygons 9 bones Figure Tankini 10966 vertices 10848 polygons 16 bones Figure WidMane Hair 17116 vertices 13686 polygons 8 bones Pre Export Report Export Ready Figure 16 Export FBX Options Import in 3DS Max Open 3DS Max and do the following steps Step 1 Go to the File menu choose the Import option and choose the fbx file exported from DAZ Studio 4 0 Pro A window will appear and in the Presets section select the option Autodesk Me
75. r Table 2 Possible Number of Sensors in One Arm Both Arms Four Sensors Six Sensors Hand Shoulder Arm Forearm Shoulder Table 3 Possible Number of Sensors in Both Arms After the number of sensors is chosen the sensors can be attached to the participant s body To help placing the sensors there is the XM B User Manual and a Complementary User Manual for the Xsens Motion Sensors that is presented in the Appendices The library SensorInteract is the responsible for the sensors and has three classes SensorInteraction UpdatePositionBones and SkeletonConfigFile Figure 20 the complete class diagram is in the Appendix B 3 class Sensorinteract enumeration Sensorlnteraction SelectedArm arm SkeletonConfigFile getSkeletonConfigFile UpdatePositionBones updatePositionBones Sensorlnteraction Figure 20 SensorInteract Class Model The class SkeletonConfigFile is the class that read the section of the sensors in the configuration file named ConfigFile cfg and saves the number of sensors used the chosen arm and the selected arm The orientations of the VH skeleton are given through the articulations of the shoulder elbow and wrist so it is necessary to place the sensor near these zones The sensor gives us the 3D orientation the 3D acceleration 3D rate of turn and 3D earth magnetic field data The orientation is the only information used The
76. rgonomist and a stakeholder of this project VE are developed by Designers and Architects which are another group of stakeholders VH are necessary to inhabit these VE and to reflect the movements captured by sensors attached to the upper limbs of the user The task of modeling and animating these VH as well as the development of a library called ErgoInteract a new module in the Ergo VR system were both performed by a Computer Engineer stakeholder in the context of the PEI Another computer engineer is responsible for the whole ErgoVR system that supports the simulations This is another stakeholder and he is the responsible for the non functional requirements of the system Therefore this project has four main groups of stakeholders which are e Ergonomist e Designers and Architect e Computer Engineer PEI e Computer Engineer System Developer 14 Ergonomist Designers Architects Computer Engineer Virtual Environment ErgoVR System Ergolnteract Computer Engineer PEI Virtual Humans Figure 8 Stakeholders Areas 3 1 3 Requirements Functional Requirements There are several ways to interact with the VE and the features desired to this part of the system are going to be specified Functional requirements are defined by two groups of stakeholders the Ergonomist and Designers and Architects The requirements are presented separately according to each group Requirements of the Ergonomist 1
77. rmation is a library used by the SensorInteract and the PhysicsInteract The system was developed trying to ensure that the libraries dependencies were injected instead of created inside the libraries This guarantees low coupling within the system and there is a centralized location on the system responsible to create the dependencies and pass them to the libraries That is done in the DemoApp initialization code 4 2 1 BaseApplication and DemoApp These two libraries are responsible for run the OGRE graphics engine and presenting the OGRE graphic interface As mentioned before the DemoApp overrides some BaseApplication methods The decision to create two different libraries for similar responsibilities was taken because the BaseApplication has the minimum and necessary 34 information to run any MOGRE application outside the ErgoVR system ErgoVR has already a kind of Base Application but the DemoApp is necessary because there are elements that are only necessary for this system for example the VH Five classes Base Base Input DebugOverlay CameraMan and ZSightHandler compose the BaseApplication Figure 16 the complete class diagram is in the Appendix B 1 class BaseApplication debugOverlay cameraMan IDisposable Figure 16 BaseApplication Class Model Base and Base Input are partial classes and they split the information about input and output issues The Base partial class is responsible for load the OGRE
78. rnion positionsBodies Vector3 new Vector3 4 readOnly HandNode SceneNode Skeleton Skeleton HeadCamera VRCamera targetOmega Vector3 HeadOrientation Quaternion targetVelocity Vector3 Position Vector3 triggerCallback TriggerCallback readOnly upVectorZ UpVector worldCallback WorldCallback readOnly headCamera Actorlnteraction SceneManager VRCamera Actorinteraction SceneManager VRCamera MogreNewtHandler List lt Trigger gt VRCamera ApplyGravityForceCallback Body float int void CalculateMidPointPosition void cameraCounter int ChangePosition Vector3 void cameras List lt Camera gt readOnly CreateBodyNode Vector3 SceneNode void renderWindow RenderWindow readOnly CreateCallbacks void sceneManager SceneManager readOnly CreateEllipsoid void unitsPerMeter float readOnly CreateMaterialPairs void QreatePhysicsAmsHV void ChangeBackgroundColor ColourValue void CreateTagPoints void ChangeOrientation Quaternion void IsWithin string bool ChangeOrientation Degree Degree void IsWithinSelected String bool ChangePosition Vector3 void MoveTo Vector3 void GetCamera SceneManager RenderWindow float VRCamera Update float void hit void UpdatePhysics void VRCamera SceneManager RenderWindow float UpdatePositionOrientation
79. s are subjected to referential transformations before being applied to the corresponding bones Figure 24 Figure 25 and Figure 26 show the original Euler angles on the VH bones and the Euler angles in the sensors and doing a comparison it is possible to note the referential transformation However what is most important after that comparison is to see if the Euler Angles are equal in both elements bones and sensors despite the coordinates system being different Figure 24 VH and Sensors Euler Angles at Hands 40 Pitch Figure 25 VH and Sensors Euler Angles at Forearm Figure 26 VH and Sensors Euler Angles at Shoulder After comparing it is possible to conclude that in the hand all axes are changed the Y is transformed in to the Z the X in to the Y and the Z in to the X Figure 24 However it is possible to see that the unique Euler angles that change are the pitch and roll The yaw is made in different axis but in the same direction In the elbow and shoulder we can see that the axis Y and Z are changed in the sensor Figure 25 and Figure 26 but the Euler angles are at the right place Observe that the Euler angle roll in the X axis 1s positive in the VH but negative in the sensor Besides not every bone has the three movements and there are some restrictions in some degrees The hands have the three movements roll pitch and yaw In the elbow the pitch and yaw are the same movement because the internal part o
80. s s o refletidos no Humano Virtual correspondente O utilizador pode de modo natural interagir com elementos do ambiente realizando gestos relativos s a es de agarrar largar puxar e pressionar Considerou se que o m dulo desenvolvido uma mais valia para o ErgoVR porque permite a aplica o deste sistema de Realidade Virtual a diferentes cen rios em diversos mbitos sempre que a intera o de um utilizador humano com objectos presentes no Envolvimento Virtual seja requerida Palavras chave Realidade Virtual Envolvimentos Virtuais Movimento Intera o Humana Imers o xil Abstract The Virtual Reality VR is an interface between the user and a system and is main goal is simulate a Virtual Environment VE next to the reality The advantage of use VR is the possibility of simulates the dangerous that exist in real world or allowing the repetition of situations and experiences Nowadays VR is used in many areas from Medicine for surgical training in virtual patients to the army in which the soldiers do virtual training The project Interaction in Virtual Environments with the Upper Body is developed in the context of Virtual Reality and is a part of the project Future Safety Warnings Virtual Reality in the study of technology based warnings funded by the Portuguese Science Foundation PTDC PSI PCO 100148 2008 This project was developed in the Ergonomics Laboratory ErgoLAB of the Faculty of Human Kin
81. se VR With VR it is possible to introduce a user in a realistic and immersive world where he can perform a task which in real life could involve some risks This led to the project Using Virtual Reality to Evaluate the Safety Information Effectiveness financed by FCT PTDC PSI 69462 2006 which has a main objective to use VR to evaluate the compliant behavioral against safety warnings in workplaces Later and in order to proceed with the work arose the project Future Safety Warnings Virtual Reality in the study of technology based warnings also financed by FCT 13 PTDC PSI PCO 100148 2008 This project focuses on the issue of technology based safety warnings and intends to study the ability of these warnings to counteract the effect of the environment affordances Affordance refere se s propriedades percebidas e efetivas de um objeto essencialmente as propriedades fundamentais que determinam como os objetos podem ser usados Norman 2002 To support these two projects has been created a VR system called ErgoVR It is in this ErgoVR system that is inserted this project Interaction in Virtual Environments with the Upper Body for user interaction with the VE 3 1 2 Stakeholders The stakeholders of this project belong to the ErgoVR team Observing the diagram in Figure 8 it is possible to distinguish the different stakeholders The projects made in ErgoVR have the focus in Ergonomics and a responsible for the ErgoVR is an E
82. t Sprint and the Sprint Review is where the work done is reviewed In this project the Product Backlog was divided in four groups Familiarization Gestures Recognition with motion sensors Integration of the Physics Engine and Actions Recognition Each of these general requirements is going to be divided in more specific requirements that can be made in Sprints of one week At the beginning of each week the Sprint Planning Meeting takes place and at the end of the week there are the Sprint Reviews Every day there is short brainstorming sessions Normally these meetings are with different members according to the level of specificity of the topic These brainstorming sessions are considered the Daily Scrum Three of the four groups referenced in the previous paragraph follow the PDCA cycle The one that does not follow is the Familiarization in which is completed a theoretical study and some tutorials In the other three groups the first step is the definition of the goal and requirements in more detail Plan then it is followed by the implementation of the requirements defined Do test Check and in the end the validation and creation of the Demo Act This can be seen in Figure 12 Figure 12 PDCA Cycle 3 2 2 Project Planning The initial planning had some changes and all the phases took more time than estimated to be completed mainly due to technical constraints It have been removed the 23 last phase Data Glove Integration
83. t in free living conditions This tool helps in the http www Xsens com en mo vement science sports science rowing with mvn 11 rehabilitation process Walking can be expressed in a generalized way due to its simple kinematics There are analysis of movement by stereo photogrammetry that provide useful and detailed data the problem is that has high cost associated and the use of the stereo photogrammetry equipment is limited to the laboratory That limitation restricts the movement of a few meters leads to an exhaustive environment since that the patient has to walk back and forward to performed the meters and because the floor of the laboratory has different characteristics from the exterior floor e g street garden making the data less realistic To overcome those problems it was chosen to analyze the motion through inertial sensors That way the analysis can be done in external environments being closer to the typical gait pattern The MTx sensors are placed in the trunk pelvis hip knee and ankle and the data related to certain positions requested are recorded e Stair climbing a comparison with an optical tracking device In studding the action of climbing stairs Stair climbing Bergmann Mayagoitia and Smith 2009 starts from the idea that the movement is the most important activity for health The reports made by older people show the difficulty in climbing stairs and are useful to evaluate and define the functional s
84. tExtraData void RunApplication 57 B 3 SensorInteract class Sensorinteract UpdatePositionBones enumeratio boneHandLOrientation Radian Se boneHandROrientation Radian listBones List lt Bone gt readOnly nameArm string readOnly numberSensors int readOnly pitchLoArmL Radian 0 pitchLoArmR Radian 0 pitchUpArmL Radian 0 pitchUpArmR Radian 0 rollLoArmR Radian 0 rollUpArmL Radian 0 arm Sensorinteraction SelectedArm rollUpArmR Radian 0 member string yawLoArmL Radian 0 _numberSensores int yawLoArmR Radian 0 _selectedArm string yawUpArmL Radian 0 SensorConfigFile string ConfigFile cfg yawUpArmR Radian 0 GetNumberSensors int ChangeEulerAngles Quaternion int Quaternion GetSelectedArm string ChangeOrientationBones Quaternion int void ReadConfigurationFile void UpdatePositionBones List lt Bone gt int string SkeletonConfigFile updatePositionBones getSkeletonConfigFile _getSkeletonConfigFile SkeletonConfigFile updatePositionBones UpdatePositionBones informationSkeleton InformationSkeleton readOnly SensorCoordinate Quaternion int void Sensorlnteraction Entity 58 B 4 PhysicsInteract class PhysicslInteract ContactCallback ContactCallback WorldCallback TriggerCallback bodyO string action string bodyt string hand string i ushort trigger Trigger triggersList List
85. tate of these older people The purpose of this study is to verify the anatomical angles using a portable system during stair climbing and compare them with the data acquired by the optical trackers The portable system consists in MTx Sensors placed on the legs and active markers Codamotion that are placed on the stairs 19 http www codamotion com 12 Chapter 3 Analysis and Planning The analysis and planning of this project is presented in this chapter The analysis consists in describing the context of use of this project the functional and non functional requirements and the use cases of the project In the planning section it is presented the development process the planning of the project and the chosen resources 3 1 Project Analysis 3 1 1 Context of Use This project was developed in the Ergonomics Laboratory of the Faculty of Human Kinetics of the Technical University of Lisbon In the project Informa o de Seguran a Avalia o da efic cia de sinais pict ricos de seguran a Duarte 2004 it is studied the comprehension of safety signs in different populations through the completion of questionnaires In this study it was found during the analysis of the results that the context highly influences participants response With the need to provide contexts of use that are more realistic and dynamic and the possibility of evaluating the behavior and not only the safety materials arises the possibility of u
86. tely The entire objects have their functions the table is an object to interact by pushing the lamp and the fire extinguisher are to grab and the pillar is to test the collision situations Figure 18 Scene Figure 19 Virtual Human A small viewport is created and displayed in the lower left corner of the screen The biggest viewport displays the first person perspective and the smaller displays the VH 4 2 2 Sensors Integration SensorInteract The initial position of the VH necessary to calibrate the system is a T pose Figure 19 The developed module is prepared to receive a maximum of six sensors three for each arm It is possible in the configuration file ConfigFile cfg to select the desired arm choose the number of sensors to activate and in case of using only one sensor it is possible to select which bone it will affect This last possibility is valuable for testing purposes As such ErgoInteract accepts the use of two or three sensor in each arm If is chosen two sensors it moves the arm and the hand in case of three sensors it moves the three bones arm forearm and hand If both arms are chosen there is the possibility to choose four or six sensors The four sensors are placed in the arms and hands and if it six are chosen they are for the three segments from both sides Table 2 helps to know the possibilities to place the sensors 37 One Arm Two Sensors Three Sensors Hand Shoulder Arm Forearm Shoulde
87. tes in real way interactions between rigid bodies in games and other applications in real time e Xsens XBus Kit with 10 MTx sensors HW These are hybrid trackers and provide orientation and kinematic data acceleration earth s magnetic field and angular velocity with three degrees of freedom DOF Figure 13 e 3DS Max SW It is a commercial 3D modeling and animation tool http www ogre3d org tikiwiki MOGRE P http newtondynamics com 25 Figure 13 MTx Sensor Material and technological resources chosen to be used in PEI e Daz Studio Pro 4 It is a free 3D modeling software that as posing and animation tools e MakeHuman SW It is an open source software that generates 3D Virtual Humans with skeleton e Microsoft Visual Studio 2010 SW It is a Microsoft IDE dedicated to the NET Framework e Microsoft Office Visio 2007 sw Application to create technical and professional diagrams of several types e Ogre Command Line Tools 1 7 2 SW It is a software of OGRE that convert the exported files in xml to the native format of OGRE Human Resources This work of PEI unfolds in collaboration with a multidisciplinary team from the Ergonomics Laboratory that includes e One Ergonomist e One Architect e Five Designers e One Computer Engineer e One Professor of Numerical Methods e One Psychologist V http www makehuman org P http www microsoft com visualstudio 1 http
88. the complete class diagram is in the Appendix B 4 The classes are ActionEventArgs Actor InteractionActor MogreNewtHandler NormalCamera VRCamera WorldCallback and TriggerCallback class Physicslnteract ContactCallback ContactCallback WorldCallback TriggerCallback worldCallback triggerCallback Disposable MogreNewtHandler Actorinteraction Actor headCamera Y VRCamera A NormalCamera newtHandler EventArgs ActionEv entArgs Figure 27 PhysicsInteract Class Model The InteractionActor is the main class that creates the physics bodies and places them with the right size in the right position using the skeleton file to access to that information It is not necessary a very detailed physic definition in the VHs body only it is necessary in the arms That way the VHs body has one physics shape and for the arms exist two for each arm in order to have more realism in the physics behavior Figure 28 44 Figure 28 Body s Physics It is not necessary to place physic bodies on the shoulder because as it is possible to infer from this figure it is not likely to touch only with the shoulder in an object The MogreNewtHandler is the class that initializes the MogreNewt library In order to have physics behavior in the ActorInteraction it is necessary to create a physics world that is also created in
89. tion 10 2 22 Tracking with the MTx sensors The studies referred in this section are focused on tracking lower or upper limbs using MTx sensors the same sensors that are used in this study Baseball pitchers analysis Brand 2008 An experiment was conducted Brand 2008 with pitchers of a baseball team in Chicago in August 2009 The purpose of this study was to make a real time tracking of the movement of the scapula in pitchers with the objective to prove that after the launch and the stress suffered in the joints of the upper limbs the scapula loses the ability to follow the movement of the humerus For this study four sensors MTx were used in the thorax scapula humerus and forearm Rowing with MVN Working in the areas of rehabilitation ergonomics and sports training the project Rowing with MVN focuses on competitive rowing Their ultimate goal is to develop a tool that provides relevant and accurate information to the coach who led him to make decisions about training and activities that are beneficial to the competition For this study was used the device MVN that consists in a suit of lycra that has incorporated MTx Sensors Outwalk a novel protocol for clinical gait analysis based on inertial and magnetic sensors Outwalk Cutti et al 2010 is a protocol of walking that tries to measure the thorax pelvis and lower limb 3D kinematics in children with cerebral palsy or amputated during gai
90. to four chapters Chapter 1 Presents the motivation the objectives and the contributions of the project Chapter 2 Presents VR concepts explaining its origin and evolution A state of the art in the area is also presented Chapter 3 Presents the analysis and planning of the project In the analysis it will be presented the context of the project followed by the requirements and his use cases In the planning the development method chosen will be presented and the project phases Finally the resources to fulfill the goals are enumerated Chapter 4 Presents the developed work The first section presents the work related with the VHs modeling and body facial animation The second section gives a general explanation of the developed system and after that it is given an explanation of each library in separate Chapter 5 Presents conclusions and draws some reflections about future work Appendices This document has five appendices The first appendix is the Gantt chart that presents the preliminary project planning The second appendix is the complete class diagrams of all libraries The third appendix is a user manual that helps the creation of a compatible HV model with the ErgoVR system The fourth appendix is a user manual that helps the creation modification of a specific configuration file called ConfigFile cfg The last appendix is a complementary user manual to work with the MTx sensors Chapter 2 Context and State o
91. tring void property LeftHandPosition Vector3 RightHandPosition Vector3 SceneManager SceneManager GetlDAction int Trigger String String int int Trigger String String int SceneNode property ActionName string DropAera SceneNode Hand int IDAction int ObjectName string Position Vector3 PlayAction hand Bone ID int objectAction SceneNode objectinter SceneNode trigger Trigger PlayAction Trigger Bone SceneNode TypeOfAction string void 60 B 6 SkeletonInformation class Skeletoninformation InformationSkeleton _ skeleton Skeletoninstance boneHandL Bone boneHandR Bone boneLoArmL Bone boneLoArmR Bone boneUpArmL Bone boneUpArmR Bone entHV Entity readOnly listBones List lt Bone gt GetListBones List lt Bone gt GetSizeBone Vector3 GetSizeBone String Vector3 GetSkeleton Skeleton GetSkeletonHeight float GetSkeletonWidth float InformationSkeleton Entity C User Manual to Create VH User Manual Create a VH for the ErgoVR system Contents A User Manual to Create VEL massas iii id 1 1 Introduction Ss o resur iae deep ta RR RR tesa ie echo ee pei Eae ate 4 2 lt A GPS EDEN ARENA STI EE SEE ENSIS se A ESEE aca 4 3 Overall WwOEEEIOW a a R R ae 4 4 AWC IME the VH sia Ri sees a as a 4 MEH a ED 5 BORA a ive es NUR pd dte de RS n 7 Facial ADBDALUOD eio ette eei praise leas 8
92. uation 1 2 Objectives The VEs of the project Future Safety Warnings Virtual Reality in the study of technology based warnings consists in interior spaces of buildings consisted by rooms and corridors where there are safety warnings and certain tasks to perform In these VEs the behavior and attitudes of the participant will be evaluated in respect to the compliance and respect for the indications given by the warnings and the realization of the task The objective of this project is to allow the participant to interact in a more natural way with virtual objects in a virtual scenario in order to perform some required tasks e g open a door press a fire alarm button or pick up a fire extinguisher http www fmh utl pt ergovr index htm 2 To accomplish this the participant must have a corresponding VH model in the environment Since the view of the participant will be from the point of view of the VH when moving the upper limbs the participant will see the same movements reflected in the VH 1 3 Contributions This project was developed in the Faculty of Human Kinetics of the Technical University of Lisbon in the Ergonomics Laboratory s VR research unit called ErgoVR In this research unit several different VE have been created to perform tests related with Ergonomics studies However a HV representing the participant was still missing This project address this limitation and enables the possibility of representing
93. w and placed in the user with the arrow for the upper side The series of Sensor ID 0 1 and 2 goes in order to shoulder forearm and glove as well as the series 3 4 and 5 It is possible to see in Figure 6 Connected Sensors with the Straps Figure 6 Connected Sensors with the Straps After that you just need to place them in the body s user participant as shown in Figure 7 Place Sensors Front View and Figure 8 Place Sensors Side View Figure 7 Place Sensors Front View Figure 8 Place Sensors Side View
94. y information to run the graphic engine MOGRE and it was developed based on the tutorials present in the Wiki of OGRE It deals with everything related with the graphic engine It creates cameras viewports lights materials and reads several OGRE configuration files SensorInteract receives all the data coming from the library of the sensors and handles that data to be able to associate it to the appropriate bone PhysicsInteract is the library that deals with the physics part in the scene The ObjectInteract library deals with the interaction of the VH with the objects in the scene using the concept of Smart Objects As explained in Magnenat Thalmann and Thalmann 2004 the idea behind the concept of smart object is that each interactive object contains a complete description of its functionality and interaction capabilities In Figure 15 it is possible to see the Module Diagram of the system 2 http www ogre3d org tikiwiki tiki index php page Mogre Tutorials 33 SkeletonInformation Sensorinteract PhysicsInteract Objectinteract XsensWrapper DemoApp BaseApplication Figure 15 System Modules DemoApp is a derived type from the BaseApplication in order to the DemoApp override some of his methods In addition this class also creates an instance of the XSensWrapper SensorInteract PhysicsInteract and ObjectInteract The SkeletonInfo
Download Pdf Manuals
Related Search
Related Contents
BW Technologies Clip Gas Detector User Manual SMARTWATCH 2 取扱説明書 ケーブル通信ユニット ZyXEL G-3000H User's Manual Kompernass KH 101 Operating Instructions XciteRC 30200000 SAU510-USB Iso Plus v2 User's guide User Manual Brodit ProClip 854968 Copyright © All rights reserved.
Failed to retrieve file