logging in or signing up 4ap920i31e9i7eobickg 2c82jdtj78tr1e9ig Freedom Download Post to : URL : Related Presentations : Share Add to Flag Embed Email Send to Blogs and Networks Add to Channel Uploaded from authorPOINTLite Insert YouTube videos in PowerPont slides with aS Desktop Copy embed code: Embed: Flash iPad Dynamic Copy Does not support media & animations Automatically changes to Flash or non-Flash embed WordPress Embed Customize Embed URL: Copy Thumbnail: Copy The presentation is successfully added In Your Favorites. Views: 316 Category: Education License: All Rights Reserved Like it (1) Dislike it (0) Added: May 02, 2008 This Presentation is Public Favorites: 0 Presentation Description No description available. Comments Posting comment... Premium member Presentation Transcript VR Hardware & Software: VR Hardware & SoftwareHaptic Devices: Haptic Devices Haptic interfaces are devices that stimulate the sense of touch such as the sensory capabilities within our hands. The surge of computer capability and the desire for better ways to connect to computer-generated worlds has driven the creation and development of practical devices for haptic interaction. Force feedback gaming devices, such as joysticks and computer mice, have become available, while in the medical field, telesurgery or surgeon directed robotic surgery has been gaining recognition.Haptic Devices: Haptic Devices Haptic technology refers to technology which interfaces the user via the sense of touch by applying forces, vibrations and/or motions to the user. This mechanical stimulation is used to create haptic virtual objects. Haptic technology has made it possible to investigate in detail how the human sense of touch works, by allowing the creation of carefully-controlled haptic virtual objects. These objects are used to systematically probe human haptic capabilities. Haptic Devices: Haptic Devices One of the earliest forms of haptic devices is used in large modern aircraft that use servo systems to operate control systems. Such systems tend to be "one-way" in that forces applied aerodynamically to the control surfaces are not perceived at the controls, with the missing normal forces simulated with springs and weights. In earlier, lighter aircraft without servo systems, as the aircraft approached a stall the aerodynamic buffeting was felt in the pilot's controls, a useful warning to the pilot of a dangerous flight condition. Haptic Devices: Haptic Devices This control shake is not felt when servo control systems are used. To replace this missing cue, the angle of attack is measured, and when it approaches the critical stall point a "stick shaker" (an unbalanced rotating mass) is engaged, simulating the effects of a simpler control system. This is known as haptic feedback. Alternatively the servo force may be measured and this signal directed to a servo system on the control. This method is known as force feedback.Haptic Devices: Haptic Devices Teleoperators are remote controlled robotic tools, and when contact forces are reproduced to the operator, it is called "haptic teleoperation". The first electrically actuated teleoperators were built in the 1950's at the Argonne National Lab, USA, by , to remotely handle radioactive substances. Since then, the use of "force feedback" has become more widespread in all kinds of teleoperators such as underwater exploration devices controlled from a remote location Haptic Devices: Haptic Devices In 1988 researchers Cybernet Systems at first developed devices that generated arbitrary forces from computer models or simulations in lieu of actual physical slave devices. When such devices are simulated using a computer (as they are in operator training devices) it is useful to provide the force feedback that would be felt in actual operations. Since the objects being manipulated do not exist in a physical sense, the forces are generated using haptic (force generating) operator controls. Haptic Devices: Haptic Devices Data representing touch sensations may be saved or played back using such haptic technologies. Cybernet licensed its force feedback patents to Immersion Corporation in 1998 and Immersion licensed Logitech, Microsoft, Sony and others to manufacture Force Feedback joysticks, wheels, and othere devices worldwide. Haptic simulators are currently used in medical simulators and flight simulators for pilot training Haptic Devices: Haptic Devices Some low-end haptic devices are already common in the form game controllers, in particular of joysticks and steering wheels. At first, such features and/or devices used to be optional components. Now many of the newer generation console controllers and some joysticks feature built in devices. Example: the simulated automobile steering wheels that are programmed to provide a "feel" of the roadHaptic Devices: Haptic Devices Another concept of force feedback was that of the ability to change the temperature of the controlling device. This would prove especially efficient for prolonged usage of the device. However, due to the high cost of such a technology, the closest many manufacturers have come to realizing this concept has been to install air holes or small fans into the device to provide the user's hands with ventilation while operating the device Haptic Devices: Haptic Devices A major leap forward for haptic gaming experience was enabled with the by Novint Technologies. It was released in 2007 bundled with a bunch of games and demos to showcase the technology and is available at a breakthrough price of under $200 US. They enables the user to have the simulated experience of touching a sphere that can selectably be made smooth, bumpy, made of molasses, covered in sandpaper, made of rubber, full of sand, magnetic, icy or made of honey Haptic Devices: Haptic Devices Other haptic simulation experiences include feeling the inertia of a weighted ball attached to the user's hand with a rubber band that can be stretched and spun around in any of the three spatial dimensions. Many other sensations are simulated, such as hitting and catching a baseball, firing a gun and feeling it recoil or stretching back a bow and arrow to release the tension and let it fly. While the price of the Novint Falcon puts this device at the low end of haptic interfaces, it is actually a full 3-axis robotic device that gives the user a haptic simulation experience that provides a significant tactile experience toward fully immersive virtual reality. Haptic Devices: Haptic Devices Haptics is gaining widespread acceptance as a key part of virtual reality systems, adding the sense of touch to previously visual-only solutions. Most of these solutions use stylus-based haptic rendering, where the user interfaces to the virtual world via a tool or stylus, giving a form of interaction that is computationally realistic on today's hardware Video of Novint Falcon demo Novint Falcon Haptics Controller Could Be Wii 2.0 Haptic Devices: Haptic Devices Some research has been done into simulating the different kinds of tactition by means of high-speed vibrations or other stimuli. One device of this type uses a pad array of pins, where the pins vibrate to simulate a surface being touched. While this does not have a realistic feel, it does provide useful feedback, allowing discrimination between various shapes, textures, and resiliencies. To develop research application there exist commonly available haptics API such as Chai3D, OpenHaptics and H3DAPI (Open Source). Haptic Devices: Haptic Devices Various haptic interfaces for medical simulation may prove especially useful for training of minimally invasive procedures (laparoscopy/interventional radiology) and remote surgery using teleoperators. A particular advantage of this type of work is that the surgeon can perform many more operations of a similar type, and with less fatigue. It is well documented that a surgeon who performs more procedures of a given kind will have statistically better outcomes for his patients. Haptic Devices: Haptic Devices In ophthalmology, "haptic" refers to a supporting spring, two of which hold an artificial lens within the lens capsule (after surgical removal of cataracts). A 'Virtual Haptic Back' (VHB) is being successfully integrated in the curriculum of students at the Ohio University College of Osteopathic Medicine. Research indicates that VHB is a significant teaching aid in palpatory diagnosis (detection of medical problems via touch). The VHB simulates the contour and compliance (reciprocal of stiffness) properties of human backs, which are palpated with two haptic interfaces (SensAble Technologies, PHANToM 3.0). Haptic Devices: Haptic Devices The use of haptic devices in entertainment appeared in the 1932 futurist fiction book Brave New World by Aldous Huxley. The author described a future entertainment theater where the arm rests of the seats had positions for the hands to rest that gave haptic stimulation. The programs exhibited were of an erotic nature and rather than "the movies" these theaters and shows were called "the feelies". Haptic devices, including self-propelled haptics, feature prominently in Vernon Vine's 2006 novel Rainbows End. Haptic Devices: Haptic Devices The Shadow Dextrous Robot Hand uses the sense of touch, pressure, and position to reproduce the human grip in all its strength, delicacy, and complexity. The SDRH was first developed by Richard Greenhill and his team of engineers in Islington, London, as part of The Shadow Project, (now known as the Shadow Robot Company) an ongoing research and development program whose goal is to complete the first convincing humanoid.Haptic Devices: Haptic Devices The Dextrous Hand has haptic sensors embedded in every joint and in every finger pad which relay information to a central computer for processing and analysis. Carnegie Mellon University in Pennsylvania and Bielefeld University in Germany in particular have found The Dextrous Hand is an invaluable tool in progressing our understanding of haptic awareness and are currently involved (2006) in research with wide ranging implications.Haptic Devices: Haptic Devices Touching is not limited to a feeling, but it allows interactivity in real-time with virtual objects. Thus haptics are commonly used in virtual arts, such as sound synthesis or graphic design/animation. The haptic device allows the artist to have direct contact with a virtual instrument which is able to produce real-time sound or images. Haptic Devices: Haptic Devices We can quote the physical modelling synthesis which is an efficient modelling theory to implement cross-play interaction between sound, image, and physical objects. For instance, the simulation of a violin string produces real-time vibrations of this string under the pressure and expressivity of the bow (haptic device) held by the artist. Designers and modellers may use high-degree of freedom input devices which give touch feedback relating to the "surface" they are sculpting or creating, allowing faster and more natural workflow than with traditional methods.Haptic Devices: Haptic Devices Haptic technology is difficult to achieve convincingly because the human sense of touch is far more sensitive than the senses of sight or sound. In visual systems a picture needs to be refreshed at only 30 frames per second to trick the eye into thinking it is seeing continuous motion. However, the sense of touch requires that a sensation be updated 1000 times per second or more to relay a convincing tactile experience. Haptic Devices: Haptic Devices There are two main areas of application and research that are currently being developed by researchers in the haptics field - Virtual Reality/Telerobotic systems and Force Feedback/Tactile Display systems. Although both types of systems seek to simulate force information there are differences in the types of forces being simulated.Haptic Devices: Haptic Devices Virtual reality and Telerobotic researchers seek to develop technologies that will allow the simulation or mirroring of virtual or remote forces by conveying large-scale shape and force information. Researchers in the field of Force Feedback and Tactile displays seek to develop a method of conveying more subtle sensory information between humans and machines, using the sense of touch.Haptic Devices: Haptic Devices Using the criteria of large- and small-scale force generation, haptic devices can broadly be categorized as follows: Virtual Reality/Telerobotics: a) Exoskeletons and Stationary Devices; b) Gloves and Wearable devices; c) Point-sources and Specific Task Devices; d) Locomotive Interfaces. Feedback devices: a) Feedback input devices/ Force feedback devices; b) Tactile displays. Haptic Devices: Haptic DevicesHaptic Devices: Haptic DevicesHaptic Devices: Haptic DevicesHaptic Devices: Haptic Devices Many of the haptic devices must be miniaturized so that they are lighter, simpler and easier to use. Accurate and high-speed devices must be perfected in order to create real-world simulations. Example, A remote surgical system must be able to provide instantly-updated information so that a surgeon can perform his/her work instantaneously. The slightest delay may be the difference between a life or death situation for the patient.Haptic Devices: Haptic Devices A sensation must be read remotely, converted into a digital form, sent by some communications link to the user converted back to a form that is useable by the actuation system, and the force must be generated for the user based on that information, the users response must then be read and the process must be repeated to send instructions back to the remote site. This system requires high-speed communications links and methods in order to insure that the actions of the remote robot accurately and in a timely fashion mirror the instructions of the operator.VR Input Devices: VR Input DevicesVR Input Devices: VR Input Devices Gloves have played a role in the VR craze from the very beginning, even though the original designers did not necessarily intend for them to be used in VR systems. Using a wired glove, you can interact with virtual objects by making various hand gestures. Not all gloves work the same way, though all share the same purpose: allowing the user to manipulate computer data in an intuitive way. VR Input Devices: VR Input Devices Data gloves offer a simple means of gesturing commands to the computer. Rather than punching in commands on a keyboard, which can be tricky if you're wearing a head-mounted display or are operating the BOOM, you program the computer to change modes in response to the gestures you make with the datagloves. VR Input Devices: VR Input Devices Pointing upwards may mean zoom in; pointing down, zoom out. A shake of your fist may signal the computer to end the program. Some people program the computer to mimick their hand movements in the simulation; for instance, to see their hands while conducting a virtual symphony. One type of dataglove has a web of fiber optic cables along its back.VR Input Devices: VR Input Devices Changes in the amount of light transmitted to the computer by the cables signal how the joints of your fingers are bent. Light passes through the cables from an emitter to a sensor. The amount of light that makes it to the sensor changes depending on how the user holds his fingers -- if he curls his fingers into a fist, less light will make it to the sensor, which in turn sends this data to the VR system's CPU. In general, these sort of gloves need to be calibrated for each user in order to work properly VR Input Devices: VR Input Devices Once the dataglove has been calibrated to your hand, your gestures trigger pre-programmed commands. Other gloves use strain sensors over the joints to detect movement. Yet others rely on mechanical sensors to measure your hand movements. VR Input Devices: VR Input Devices Other gloves use strips of flexible material coated in an electrically conductive ink to measure a user's finger position. As the user bends or straightens his fingers, the electrical resistance along the strips changes. The CPU interprets the changes in resistance and responds accordingly. These gloves are less accurate than fiber-optic gloves, but they also tend to be much less expensive. VR Input Devices: VR Input Devices Some computer users have elaborated on the dataglove concept by creating facial sensors, even body suits. Not many scientists have climbed into these get ups, but animators have. Already, facial movement sensors hooked to computers are simplifying their job: animating cartoons. VR Input Devices: VR Input Devices Wands, the simplest of the interface devices, come in all shapes and variations. Most incorporate on-off buttons to control variables in a simulation or in the display of data. Others have knobs, dials, or joy sticks. Their design and manner of response a re tailored to the application. For example, biologists sometimes use wands like scalpels to slice tissue samples from virtual brains. VR Input Devices: VR Input Devices Most wands operate with six degrees of freedom; that is, by pointing a wand at an object, you can change its position and orientation in any of six directions: forward or backward, up or down, or left or right. This versatility coupled with simplicity are the reasons for the wand's popularity. VR Input Devices: VR Input Devices DataSuit (VPL Research): Using the same fiber optic flex-sensing technology made popular in the VPL DataGlove, this full body suit can track the movement of the arms, legs, feet, and torso with up to fifty different sensors on the users joints and with four Polhemus trackers (position sensors). The DataSuit is not yet commercially available due to the complex calibration process the suit must go through for each new user. VR Input Devices: VR Input Devices Suits measure positions of ankles, hip, shoulders, arms, and hands There are wired and wireless versions Initially developed for NASA applications such as measurement of biomechanics during space missions VR Input Devices: VR Input Devices A treadmill is useful because the user remains stationary with respect to the real world, but feels as if he is actually walking through the virtual environment. Researchers have found it relatively simple to link a treadmill to a computer system so that a user's steps result in an appropriate adjustment in the system's graphics. An obvious limitation of normal treadmills is that you can only walk in two directions: backward or forward.VR Input Devices: VR Input Devices Some companies have developed omni-directional treadmills. These devices allow a user to step in any direction. Normal treadmills use a single motor, which exerts force either forward or backward relative to the user. Omni-directional treadmills use two motors -- from the user's perspective the treadmill can exert force forward, backward, left or right. With both motors working together, the treadmill can allow a user to walk in any direction he chooses on a walking surface wrapped around a complex system of belts and cables. Treadmill: TreadmillTreadmill: TreadmillTreadmill: TreadmillVR Input Devices: VR Input Devices An alternative to a treadmill is a pressure mat. Video games like "Dance Dance Revolution." There are many kinds of pressure sensors, though the most common are electromechanical pressure sensors. An electromechanical pressure sensor is a relay that activates when pressure is applied to the sensor. When the circuit closes, an electric current runs through it, signaling the CPU to make changes to the graphic output sent to the user. VR Input Devices: VR Input Devices A Japanese company has developed a unique omni-directional treadmill called the String Walker. The device is ring-shaped, with eight strings strung across the diameter of the device. The user can walk on the strings in any direction. The String Walker debuted in America at SIGGRAPH 2007, a conference focusing on computer graphics and interaction devices. VR Input Devices: VR Input Devices The company VirtuSphere, Inc. offers a unique way for users to move around inside a virtual environment. It looks like a human-size hamster ball -- the user gets inside the sphere and walks around in it. The sphere rests on a stable platform that has several wheels resting against the sphere, allowing it to roll in any direction while staying in the same fixed position. Sensors in the wheels tell the CPU which way the user is walking, and the view within the user's HMD changes accordingly. VR Input Devices: VR Input Devices 3D Pointing devices support menu item and object selection and a number of programmable buttons This is similar to conventional 2D pointing devices that are used as conventional human computer interface devices The choice of the device type depends on the degree of how much the device is appropriate for a particular user interface VR Input Devices: VR Input Devices 3D mouse has 6 degrees of freedom (position coordinates and angles) Example: Ascension 6DOF mouse 3D Digitizer digitizes 3D coordinates of points in space Useful for creating 3D objects for building virtual worlds Example: Immersion Corp. Microscribe 3D VR Input Devices: VR Input Devices Exoskeleton devices use an external mechanism attached to the user’s hand or arm It is used for tracing the movements of the hand or arm Exoskeleton devices can be constructed to provide a force feedback (e.g. when reaching a virtual object) Example: Dexterous Arm Master (Sarcos, Inc.) VR Input Devices: VR Input Devices VR Nose: Technologies for collecting and interpreting odors are developed for military applications in biological and chemical warfare and product quality control Three basic sensing technologies are: gas chromatography mass spectrometry chemical sensor arrays Graphics Pipeline: Pixel processing Polygon processing Graphics Pipeline Application-specific processing Scene processing Scene graph Display list Depth buffer Frame buffer - per primitive (polygon processing) - rasterization (included in pixel processing) - per fragment (pixel processing) Texture bufferOS: OS real-time, multi-modal requirements, very high resolution time slicing atomic, transparent distribution of tasks large number of light-weighted processors, communicating by means of shared memory support for time-critical computing: negotiated, graceful degradation guaranteed frame rate, lag timeCPU: CPU This multithreading generally occurs by time slicing ,wherein a single processor switches between different threads, in which case the processing is not literally simultaneous, for the single processor is really doing only one thing at a time. Uni-Core Processor Multi-Core Processor Threading can be achieved via multiprocessing, wherein different threads and processes can run literally simultaneously on different processors or cores. Slide68: AMD released its dual-core Opteron server/workstation processors on 22 April 2005, and its dual-core desktop processors, the Athlon 64 X2 family, were released on 31 May 2005. AMD have also recently released the Athlon 64 FX line as FX-60, FX-62 and FX-64 for high performance desktops, and Turion 64 X2 for laptops. AMD announce its quad core processors would be produced in 2007. International Business Machines (IBM)'s POWER4, first Dual-Core module processor released in 2000 and POWER5 dual-core chip is now in production, and the company has a PowerPC 970MP dual-core processor in production that was used in the Apple Power Mac G5. Intel launched its quad core processor on 13th Dec 2006 and is currently shipping Core 2 Quad and Xeon microprocessors with quad-core technology codenamed "Kentsfield". Also Core Duo, Core 2 Duo, and Xeon (x1xx series) microprocessors with dual-core technology. These chips, based on the Pentium M (Core Duo) and Core (Core 2 Duo and Xeon) replaced the earlier Pentium D chips, which were based on the Pentium 4. Itanium 2 multi-core processor: Montecito (processor). Intel has developed an 80-core processor prototype that has each core running at 3.16GHz, which it says will be released within the next five years. Microsoft's Xbox 360 game console uses a triple core SMT-capable PowerPC microprocessor, Xenon, made by IBM. Motorola/Freescale has dual-core ICs based on the PowerPC e200, e500 and e600 cores, and the e700 core under development. Sun Microsystems UltraSPARC IV, UltraSPARC IV+, UltraSPARC T1 eight cores, 32 threads VR Software: VR Software Software is used for: design of virtual environments (general purpose 3-D CAD programs or specialized VE development applications) simulator software packages for running VR applications in real-time Simulations in real-time require powerful computers that can perform real-time computations required for generation of visual displaysVR Software: VR Software VR software exploits two decades of CAD and computer graphics development What is challenging is making the integration of hardware, software and the user interface to run in real time Minimizing latency and maximizing update rate are the prime consideration of the system designer Software determines the overall framework for coordinating events within the processing envelope of the hardware, and as multiple processors and parallel processing Software is also reflecting the level of complexity required to make these systems function in real time without compromising reliability VR Software: VR Software Three of the most common pieces of VR software that you'll run into are: VRWorx which is part of the VRToobox collection (cross-platform); QuickTime VR Authoring Suite (not cross-platform) and, Ulead Cool 360 (not cross-platform).Slide72: VR Toolbox announces the next generation…The VR Worx 2.6. Faster, more powerful than ever before. Engineered for Mac OS™ X Tiger and Microsoft Windows™ XP, with a new streamlined and simplified user interface, it delivers powerful technology along with fresh advancements to its famous feature/function set. The ultimate in QuickTime VR for the most discriminating user. VR Worx 2.6: VR Worx 2.6 The VR Worx 2.6 empowers user to create your own camera presets with it's new Lens Tuner utility. It now allows user to create cylindrical panoramic movies, object movies and multi-node scenes (a.k.a. virtual tours), all in the QuickTime format, easier and faster than ever. Version 2.6 has the ability to create an object movies with a panoramic movie as a moving background. And v2.6 has the capacity for transitions within a multi-node scene, like standard wipes, dissolves, explodes, and others, as well as actual linear video as a transition. The VR Worx 2.6 can construct multinode environments with cylindrical panoramas, Cubic VRs, multi-row objects, absolute objects, still images and linear QuickTime movies. Ulead® COOL 360™: Ulead® COOL 360™ Ulead® COOL 360™ is photo panorama software that combines ease-of-use, power and flexibility to allow even novice digital imaging enthusiasts to move beyond traditional photos into immersive imaging. VR Development Software: VR Development Software 3D development software is required to create virtual environments Conventional CAD programs can be used for this purpose Specialized VE development software is also available VR Simulation Software: VR Simulation Software Support for a wide spectrum of VR interfaces Contain control programs for VR simulations Examples of VR development software packages: World Tool Kit (Sense8) CDK (Autodesk) dVise Superscape VRT VREAM VR Software: VR SoftwareFunctionality Comparison: Functionality Comparison Input & scheduling Scene Graph Primitive pipeline Kinematic, Dynamic simulation Autonomous Behavior Avatar, Sharing Performer Inventor WTK, SuperScape, VREAM, CDK VEGA, Corypheus SARAH BRender RenderWare IRIS GL NPSNET Spline DirectX OpenGLVREAM Inc. : VREAM, VR Creater: VREAM Inc. : VREAM, VR Creater 3D World Editor Import/Export DXF World : I/O, hand, gravity Object 3D drawing links : condition, response Attribute (sound, motion, etc) Environment Description World in one ASCII file Runtime System virtual hand interaction MS-DOS, VGA, HMD, tracker, glovesDimension Inc.: Superscape: Dimension Inc.: Superscape Superscape VRT Visualizer World editor position, size, color, dynamics : weight, velocity Shape editor SCL:Superscape Command Language (interpreter) C like, ~400 functions Superscape Networks ~8 users shared VW Superscape Developers Kit C language bindingSuperscape: SuperscapeSuperscape : Superscape JACK: JACKJACK: JACKSense8: Sense8 Sense8's tools are cross platform, available on Win95-98, WinNT, Sun, HP and the whole range of SGI's. This gives user maximum flexibility to utilize a variety of platforms, demonstrating the capabilities of each, adapting to an ever-changing hardware landscape, and to limited university resources. Sense8's tools are known as the most robust and mature tools available on the market. Sense8 also supports most major geometry file formats, including: AutoCAD, Wavefront, ProE, VRML, 3DS, Multigen, VRML and Videoscape. Sense8: Sense8 Sense8 provides the most flexible programming paradigm, including both a GUI interface in WorldUp and a C/C++ toolkit in WorldToolKit enabling our tools to be used by a wide range of programmer skill sets. Additionally, our multi-user client-server solution, World2World, integrates applications developed in either WorldUp or WorldToolKit over the internet, intranet, or LAN/WAN to provide a complete 3D/VR interactive solution. Sense8 : WorldToolKit: Sense8 : WorldToolKit ~400 C library functions simple simulation loop universe, objects, sensor, task viewpoint, light, path, portal platform PC - Windows SGI GL/Performer SUN (XGL,ZX) E&S FreedomWTK : Simulation loop: WTK : Simulation loop Universe Action Function Create World Application Code Simulation Loop Object Task Sensor Read Move Object using Sensor Data Perform Object Tasks Render ImageWorld Tool Kit: World Tool KitWTK: WTKWTK - Traversal: WTK - TraversalWTK - Separator Node: WTK - Separator NodeWTK - Separator: WTK - SeparatorWTK - Propagation: WTK - PropagationSGI : Inventor: SGI : Inventor Object-oriented 3D toolkit for interactive graphics programming Inventor (IRIS-GL) : SGI only Open Inventor (OpenGL) : SGI, Win95 C++ Class libraries Scene Graph node - group, seperator, geometry, transform, light, camera, sensor, material (mathmatical) constraint between nodes (inventor engine) .iv file format for 3D data interchange => VRML 1.0 format browser class 2D GUI : mouse/menu driven scene graph manipulationDIVISION : dVS, dVISE: DIVISION : dVS, dVISE platform (UNIX) PROVISION PV100 VRX/VTX/VPX, Pixel Planes 6 PROVISION 10 + HP 300k poly/sec, 997M pixel / sec SGI, IBM RS/6000 dVS : distributed VR runtime environment multi-server architecture dVISE : a virtual world authoring and simulation software toolDivision dVISE: Division dVISE Division's dVISE environment employs the concept of actors(servers) that are responsible for various system activities For instance, there are actors to support collision detection, audio, tracking and image generation Modelling Virtual Worlds: Modelling Virtual Worlds We may need objects modelled to differentiate levels of detail that are automatically selected depending upon their range A medical application might call for voxel-based models that have to be rendered by making transparent those voxels with a specific attribute We will have to depend upon a range of different systems to address these various applications, but original modelling tools will have to be designed in-house Modelling Virtual Worlds: Modelling Virtual Worlds Imported Models Model formats such as AutoCAD, 3D Studio, Wavefront, Alias, and Computer Vision can normally be imported into most VR systems, saving considerable rebuilding time Modelling Toolkit Features Libraries Modelling systems must provide the user with a variety of tools to select shapes and objects from a system library A shape library holds parametric definitions of circles, ellipses, arcs, parabolas, triangles and etc, while an object library contains cubes, boxes, spheres,and etc. Interactive commands are available to delete or insert vertices, reverse a vertex sequence in a facet, or delete a facet Modelling Toolkit Features: Modelling Toolkit Features Object Sorting In order to speed up the rendering process, some modelling environments make the modeller determine the rendering sequence of the facets and objects For example, when the objects are stored as a binary tree, the tree is rapidly traversed to reveal an optimum rendering sequence Modelling Toolkit Features: Modelling Toolkit Features Surface Attributes Objects can be coloured in a variety of ways One colour could be assigned to the entire object, or can be decorated using texture maps scanned in from photographs These require positioning, scaling and orientating, and some rendering systems may be able to process a nest of texture maps to cope with very large viewing ranges Modelling Toolkit Features: Modelling Toolkit Features Lighting The VE will have to be illuminated by a mixture of amient light and individual light sources These could be directional, point or spot lights To avoid the use of virtual light sources, the virtual world can be prelit by colouring objects with colour intensities that suggest a directional light source For example, to simulate an overhead ceiling light, all surfaces pointing towards the ceiling are made bright, while others are made darker depending on their vertical orientation Modelling Toolkit Features: Modelling Toolkit Features Physical Simulation A VR system must work in real time and will not be able to support every type of physical behavour used in stop-frame animation Designers of VR software are including procedures that animate the VE with credible simulated behaviours Modelling Toolkit Features: Modelling Toolkit Features Rigid Bodies Deformable models are to be included when processor performance support their simulation in real time Mass Homogeneous, symmetric objects are often represented as a point mass, where the object's inertial mass is assumed to be concentrated at a point Modelling Toolkit Features: Modelling Toolkit Features Forces due to Direct Intervention External forces can also be applied to an object at any point in its frame of reference, together with torques about an objects' centre of mass Forces due to Collisions The physics actor implements simple reflective collision using a coefficient of restitution to simulate energy loss this consists of an object's linear position velocity and acceleration, angular position, velocity and acceleration, and a scale factor The physics actor implements simple reflective collision using a coefficient of restitution to simulate energy loss This consists of an object's linear position velocity and acceleration, angular position, velocity and acceleration, and a scale factor Modelling Toolkit Features: Modelling Toolkit Features Collision Detection Various types of collisions can be supported by the physics actor and involve interactions between polygons, edges and vertices All the interactions are subject to a user-defined tolerance level Steady-state Behaviour At the start of a VR session the physics actor is initialized and then exhibits a steady-state behaviour Modelling Toolkit Features: Modelling Toolkit Features This consists of the following steps: 1. Lock the database 2. Extract all the information in which the actor is interested 3. Unlock the database 4. Simulate the physics for the next time step 5. Simulate the collision detection code 6. Report the change in collision to the rest of the database 7. Rewind the simulation to the time of the change in collision state 8. Repeat steps 4 to 7 until the entire time period is used up 9. Check the dead reckoning positions of all objects and update if necessary Modelling Toolkit Features: Modelling Toolkit Features The Physics Library It provides a range of functions to set an objects's physical properties This includes: inertial mass centre mass inertial tensor translate tensor gravitational mass gravitational field restitution mass and external forces Software challenges: Software challenges Is VR different from the desktop CG ? Yes, remember Model I3 Does VR require different technology ? It can be done with the existing technology. But, hard to implement, hard to extend, not very good quality.(1) interaction: (1) interaction Task-level interaction is not well-defined. (as opposed to the driver-level) Extension of 2D interaction? (e.g., 3D widget) Complicated matters continuous interaction no distinction between interface objects and app objects active objects multi-modal interactions multi-participants little formalism: operational semantics(2) operations in the VE: (2) operations in the VE “main loop” tasks sensing, virtual perceiving, interaction, simulation, perceptualization, rendering Problems latency uniform performance Methods simulation loop separate process for each task fine-grain objects, rather than heavy-weight processes(3) Object systems: (3) Object systems OOP Display list Sharing Object collaboration(4) object frameworks: (4) object frameworks Graphics library vs. Framework library procedural API framework protocol to integrate and connect multiple software modules. Requirements multiple, continuous inputs multiple outputs time-varying behavior and simulation time-critical computingComputational challenges: A new time model: Computational challenges: A new time model Has been the subject of intensive studies in distributed computing and parallel computing. For the time-varying behavior and simulation time in the real world, time in the virtual world, time in another world. Synchronization among processes interaction, simulation, renderingComputational challenges: Time-critical computing: Computational challenges: Time-critical computing Philosophy Fast, slightly inaccurate vs. Slow, very accurate What inaccuracies are acceptable? Time budgets are assigned to tasks. Algorithms choose the best compromises in order to meet the time budget. Real-time computing vs. TCC: Real-time computing vs. TCC Real-time computing provides guaranteed response in a fixed time, In a fixed computational environment. Time-critical computing chooses the appropriate level of degradation in a user-defined time, in a flexible, unpredicted environment.Basic approach: Basic approach User specifies an overall desired frame time. Each task within that frame is assigned a time budget. Issue -- allocating the time budget to each task. Each task determines how to do the best job within that time budget. Issue - meeting the time budget. Approach: how much to be processed? How accurately to be processed?(1) Time-critical rendering: (1) Time-critical rendering Determine the following: How much to be processed? # of polygons How accurately to be processed? Rendering techniques while maximizing the ratio = benefit / cost.Benefits and costs: Benefits and costs Benefits image quality (or something else…) depends on the size of object on screen the location on screen the focus of interaction Cost number of polys * cost per poly Maximize the benefit / cost ratio(2) Time-critical simulation: (2) Time-critical simulation Determine the following: How much to be processed? in terms of the objects under computation How accurately to be processed? in terms of the algorithm complexity (e.g., # of iterations) while maximizing the ratio = benefit / cost.Benefits and costs: Benefits and costs Benefits accuracy of computation number of objects being computed Costs time complexity Maximize the benefit / cost ratio Benefits and costs are hard to define. No universally accepted approach yet. Some Virtual Reality Software : Some Virtual Reality Software AC3D (Free 14-day download. "To create 3D models for: games - virtual reality and flight simulation - scientific, medical and general data visualisation - rapid prototypes of 3D designs - high resolution 3D renderings.") http://www.ac3d.org/index.html Alice ("Free, Easy, Interactive, 3D Graphics for the WWW") Carnegie Mellon University, originally developed at the University of Virginia. http://www.alice.org/ Anim8or (Free 3D animation software) R. Steven Glanville, author http://www.anim8or.com/ Some Virtual Reality Software : Some Virtual Reality Software Blender 3D (Free download. "The open source software for 3D modeling, animation, rendering, post-production, interactive creation and playback. Available for all major operating systems under the GNU General Public License.") http://blender.org/cms/Home.2.0.html Cortona Viewer (VRML viewer, works with vrml, wrl, and 3-D HTML websites. Free Download) ParallelGraphics, 6 Wilton Place, Dublin 2, Ireland http://www.parallelgraphics.com/products/cortona/ Cosmo Player (VRML) (Free. Distributed "as is" by Computer Associates International, Inc.) http://www.karmanaut.com/cosmo/player/ Some Virtual Reality Software: Some Virtual Reality Software DesignWorkshop; DesignWorkshop Lite ("Design-Oriented 3D Modeling complete with Live Walkthrough, Presentation Rendering, and lots more!" Lite version is free. Download at their website.) http://www.artifice.com Eon (VR world building software) Eon Reality. http://www.eonreality.com Internet Space Builder (Professional web authoring for virtual 3D worlds. Free demo available) ParallelGraphics, 6 Wilton Place, Dublin 2, Ireland http://www.parallelgraphics.com/products/ http://www.parallelgraphics.com/products/isb/ Internet Scene Assembler (Professional web authoring for virtual animated 3D worlds. Free demo available) ParallelGraphics, 6 Wilton Place, Dublin 2, Ireland http://www.parallelgraphics.com/products/isa/ Some Virtual Reality Software : Some Virtual Reality Software Mandala(R) VR System (VR world authoring software; uses video camera system) The Vivid Group, 317 Adelaide Street West, Suite 302, Toronto,Ontario, Canada M5V 1P9. http://www.vividgroup.com Strata 3D (Rendering, modeling, animation.) http://www.strata.com VistaPro (landscape rendering) http://store.spiritekproducts.net/vistapro40.html http://downloads-zdnet.com.com/VistaPro-Renderer/3000-6677_2-10276110.html http://www.vendornation.com/WS4D_Cookie=x11.19.05_20,31,12_14448/*ws4d-db-query-QuickShow?vp001 VrmlPad (VRML Programming. Free Demo Available) ParallelGraphics, 6 Wilton Place, Dublin 2, Ireland http://www.parallelgraphics.com/products/vrmlpad/ You do not have the permission to view this presentation. In order to view it, please contact the author of the presentation.