ARES Augmented Reality System

Developed by CGSD and Hokkaido Electric Power

The concept of superimposing computer generated imagery on top of real world objects is known as augmented reality. The augmented reality system (called ARES [tm], aireze) offered here was initially developed by CGSD and Hokkaido Electric Power Company (HEPCO). HEPCO is the ultility on the north island of Japan. The HEPCO R&D Department has an interest in displaying data measured relative to an object, such as electromagnetic field strength, sound level, or temperature, is to be visualized relative to the actual object. Display of the electromagnetic field pattern surrounding a microwave oven is one such use.

Applications and Availability

CGSD continues to work with HEPCO in developing exciting new applications for ARES. CGSD itself has taken the lead in applying the system to the visualization of CAD data. For data visualization, the system includes software that provides alternate representations of vector data in three-dimensional space.

For design visualization, an image of a new design is made to hang in space, viewed in the correct perspective by each person on the design team; the 3D object can be critiqued by pointing at features of the object, and the whole team will observe the action using the see-through display. In such applications, the systems offers advantages over traditional projection-based systems such as virtual workbenches and CAVE systems. The augmented reality system works in normal room light.

Designated the ARES-100, the sytem as described here is offered by CGSD including the data visualization and viewing software. The display can be upgraded to the ruggedized stereo Sony HMD. The system includes complete hardware documentation and a comprehensive User's Manual. Source code licensing is available as an option. Contact P.Y. Cheng pcheng@cgsd.com, tel: 650-903-4924, fax: 650-967-5252 for order information. Delivery is 45 days ARO.

The system was developed with cost-effectiveness as a goal. We can provide options to increase the area of tracked coverage, the graphics performance an image quality, and other aspects of system performance. Please contact us with your requirements.

System Design

Augmented reality systems present certain design challenges. Both the room objects and superimposed data must be clearly visible. Tracking of the user’s head must be good enough to keep the generated imagery stable and aligned to the room. The tracking must work over the desired area, in our case nominally a 2.5 meter cube. For the visualization of abstract data, the generated imagery must provide an easily interpreted representation of the data, being rich enough to provide value without being so complex as to defy comprehension. For either CAD visualization or abstract data visualization, the user must have reasonable mechanisms for interacting experimentally with the data while operating under the HMD. Finally, the system must implemented at reasonable cost.

Configuration

The hardware for the ARES-100 system includes: The network card is used to receive ASCII files of the 3-D magnetic field or CAD data. The PC processes this data for display and routes imagery via the graphics card to the see-through HMD, so that the graphics representation of the data appears superimposed on the real world. To allow the user to move within the scene and to see perspectively correct imagery the position and attitude of the HMD is constantly tracked. The HMD provides true stereo imagery at S-VHS resolution. The wand is tracked and enables the user to interact with the data.

The system is primarily set up initially via the keyboard and trackball. The main control functions are menu driven with a Windows interface. For convenience of the user, some control functions may be repeated with wand and wand button.

The development and runtime software for the system includes: Sense8 WorldToolKit, Microsoft Visual C++, and Windows NT.

Display and Tracker

We selected the I-O Displays i-glasses X2 for the baseline ARES system. It provides approximately 30% transmission of the room lighting and full stereo viewing. The X2 display provides 360K dots of resolution. Optionally, a Sony LDI-D100 see-through stereo display is available. The Sony provides 600 x 800 display resolution with 1.55 M dots, but adds $4800 to the price of the system.

We modified the X2 display in several ways. We added plastic light shields both above and below the original display. Without light shields, stray light entering the eye around the display can cause the user’s eyes to adapt to a higher brightness level than if the stray light were shielded. The user would then turn up the HMD display brightness, making the room objects more even more difficult to see. When stray light is blocked with the light shields, the room illumination may be made quite bright, if desired, to provide good visibility through the display.

The top light shield is designed with a mount for three tracking diodes. The diodes are used with a ceiling-mounted optical tracker. Infrared emitting diodes are lit in sequence to provide the spot targets required for tracking.

Two sensors are used in the tracker to obtain a distance measurement from the stereo disparity. With no filtering of the data to reduce noise errors, the accuracy is approximately one cm, at a distance of four meters, with accuracy increasing to about 0.2 mm at closer distances. Our experience is that this is roughly an order of magnitude better than extended range magnetic trackers, under equal conditions with no filtering being applied.

The optical tracker is immune to electromagnetic field interference, and for our particular application there is not much risk of line-of-sight to the diodes being lost. The tracker was obtained with an optional 70 degree field of view, and is mounted on a high ceiling, up to 4 m high).

Unlike a magnetic tracker, the optical tracker cannot measure the roll, pitch, and yaw of the HMD directly. The three diodes on the HMD permit indirect measurement of the angles from three position measurements. However, a fair amount of filtering must be applied to the angle measurements to reduce the noise, and filtering always introduces lag.

We added a commercially-available inertial tracker to improve the responsiveness of the measurements of the angles. The inertial tracker senses gravity to establish which direction is down, and integrates a two-axis rate gyro measurement to provide yaw. There is an assumption that there is no significant user acceleration, which is reasonable for the data visualization system. The–yaw measurement requires an initial angle and is subject to both drift of the gyroscope and to a hysteresis effect in which the gyro measurement can be significantly biased after a large rapid rotation in yaw. We observed hysteresis of ten degrees or more with our gyro unit.

The filtered angular measurements of the optical tracker can be used to correct the drift and hysteresis. The fast response of the inertial tracker thus compliments the stability of the optical tracker. We wrote simple exponential filtering software to estimate and remove the gyro errors. Note that tracking accuracy is important in a see-through system, because the superimposed imagery is viewed relative to the absolute environment of the room. Advances are being made rapidly in inertial sensors, and we plan to upgrade ARES as new technology become available.

Graphics Accelerator

The objective of the system is to see graphics displayed superimposed on room objects. The graphics ought not fill the display, so there is space left to see the room objects. This concern with leaving "blank space" between objects leads to use of data representations that have many small features, but which do not usually cover much of the screen space of the display. Similarly for CAD models, there tends to be large numbers of small polygons.

The i-glasses X2 we selected uses an unusual stereo format in which the image for one eye is written on the odd numbered scan lines, and the image for the other eye is written on the even numbered scanlines. To support this mode of operation, the graphics card must support, in fast hardware, the stencil mode in OpenGL.

Finally, the potential for using transparency in various data representations requires that the graphics accurately render overlapping transparent objects.

Data Representations

Data values are represented in space by corresponding changes in values of graphical characteristics. A functional relationship between data values and graphical representation values is a mapping. The following mappings are used to map data values over a specified range from a minimum to a maximum to graphical representation values:

Brightness maps the minimum to zero brightness (black) and maximum to full brightness (white). The default brightness is full brightness.

Color maps minimum to blue and maximum to red, with the default green. The linear Munsell chromaticity scale is used for the intermediate values. Shades of purple (RB) and violet (BR) are not used.

Transparency maps the minimum to completely transparent, max to completely opaque. The default transparency is completely opaque

Size maps the minimum to a graphics object width of zero, maximum to an object width of 0.25 meter.

Volume is like size, only varying by the cube root so the volume of the graphics object represents the data value.

Display Modes

One display mode is to show a representation of the raw sampled data. Each data point in the data set is represented by a cube centered at the x, y, z coordinate of the point and aligned with the axes.


 
 
 

In the image above, data are displayed as cubes coded by color and brightness.

Check points are used to check the alignment of the data set with the room. Each check point is represented by the tip of a six sided pyramid. Each pyramid is 0.2 m high with a base 0.1 meter across. The sides of the pyramid are violet, a color not used in the data display. The pyramids are oriented so their axes pass through the center of the room.

Other data display modes present interpolated data points, contour lines, and textured transparent planes.

Interaction with the Wand

The system uses a see-through display so that the graphical representations of data are superimposed over objects in the room. Objects in the room may include the object (like a microwave oven or television) that generated the field pattern being visualized. We would like to represent the object in the graphical imagery so that it can occlude the field representation that is within the object or out-of-view behind the object. Noting that black objects correspond to a transparent "hole" for a see-through display, the method is to make a black polygonal model of the object.

A tracked pointer (or wand) can provide the occlusion input. At the start of the viewing session the user could touch the various corners of the real object (oven) and press the pointer button each time he touches a corner. This will tell the computer the size, orientation, and location of the real object in 3-D space. The computer can then create a series of black polygons to represent the object. The black polygons is used as described above eliminating the confusing imagery that would otherwise be shown to the user.

Other wand functions are selected using the buttons on the wand. The functions include display mode selection, data editting, placement of a dynamic clipping plane, and many other functions.


Copyright CGSD Corp. 1999fmb