RENCI Vis Group Multi-Touch Blog

Developing multitouch hardware and applications for research and experimentation.

Entries Comments


Multi-Touch Calibration Device

November 10, 2008 (13:35) | Duke Multi-Touch Wall, Multi-Touch | No comments

Multi-Touch Calibration Tool

Multi-Touch Calibration Tool

Aligning the detected touches with the projected screen requires the use of a calibration application built into Touchlib. Creating precise touches is made easier using the calibration tool we developed. Since there can be variations in the illumination behind (or under) the projection screen, it’s easier to turn off the illuminators and use an infra red laser to create the touches.

Calibrating using the utility built into TouchLib.

Calibrating using the utility built into TouchLib.

We used two lasers: one visible (to aim at the desired point), and one infra red (which the IR cameras can see) which is triggered by a push button. The body of the device was designed in SolidWorks and built as a rapid prototype using sterolithography (SLA). The electronics were developed and built in-house.

Duke Multi-Touch Wall Development - Hardware Part 3

November 7, 2008 (17:11) | Duke Multi-Touch Wall | No comments

Installation

Building the frame.

Building the frame.

Construction included the building of the 80/20 frame, mounting the projectors, cameras and illuminators. Then came the screen installation…

Installing the screen.

Installing the screen.

The screen is a single piece of 3/8″ acrylic mounted into an extruded aluminum frame. This assembly is clamped onto the 80/20 frame and supported by aluminum plates.

Backstage...

Backstage...

A look back stage: Cameras, illuminators and projectors… We also wrapped the front half of the frame with Mylar to help improve the distribution of the IR from the illuminators.

Finished Wall...

Finished Wall...

Here’s a video…

Duke Multi-Touch Wall Development - Hardware Part 2

September 22, 2008 (16:24) | Duke Multi-Touch Wall | No comments

Cameras

The touch detection for our direct illumination (DI) wall is based on the GC660 GigE camera from Prosilica. At VGA resolution (659 x 493), each GC660 can capture images up to 120 fps. Eight (8) of these cameras are connected via a network switch to capture the entire 13.5-foot wide x 5-foot tall screen. Each camera is fitted with Tamron 5-50mm lens and an IR filter.

Prosilica GC Series camera

Prosilica GC Series camera

IR Illumination

The IR illumination is achieved using a large number of Lorex VQ-2120 Night Vision illuminators used for IR security cameras. We also tried using the Arm Electronics IR40 IR Illuminator, but it wasn’t as bright. Dealing with hot spots on the screen was also a challenge.

Lorex QV-2120 Night Vision Illuminator

Lorex QV-2120 Night Vision Illuminator

We prototyped the wall by building the middle third with 2 projectors and 4 cameras:

Testbed Duke Multitouch Wall

Test bed for Duke Multitouch Wall

This allowed us to experiment with the cameras, illuminators and different screen materials.

Testing the wall...

Dr. Xunlei Wu testing the wall...

Projection

For the projectors, we are using the Epson PowerLite Home Cinema 1080 which gives us HD resolution at a reasonable price. With 6 projectors (3 across x 2 high) in the finished wall, we have a final resolution of 5,760 x 2,160 pixels (over 12.4 Megapixels) for a screen measuring about 13.5′ wide x 5.2′ tall.

Cameras, illuminators and a projector.

Cameras, illuminators and a projector.

Screens

Our experiments with sample screens led us towards an acrylic with a higher gain to keep the image read through the screen by the cameras as sharp as possible. We received a number of samples in acrylic as well as glass, but we decided to avoid using glass due to the risk of breakage and tempered glass didn’t transmit IR as well. Here’s our test set-up:

Screen material test set-up.

Screen material test set-up.

Screen samples.

Screen samples.

We experimented with the emulsion (projection surface) towards the user as well as towards the cameras. While we had better contrast with the emulsion towards the user, we were worried about damage. Most manufactures use a fragile sprayed-on coating for their projection surface and there was a significant risk of damage due to scratching and/or oils from the user’s hands. The screen we ended up using is 3/8″ Cinepro Ultra Diffused Screen with a 1.3 gain from RPVisuals with their Tek Satin coating on the front. In addition to reducing glare from other lights in the room, the Tek Satin coating turned out to be easier to touch - your fingers glide along it much more easily than un-coated acrylic.

On to Part 3…

Duke Multi-touch Wall Development - Hardware Part 1

September 15, 2008 (11:00) | Duke Multi-Touch Wall | No comments

Duke Multitouch Wall. (Photo credit: Josh Coyle)

Dr. Xunlei Wu using the Duke Multi-Touch Wall. (Photo credit: Josh Coyle)

RENCI‚Äôs initial foray into this area of research has taken the form of the 13.5-foot wide x 5-foot tall multi-touch visualization wall that was installed at the RENCI Duke Engagement Center in the beginning of May 2008. The wall sits at the end of the Engagement Center’s primary collaboration space:

Floorplan for the RENCI Duke Engagement Center showing the location for the Multitouch Wall.

Floorplan for the RENCI Duke Engagement Center showing the location for the Multi-Touch Wall.

The Wall is positioned at the end of the primary collaboration space. (Photo credit: Josh Coyle)

The Wall is positioned at the end of the primary collaboration space. (Photo credit: Josh Coyle)

This 6-projector (2 tall x 3 wide) rear projection display provides over 12 million pixels of resolution and uses direct illumination (DI) with 8 infrared cameras to track touches across its coated acrylic screen. A RENCI-modified version of TouchLib and a custom gesture library will allow users to interact with large high-resolution images for medical and disaster response applications.

Resolution Statistics

  • Visual Resolution: 6 HD projectors ~= 12.4M pixels across 13.5-foot x 5-foot span
    ~= 1330 rendering pixels/sq inch
  • Touch Resolution: 8 VGA cameras ~= 2.46M pixels across 13.5-foot x 5-foot span
    ~= 262 touch pixels/sq inch
  • Approx. 5 visible pixels per touch pixel

The Build

The physical setup was developed and designed using SolidWorks and constructed using 80/20 framing, which allowed us to adjust the component positions.

SolidWorks model of Duke Multitouch Wall.

SolidWorks model of Duke Multi-Touch Wall.

Setting the size of the screen… By laying out the modeled projector frustrums, we could see where the cameras and illuminators could exist without throwing shadows.

Projector frustrums were modeled to set the size for the screen.

Projector frustrums were modeled to set the size for the screen.

See Hardware Part 2 for more…

Introduction to Multi-Touch

September 1, 2008 (17:32) | Multi-Touch | 5 comments

The multi-touch interface allows a user to interact with an application or system using more than one finger at a time, as in chording and bi-manual operations. In addition to multiple touches, such sensing devices can also accommodate multiple users simultaneously, which is especially useful for larger shared-display systems such as interactive walls and tabletops.

By transferring the interface from a physical device (such as a mouse, pointer or joystick) to a programmable virtual environment, the interaction between user and application can adapt to the user as opposed to the other way around. To paraphrase Bill Buxton from Multi-Touch Systems that I Have Known and Loved, the multi-touch interface has the potential to become a kind of chameleon that provides a single device that can transform itself into whatever interface that is appropriate for the specific task at hand.

The application of multi-touch in concert with gesture recognition has taken off in recent years largely due to the increase in computation power available from commodity computing. RENCI has recognized an opportunity to develop its own in-house expertise in multi-touch and gesture recognition technology as a powerful tool for allowing scientists and researchers to collaboratively interact with large high resolution data sets.

Welcome!

August 1, 2008 (11:00) | Welcome | No comments

Welcome to the Multi-Touch blog for the Renaissance Computing Institute (RENCI) Visualization and Collaborative Environments Group. Our group explores how technology can make scientific problem solving, long-distance collaboration, and even artistic expression more meaningful. Through our development of hardware and software for a multi-touch interface, we hope to empower researchers and scientists with new tools for collaboration and interaction with data in their work.

RENCI brings together a diverse group of professionals from academia and private industry and the Multi-Touch Team is no exception. Here are the core participants in this program:

  • Ray Idazak, Director, Collaborative Environments
  • Xunlei Wu, Ph D, Senior Visualization Researcher for the Duke RENCI Center
  • Jason Coposky, Senior Visualization Engineer
  • Warren Ginn, Senior Research Industrial Designer

For more information about RENCI, please visit www.renci.org.

 Newer entries »