RENCI Vis Group Multi-Touch Blog

Developing multitouch hardware and applications for research and experimentation.

Entries Comments

UNC-Charlotte Engagement Site Takes Delivery of Multi-Touch Table

July 27, 2009 (11:53) | UNC-Charlotte Multi-Touch Table | No comments


RENCI's Jason Coposky experiments with the Urban Growth Model as UNC-C Assistant Professor Zachary Wartell, Ph.D. and Ph.D. student Dong Jeong look on.

Jason Coposky and Warren Ginn from RENCI Europa delivered UNC-Charlotte’s Multi-touch Table to the Charlotte Visualization Center last week. Dubbed the Urbanization Explorer Touch Table, the device’s first role will be to display the Urban Growth Model, developed by the Center for Applied Geographic Information Science (CAGIS) and UNC-Charlotte’s Urban Institute. By accessing historical patterns of growth in the region, this application will provide forecasts on how much growth is expected to take place based on these historical patterns. Using satellite imagery for the 24-county region around Mecklenburg, for four time periods: 1976, 1985, 1996 and 2006, the Urban Growth Model tracks the advance of impervious surfaces, a key indicator of development, in expansion across the area since 1976, and estimates the extent of urbanization through 2030. With interfaces developed by collaborators at the Charlotte Visualization Center, multiple users will be able to select areas of interest, zoom, pan, and navigate the colorful, large-format maps using only their fingertips and on-screen digital tools.


Frame assembly of table without the touch surface assembled. Note the single Prosilica amera in the middle and the 4 illuminators that supplement the edge illumination (because of the camera's wide-angle lens). This camera will eventually be replaced by 4 smaller Firefly cameras, which will improve touch performance and accuracy.

First introduced at North Carolina State University’s Institute for Emerging Issues annual forum this past Februrary, this multi-touch table represents the next leap in performance in touch tracking. As opposed to the previous Direct Illumination (DI) technique employed in the original table, this table employs Diffused Surface Illumination (DSI). By employing a sheet of Cyro Acrylite EndLighten with polished edges and LED Edge-View Ribbon Flex from Environmental Lights, we’ve been able to distribute the IR illumination more evenly.


Infrared LEDs on a trip from Environmental Lights is applied to the inside perimeter of the frame where the polished Endlighten acrylic sheet will be installed.

For the projection surface we’re using a thin (3mm) sheet of Acrylite RP 7D513 rear projection acrylic. This works out well since the thn sheet protects the more expensive Endlighten material and the projection surface has a nice touch.


Completed UNC-C table in the Europa lab.


Setting up the table at the Charlotte Vis Center.

Experiments with EndLighten and IR LED Edge-View Ribbon

March 20, 2009 (17:40) | Multi-Touch | No comments

EndLighten with Edge-View

Jason is behind a 60" x 80" sheet of 3/8" EndLighten acrylic with a strip of IR LED Edge-View Ribbon Flex taped onto the perimeter.

We’ve been experimenting with alternatives to Direct Illumination (DI) because of the challenges we’ve been having with achieving a uniform illumination. We’ve turned our attention to Diffused Surface Illumination (DSI) which uses a diffuse acylic (in this case Cyro Acrylite EndLighten) to distribute the IR evenly across the surface. For the IR, we’re experimenting with LED Edge-View Ribbon Flex from Environmental Lights. We’re using the InfraRed 850nm version and it works great with the 3/8″ thick EndLighten since the strip is only .31″ wide. The 5 meter has an adhesive backing, so we can lay it into our frame pretty easily.

You can see the touches way out in the middle of the sheet (the image came off of one our Prosilica cameras with IR filter). Looks pretty promising…

As for the projection surface, we’ll try out some Rosco grey for the projection surface… We found that if the emulsion is on the camera side, the touches are much more distinct, but that separates the touchs from the image by 3/8″… Unfortunately, placing the film on the user side diminishes the touches quite a bit…

More experimentation…

Video from Emerging Issues Forum 2009

February 12, 2009 (11:04) | Multi-Touch | No comments

RENCI Multi-Touch Table at Emerging Issues Forum

February 9, 2009 (15:19) | Multi-Touch | No comments

The RENCI Multi-Touch Table was featured at the NC Emerging Issues Forum at the Raleigh Convention Center Feb. 9-10, 2009.


Jeff Michael, Director of RENCI at UNC-Charlotte's Engagement Center, discusses RENCI's Multi-Touch Table with Former NC Governor James B. Hunt, Jr. as Thomas Butkiewicz demonstrates UNC-Charlotte's Urban Growth Model.


As RENCI's Ray Idaszak and Jason Coposky look on, RENCI Interim Director Alan Blatecky shows off the RENCI Multi-Touch Table to NCSU Provost and Executive Vice Chancellor Larry A. Nielsen.


Thomas Butkiewicz demonstrates UNC-Charlotte's Urban Growth Model.

Latest multi-touch video

February 4, 2009 (14:31) | Multi-Touch | 2 comments

Here’s the latest video of running several different applications developed by RENCI for our Multi-Touch Table. You can view this video in HD by going to YouTube.

RENCI @ SC08 Video

December 12, 2008 (12:23) | Multi-Touch | 1 comment

This is our exhibit at Supercomputing 08 in Austin, TX on Nov 17-20 where we introduced our Multi-Touch Table.

RENCI Mulit-Touch Debut at SC08

November 17, 2008 (19:43) | Multi-Touch | 4 comments

RENCI enveils its Multi-Touch Table at Supercomputing 08 this week in Austin, TX….

Our SC08 Booth 2633.

Our SC08 Booth 2633.

Showin' off the table...

Showin' off the table...

As far as we can tell, we’re one of only 4 multi-touch tables (the others are University of Amsterdam, EVL, Western Scientific and, of course, Microsoft). We’ve had some good comments… One challenge has been the IR coming off the sodium lights in the conference center… But we had a few bulbs in the ceiling removed and it was much better… I think there will be a shroud or tent in our future…

An iBiblio visualization...

An iBiblio visualization...

Touchin' is fun...

Touchin' is fun... (This one's an electron structure of uranium, by the way).

More to come…

RENCI Multi-Touch Table Development - Part 2

November 12, 2008 (17:00) | Multi-Touch | 4 comments


The touch detection for our direct illumination (DI) wall was originally based on the GC660 GigE camera from Prosilica. At VGA resolution (659 x 493), the GC660 can capture images up to 120 frames per second, but while the Duke wall used 8 of these to cover the image produced by 6 HD projectors, the table only had one camera to cover 2 HD projectors. The result was poor touch fidelity despite the high frame rate…

Prosilica GC Series camera

Prosilica GC Series camera.

We decided to upgrade to the Prosilica GE1650, which offered 1600×1200, but at only 30f fps. because of the aspect ratio of the two HD projectors combined into a single image (approx 1:1), we only capture 1200 x 1200. At 30 frames per second, the response was acceptable.

Prosilica GE1650 camera.

Prosilica GE1650 camera.

The added resolution moved us from a 1/3″ CCD (fairly standard) to a 1″ CCD, whcih limited our choices of wide-angle lenses. After some experimentation, we ended up using a Pentax 6.5mm F1.8 lens fitted with an IR filter. This was a real challenge since this lens had a fixed focus…

IR Illumination

Because of the reduced aperture and exposure setting for the camera, we used a large number of Lorex VQ-2120 Night Vision illuminators. There are a total of 14 illuminatiors in the table…

Lorex QV-2120 Night Vision Illuminator

Lorex QV-2120 Night Vision Illuminator

Early Development Table made out of scrap 80/20...

Early Development Table made out of scrap 80/20...

Early build of the Table...

Early build of the Table...

RENCI Multi-Touch Table Development - Part 1

November 12, 2008 (16:24) | Multi-Touch | No comments


The RENCI Multi-Touch Table is a portable multi-user, multi-touch device for user interface and visualization research.  Through an architecture of commodity-level components and custom software, this high-resolution interactive display provides an effective means for collaborators to directly interact with their data and associated external applications and peripherals. Future work will improve the table’s usability and expand the use of multi-touch technologies through more sophisticated gesture recognition, native integration with operating systems, and a continually increasing base of touch-enabled applications and APIs.

This development builds on our experience with the Duke Multi-Touch Wall at the Duke RENCI Center. However, the table had several different requirement, the most important of which was portability. This table is designed to be broken down and moved from room to room (through a 32″ doorway), or transported for use in a different building.

Specifications of Completed Table

  • 62″ diagonal work surface (42″ x 46″), 40″ tall
  • 2X HD resolution rear-projection display (1920 x 2160 pixels)
  • 1200 x 1200 touch resolution from a single GigE IR camera at 30 frames/second
  • System driven by a 3.2GHz quad-core Xeon processor

The Build

Like the Duke wall, this table uses Direct Illumination (DI) for touch detection…

Direct Illumination (DI)

Direct Illumination (DI)

The physical setup was developed and designed using SolidWorks and constructed using 80/20 framing, which allowed us to adjust the component positions. As we did with the Duke wall, we used the frustrums of the two HD projectors to determine the basic form factor. However, in this case, we also use mirrors in the base pan to compress the total envelope of the unit.

Early SolidWorks model of the Multi-Touch Table.

Early SolidWorks model of the Multi-Touch Table.

The touch surface screen measures 46″ x 42″ (approx. 62-1/2″ diagonal). This seemed to be a good size for 4-6 users to gather around the table comfortably. Because it had to be moveable, we also had to account for the height of the wheels and ground clearance required to clear door thresholds. The final height ended up being about 41″, which is a little high for some shorter users, so we have 6″ risers that can be used if necessary. Future units might sacrifice image size to bring this hight down, but for this one, we wanted to take advantage of the 2 HD Epson projectors.

Early SolidWorks model of the Multi-Touch Table.

Early SolidWorks model of the Multi-Touch Table.

On to Part 2…

Duke Multi-Touch Wall Development - System

November 11, 2008 (19:01) | Duke Multi-Touch Wall | 2 comments

The Duke Multi-Touch Wall uses 8 cameras to detect touches across the large 13.5-foot wide x 5-foot tall screen. These cameras are connected via network cables to a Camera Control Node where a separate instance of Touchlib is running for each camera. The Camera Control Node handles the image processing and blob tracking for each camera separately. These touch events are all then routed to another “composition” process which eliminates duplicate touches due to overlapping camera boundaries and assigns unique touch IDs.

This process then sends the stream of unique touches to the gesture engine (running on the Windows or Linux display machine) which interpret them as gesture events. These gesture events are finally sent to the client application to manipulate windows, data, etc.

The Duke Multi-Touch Wall System

The Duke Multi-Touch Wall System.

« Older entries