Why are touch screens touch “screens”?

It’s getting to be that time of year. No, not pumpkin everything – although it is that time of year – no, I’m talking about the run-up to CES. This is the time of year when companies start to drop tantalizing tidbits of information about what we might see. It’s probably not the best marketing strategy, this is also the time of year where consumer electronic companies start rolling out products for holiday season. I have dedicated myself to wading through the tons of news feeds to find technology elements that I think have a place in advertising. Thanks to Jesse Emspak, I think I’ve come across another one – using sound to enable anything to respond to touch.

Touch screens are so ubiquitous that physical keyboards are becoming a thing of the past, at least for mobile devices. Now imagine if the capability of touch spread from the display to the entire device, allowing control by gently pressing on any part of the phone, or even making any household item into a touch-sensitive interface with your computer. Makato Ono, Buntarou Shizuki, and Jiro Tanaka of the Japan’s University of Tsukuba demonstrated an acoustic touch system at the ACM Symposium on User Interface Software and Technology (UIST), in St. Andrews, Scotland. Their innovation allows objects to link with computers by taking advantage of how objects respond to sound.

Anything solid vibrates a specific way when it’s hit physically with another object or with sound waves. The characteristic is called resonance. For example, when you tap on a crystal glass, it vibrates at a certain frequency, producing a ring. If you hit it with sound waves — for example, the ambient background noise in a room — it vibrates at a different frequency. Grip the glass while it rings, and the sound stops. Ono and his team expanded on this phenomenon by first attaching a microphone and a tiny speaker to a simple object. They linked the speaker to a sound generator and generated tones — largely inaudible — of 20,000 to 40,000 cycles per second. The sound waves from the tone hit the object and vibrated it while it was in its untouched, still state. The computer analyzed the object’s resonance and used that as a reference point.

Next the researchers touched the object while it was being flooded in sound waves, changing the pattern of resonance. The computer picked that up, “learning” which pattern was produced by different touches. Touching a small statue on the side, for example, produced a different pattern than touching it on the top. Once the computer learned the different frequencies produced by sound and touch, the object could be used an interface. In their video they show that even a Duplo brick — the large, kid’s version of a Lego — can be a simple control for a music program.  Touching the studs on the top of the brick controlled volume, skip to another song, fast forward or rewind.

Taking the idea a step farther, say to a phone, is not a big leap. In the lab, Ono and his colleagues showed that a phone could distinguish subtle changes in resonance that come when someone is holding it in the left or right hand. If one built a mic and speaker into a phone case, the phone could be controlled by gripping it in a certain way — perhaps squeezing to make a phone call. The phone could learn whether you are getting ready to take a picture or send a text, and what kind of pressure means it’s just in your pocket — no more accidental “butt dialing.”