Monday, July 29, 2013

No keyboard, no touchscreen either

A new device lets you control your computer with hand motions. But its software is inconsistent and frustrating, says  David Pogu.
By now, we all know what the future will be like; movies and TV shows have described it in detail. We know about the flying cars (thank you, “Blade Runner”), holograms (thank you, “Star Trek”) and robot butlers (thank you, “Jetsons”).
So when will we really get those technologies? Probably on the 11th of “Don’t hold your breath.”
 There is, however, one exception. As of this week, you can buy your own little piece of “Minority Report” and “Iron Man”: controlling your computer by making hand motions in the air.
The Internet has been buzzing about the much-delayed Leap Motion Controller ($80 or Rs 4,726) since its first public demonstrations over a year ago. 
Imagine controlling on-screen objects just by reaching into empty space, just like Tom Cruise! Imagine gesture recognition just like Microsoft’s Kinect game controller, but on a much smaller, more precise scale! Imagine the future, plugged into a USB jack on the Mac or Windows PC you own today!
The Leap Motion sensor is beautiful, tiny and self-contained. It is a sleek, glass-and-aluminum slab (1.2 by 3 by 0.5 inches), with nonskid rubber on the bottom. A single USB cable (both a long one and a short one come in the box) stretches away to your computer; a light comes on when it’s working.
If you have a desktop computer, you put the sensor between your screen and keyboard. If it’s a laptop, you park it on the desk just in front of the keyboard. Soon, Leap says, you’ll be able to buy a PC from HP or Asus that has the sensor built right in.
You download the Leap software, and presto: a somewhat buggy tutorial instructs you to insert your hands into the space – an invisible two-foot cube – that’s monitored by the Leap’s cameras and infrared sensors.
This device is like the Kinect in that it recognises body parts in space. But not only is the Leap far smaller and less expensive, it’s also far more precise. According to the company, it can detect the precise positions of all 10 of your fingers simultaneously, with a spatial accuracy to a 100th of a millimeter – 200 times as accurate as the Kinect.
And remember, the Leap adds gesture recognition not to your TV, but to your computer. A machine that can run millions of different programs for all different purposes. Games, sure, but also office work. Creative work. Communication.
Entertainment. Surely this little wonder is a very big deal.
Unfortunately, it’s not. The Leap’s hardware may be simple, attractive and coherent – but its software is scattershot, inconsistent and frustrating.
The first crushing disappointment is that no software recognises your hand motions unless it’s been specially written, or adapted, for use by the Leap.
There are 75 such apps already on the Leap’s app store, Airspace; some are free, some cost a few dollars. Not all work on both Mac and Windows.
Most are games. In the best of them, you control the action in 3-D space, just as with the Kinect but without having to stand up. For example, Boom Ball ($5 or Rs 295) is the classic Breakout game, where you try to knock out bricks by bouncing a ball against them – but your paddle is attached to your finger in vertical space.
In Disney’s clever Sugar Rush ($2 or Rs 118), a spinoff from the “Wreck-It Ralph” movie, you bake yourself a racing car shaped like a wedge of cake, and then steer it by holding both sides of an invisible steering wheel. When you play Dropchord ($3 or Rs 177), you hold two index fingers out in space; you’re defining a line between them that you use to slice dots and avoid X’s. Cut the Rope is here, too (free).
There are some interesting music-making programs, which makes sense, since hand motions are generally associated with playing instruments. Air Harp ($1 or Rs 59) is just what it sounds like. Chordion Conductor is a sweet-sounding arpeggiator (generates music from chords you select).
A few simple educational apps are available, like Molecules (rotate molecules on the screen; free), Cyber Science 3D (pull apart a skull; free) and Frog Dissection (you guessed it; $4 or Rs 236).
There are some stunningly obvious problems with all of this. First, the motions aren’t consistent from game to game. In some apps, “clicking the mouse” involves jabbing toward something on the screen; in others, you have to point to the object and remain perfectly still while a cursor “fills up.” As on the Kinect, that waiting period is presumably to avoid accidental selections. But keeping your finger absolutely motionless in space is darn hard.
In some apps, you go back to the previous screen with a quick finger-swipe to the left. In others, you waggle your hand horizontally.
There’s just no rhyme or reason. Learning the gestures in one app gives you zero applicable skills for the next one.
The finger-tracking itself works inconsistently, too. In some apps, the computer seems to know precisely where your fingers are; in others, you feel ignored or misunderstood. The result is mushy frustration, like trying to play the guitar wearing boxing gloves.
The second huge problem: Half these apps would work much better with mouse and keyboard, or at least a touch screen. In Neko’s Puddle game ($5 or Rs 295), you conduct dripping liquids along various surfaces by tilting them, but the arrow keys would do just as well.
The third huge problem has to do with the app Touchless. It’s the one app that goes beyond gimmicky demoware and games, the one that comes closest to the promise of “Minority Report.” It’s an app that lets you operate the cursor and scroll windows right in your operating system (OS X or Windows).
The thing is, when your finger is up in the air, how is the software supposed to know when you’ve “clicked” something? There’s no glass to tap, as on a touch screen, and no button to click, as on a mouse.
Leap’s solution: there’s an arbitrary invisible wall in space over the sensor. You can’t see it, you can’t feel it – but when your finger crosses that line, you’ve just clicked.
Painting programs like Corel Painter Freestyle (free) require the same technique.
As you’d guess, this method is a disaster: fussy, frustrating, imprecise. The motion gesture you’ll probably use at this point is throwing the stupid thing out the window.
In short, the Leap Motion Controller is a solution in search of a problem. It desperately needs a killer app, some program that couldn’t exist without the Leap’s special talents: tracking the three-dimensional motions of 10 individual fingertips with incredible precision. Most apps don’t even avail themselves of that 10-finger tracking feature. Most detect only a single finger or hand.
It’s very exciting that the Leap controller has attained excellent hardware design, high precision, excellent speed and low price – today. Unfortunately, the software that will justify its existence still lies in the future.
from:NJYTOUCH

No keyboard, no touchscreen either

A new device lets you control your computer with hand motions. But its software is inconsistent and frustrating, says  David Pogu.
By now, we all know what the future will be like; movies and TV shows have described it in detail. We know about the flying cars (thank you, “Blade Runner”), holograms (thank you, “Star Trek”) and robot butlers (thank you, “Jetsons”).
So when will we really get those technologies? Probably on the 11th of “Don’t hold your breath.”
 There is, however, one exception. As of this week, you can buy your own little piece of “Minority Report” and “Iron Man”: controlling your computer by making hand motions in the air.
The Internet has been buzzing about the much-delayed Leap Motion Controller ($80 or Rs 4,726) since its first public demonstrations over a year ago. 
Imagine controlling on-screen objects just by reaching into empty space, just like Tom Cruise! Imagine gesture recognition just like Microsoft’s Kinect game controller, but on a much smaller, more precise scale! Imagine the future, plugged into a USB jack on the Mac or Windows PC you own today!
The Leap Motion sensor is beautiful, tiny and self-contained. It is a sleek, glass-and-aluminum slab (1.2 by 3 by 0.5 inches), with nonskid rubber on the bottom. A single USB cable (both a long one and a short one come in the box) stretches away to your computer; a light comes on when it’s working.
If you have a desktop computer, you put the sensor between your screen and keyboard. If it’s a laptop, you park it on the desk just in front of the keyboard. Soon, Leap says, you’ll be able to buy a PC from HP or Asus that has the sensor built right in.
You download the Leap software, and presto: a somewhat buggy tutorial instructs you to insert your hands into the space – an invisible two-foot cube – that’s monitored by the Leap’s cameras and infrared sensors.
This device is like the Kinect in that it recognises body parts in space. But not only is the Leap far smaller and less expensive, it’s also far more precise. According to the company, it can detect the precise positions of all 10 of your fingers simultaneously, with a spatial accuracy to a 100th of a millimeter – 200 times as accurate as the Kinect.
And remember, the Leap adds gesture recognition not to your TV, but to your computer. A machine that can run millions of different programs for all different purposes. Games, sure, but also office work. Creative work. Communication.
Entertainment. Surely this little wonder is a very big deal.
Unfortunately, it’s not. The Leap’s hardware may be simple, attractive and coherent – but its software is scattershot, inconsistent and frustrating.
The first crushing disappointment is that no software recognises your hand motions unless it’s been specially written, or adapted, for use by the Leap.
There are 75 such apps already on the Leap’s app store, Airspace; some are free, some cost a few dollars. Not all work on both Mac and Windows.
Most are games. In the best of them, you control the action in 3-D space, just as with the Kinect but without having to stand up. For example, Boom Ball ($5 or Rs 295) is the classic Breakout game, where you try to knock out bricks by bouncing a ball against them – but your paddle is attached to your finger in vertical space.
In Disney’s clever Sugar Rush ($2 or Rs 118), a spinoff from the “Wreck-It Ralph” movie, you bake yourself a racing car shaped like a wedge of cake, and then steer it by holding both sides of an invisible steering wheel. When you play Dropchord ($3 or Rs 177), you hold two index fingers out in space; you’re defining a line between them that you use to slice dots and avoid X’s. Cut the Rope is here, too (free).
There are some interesting music-making programs, which makes sense, since hand motions are generally associated with playing instruments. Air Harp ($1 or Rs 59) is just what it sounds like. Chordion Conductor is a sweet-sounding arpeggiator (generates music from chords you select).
A few simple educational apps are available, like Molecules (rotate molecules on the screen; free), Cyber Science 3D (pull apart a skull; free) and Frog Dissection (you guessed it; $4 or Rs 236).
There are some stunningly obvious problems with all of this. First, the motions aren’t consistent from game to game. In some apps, “clicking the mouse” involves jabbing toward something on the screen; in others, you have to point to the object and remain perfectly still while a cursor “fills up.” As on the Kinect, that waiting period is presumably to avoid accidental selections. But keeping your finger absolutely motionless in space is darn hard.
In some apps, you go back to the previous screen with a quick finger-swipe to the left. In others, you waggle your hand horizontally.
There’s just no rhyme or reason. Learning the gestures in one app gives you zero applicable skills for the next one.
The finger-tracking itself works inconsistently, too. In some apps, the computer seems to know precisely where your fingers are; in others, you feel ignored or misunderstood. The result is mushy frustration, like trying to play the guitar wearing boxing gloves.
The second huge problem: Half these apps would work much better with mouse and keyboard, or at least a touch screen. In Neko’s Puddle game ($5 or Rs 295), you conduct dripping liquids along various surfaces by tilting them, but the arrow keys would do just as well.
The third huge problem has to do with the app Touchless. It’s the one app that goes beyond gimmicky demoware and games, the one that comes closest to the promise of “Minority Report.” It’s an app that lets you operate the cursor and scroll windows right in your operating system (OS X or Windows).
The thing is, when your finger is up in the air, how is the software supposed to know when you’ve “clicked” something? There’s no glass to tap, as on a touch screen, and no button to click, as on a mouse.
Leap’s solution: there’s an arbitrary invisible wall in space over the sensor. You can’t see it, you can’t feel it – but when your finger crosses that line, you’ve just clicked.
Painting programs like Corel Painter Freestyle (free) require the same technique.
As you’d guess, this method is a disaster: fussy, frustrating, imprecise. The motion gesture you’ll probably use at this point is throwing the stupid thing out the window.
In short, the Leap Motion Controller is a solution in search of a problem. It desperately needs a killer app, some program that couldn’t exist without the Leap’s special talents: tracking the three-dimensional motions of 10 individual fingertips with incredible precision. Most apps don’t even avail themselves of that 10-finger tracking feature. Most detect only a single finger or hand.
It’s very exciting that the Leap controller has attained excellent hardware design, high precision, excellent speed and low price – today. Unfortunately, the software that will justify its existence still lies in the future.
from:NJYTOUCH

No keyboard, no touchscreen either

A new device lets you control your computer with hand motions. But its software is inconsistent and frustrating, says  David Pogu.
By now, we all know what the future will be like; movies and TV shows have described it in detail. We know about the flying cars (thank you, “Blade Runner”), holograms (thank you, “Star Trek”) and robot butlers (thank you, “Jetsons”).
So when will we really get those technologies? Probably on the 11th of “Don’t hold your breath.”
 There is, however, one exception. As of this week, you can buy your own little piece of “Minority Report” and “Iron Man”: controlling your computer by making hand motions in the air.
The Internet has been buzzing about the much-delayed Leap Motion Controller ($80 or Rs 4,726) since its first public demonstrations over a year ago. 
Imagine controlling on-screen objects just by reaching into empty space, just like Tom Cruise! Imagine gesture recognition just like Microsoft’s Kinect game controller, but on a much smaller, more precise scale! Imagine the future, plugged into a USB jack on the Mac or Windows PC you own today!
The Leap Motion sensor is beautiful, tiny and self-contained. It is a sleek, glass-and-aluminum slab (1.2 by 3 by 0.5 inches), with nonskid rubber on the bottom. A single USB cable (both a long one and a short one come in the box) stretches away to your computer; a light comes on when it’s working.
If you have a desktop computer, you put the sensor between your screen and keyboard. If it’s a laptop, you park it on the desk just in front of the keyboard. Soon, Leap says, you’ll be able to buy a PC from HP or Asus that has the sensor built right in.
You download the Leap software, and presto: a somewhat buggy tutorial instructs you to insert your hands into the space – an invisible two-foot cube – that’s monitored by the Leap’s cameras and infrared sensors.
This device is like the Kinect in that it recognises body parts in space. But not only is the Leap far smaller and less expensive, it’s also far more precise. According to the company, it can detect the precise positions of all 10 of your fingers simultaneously, with a spatial accuracy to a 100th of a millimeter – 200 times as accurate as the Kinect.
And remember, the Leap adds gesture recognition not to your TV, but to your computer. A machine that can run millions of different programs for all different purposes. Games, sure, but also office work. Creative work. Communication.
Entertainment. Surely this little wonder is a very big deal.
Unfortunately, it’s not. The Leap’s hardware may be simple, attractive and coherent – but its software is scattershot, inconsistent and frustrating.
The first crushing disappointment is that no software recognises your hand motions unless it’s been specially written, or adapted, for use by the Leap.
There are 75 such apps already on the Leap’s app store, Airspace; some are free, some cost a few dollars. Not all work on both Mac and Windows.
Most are games. In the best of them, you control the action in 3-D space, just as with the Kinect but without having to stand up. For example, Boom Ball ($5 or Rs 295) is the classic Breakout game, where you try to knock out bricks by bouncing a ball against them – but your paddle is attached to your finger in vertical space.
In Disney’s clever Sugar Rush ($2 or Rs 118), a spinoff from the “Wreck-It Ralph” movie, you bake yourself a racing car shaped like a wedge of cake, and then steer it by holding both sides of an invisible steering wheel. When you play Dropchord ($3 or Rs 177), you hold two index fingers out in space; you’re defining a line between them that you use to slice dots and avoid X’s. Cut the Rope is here, too (free).
There are some interesting music-making programs, which makes sense, since hand motions are generally associated with playing instruments. Air Harp ($1 or Rs 59) is just what it sounds like. Chordion Conductor is a sweet-sounding arpeggiator (generates music from chords you select).
A few simple educational apps are available, like Molecules (rotate molecules on the screen; free), Cyber Science 3D (pull apart a skull; free) and Frog Dissection (you guessed it; $4 or Rs 236).
There are some stunningly obvious problems with all of this. First, the motions aren’t consistent from game to game. In some apps, “clicking the mouse” involves jabbing toward something on the screen; in others, you have to point to the object and remain perfectly still while a cursor “fills up.” As on the Kinect, that waiting period is presumably to avoid accidental selections. But keeping your finger absolutely motionless in space is darn hard.
In some apps, you go back to the previous screen with a quick finger-swipe to the left. In others, you waggle your hand horizontally.
There’s just no rhyme or reason. Learning the gestures in one app gives you zero applicable skills for the next one.
The finger-tracking itself works inconsistently, too. In some apps, the computer seems to know precisely where your fingers are; in others, you feel ignored or misunderstood. The result is mushy frustration, like trying to play the guitar wearing boxing gloves.
The second huge problem: Half these apps would work much better with mouse and keyboard, or at least a touch screen. In Neko’s Puddle game ($5 or Rs 295), you conduct dripping liquids along various surfaces by tilting them, but the arrow keys would do just as well.
The third huge problem has to do with the app Touchless. It’s the one app that goes beyond gimmicky demoware and games, the one that comes closest to the promise of “Minority Report.” It’s an app that lets you operate the cursor and scroll windows right in your operating system (OS X or Windows).
The thing is, when your finger is up in the air, how is the software supposed to know when you’ve “clicked” something? There’s no glass to tap, as on a touch screen, and no button to click, as on a mouse.
Leap’s solution: there’s an arbitrary invisible wall in space over the sensor. You can’t see it, you can’t feel it – but when your finger crosses that line, you’ve just clicked.
Painting programs like Corel Painter Freestyle (free) require the same technique.
As you’d guess, this method is a disaster: fussy, frustrating, imprecise. The motion gesture you’ll probably use at this point is throwing the stupid thing out the window.
In short, the Leap Motion Controller is a solution in search of a problem. It desperately needs a killer app, some program that couldn’t exist without the Leap’s special talents: tracking the three-dimensional motions of 10 individual fingertips with incredible precision. Most apps don’t even avail themselves of that 10-finger tracking feature. Most detect only a single finger or hand.
It’s very exciting that the Leap controller has attained excellent hardware design, high precision, excellent speed and low price – today. Unfortunately, the software that will justify its existence still lies in the future.
from:NJYTOUCH

No keyboard, no touchscreen either

A new device lets you control your computer with hand motions. But its software is inconsistent and frustrating, says  David Pogu.
By now, we all know what the future will be like; movies and TV shows have described it in detail. We know about the flying cars (thank you, “Blade Runner”), holograms (thank you, “Star Trek”) and robot butlers (thank you, “Jetsons”).
So when will we really get those technologies? Probably on the 11th of “Don’t hold your breath.”
 There is, however, one exception. As of this week, you can buy your own little piece of “Minority Report” and “Iron Man”: controlling your computer by making hand motions in the air.
The Internet has been buzzing about the much-delayed Leap Motion Controller ($80 or Rs 4,726) since its first public demonstrations over a year ago. 
Imagine controlling on-screen objects just by reaching into empty space, just like Tom Cruise! Imagine gesture recognition just like Microsoft’s Kinect game controller, but on a much smaller, more precise scale! Imagine the future, plugged into a USB jack on the Mac or Windows PC you own today!
The Leap Motion sensor is beautiful, tiny and self-contained. It is a sleek, glass-and-aluminum slab (1.2 by 3 by 0.5 inches), with nonskid rubber on the bottom. A single USB cable (both a long one and a short one come in the box) stretches away to your computer; a light comes on when it’s working.
If you have a desktop computer, you put the sensor between your screen and keyboard. If it’s a laptop, you park it on the desk just in front of the keyboard. Soon, Leap says, you’ll be able to buy a PC from HP or Asus that has the sensor built right in.
You download the Leap software, and presto: a somewhat buggy tutorial instructs you to insert your hands into the space – an invisible two-foot cube – that’s monitored by the Leap’s cameras and infrared sensors.
This device is like the Kinect in that it recognises body parts in space. But not only is the Leap far smaller and less expensive, it’s also far more precise. According to the company, it can detect the precise positions of all 10 of your fingers simultaneously, with a spatial accuracy to a 100th of a millimeter – 200 times as accurate as the Kinect.
And remember, the Leap adds gesture recognition not to your TV, but to your computer. A machine that can run millions of different programs for all different purposes. Games, sure, but also office work. Creative work. Communication.
Entertainment. Surely this little wonder is a very big deal.
Unfortunately, it’s not. The Leap’s hardware may be simple, attractive and coherent – but its software is scattershot, inconsistent and frustrating.
The first crushing disappointment is that no software recognises your hand motions unless it’s been specially written, or adapted, for use by the Leap.
There are 75 such apps already on the Leap’s app store, Airspace; some are free, some cost a few dollars. Not all work on both Mac and Windows.
Most are games. In the best of them, you control the action in 3-D space, just as with the Kinect but without having to stand up. For example, Boom Ball ($5 or Rs 295) is the classic Breakout game, where you try to knock out bricks by bouncing a ball against them – but your paddle is attached to your finger in vertical space.
In Disney’s clever Sugar Rush ($2 or Rs 118), a spinoff from the “Wreck-It Ralph” movie, you bake yourself a racing car shaped like a wedge of cake, and then steer it by holding both sides of an invisible steering wheel. When you play Dropchord ($3 or Rs 177), you hold two index fingers out in space; you’re defining a line between them that you use to slice dots and avoid X’s. Cut the Rope is here, too (free).
There are some interesting music-making programs, which makes sense, since hand motions are generally associated with playing instruments. Air Harp ($1 or Rs 59) is just what it sounds like. Chordion Conductor is a sweet-sounding arpeggiator (generates music from chords you select).
A few simple educational apps are available, like Molecules (rotate molecules on the screen; free), Cyber Science 3D (pull apart a skull; free) and Frog Dissection (you guessed it; $4 or Rs 236).
There are some stunningly obvious problems with all of this. First, the motions aren’t consistent from game to game. In some apps, “clicking the mouse” involves jabbing toward something on the screen; in others, you have to point to the object and remain perfectly still while a cursor “fills up.” As on the Kinect, that waiting period is presumably to avoid accidental selections. But keeping your finger absolutely motionless in space is darn hard.
In some apps, you go back to the previous screen with a quick finger-swipe to the left. In others, you waggle your hand horizontally.
There’s just no rhyme or reason. Learning the gestures in one app gives you zero applicable skills for the next one.
The finger-tracking itself works inconsistently, too. In some apps, the computer seems to know precisely where your fingers are; in others, you feel ignored or misunderstood. The result is mushy frustration, like trying to play the guitar wearing boxing gloves.
The second huge problem: Half these apps would work much better with mouse and keyboard, or at least a touch screen. In Neko’s Puddle game ($5 or Rs 295), you conduct dripping liquids along various surfaces by tilting them, but the arrow keys would do just as well.
The third huge problem has to do with the app Touchless. It’s the one app that goes beyond gimmicky demoware and games, the one that comes closest to the promise of “Minority Report.” It’s an app that lets you operate the cursor and scroll windows right in your operating system (OS X or Windows).
The thing is, when your finger is up in the air, how is the software supposed to know when you’ve “clicked” something? There’s no glass to tap, as on a touch screen, and no button to click, as on a mouse.
Leap’s solution: there’s an arbitrary invisible wall in space over the sensor. You can’t see it, you can’t feel it – but when your finger crosses that line, you’ve just clicked.
Painting programs like Corel Painter Freestyle (free) require the same technique.
As you’d guess, this method is a disaster: fussy, frustrating, imprecise. The motion gesture you’ll probably use at this point is throwing the stupid thing out the window.
In short, the Leap Motion Controller is a solution in search of a problem. It desperately needs a killer app, some program that couldn’t exist without the Leap’s special talents: tracking the three-dimensional motions of 10 individual fingertips with incredible precision. Most apps don’t even avail themselves of that 10-finger tracking feature. Most detect only a single finger or hand.
It’s very exciting that the Leap controller has attained excellent hardware design, high precision, excellent speed and low price – today. Unfortunately, the software that will justify its existence still lies in the future.
from:NJYTOUCH

Friday, July 26, 2013

Berkeley researchers develop flexible touch screen

Imagine living in a world where paper-thin computer screens cover walls instead of paint. They would respond to your very presence in a room, attuned to and controllable by your every movement — even your facial expressions.
Engineers at UC Berkeley and Lawrence Berkeley National Laboratory have taken the latest step toward that by developing a thin, pliable plastic material called “e-skin” that is interactive like a touch screen but more flexible than the rigid glass or silicon currently used in smartphones and computers.
The team of researchers, led by UC Berkeley professor of electrical engineering and computer sciences Ali Javey, created a working stamp-sized prototype that responds to touch by emitting light. The work was published in the academic journal Nature Materials on Sunday.
E-skin is not a synthetic replacement for human skin, despite what the name may suggest, Javey said. E-skin will primarily be used in electronics and aims only to mimic human skin in computer displays by responding similarly to touch, temperature and light.
“From the engineering point of view, human skin is an interface that provides us with information,” Javey said.
When pressure is applied to e-skin, its surface lights up with organic light-emitting diodes, an advanced form of LEDs. The intensity of the light varies with the intensity of pressure exerted on the skin, said Chuan Wang, a professor at Michigan State University who co-authored the publication and worked on the research as a UC Berkeley postgraduate.
According to Wang, prior to using OLEDs, the only way to measure the pressure being applied to a surface was to measure electrical currents, which required extensive equipment. Integrating light sensors into e-skin has simplified the process of detecting pressure by eliminating the need for electrical boards or a computer.
The researchers are now working to have e-skin respond to light and temperature as well.
But there are still significant challenges to overcome before the use of plastic as a platform for electrical systems can advance.
“Existing equipment is designed to only process rigid substrates, like silicon,” Javey said. “For plastic substrates, we have to develop new equipment in order to build complex electrical structures.”
Researchers believe e-skin could be used in large-scale applications in the future. It may ultimately be possible to cheaply print e-skin on regular printers using metallic ink and thin plastic.
Dae-Hyeong Kim, a professor of bioengineering at Seoul National University who does research on similar materials to e-skin, said Javey’s material is significant because it reacts by emitting light — a more natural way of interacting with humans than traditional electronic sensors.
“The recent work from Professor Javey’s group has a great meaning in that it realized the interactive mode of artificial electronic skin,” he said.
from:NJYTOUCH

Wednesday, July 24, 2013

GP attractions only a touch screen away

Thousands of pamphlets can now be at your fingertips and be flipped through with a single touch of a screen as the Grande Prairie Region Tourism Association officially unveiled its new information kiosk at the Grande Prairie Airport.
The kiosk, which took six weeks of designing and assembling, was officially unveiled at the Grande Prairie Airport Wednesday for travellers to use while mapping out their vacation in the area.

“We want to be a one-stop location for finding out all the current events, where they are, what’s happening, photos of them, locator maps to work out how to get to them. So this is one ideal spot for us to be promoting all those things that happen around the region,” said Ainsley Lamontagne, executive director of the Grande Prairie Region Tourism Association.
“There’s also an interactive map so you can tap on shopping centers, restaurants, museums, the art gallery and it gives you directions from the airport. It maps it out for you, how far away it is, how hard it is to get to and it’s actually attached to a Google maps.”
The kiosk, which looks like a larger version of an iPad, is a laptop-shaped touch screen on a stand with various categories such as hotels, restaurants and upcoming events to search through for planning out the perfect vacation while staying in the city. Lamontagne said anyone who has any experience using an iPhone should be able to use the machine.
“There’s not a whole lot of different icons to chose from… you can always come out of any page directly onto what you’re looking for,” she said. “It’s not cluttered, it’s very simple: The hotels, the attractions, the maps, the events and nominating a Service Superstar.”
All the information available on the kiosk is updated from the Grande Prairie Region Tourism Association’s location at Centre 2000, via its website and WiFi.
“We’re developing surveys to go onto the kiosk so… if you provide information about how long your stay was in Grande Prairie, we can use that to clarify what our visitors are doing, then you are entered into competitions and so we’re developing those and we’re really quite excited. It’s going to be an even more useful tool the more we get to know it,” added Lamontagne.
The kiosk was set up a Centre 2000 for Travel Alberta staff to become familiar with and see what changes would need to be made to it before it was officially launched this week.
Something else to be proud of, besides the fact this is the first kiosk of its kind to be put to use in the Grande Prairie area, is that the machine was built and designed by a local company, Broca Media. Funding and developing for the project was also done through Destination Marketing Properties in Grande Prairie. Broca Media is also working on another information touch screen kiosk for Evergreen Park.
“The one out of Evergreen Park is specifically build around Evergreen and its area,” said Karna Germshied, owner of the company.
“So out there, our users are going to be using it to find a little bit more information that they didn’t realize was there like a lot of people go out to the TEC centre and don’t realize that there’s the bar and the casino to eat at and things like that.”
As well as promoting the different activity bases at the park, it will also be able to help visitors find their way around the area as well as different events in and around the area.
Currently, the Evergreen Park kiosk is nearing completion.
“We’re just at that point of finalizing the final content. Everything has been built to this point so the project is almost complete, we just need to bring the kiosk and the touch screen together,” she said.
The kiosk as Evergreen Park, Germsheid said, should be up and running at the location within a week to 10 days.
from:NJYTOUCH

Monday, July 22, 2013

Touchscreen Devices Will Soon Be Able to Identify Fingerprints

Touchscreen
A new touchscreen display, which is capable of identifying fingerprints, is closer to reality.
Not only will the display redefine online security but could revolutionise the way in which humans and computers interact in the public sphere.
Current touchscreens emit light but are not able to sense it, which makes it impossible to identify fingerprints unless a supplemental sensor is added.
Researchers Christian Holz of the Hasso Plattner Institute in Germany told New Scientist that these touchscreens cannot scan fingerprints and fingerprint sensors are not able to show images.
from:NJYTOUCH

Thursday, July 18, 2013

Apple Touchscreen Dashboard For Cars Gets Patent Treatment

An Apple touchscreen dashboard for cars may be on the way thanks to a new patent filed by the tech giant.
The patent would turn the manual controls in a users car into touch screen capable options via an included display.
The patent filing arrives less than one month after Ford announced that it would start transitioning back to physical buttons which lead to less driving distraction and confusion when compared to touchscreen based vehicle operating systems.
The new Apple patent for the dashboard touch screen is the continuation of a patent the tech firm filed in 2011.
At Apple’s developer conference this year company executives talked about bringing iOS to vehicles. The company at the time of its announcement didn’t give much insight into the platform it was creating for cars, vans, and other modes of transportation.
Apple could use Airplay to beam content from a users phone to their vehicle’s display.
“iOS in the Car” is expected to make a 2014 debut and will work with iPhone 5 and likely iPhone 5S smartphones. Users will be able to make phone calls, send and receive messages, access their music, and get directions right from the dashboard.
Apple will also include Siri for hands free navigation and information retrieval. A program known as “Siri Eyes Free” is currently available in GM’s Chevy Spark and Sonic via the Chevrolet MyLink system. Other manufacturers currently working to brin ghte handsfree Siri platform to market include Toyota, Honda, Audi, Mercedes, BMW, Land Rover and Jaguar.
I personally hope they don’t try to pull some weird kind of monthly payment switch that only serves to lock customers into more Apple supported contract.
Are you ready for Apple to offer a touchscreen dashboard for cars?
from:NJYTOUCH

Wednesday, July 17, 2013

Tobii’s eye-tracking proves that the best kind of touchscreen is no touchscreen at all

Even if you’ve never used Tobii’s new eye tracking computer, it still feels like you have.
Seconds into using the device, the whole experience comes together: Glance at an onscreen object, click, and it opens. The entire process is so surprisingly fluid that you barely realize you’re not using a mouse.
Carl Korobkin, Tobii’s business development vice president, says that this experience exposes one of the fundamental realities about how we interact with devices today: It’s all really inefficient.
“With smartphones now, you’re touching the screen, but you’re already touching the screen with your eyes. Why reach out and touch something if I’ve already looked at it? You really don’t need that mechanical process anymore,” he said earlier today.
It’s tough to argue with that logic. Like touch input and voice, eye-tracking breaks down the abstraction of interfaces between you and your devices. Why use a mouse — or even a touchscreen– when you can just look at what you want to interact with? Why type your search queries when you can say them?
(Tobii, however, isn’t ditching the idea of touch entirely: Its laptop prototype features a pressure-sensitive touchpad developed by Synaptics, which replaces the kind of clickable touchpads found in current laptops.)
tobii-ultrabookWhile eye-tracking has been around for years, its underlying technology is finally getting small enough that manufacturers can implement it in smaller devices — including laptops, tablets, and, yes, smartphones. Two years ago, none of this would have been possible, but by next year, it’ll be everywhere.
And the possibilities are exciting. Imagine if Amazon created a version of its Kindle app that followed your vision, highlighting words as you read along, or even a horror game designed to make monsters pop up on the screen based on where you’re looking. In a more subtle example, imagine if you could simply dismiss dialog boxes with a look instead of a click. (Check below for a video of how some of this looks in action.)
“Our goal is to make the entire computing process 10 percent faster and better — and that’s huge. Our number one application is everything,” Korobkin said.
A good example of how designers are implementing eye-tracking comes from Samsung, which developed a feature called Smart stay alongside the Galaxy S III. Using what Korobkin calls a “rudimentary” form of eye-tracking, Smart stay monitors the user’s vision to determine when it will dim the device’s screen.
Korobkin, though, wants to take that idea further. “We know very accurately if you’re looking at the screen, so it’s the best screen management you can have,” he said.
While Tobii’s eye-tracking laptop is still a prototype, Korobkin says that computer manufacturers are already looking for ways to add the technology to their devices. More, Korobkin argues that the rise of eye-tracking is going make current touchscreen-equipped laptops look really crude in comparison.
“Microsoft is trying to bridge the gap, but it turns out that eye-tracking really is the ideal form of touch,” Korobkin said.
Like touchscreens before it, eye-tracking is in many senses an inevitable step toward a future where the division between humans and computers is blurred into nonexistence. For most of computing history, we’ve been reading our devices. Now our devices are finally reading us.
from:NJYTOUCH

Tuesday, July 16, 2013

U.S. driver safety group sues Ford over touch-screen systems

(Reuters) – Ford Motor Co has been hit with a proposed class action claiming it neglected to fix defects in vehicle touch-screen control systems that create safety hazards for drivers.
The lawsuit was filed Monday in the U.S. District Court for the Central District of California by the Center for Defensive Driving, a non-profit driver safety group. The plaintiffs are bringing the lawsuit on behalf of customers who purchased or leased Ford vehicles equipped with a MyFord Touch system, as well as variations like MyLincoln Touch and MyMercury Touch.
According to the lawsuit, customers have complained that the system freezes up, malfunctions, blacks out and fails to connect with mobile devices. The complaint said system flaws have created “significant safety risks” for drivers, diverting their attention from the road when the product malfunctions and failing to contact 911 during emergencies as designed.
Ford launched MyFord Touch in vehicles in 2010. The system was designed to centralize audio, navigation, climate, mobile-device, entertainment and safety controls through LCD interfaces powered by Microsoft’s Sync operating system. The MyFord Touch system can be controlled via a touch-screen panel, voice commands or by buttons on the steering wheel.
Since its launch, however, MyFord Touch and other so-called infotainment systems in Ford vehicles have been an “unmitigated disaster,” plaintiffs said in the lawsuit.
A representative for Ford declined to comment, citing the pending litigation.
Ford, the second-largest U.S. automaker, reported 400 problems with its MyFord Touch system for every 1,000 vehicles in November 2012. The company previously said it aims to lower that number to 360 by August.
Automakers have struggled to create easy-to-use and effective touch-screen systems that integrate entertainment and navigation systems.
Ford has faced public criticism over its systems from some customers. At least two websites, including syncsucks.com, have been set up to chronicle customers’ problems with the systems, the complaint said. Customers also have lodged complaints with the National Highway Traffic Safety Administration over the system, according to the lawsuit.
“In theory, MyFord Touch is a brilliant idea and worth the premium that Ford charged its customers for the system,” plaintiffs’ lawyer Steve Berman of Hagens Berman Sobol Shapiro said in a statement. “In reality, the system is fundamentally flawed, failing to reliably provide functionality, amounting to an inconvenience at best, and a serious safety issue at worst.”
Ford said in a June press release that Sync and MyFord Touch were sold on nearly 80 percent of 2013 Ford vehicles, up from 68 percent in 2012.
Ford has issued several updates to address issues with the system, but the complaint said that the updates failed to address plaintiffs’ problems.
The company said in June that it is also planning to add buttons and knobs to MyFord Touch systemsin future vehicles.
Plaintiffs are seeking a court order compelling Ford to recall or replace the systems, as well as damages.
from:NJYTOUCH

Monday, July 15, 2013

SHARP to Introduce PN-K322B Touchscreen 4K Ultra HD LED Monitor

Accurate Onscreen Handwritten Text Input and Multi-Touch Operation on a 4K Ultra HD Professional Display

MAHWAH, N.J., July 15, 2013 /PRNewswire/ – Sharp Imaging and Information Company of America (SIICA), a division of Sharp Electronics Corporation, will introduce a new 32-inch-class (31.5″ diagonal) LCD LED monitor, the PN-K322B. The thinnest in its class, this professional-use monitor features a Sharp developed high-sensitivity, high-precision touchscreen and delivers 4K Ultra HD resolution (3,840 x 2,160 pixels) – four times the pixel resolution of Full HD.
As previewed at InfoComm and CE Week earlier this summer, the PN-K322B is Sharp’s latest 4K Ultra HD monitor. Its high-precision touchscreen allows accurate onscreen handwriting of fine text and lines, with writing performed via a dedicated touch pen with a pen-tip width of just 2 mm. The display also supports up to 10-point multi-touch operation.
“4K ultra high definition monitors offer a clear, near lifelike picture, and because of this, we are seeing a growing demand for these displays in many industries, including graphic and video content, creation and editing. The touchscreen feature takes these displays one step further, widening their applications to these markets and others,” said Mike Marusic, Senior Vice President, Business Solutions Group.  “We are looking to expand this cutting-edge technology into additional sectors that could benefit from its clarity and interactivity, such as financial services, retail and museum/art exhibitions.”
Built with Sharp’s IGZO technology and an edge lit LED backlight, the PN-K322B boasts a slender profile with a thickness of just 36 mm. A stand, included with the monitor, allows it to slide easily between two angles depending on the application: vertical for viewing or low-angle for onscreen writing and touchscreen operation.
The PN-K322B features a palm cancellation function that prioritizes pen input even when the user’s hand is resting on the touchscreen. Input connectors on the PN-K322B are compatible with the latest DisplayPort™ and HDMI® interface specifications, enabling the monitor to display ultra high definition content delivered from a PC via a single-cable connection.
from:NJYTOUCH

SHARP to Introduce PN-K322B Touchscreen 4K Ultra HD LED Monitor

Accurate Onscreen Handwritten Text Input and Multi-Touch Operation on a 4K Ultra HD Professional Display

MAHWAH, N.J., July 15, 2013 /PRNewswire/ – Sharp Imaging and Information Company of America (SIICA), a division of Sharp Electronics Corporation, will introduce a new 32-inch-class (31.5″ diagonal) LCD LED monitor, the PN-K322B. The thinnest in its class, this professional-use monitor features a Sharp developed high-sensitivity, high-precision touchscreen and delivers 4K Ultra HD resolution (3,840 x 2,160 pixels) – four times the pixel resolution of Full HD.
As previewed at InfoComm and CE Week earlier this summer, the PN-K322B is Sharp’s latest 4K Ultra HD monitor. Its high-precision touchscreen allows accurate onscreen handwriting of fine text and lines, with writing performed via a dedicated touch pen with a pen-tip width of just 2 mm. The display also supports up to 10-point multi-touch operation.
“4K ultra high definition monitors offer a clear, near lifelike picture, and because of this, we are seeing a growing demand for these displays in many industries, including graphic and video content, creation and editing. The touchscreen feature takes these displays one step further, widening their applications to these markets and others,” said Mike Marusic, Senior Vice President, Business Solutions Group.  “We are looking to expand this cutting-edge technology into additional sectors that could benefit from its clarity and interactivity, such as financial services, retail and museum/art exhibitions.”
Built with Sharp’s IGZO technology and an edge lit LED backlight, the PN-K322B boasts a slender profile with a thickness of just 36 mm. A stand, included with the monitor, allows it to slide easily between two angles depending on the application: vertical for viewing or low-angle for onscreen writing and touchscreen operation.
The PN-K322B features a palm cancellation function that prioritizes pen input even when the user’s hand is resting on the touchscreen. Input connectors on the PN-K322B are compatible with the latest DisplayPort™ and HDMI® interface specifications, enabling the monitor to display ultra high definition content delivered from a PC via a single-cable connection.
from:NJYTOUCH