171 ТЕКСТЫ ДЛЯ РЕФЕРИРОВАНИЯ ПО ВЫЧИСЛИТЕЛЬНОЙ ТЕХНИКЕ - Страница 4

Text 6. UNIVERSITY OF DELAWARE RESEARCHERS DEVELOP REVOLUTIONARY COMPUTER INTERFACE TECHNOLOGY; FINGERWORKS SYSTEM USES HAND MOTIONS

International Online Conference on Computer Science.  For more on FingerWorks, see the web site at http://www.fingerworks.com.

University of Delaware researchers have developed a revolutionary computer interface technology that promises to put the bite on the traditional mouse and mechanical keyboard.

"This is not just a little step in improving the mouse, this is the first step in a new way of communicating with the computer through gestures and the movements of your hands. This is, after all, one of the ways humans interact." John Elias, UD professor of electrical and computer engineering, said.

Elias and Wayne Westerman, UD visiting assistant professor of electrical and computer engineering, have been working on the new interface for about five years and are now marketing their iGesture product through a company called FingerWorks.

 

- 13 -

The project started as a doctoral thesis by Westerman, who was then a UD graduate student working with Elias.

The FingerWorks name fits because the technology uses a touch pad and a range of finger motions to communicate commands and keys to the computer. To open a file, you rotate your hand as if opening a jar; to zoom or de-zoom, you expand or contract your hand.

Elias said the communication power of their system is "thousands of times greater" than that of a mouse, which uses just a single moving point as the main input. Using this new technology, two human hands provide 10 points of contact, with a wide range of motion for each, thus providing thousands of different patterns, each of which can mean something different to the computer.

While much about the computer has changed over the last three decades-greater power, faster speeds, more memory-what has not changed is the user interface.

"For what it was invented for, the mouse does a good job," Elias said. "People accept the mouse and the mechanical keyboard because that's the way it is. But there are limitations in terms of information flow. There is so much power in the computer, and so much power in the human, but the present situation results in a communications bottleneck between the two."

Elias and Westerman have a better idea. "I believe we are on the verge of changing the way people interact with computers," Elias said. "Imagine trying to communicate with another human being using just a mouse and a keyboard. It works, but it is slow and tedious."

Elias said he could envision in the next 10 years "a very complex gestural language between man and machine."

The system is a multi-touch, zero force technology, Elias said, meaning the gestures and movements use all the fingers in a light and subtle manner.

Because of that, the system has a second major advantage over the mouse and mechanical keyboard because it can greatly reduce stress injuries such as tendonitis and carpal tunnel syndrome attributed to traditional computer work.

The company markets both stand-alone touch pads and touch pads built into nonmechanical keyboards. In the keyboards, the keys overlap the touch pad so the operator does not have to move his hands when switching between typing and using the mouse. Rather, everything can be done in a smoother flow of hand motions.

Elias explained the touch pad acts like a video camera, recording the objects touching its surface. An embedded microprocessor then applies an

 

- 14 -

algorithmic process to convert those touches into commands understood by the computer.

"To observers watching somebody use multi-touch, it looks a little like magic," Elias said, illustrating his point on a computer in Evans Hall. "People see lots of things happening on the computer screen but very little hand motion is observed."

He said the system has been designed so the gestures used make sense for the operation being performed. For instance, you cut text with a pinch and paste it with a flick.

Eventually, he said, the computer password could be a gesture known only to the user.

Elias said people often think that speech recognition systems will become the ultimate user interface. "Voice commands are good for many things but terrible for other things," Elias said, adding he believes there are inherent problems with a speech-only interface.

"If you want to test this claim, you can do so with a perfect speech recognition system-another human being," Elias said. "Put somebody in front of your computer and try to do your work by issuing voice commands to him. You'll quickly find that many common tasks are difficult to do using speech, even though your 'computer interface' understands you perfectly."

Using hand and finger motion to input commands is, for many tasks, much more effective than trying to explain what you want to do in words, he said.

The system is being used at several work stations in Evans Hall and the reaction is largely favorable. It is something of a challenge for some workers, Elias said, because it is like learning a new language.

Susan Foster, UD vice president of information technologies, said she is impressed with the interface and plans to adopt it for use at several computer sites around campus.

"The device is the result of new thinking about the 'bandwidth' that constrains the physical interaction between operator and computer," Foster said. "It capitalizes on human gestures, which are easy to understand and execute. Once learned, like other motor skills, they are readily retained. The assistive qualities of the device also make it quite useful for those with limitations on upper extremity use."

The plug-and-play device, which requires no special software, should be of particular interest to programmers, graphic designers and editors, Foster said, and she is recommending they consider making use of a new technology that was "born and bred at UD and under continuing development here."

- 15 -

Tet 7. NEWER DESIGN OF CLOSE-UP COMPUTER MONITORS INCREASES EASE OF USE  COLUMBUS, OHIO

Eyeglasses with built-in computer monitors could soon be a reasonable alternative to reading text from a traditional computer screen, according to new research from Ohio State University.

Participants in a recent study rated the comfort and performance of these so-called near-eye displays as comparable to that of traditional computer monitors. Near-eye displays are like eyeglasses with a monitor built into the lenses.

"The problems with near-eye devices range from motion sickness to the device's weight to poor image resolution," said James Sheedy, a study co-author and an associate professor of optometry at Ohio State University.

"But the design of such devices is improving, and the subjects in our study found the function and usefulness of the near-eye display similar to that of a regular computer screen."

The research appears in a recent issue of the journal Optometry and Vision Science. Sheedy, who is also the director of the computer vision clinic at Ohio State, conducted the study with Neil Bergstrom, the vice president of business development at Iridigm Display Corporation in San Francisco.

At the time of the study, Bergstrom was the chief technology officer of InViso Corporation, a now-defunct startup company specializing in microdisplays. InViso provided support for the study, and Sheedy served as a consultant to InViso during the study. InViso was acquired by the company Three-Five Systems, Inc. in spring 2002.

The researchers asked 22 subjects to participate in a reading experiment and a separate movement experiment.

The subjects used a total of five different displays to complete the tasks in the reading experiment: a hand-held monocular vision display with an attached cover for the non-viewing eye; a binocular vision display with a holder that wrapped around the subject's head; hard copy with printed text; a flat panel computer screen; and a screen on a hand-held computer.

The participants were asked to perform four trials each of three different reading tasks: the first had subjects reading four separate paragraphs of about 325 words in length and answering three to four multiple choice questions at the end of each passage. The second task involved counting the occurrences of an assigned letter in a paragraph of nonsense words. In the final reading experiment, subjects were instructed to find three out of four occurrences of an assigned three-letter word on a spreadsheet filled with various three-letter words.

- 16 -

The researchers measured how long it took each subject to complete each reading task using the respective visual display. After using each type of display, the subjects were asked if they had experienced any of the following symptoms, and to what degree: headache, eyestrain, sore or irritated eyes, blurry vision, dizziness, nausea, disorientation, neck ache or backache.

Results showed that the performance of the monocular vision display was comparable to the performance of the flat panel screen and hard copy text.

"To our surprise, the reading tasks were completed faster with the monocular display than with the binocular display," said Sheedy. "This may have had to do with how each display fit the user, or the design of the respective device."

However, the subjects did complain about eyestrain more with the monocular display than with any of the other displays.

Performance speeds with the binocular display were about 5 to 7 percent slower than for the other displays, the researchers found. Sheedy suspects that this slower speed may have something to do with how the image is aligned along the line of sight in each lens.

The image displayed by a near-eye device may appear to be much farther - up to 6 feet away - from the viewer than an image on a typical computer monitor. The seemingly greater distance makes for easier viewing, Sheedy said. But the image size in a near-eye display is about the size of an average computer screen.

"Traditional displays are constrained by their physical size and are usually about 20 to 30 inches from the eyes," he said.

The second experiment assessed the risk for motion-related symptoms while wearing the binocular head display.

Seated subjects were asked to rotate their heads several times to the left and right, and again up and down. They performed the same task while standing. They were also asked to rotate their head while sitting and while standing.

"Motion-related symptoms were a large problem in previous studies," Sheedy said. "Participants didn't have much problem with motion sickness

in this study, probably due in part to the nature of the tasks they were asked to perform.

"Most of the previous studies on near-eye displays used video movement or virtual reality tasks that created movement on the virtual display. These kinds of tasks are more likely to cause queasiness."

 

- 17 -

Nor were the participants in the current study fully immersed in the image. That is, neither the monocular nor binocular displays blocked the user's peripheral vision, so he could focus on the image in front of him and still see his surroundings.

"Being able to see the real environment while wearing a display gives the user a visual reference that can help lessen confusion when the eye sees the image move," Sheedy said. "For most of the common uses, the user wouldn't want to be fully immersed in the virtual environment."

Although several prototypes exist, near-eye displays have yet to become common. It's only a matter of time before they do, Sheedy said. He doesn't see such devices becoming a commodity in the office, but he does predict that they'll find a place in business and industry.



 
Газета партия ЕР - вся актуальная информация. законодательства всех дорожно-транспортных го. страницу поиска с из броузера