FOLLOW US ON... FEEDBACK

Home >> Industry News >> Feature Story
sideview vmo3D sideview vmo3D 30 day free trial
Feature Story
  • Also available via:


Microsoft Claims New Computer Vision System Outperform Humans PDF
Submit Your News
Feature Story
Tuesday, 17 February 2015

ImagenetA team of Microsoft researchers have recently published a paper in which they noted their computer vision system based on deep convolutional neural networks had for the first time eclipsed the abilities of people to classify objects defined in the ImageNet dataset. It is estimated that humans can classify images in the ImageNet dataset with an error rate of 5.1 percent, Microsoft's team said its deep-learning-based system achieved an error rate of only 4.94 percent.

The researchers investigated neural networks from two aspects particularly driven by the rectifiers. Rectified activation units (rectifiers) are essential for state-of-the-art neural networks.

However, the authors also emphasize that computer vision still cannot match human vision in general, noting that the computing system has challenges with understanding objects, or where contextual understanding or high-level knowledge of a scene is required.

 
Students Develop Ultrasonic Glove that Allows Blind People to Sense Distance PDF
Submit Your News
Feature Story
Wednesday, 11 February 2015

Sensei Prototype GloveA prototype of a glove that alerts blind people of the proximity of objects has won first prize in the inaugural Entrepreneurship and Business Competition run by Nottingham University Business School (NUBS) at The University of Nottingham and Sainsbury Management Fellows (SMF). The SenSei Glove uses ultrasonic sensor technology to provide vibration cues on the distance from objects.

The SenSei team set out to create a product that can help blind and partially sighted people sense the distance of objects without having to undergo a lot of training. The SenSei prototype glove features an ultrasonic senor on the back of the glove. The battery operated ultrasonic senor emits different levels of sound depending on how physically close the wearer is to an object. A good analogy is the sound of a car's parking sensor.

The SenSei team is now working on the ergonomics with a view to making the ultrasonic senor smaller and lightweight so that the glove is very comfortable to wear. The team hopes to gain a business development grant to help with the ongoing development.

 
Apple Patents Eye-Tracking User Interface PDF
Submit Your News
Feature Story
Thursday, 22 January 2015

Eyetracking UIApple has been granted a patent for an advanced gaze-tracking graphical user interface that could one day see implementation in Macs, iPhones, iPads or even a future version of the Apple TV. 

The following description of the patent was submitted:

"The eye tracking system may allow a user of the graphical user interface (GUI) to navigate or interact with various elements in the GUI, such as a word processor, game, web browser, or any other suitable interactive application, simply by gazing or looking at a particular point on a display of the GUI. For example, the user may gaze at a button on the menu bar of a word processing application, causing the eye tracking system to render a cursor over the button. In certain configurations, the user of the eye tracking system may be able to select the button using any suitable input device external to the display device that is presenting the GUI, such as a track pad or mouse. In other configurations, the user of the eye tracking system may be able to select the button using an input device built-in to the display device presenting the GUI itself, such as a capacitive or resistive touch screen. In yet other configurations, the user of the eye tracking system may be able to select the button using facial gestures or voice input.

In certain configurations, the eye tracking system may persistently render the movable indicator wherever the user looks in the GUI. This rendering of the movable indicator may be accurate to the degree that the movable indicator becomes a stabilized retinal image with respect to the user's eyes. As such, the movable indicator may fade with respect to the user's perception of the GUI. In other words, the movable indicator may no longer be visible to the user. In such situations, it is desirable to restore the user's perception of the movable indicator to counteract this fading effect. Accordingly, the eye tracking system may automatically alter the position, appearance, or both of the movable indicator so that it is no longer a stabilized retinal image and can be perceived by the user."

 
Google Halts Sales of Google Glass Eyewear PDF
Submit Your News
Feature Story
Monday, 19 January 2015

Google GlassGoogle insists it is still committed to launching the smart glasses as a consumer product, but will stop producing Glass in its present form and announced the closure of its Google[x] labs' Glass Explorer Program which was a kind of "open beta" for developers. Google further stated: "Glass at Work has been growing and we're seeing incredible developments with Glass in the workplace. As we look to the road ahead, we realize that we've outgrown the lab and so we're officially "graduating" from Google[x] to be our own team here at Google. We're thrilled to be moving even more from concept to reality".

As part of this transition, Google is closing the Explorer Program. January 19 will be the last day to get the Glass Explorer Edition. Future versions of Glass are already planned and the related work will be carried out by a different Google division.

 
Eye Tracking Software Allows Users to Build LEGO Sets PDF
Submit Your News
Feature Story
Monday, 12 January 2015

The Eye TribeStartup company The Eye Tribe has announced that it is launching the world's first Android eye tracking software development kit at the 2015 International CES in Las Vegas. With this SDK, The Eye Tribe is enabling OEMs and developers on the Android platform to create new experiences and products for the consumer market. 

During the show, The Eye Tribe demonstrated its Eye Tracking SDK for Android that provides a first step toward integration of robust, affordable eye tracking into mobile devices. In addition, the company introduced its new and highly accurate TV Tracker and showcase an exciting project with the LEGO Group that allows users to build their favorite LEGO® sets with eye tracking. 

The Eye Tribe is working with the LEGO Group to develop a unique learning tool that provides users with a completely new experience for engaging with their favorite LEGO® sets.

 
Innovative Display Technology Adapts To Optical Prescription PDF
Submit Your News
Feature Story
Tuesday, 29 July 2014

Innovative Corrective DisplayResearchers at MIT, Microsoft, and University of California, Berkeley are developing technology that can adjust an image on a display so you can see it clearly without corrective lenses. The technology uses algorithms to alter an image based on a person's glasses prescription together with a light filter set in front of the display. The algorithm alters the light from each individual pixel so that, when fed through a tiny hole in the plastic filter, rays of light reach the retina in a way that re-creates a sharp image. Researchers say the idea is to anticipate how your eyes will naturally distort whatever's onscreen, something glasses or contacts typically correct, and adjust it beforehand so that what you see appears clear.

In addition to making it easier for people with simple vision problems to use all kinds of displays without glasses, the technique may help those with more serious vision problems caused by physical defects that can't be corrected with glasses or contacts, researchers say. This includes spherical aberration, which causes different parts of the lens to refract light differently.

While similar methods have been tried before, the new approach produces a sharper, higher-contrast image. The technology can be adjusted for different viewers, but it won't currently work for several people simultaneously who have different vision needs.

For more information download the published paper at: http://web.media.mit.edu/~gordonw/VisionCorrectingDisplay/SIG2014-VisionCorrectingDisplay.pdf

 
<< Start < Prev 1 2 3 4 5 6 7 8 9 10 Next > End >>

Page 1 of 25