FOLLOW US ON... FEEDBACK

Home >> Industry News >> Feature Story
sideview vmo3D sideview vmo3D 30 day free trial
Feature Story
  • Also available via:


Eyewear Startup Introduces Prescription Glasses With "Unlimited" Colors, Looks and Styles PDF
Submit Your News
Feature Story
Thursday, 21 May 2015

Eye ChangeU.S. eyewear start-up company Eye Change has recently introduced a new and unique eyewear system that allows you to invest in just one pair of glasses and have the option of "unlimited" colors, looks and styles. Simply change the 'rims and trims' anytime you want for an entirely new look. An inlaid magnet and tongue and groove system ensures a secure attachment system for both the front piece, aka the 'rim' and the temples, aka the 'trims'. The company's motto is: "Eye Change...to a new frame of mine!"

Eye Change has partnered up with Vision Craft, an Essilor partner lab in the Michigan, US. According to Eye Change's website, their timeline is expected to be like: "Manufacture a minimum of 10 more mainframes to choose from within the next 2 months with the goal being 15-20 different mainframes by summer’s end. Surely, with 20 different styles and shapes, there will be a flattering frame for most faces." They also mention "Beginning early in 2016, we will be opening our first Kiosks in some of the higher end Malls for a ‘more hands’ on experience. This allows you to try on and really experience the many different looks that Eye Change will be offering."

The company is also listed on kickstarter.

 
Emotion Detecting Eyewear Patent Awarded to Microsoft PDF
Submit Your News
Feature Story
Friday, 01 May 2015

Microsoft emotion detecting eyewearMicrosoft has been awarded recently with a patent they filed in late 2012 for emotion detecting eyewear. The patent describes it as "a see-through, head mounted display and sensing devices cooperating with the display detect audible and visual behaviors of a subject in a field of view of the device". A processing device communicates with display and the sensors monitors audible and visual behaviors of the subject by receiving data from the sensors. Emotional states are computed based on the behaviors and feedback provided to the wearer indicating computed emotional states of the subject. During interactions, the device, recognizes emotional states in subjects by comparing detected sensor input against a database of human/primate gestures/expressions, posture, and speech. Feedback is provided to the wearer after interpretation of the sensor input.

The technology allows virtual imagery to be mixed with a real world physical environment in a display.  Such systems typically include processing units which provide the imagery under the control of one or more applications.

The patent further outlines sensors, including depth cameras and a microphone mounted on the nose bridge, pick up visual and audio information from a subject. Things like subtle variations in speech rhythm and amplitude, choice of words, type and speed of gestures, eye focus and body posture are processed. All these data is to be sent to Microsoft's databases, and an emotional analysis is then fed back to the wearer through the eyewear.

 
Smartphone Eye Exam Service Launches in New York PDF
Submit Your News
Feature Story
Monday, 27 April 2015

A service called Blink has been launched in New York last week that brings an eye exam to a patient's home or office for US$75, administered by a technician who uses a trio of handheld devices that take the place of the bulky autorefractor, lensmeter, and phoropter. The technician will send the results to an optometrist, who will write a prescription if necessary and e-mail it to the patient.

Blink is the product of EyeNetra, a startup that has been working for several years on smartphone-connected eye-exam tools. The idea behind Blink is to make eye exams more convenient (and in some cases more affordable) by redesigning the expensive, typically immovable equipment that usually crowds a doctor’s or optometrist’s office as much cheaper, smaller tools that rely on a smartphone to do some of the work.

 
Microsoft Claims New Computer Vision System Outperform Humans PDF
Submit Your News
Feature Story
Tuesday, 17 February 2015

ImagenetA team of Microsoft researchers have recently published a paper in which they noted their computer vision system based on deep convolutional neural networks had for the first time eclipsed the abilities of people to classify objects defined in the ImageNet dataset. It is estimated that humans can classify images in the ImageNet dataset with an error rate of 5.1 percent, Microsoft's team said its deep-learning-based system achieved an error rate of only 4.94 percent.

The researchers investigated neural networks from two aspects particularly driven by the rectifiers. Rectified activation units (rectifiers) are essential for state-of-the-art neural networks.

However, the authors also emphasize that computer vision still cannot match human vision in general, noting that the computing system has challenges with understanding objects, or where contextual understanding or high-level knowledge of a scene is required.

 
Students Develop Ultrasonic Glove that Allows Blind People to Sense Distance PDF
Submit Your News
Feature Story
Wednesday, 11 February 2015

Sensei Prototype GloveA prototype of a glove that alerts blind people of the proximity of objects has won first prize in the inaugural Entrepreneurship and Business Competition run by Nottingham University Business School (NUBS) at The University of Nottingham and Sainsbury Management Fellows (SMF). The SenSei Glove uses ultrasonic sensor technology to provide vibration cues on the distance from objects.

The SenSei team set out to create a product that can help blind and partially sighted people sense the distance of objects without having to undergo a lot of training. The SenSei prototype glove features an ultrasonic senor on the back of the glove. The battery operated ultrasonic senor emits different levels of sound depending on how physically close the wearer is to an object. A good analogy is the sound of a car's parking sensor.

The SenSei team is now working on the ergonomics with a view to making the ultrasonic senor smaller and lightweight so that the glove is very comfortable to wear. The team hopes to gain a business development grant to help with the ongoing development.

 
Apple Patents Eye-Tracking User Interface PDF
Submit Your News
Feature Story
Thursday, 22 January 2015

Eyetracking UIApple has been granted a patent for an advanced gaze-tracking graphical user interface that could one day see implementation in Macs, iPhones, iPads or even a future version of the Apple TV. 

The following description of the patent was submitted:

"The eye tracking system may allow a user of the graphical user interface (GUI) to navigate or interact with various elements in the GUI, such as a word processor, game, web browser, or any other suitable interactive application, simply by gazing or looking at a particular point on a display of the GUI. For example, the user may gaze at a button on the menu bar of a word processing application, causing the eye tracking system to render a cursor over the button. In certain configurations, the user of the eye tracking system may be able to select the button using any suitable input device external to the display device that is presenting the GUI, such as a track pad or mouse. In other configurations, the user of the eye tracking system may be able to select the button using an input device built-in to the display device presenting the GUI itself, such as a capacitive or resistive touch screen. In yet other configurations, the user of the eye tracking system may be able to select the button using facial gestures or voice input.

In certain configurations, the eye tracking system may persistently render the movable indicator wherever the user looks in the GUI. This rendering of the movable indicator may be accurate to the degree that the movable indicator becomes a stabilized retinal image with respect to the user's eyes. As such, the movable indicator may fade with respect to the user's perception of the GUI. In other words, the movable indicator may no longer be visible to the user. In such situations, it is desirable to restore the user's perception of the movable indicator to counteract this fading effect. Accordingly, the eye tracking system may automatically alter the position, appearance, or both of the movable indicator so that it is no longer a stabilized retinal image and can be perceived by the user."

 
<< Start < Prev 1 2 3 4 5 6 7 8 9 10 Next > End >>

Page 1 of 25