FOLLOW US ON... FEEDBACK

Home >> Industry News >> Feature Story
Banner
Banner
Feature Story
  • Also available via:


New Visual Software Shows Animal-eye View of the World PDF
Submit Your News
Feature Story
Monday, 17 August 2015

borage flowerNew camera technology that reveals the world through the eyes of animals has been developed by University of Exeter researchers in UK. The details are published in the journal Methods in Ecology and Evolution.

The software, which converts digital photos to animal vision, can be used to analyse colours and patterns and is particularly useful for the study of animal and plant signalling, camouflage and animal predation, but could also prove useful for anyone wanting to measure colours accurately and objectively.

The software has already been used by the Sensory Ecology group in a wide range of studies, such as colour change in green shore crabs, tracking human female face colour changes through the ovulation cycle, and determining the aspects of camouflage that protect nightjar clutches from being spotted by potential predators.

Until now, there has been no user-friendly software programme that enables researchers to calibrate their images, incorporate multiple layers - visible and UV channels -, convert to animal colour spaces, and to measure images easily. Instead, researchers have needed to do much of this manually, including the sometimes complex programming and calculations involved. This freely available open source software now offers a user-friendly solution.

Colour vision varies substantially across the animal kingdom, and can even vary within a given species. Most humans and old-world monkeys have eyes sensitive to three colours; red, green and blue, which is more than other mammals that are only sensitive to blue and yellow. It is impossible for humans to imagine seeing the world in more than three primary colours, but this is common in most birds, reptiles, amphibians and many insects that see in four or more. Many of them can also see into the ultraviolet range, a world completely invisible to us without the use of full spectrum cameras. So scientists studying these species need to measure UV to understand how these animals view the world.

Using a camera converted to full spectrum sensitivity, one photograph taken through a visible-pass filter can be combined by the software with a second taken through an ultraviolet-pass filter. The software can then generate functions to show the image through an animal's eyes. The researchers have provided specific data on camera settings for commonly studied animals, such as humans, blue tits, peafowl, honey bees, ferrets and some fish.

Flowers often look particularly striking in UV because they are signalling to attract pollinators that can see in UV, such as bees. UV is also often important for birds, reptiles and insects in their colourful sexual displays to attract mates.

The software is available at: http://www.jolyon.co.uk/myresearch/image-analysis/image-analysis-tools/

 
New Bionic Lens Could Provide "Super" Vision PDF
Submit Your News
Feature Story
Friday, 29 May 2015

Bionic LensEmploying state-of-the art materials and production techniques, Canadian company Ocumetics Technology Corporation has recently announced the development of one of the world's most advanced intraocular lenses, one that is capable of restoring quality vision at all distances, without glasses, contact lenses or corneal refractive procedures, and without the vision problems that have plagued current accommodative and multifocal intraocular lens designs.

The Ocumetics Bionic Lens could one day transform the eyecare world, as the custom lenses can provide three times better than 20/20 vision. Instead of needing eyeglasses or contacts, the surgically-implanted lenses would be able to improve a person's vision regardless of how poor their vision was beforehand.

"This is vision enhancements that the world has never seen before," said Dr. Garth Webb, optometrist and CEO of Ocumetics Technology. "If you can just barely see the clock at 10 feet, when you get the Bionic Lens you can see the clock at 30 feet away."

The surgery takes around eight minutes, with the procedure similar to a cataract surgery, and would immediately improve vision. Dr. Webb presented the lens to 14 top ophthalmologists in San Diego the day before an annual gathering of the American Society of Cataract and Refractive Surgery.

 
Eyewear Startup Introduces Prescription Glasses With "Unlimited" Colors, Looks and Styles PDF
Submit Your News
Feature Story
Thursday, 21 May 2015

Eye ChangeU.S. eyewear start-up company Eye Change has recently introduced a new and unique eyewear system that allows you to invest in just one pair of glasses and have the option of "unlimited" colors, looks and styles. Simply change the 'rims and trims' anytime you want for an entirely new look. An inlaid magnet and tongue and groove system ensures a secure attachment system for both the front piece, aka the 'rim' and the temples, aka the 'trims'. The company's motto is: "Eye Change...to a new frame of mine!"

Eye Change has partnered up with Vision Craft, an Essilor partner lab in the Michigan, US. According to Eye Change's website, their timeline is expected to be like: "Manufacture a minimum of 10 more mainframes to choose from within the next 2 months with the goal being 15-20 different mainframes by summer’s end. Surely, with 20 different styles and shapes, there will be a flattering frame for most faces." They also mention "Beginning early in 2016, we will be opening our first Kiosks in some of the higher end Malls for a ‘more hands’ on experience. This allows you to try on and really experience the many different looks that Eye Change will be offering."

The company is also listed on kickstarter.

 
Emotion Detecting Eyewear Patent Awarded to Microsoft PDF
Submit Your News
Feature Story
Friday, 01 May 2015

Microsoft emotion detecting eyewearMicrosoft has been awarded recently with a patent they filed in late 2012 for emotion detecting eyewear. The patent describes it as "a see-through, head mounted display and sensing devices cooperating with the display detect audible and visual behaviors of a subject in a field of view of the device". A processing device communicates with display and the sensors monitors audible and visual behaviors of the subject by receiving data from the sensors. Emotional states are computed based on the behaviors and feedback provided to the wearer indicating computed emotional states of the subject. During interactions, the device, recognizes emotional states in subjects by comparing detected sensor input against a database of human/primate gestures/expressions, posture, and speech. Feedback is provided to the wearer after interpretation of the sensor input.

The technology allows virtual imagery to be mixed with a real world physical environment in a display.  Such systems typically include processing units which provide the imagery under the control of one or more applications.

The patent further outlines sensors, including depth cameras and a microphone mounted on the nose bridge, pick up visual and audio information from a subject. Things like subtle variations in speech rhythm and amplitude, choice of words, type and speed of gestures, eye focus and body posture are processed. All these data is to be sent to Microsoft's databases, and an emotional analysis is then fed back to the wearer through the eyewear.

 
Smartphone Eye Exam Service Launches in New York PDF
Submit Your News
Feature Story
Monday, 27 April 2015

A service called Blink has been launched in New York last week that brings an eye exam to a patient's home or office for US$75, administered by a technician who uses a trio of handheld devices that take the place of the bulky autorefractor, lensmeter, and phoropter. The technician will send the results to an optometrist, who will write a prescription if necessary and e-mail it to the patient.

Blink is the product of EyeNetra, a startup that has been working for several years on smartphone-connected eye-exam tools. The idea behind Blink is to make eye exams more convenient (and in some cases more affordable) by redesigning the expensive, typically immovable equipment that usually crowds a doctor’s or optometrist’s office as much cheaper, smaller tools that rely on a smartphone to do some of the work.

 
Microsoft Claims New Computer Vision System Outperform Humans PDF
Submit Your News
Feature Story
Tuesday, 17 February 2015

ImagenetA team of Microsoft researchers have recently published a paper in which they noted their computer vision system based on deep convolutional neural networks had for the first time eclipsed the abilities of people to classify objects defined in the ImageNet dataset. It is estimated that humans can classify images in the ImageNet dataset with an error rate of 5.1 percent, Microsoft's team said its deep-learning-based system achieved an error rate of only 4.94 percent.

The researchers investigated neural networks from two aspects particularly driven by the rectifiers. Rectified activation units (rectifiers) are essential for state-of-the-art neural networks.

However, the authors also emphasize that computer vision still cannot match human vision in general, noting that the computing system has challenges with understanding objects, or where contextual understanding or high-level knowledge of a scene is required.

 
<< Start < Prev 1 2 3 4 5 6 7 8 9 10 Next > End >>

Page 1 of 25
Banner