Wearables Improving Life

Did you know that almost 45million persons in the world are blind? And that 36million have disabling hearing loss? It’s the little things that make a difference and that is why technology might be able to improve the everyday life of these on other challenged people.

Bionic ears might sound they are from the future but a hearing implant company called Med-El has a wide range of implant solutions for those with hearing loss or with partial deafness. The cochlear implant system is a medical option for individuals with severe to profound sensorineural hearing loss. For individuals with this type of hearing loss, hearing aids provide little or no benefit. Their staggering ability to create a sense of sound is down to a flexible electrode array that is nestled inside the cochlea during surgery. These wires allow the conventional auditory pathway to be sidestepped. Sounds are picked up by an external microphone, hooked over the ear, and turned into a digital “score” of electronic stimulation patterns by a processor. This information is then transmitted wirelessly across the scalp, together with a dose of energy, where it is picked up by a coil under the skin and passed to the implant where the digital score is converted into electrical pulses. These are sent to the electrodes within the cochlea, where they artificially trigger electrical impulses in the auditory nerve fibers, bypassing the role of the hair cells. But while each hair cell stimulates only a few of these fibers, the electrical pulses of a cochlear implant trigger much larger areas.

Another solution presented as a student project which won the first prize at the Microsoft Imagine Cup consist of two sensor gloves and a mobile device with the idea of translating sign language into speech. The idea is still a concept and named Enable Talk. The inspiration for the gloves came from observing fellow college students who were deaf have difficulty communicating with other students, which results in them being excluded from activities. In their glove, a total of 15 flex sensors in the fingers measure the degree of bending while a compass, accelerometer, and gyroscope determine the motion of the glove through space. The sensor data are processed by a microcontroller on the glove then sent via Bluetooth to a mobile device, which translates the positions of the hand and fingers into text when the pattern is recognized.

For those with visual problems there are also some smart solutions. For example Orcam a company founded in 2010 developed a pair of glasses, a smart camera mounted on the frames of eyeglasses, which “sees” text, recognizes objects and “whispers” in the wearer’s ear. The Orcam device enables to read books or newspapers and even identify which product or item the wearer is pointing at. OrCam understands automatically, whether the wearer wants to read, find an item or recognize a product. All it takes is a simple, intuitive, pointing gesture. The base unit is smaller than the average glasses case and fits in most pockets or clips to belt.

Big brands like Samsung also offer three new accessories for users of the Galaxy Core Advance mobile, that intend to assist disabled and visually impaired users. The ultrasonic cover allows users to detect obstacles and navigate unfamiliar places by sending an alert through a vibration or text to speech feedback. By holding the cover in front of the user, it can enhance a visually impaired user’s awareness of their surroundings by sensing the presence of a person or object up to 2 meters away. The optical scan stand positions the device to focus on printed materials, automatically activating the optical scan application, which recognizes text from an image and reads it aloud to the user. The optical scan stand is particularly useful when a consumer is alone, automatically initiating the application to read text as soon as paper is sensed on the stand. The voice label assists users in distinguishing objects by allowing them to make notes and tag voice labels easily on-the-go.

Ducere Technologies was founded in 2011 by two friends and tinkerers. They graduated at MIT together and came up with a product they called LECHAL, designed for the use of everyone. However, LECHAL’s footwear can also be used by people who are visually challenged. Its interactive haptic based navigational system guides the user towards their destination through simple vibrations in the footwear. An smartphone app pairs with the footwear via Bluetooth. The user interacts with the app and can set a destination. The phone’s GPS is used to calculate location data and the directions are conveyed to the user via haptics in the footwear. Besides that Lechal has also fitness features as counting steps and tracking calories. Through the app the user can set goals, create custom workout sessions and much more. It can also tag locations, set destinations, start/stop/pause navigation all by executing simple foot gestures. The product pack comes with a pair of footwear (insoles or shoes), a charger, an application which can be downloaded on an Android, IOS or Windows smartphone.

There is also a smart development for early diagnose and prevention by Ramesh Raskar who heads the Camera Culture group at the MIT. Together with his team he developed Eyemitra a mobile retinal imaging solution that brings retinal exams to the realm of routine care, by lowering the cost of the imaging device to a 10th of its current cost, integrating the device with image analysis software an predictive analytics. Diabetic Retinopathy (DR) is the leading cause of adult blindness and with Eyemitra the Camera Culture group expects to not only improve quality of eye care and reduce cases of blindness due to DR, but to also bring routine diagnostic retinal examinations to the developing countries where a standard for eye care for diabetics might not even exist today.

Neuroengineering company Emotiv especially known for their wireless EEG systems, and their headset, EPOC, a personal interface for human computer interaction, is a high resolution, multi-channel, wireless neuroheadset. The EPOC uses a set of 14 sensors plus 2 references to tune into electric signals produced by the brain to detect the user’s thoughts, feelings and expressions in real time.The device’s Cognitiv suite reads and interprets a user’s conscious thoughts and intent. Users can manipulate virtual objects with only the power of their thought. This could be especially interesting for controlling an electric wheelchair a mind-keyboard or playing hands-free game.

Personal BIONICS empowers people to do the things they want with their BiOM® T2 System , which provides natural Bionic Propulsion, emulating muscle function about the healthy human ankle during the load-bearing stance phase of walking. Personal Bionics is transforming the prosthetic industry with Bionic Propulsion ankle technology that emulates missing muscle and tendon function. The BiOM offers programmable stiffness modulation and power assist to emulate lost muscle function, reducing gait deviation, resulting in lower metabolic energies, faster speeds, and lower joint stresses throughout the musculoskeletal system.

And these are not even all solutions out there improving people’s lives but probably the most intriguing examples. Let us know if we missed out on something important!

Previous articleHearables – the new Wearables
Next articleEndless Fashion Week