Protecting and Securing the Treasure Trove of Wearable Data

Recently WT | Wearable Technologies sponsored a Wearable Data Hackathon in Munich, Germany. The main topics of consideration were; how do software engineers use wearable data efficiently, and how do we protect it from cybercrime?   In 2011, a researcher provided evidence to how dangerous cyber hacks to wearable health care can be. They were able to manipulate an insulin pump to deliver a lethal amount of insulin into the user’s body.  So after nearly four years of advancement, where is the wearable industry in securing data from cybercrime?

Boston Consulting Group has predicted that by 2020 consumer’s personal data to be valued at €1 trillion. Yet, what types of wearable data are useful to hackers? Obviously they go beyond the basic individual information of name, gender, weight, height, home location, marital status, and personal ID numbers.  The diverse amount of sensors allows one to collect data on another’s blood pressure, cholesterol, weight loss goals, current location, body temperature, speed, and muscle response time.  As isolated numbers this data is no more than nonsense. However, when incorporated into AI or affective computing algorithms the methods of terrorizing a person’s mental and physical health can be frightening.

This is a topic of ethics. Should consumers learn how to manage their data? Should companies give transparent options about who the consumer can share their data with? I think yes! These are two simple solutions to protecting data; it all begins with awareness. However, simply encouraging consumers to read the terms and conditions before agreeing to use a product, doesn’t mean that they will do it. Nonetheless, the increasing value of reading this fine print is inevitable. Even after Snowden’s warning about NSA’s secret mass-data collection, society doesn’t seem to care that their data is being manipulated for research and statistics. If you are not breaking the law, then what do you have to hide?

Another simple solution is in the hands of user-interface designers. If companies provided user-friendly methods within the app for asking for informing the customer all the ways their data could be used, then people will begin to trust technology. Even so, companies should happily accept when some of their consumers will not consent to sending their data to 3rd parties.  Service providers need to define a clear boundary between user’s data collection and the user’s control over the data. Users want to be in control of their lives, time, money and data; therefore supplying them with easy to manage graphics is necessary.

Companies have asked their employees to use fitness trackers to collect statistical analysis about their employees’ average health – for reasons concerning risk insurance assessment.  Collecting health data from an entire company’s employees creates a gold rush for data hackers. These companies should provide security measurements to ensure their employees’ physical, and mental data is secure. As well, politicians should put laws in place about what is ethical and unethical research with individual physiological data. Ideally, politicians can be able to do this without imposing a high budget.

Data security is even more important when considering the effects on healthcare wearables. A simple solution begins with a strong, diverse passphrase. Yes, I said it correctly: passphrase. This means instead of a one to two word ‘password’ people should use a phrase with many symbols and numbers. This will easily decrease the amount of security breaches from easy-to-hack passwords.  Users should also be consistently updating their smartphone applications to include the newest protection developments.

Wearables can experience physical attacks through eavesdropping and proxy attacks with wireless communication, just like mobile phones.  A solution for this would be to establish an encrypted link in sensors.  However, this code can be too-long in regards to processing time for current sensors on the market.

Another recommendation for data solutions are provided by the European Parliament on Regards to the Procession of Personal Data. Frist any new application in the field of wearables of IoT should pass the Privacy Impact Assessments (PIAs) based on the Privacy and Data Protection Impact Assessment Framework for RFID Applications.  Second is deleting raw data once it is processed for aggregated data. Third is to apply the principles of Privacy by Design and Privacy by Default to the product in the earliest stages. Lastly, informing a user immediately once the device recognizes something fishy is happening.

Gemalto is a long partner of WT | Wearable Technologies and an expert with regards to digital security. They recently spoke at out 16th WT | Conference in San Francisco, and will speak again at the WT | Hong Kong conference October 12, 2015. Gemalto works specifically with financial services, governments, mobile companies, transportation, identity and access, and lastly the internet of things. They have prototypes in place for a wristable credit card called Optelio.

Before I end this article I would like to stress that there is a difference between a committing a crime and committing cybercrime; likewise committing terrorism and cyber terrorism. Even so, these acts are against the norms of society. People working in the wearable field need to consider the societal consequences of hacking raw physiological data, and set protection solutions in place before release a product to market. The current laws on digital protection should be updated and vigorously regulated within tech industries. Having trained people in place along every corner of security and digital protection is a key factor in maintaining a guarded market. Perhaps security algorithms are the link to ending the everlasting war of good and evil knowledge.

Previous articleOut of Sight, Out of Fright
Next articleA Smart Employee for a Productive Work Place