Design Privacy in the IoE

Europe have insisted on the requirement of DESIGN PRIVACY for wearable devices, meaning that the configuration in origin have to have activated the maximum protection to the user and the tools to decrease or avoid the protection that is left to the user.

This is just the opossite situation that exist nowadays in which wearable are built with the less protection for the user.

For the reason the EC have write a document set up by the Working Party stablished under the Article 29 of the Directive 95/46/EC and entitle: Opinion 8/2014 on the Recent Developments of the IoT dated on the 16th of September.

The article cover privacy aspects of three main domains of the IoT

  1. Wearable Computing
  2. Quantify Self
  3. Domotics.

For us the main concern is QUANTIFY SELF stablished as “things are designed to be regularly carried by individuals who want to record information about their own habits and lifestyles. For example, an individual may want to wear a sleep tracker every night to obtain an extensive view of sleep patterns. Other devices focus on tracking movements, such as activity counters which continuously measure and report quantitative indicators related to the individual’s physical activities, like burned calories or walked distances, among others.
Some objects further measure weight, pulse and other health indicators. By observing trends and changes in behaviour over time, the collected data can be analysed to infer qualitative health-related information including assessments on the quality and effects of the physical activity based on predefined thresholds and the likely presence of disease symptoms, to a certain extent.
Quantified Self sensors are often required to be worn in specific conditions to extract relevant information. For example, an accelerometer placed at the belt of a data subject, with the appropriate algorithms, could measure the abdomen moves (raw data), extract information about the breathing rhythm (aggregated data and extracted information) and display the level of stress of the data subject (displayable data). On some devices, only this latter information is reported to the user but the device manufacturer or the service provider may have access to much more data that can be analysed at a later stage.
Quantified Self is challenging with regard to the types of data collected that are health-related, hence potentially sensitive, as well as to the extensive collection of such data.

In fact, since this  movement  http://developer.android.com/wear/index.html focuses on motivating users to remain healthy, it has many connections with the e-health ecosystem. Yet, recent investigations have challenged the real accuracy of the measures and of the inferences made from them.

The challenges are mainly:

  • Lack of control and information asymmetry
  • Quality of the user´s consent
  • Secondary use of this data
  • Intrusive bringing out of behavious patterns and profiling
  • Limitations on the anonymity addressing the MAC address of the devices.
  • Security risks: security vs efficiency.

They specify the applicable laws such as:  Directive 95/46/EC as well as specific provisions of Directive 2002/58/EC as amended by Directive 2009/136/EC. This framework applies where the conditions of its applicability are met as set forth in article 4 of Directive 95/46/EC. The Working Party has provided extensive guidance on the interpretation of the provisions of Article 4, namely in its Opinion 8/20108 on applicable law. IoT stakeholders qualifying as data controllers (whether alone or jointly with others) under EU law must comply with the different obligations that weigh on them in application of Directive 95/46/EC and relevant provisions of Directive 2002/58/EC, if applicable. Consent (Article 7(a)) is the first legal basis that should be principally relied on in the context of the IoT, whether by device manufacturers, social or data platforms, devices lenders or third party developers.

Which is relevant for the Quantify self is the IoT data platforms: Due to a lack of standardisation and interoperability, the Internet of Things is sometimes seen as an “Intranet of Things” in which every manufacturer has defined its own set of interfaces and data format. Data is then hosted in walled environments, which effectively prevents users from transferring (or even combining) their data from one device to another.
Yet, smartphones and tablets have become the natural gateways of data collected through many IoT devices to the internet. As a result, manufacturers have progressively developed platforms that aim to host the data collected through such different devices, in order to centralise and simplify their management.
Such platforms may also qualify as data controllers under EU data protection law, when the development of such services actually implies that they collect the users’ personal data for their own purposes.

This brings again our concern of the difference between SMALL and BIG DATA and the requirements that the FOG streaming and procesing SMALL data should have.  Starting from 1) The purpose limitation principle implies that data can only be collected for specified, explicit and legitimate purposes. Any further processing that would be incompatible with these original purposes would be illicit under EU law. Continuing with 2) The data minimization principle by which data collected on the data subject should be strictly necessary for the specific purpose previously determined by the data controller. And ending with the principle by which data should not be kept for longer than is necessary for the purpose for which the data were collected or further processed.

That is one of the main reason why a clear difference between to whom belong the SMALL DATA, where is this going to be process and how can we stablish a secure environment for the FOG.