Abstract—In this article we present the experimental results of an innovation-driven collaborative project in the eHealth domain that aims to create a product for the remote monitoring of elderly people in their own homes or inside proper carehomes or hospitals. In particular, we focus on how we solved the challenges of integrating a number of components into an architecture for an overall IoT platform that i. remotely monitors elderly people through heterogeneous wearable and positioning technologies; ii. processes produced data to generate alarms; iii. delivers, through an IoT broker, datastreams with fine-grained access control; iv. displays data to their owners and allowed contacts / doctors through simple web-based applications. These integration results, driven by concrete requirements, are about to be applied in 13 different trials across Europe. We illustrate lessons learnt in the implementation of an interoperable IoT data broker and a set of pragmatic choices made in the integration of newly designed and existing services (components in the afore-mentioned architecture). The contents of this article, rather than illustrating a scientific breakthrough, advance the state-of-the-art by providing a reference guide for a scalable API-based solutiondesign that lowers the IoT interoperability barriers and addresses secure and privacy concerns when dealing with sensitive data.
The UNCAP Architecture - write applications once update sensing data-sources anytime The UNCAP architecture was also designed to address the concrete need for creating solutions that are capable of interfacing to many different data sources and that can be easily and continuously improved to use the best of the sensing technologies available as well as adopt and integrate different services without too much reprogramming. For the loT part this is about ensuring loT interoperability. If we were able to enhance the dataset loT applications rely on, in any domains without having to worry about coding directly into heterogeneous data sources' interfaces, this would spark a lot ofinnovation [2]. With these objectives in mind we designed the UNCAP architecture to reflect the three canonical macro building blocks (sensing, data processing, visualisation/application) that can be found in any loT solutions. What made our experience different was the fact that rather than targeting small proof-ofconcept prototypes we had to come up with a design that gave particular attention to hiding heterogeneity of loT devices, supporting scalability and addressing privacy and security concerns which is paramount in sensitive health-related applications. Project center in kovilpatti
The "interoperable sensing" building block consists of a set of functionalities deployed between gateway and the cloud that provide de-facto an loT data brokerage service that hides sensing heterogeneity to the application developers (see left squared section in the picture, adding the cloud based box ofthe loT Data Broker). At this point in time the gateway only provides sync functionality that guarantee some ofthe UNCAP application features (i.e. visualization of sensed data, alerts through SMS due to emergency) can work offline. A planned enhancement here is to use cloud technologies and distribute functionality on containers running between the gateway and the cloud. This enables a higher flexibility when scaling up and addressing runtime performance issues caused by inadequacy of connectivity and / or computational capabilities of the gateway hardware. From such an "interoperable sensing block" loT data emerge for further (real-time) processing, storage or consumption through a second macro "cloud bound" building block (illustrated with PROCESSING in the picture). This second building block includes means for managing access control to data-streams, for stream processing of events and associated alert generation as well as permanent data storage in a secure backend database compliant with EU privacy laws. The third macro-building block (APPLICAnONS) consist of a Web-Interface for the visualisation of data and alerts generated. This is a placeholder for a variety of different applications that "understand" the structure of the produced loT data (more on that later). We now delve deeper in these three major building blocks to analyse at a higher level of granularity the various components of the architecture that has been deployed, using the proposed structure: the requirements the presented architectural block fulfils the description of what it does and the research challenges that stem from ensuring long-term viability for such components, setting an evolution path for the whole solution.
The UNCAP interoperable data collection One of the main requirements elicited in the first phase of the project was to account for intermittent connectivity. While there is some limited delay tolerance when accessing information from a website, when it comes to users wanting to get a feedback from using a device for measuring their bioparameters, the tolerance to latency is almost inexistent. In loT, due to the resource constrained nature of devices and lack of visual cues into potential failures, having a solution entirely relying on an online cloud component is often cause for distress and therefore limited adoption from end-users. The gateway-bound set of components was therefore designed to work in isolation (i.e. even without connectivity) and provide immediate response to the user for every action he would proactively make to interact with his devices. In the implementation this requirement was addressed with a gateway-bound limited set of functionality that with the help of a local cache and of a synchronisation module guarantees 1. responsive interactions and 2. no monitored data loss even in case of disconnections happening while targeted users are on the move. Research challenges in this area relate to the ability to have lightweight gateway-bound functionality that can be installed on limited resources hardware as this makes our design valid also in the case where even a Raspberry PI can act as a gateway for low-cost deployments [6]. Extending further the scope of research, beyond miniaturisation comes the issue of local / global cloud separation of functionality. Namely this is the ability to deploy self contained services and migrate them to and from the cloud according to the load they generate and accounting for how this impacts the underlying infrastructure resources. In terms of loT challenges, we had to address the problem of interoperability towards heterogeneous data sources at many different levels. For the purpose of this article, it is worth focusing on two of the major aspects that hinder interoperability in loT. One at communication / protocol level, the other at data-structure level. On the radio capability front, we found that within the scope of the trial BLE and WiFi connectivity would suffice to ensure the gateway gathers loT data from all the data sources used in our extensive eHealth application scenarios. In terms of protocols, to address the low-power requirement of any loT devices publishing sensed data we used the MQTT protocol, implementing a lightweight publish-subscribe interface at gateway level. Besides such obvious choices for hardware, wireless and communication protocols the interoperability problem solution was a little more complex when considering data structures for loT data. For this purpose, the pragmatic solution was to use an loT Data Broker accessed through REST APls that isolate the applications and the sensing devices from the internals of the broker platform, which internally would handle the following W3C standard data-structure.
This can be seen as the bare, general purpose skeleton to which one has to add the appropriate flesh to ensure the final outcome fulfils the requirements in terms of what data needs to be extracted by the loT Data Broker and for what purposes. 549 In order to address the need to support different sensing hardware manufacturers, a set of guidelines for certification [12] have been released to help in the creation of suitable drivers for interfacing any sensor with the loT Data Broker. At this point the requirement was to tailor this general purpose loT Data Broker to fit concrete deployment needs. This process, keeping fixed the "southbound" interactions APls, was needed to specialise what streams and channels should look like in the structure highlighted in Fig. 2 to fit the "northbound" applications. This is usually to be agreed between software developers mastering the platform features, the sensing hardware providers formatting data according to agreed standards (i.e. Open mHealth [7] in our case) and solution designers who are knowledgeable of what data valueadd applications need. For the alpha release of the UNCAP software streams and channels were tailored to handle three different types of streams, one for Measurements, one for Positions and one for Alarms. This would enable the APPLICAnONS to handle indoor positions ofmonitored patients, bio-parameters readings as well as alarms generated in case of emergencies. Besides this outer shell structure (i.e. pictured as the leftmost trolley in the Fig. 3), the actual data with details of the measurement was included as stringified JSON in the payload of the to enable quick and data-structure-agnostic routing of loT messages. Clearly in this configuration the value-add would be kept outside the loT Data Broker and all the messages were flooding the communication medium regardless of their usefulness due to the blindness of the broker to the contents ofthe messages.
To address this problem and have optimised and more efficient handling of messages it became soon clear that the data structure of messages and the loT Data Broker had to be modified to enable the PROCESSING component of the architecture play a more valuable role in terms of message manipulation (i.e. filtering, aggregation etc.). As it will be illustrated later, this also enabled a more efficient storage as well as a more powerful role for the Complex Event Processing (CEP) engine. This modification of the broker would make it more capable as it would now handle proper JSON structures rather than strings. This proved to be a better way of proceeding as it would strengthen the processing capabilities of the loT component of the architecture rather than delegating everything to the APPLICAnONS block. The ability to process JSON data requires the knowledge of how the data is structured. As already mentioned, given the health-related application domain, the chosen standard was Open mHealth [7], but the broker can be easily enhanced to parse different data structures on a per-need basis (as it was done manually to handle position data). Research challenges here relate to the ability for the processing platform to acquire the knowledge necessary in handling different standards. https://edottechnologies.in/ At design phase we envisaged addressing this problem through the aid of semantic interoperability features [8], but a closer interaction with technology providers revealed that this is an objective still hard to achieve given it presumes all data producers (i.e. sensing technologies providers) structure their data according to welldefined ontologies. Business pressures and very short delivery cycles cause technology providers to rush out solutions to identified market needs as quickly as possible. Such "market-ready" product manufacturers lack the foresight and the competences, let alone the time and willingness, given limited returns on this investment, to structure data according to this or that ontology. An automated engine that translates between similar entries across different ontologies [9], while valid from a scientific standpoint is therefore not yet a compelling way to address the loT interoperability problem. Furthermore, the jury is still out on whether we will keep on interworking different standards / ontologies or whether, as it looks more likely, market-driven de-facto standards will make this problem obsolete.