Tim Coote the CTO at Anthropos kindly made time to discuss the challenges around remote monitoring and connected care for elderly people in their homes. We explored the challenges of connected care technology in domestic homes and where he sees the market and technology in the future.

To set the scene, can you briefly explain the technology area and solutions you primarily work with?

Anthropos is building a consumer facing IoT platform. It’s similar at the TOGAF/IAF’s logical level to systems like BG’s Hive system: in-home networks of sensors, actuators and computer(s) routed to a cloud hosted application set. The in-home components are COTS hardware, based on non-proprietary standards. The data and control planes are implemented over consumer broadband. Processing can be moved between in-home and in-cloud compute nodes. However, the main function of in-home compute nodes is to isolate the network technologies from the internet.

The solution focuses on helping with the environment and health of elderly and frail individuals, providing information that helps concerned individuals: family, friends, and carers; get a view of and a degree of comfort about the day-to-day living and longer term health changes of their loved ones.

What are the main challenges you face when implementing a successful solution? I know we have spoken about data and the economics before

From an IT/IS point of view, it’s “just” a very large scale distributed environment, with low capability computers and networks. But there are a few twists that require specific attention, including:

  • The end points are unattended and unobserved

This means that the system needs to be more self-aware than typical systems, where most application level incidents are identified by humans and logged through the helpdesk. If a node or a link starts to struggle, you need to know so that you can intervene without losing data. It’s not good if you find that you’re missing analytical or event data many months after you needed it.

  • There’s no IT department in the house

This requires a relentless focus on making the system simple to set up and to adjust to changes in the environment by untrained individuals. If the in home networks were disturbed every time the furniture was moved, forcing an engineer to visit, the service would be very inconvenient and too expensive.

  • Sensor and actuator technology, and the communications networks change rapidly as new technologies become economically viable

This is particularly challenging. The rate of change means that you must be able to exploit others’ products: it is both impractical and too much of a strategic drag to build such devices yourself. But that can create testing complexity. And, because the manufacturers of the devices differentiate themselves on features, the products tend not to have a focus on how they are managed (compared to, say, the computer accessible control planes on datacentre grade servers). This can lead to unexpected bugs that are first visible in the production environment as manufacturers do release new versions of products with new behaviours, but the same SKU. So, the first time that the new device is incorporated into the system can be a surprise.

  • Scale: challenges and the economics of solutions change as we scale up

Aside from the usual architectural challenges of increasing complexity with scale. For example, the cost of operations is a function of the variation of the configuration of individual compute nodes. At smaller scale (<1M nodes), the problem and the solution are still being explored, so rapid changes in the software are more easily handled by centralising the compute locus in the cloud. However, as we learn it becomes more attractive to use the very in-home compute nodes to reduce latency/improve responsiveness and to reduce cloud costs.

What market related issues do you see?

We are disrupting a market. This is always a challenge. I’ve mentioned the issues on the supply side. I expect that these will take some time to bed in as more people become aware of the issues. There is a risk that some early incumbents under-estimate the security approaches that are required. Although, those coming from an enterprise IT background tend to over-estimate the security requirements.

On the demand side, using Geoffrey Moore’s “Crossing the Chasm” model, we’re currently in the situation where the “Innovators”, who prefer to be on the bleeding edge, like being able to influence new things, and enjoy understanding what can go wrong, are starting to give way to the “Early Adopters” who expect to get direct value from a solution, but who are often naïve buyers. IoT systems can get out of control in many ways, and this is much underplayed by many early entrants, who specialise in demos that emphasise the ‘happy path’, while ignoring the inherent challenges. This early stage pattern is common to many disruptive technologies: it drives the Gartner Hype Curve, but that doesn’t make it any less challenging.

As the existing markets have very limited technology, there’s an even larger education gap; comparable in some ways to explaining the benefits of e-commerce to consumers before the Internet, and, although there are potentially large productivity improvements, the existing industry runs on very tight margins, which makes it difficult for customers to even think about the possibility of improving the situation with some new purchases.

What is your vision of where this technology will be in the future?

One story behind our solution is that we are building a platform for running experiments. By which we mean enabling research led investigation. The platform enables the rapid measurement of the effects of interventions at scale and with simple understanding of error bars so that confidence intervals can be narrowed and small effects identified. There will be a need for better understanding of how statistical modelling works, but it looks like Covid is helping here. We are all becoming used to graphs that include error bands, rather than just the best guess line, and, I think, we’re becoming familiar with re-evaluating forecasts in the light of new information. I know from my previous work that any time that new data are made available, new insights follow closely behind, and then new value is created.  Such as elder cohort focused clinical trials, insights from day-to-day use and personalised insights such as the identification of sub-groups that react differently do interventions.

I believe that our approach will enable a couple of very important population scale changes:

  • Feedback on what is going on in the life of elderly and frail people, which will help them to keep up on interventions that are helping – e.g. daily feedback on improvements in mobility for a specific exercise can be very motivating; or information that healthcare professionals can use to spot bad things earlier.
  • Sufficient statistical power, potentially with very large subject cohorts with high frequency signals, to identify small effects that only affect sub-populations and to apply signal processing techniques to measure normal and abnormal conditions. Current techniques rely on very expensive measurement: high quality equipment, specialist technical knowledge to use the equipment, special locations to house the equipment, which leads to very low frequency measurement in atypical environments, which, in turn, increase the measurement errors. Overall, the visibility of effects through the data will dramatically improve.