by Simon Dobson and Kieran Delaney

Sensor networks are the key enabling technology for building systems that adapt autonomously to their environment, without direct human intervention. Most sensor networks operate in free air, but research being conducted in Ireland, between the School of Computer Science and Informatics at UCD Dublin and the Centre for Adaptive Wireless Systems at Cork IT, is starting to explore the tools and techniques we need in order to build 'augmented materials' which combine sensing, actuation and processing into the fabric of built objects.

Embedding sensing into a physical substrate has a number of attractions. Each sensor package can sense a number of local variables such as the stress on the material, its orientation in space, its proximity to other materials etc. Combine these sensors into a network and we can construct a global view of the material and its relationships to the real world. Add processing and we have the potential to build materials that "know themselves", in some sense, and which can react in ways that are far more sophisticated than are possible with simpler, 'smart' materials.

If this all sounds a bit abstract, imagine a person with a broken leg who is wearing a plaster cast. For a physiotherapist, the challenge is to make the person take adequate exercise to stimulate the break, while at the same time stopping them from attempting too much and risking further damage. As the physiotherapy programme changes over time and in conjunction with on-going assessment of the injury, the exercise required of the patient and the optimal levels of rigidity and support required of the cast, will also change. If we embed sensing and processing into the cast, the network can sense the load being placed on the cast as the person walks around. This can then be compared with a downloaded therapy programme and react, for example, by glowing green when things are fine, but flashing red lights if the person is overdoing their exercise. It is even possible to build materials with variable rigidity so that the cast adapts the support it provides over the course of treatment.

The individual elements of an augmented material can be based around any 'mote' technology. We are using the platform developed by Ireland's Tyndall National Institute, made available via that institute's National Access Programme. While current mote systems are too large for use in practice, Tyndall's 2.5cm-on-a-side motes (see figure) are being reduced to a 1cm form and beyond, making them realistic for embedded use.

A 2.5cm-on-a-side 'mote', developed by Ireland's Tyndall National Institute.
A 2.5cm-on-a-side 'mote', developed by Ireland's Tyndall National Institute.

Since a single object might contain hundreds of elements, the elements themselves need to be substantially configured, by making connections to neighbouring elements during curing, for example. Changes to this network may come from node failures, but may also come from physically significant events such as cutting which manifests itself as a (rather structured) partitioning of the network. Dealing with these changes in a tractable way means developing a programming model that operates at the level of the complete material rather than at the level of the individual elements within it. It should also be able to handle failures and errors in sensing in the individual elements. This is not something that is easy to do in traditional languages and we are investigating some techniques pioneered in high-performance computing (skeletons and categorical data types) as a possible basis for building self-managing applications on top of the underlying unreliable sensors and communications.

Augmented materials are in many ways the ideal co-design challenge. The properties of the material determine directly what can be sensed and processed, while software provides capabilities to enhance and complement the material's underlying physics. A physical phenomenon, such as placing one augmented object on top of another, gives rise to individual sensor readings affecting pressure, orientation and the establishing of new wireless communications links etc. These in turn give rise to a semantic inference that can be used in software to drive high-level responses based on the intention inferred from performing this particular action with these particular objects.

While this work is still at a very early stage, we hope that the programme will lead to useful advances in embedded sensor networks, to new forms of microsensing and actuation associated with explicit software control, to improved autonomic control of communications and routing, and to generally useful programming models for sensor-rich multi-agent networks and environments.

Links:
http://www.ucd.ie/csi
http://www.aws.cit.ie
http://www.tyndall.ie/research/mai-group/25cube_mai.html

Please contact:
Simon Dobson, University College Dublin / IUA, Ireland
E-mail: simon.dobson@ucd.ie

Next issue: January 2025
Special theme:
Large-Scale Data Analytics
Call for the next issue
Get the latest issue to your desktop
RSS Feed