Você está na página 1de 4

ReStructure: A Wireless Sensor Network for Monitoring

Temporary Earth Retaining Systems

Ross Wilkins Elena Gaura Michael Allen


Cogent Labs, Cogent Labs, Cogent Labs,
Coventry University, UK Coventry University, UK Coventry University, UK
ross.wilkins@coventry.ac.uk e.gaura@coventry.ac.uk michael.allen@coventry.ac.uk

John Kemp James Brusey Andrew J. Whittle


Cogent Labs, Cogent Labs, Department of Civil and
Coventry University, UK Coventry University, UK Environmental Engineering,
aa9384@coventry.ac.uk j.brusey@coventry.ac.uk MIT, MA, USA
ajwhittl@mit.edu

ABSTRACT TERS are monitored through field measurements (typically


Temporary earth retaining structures help prevent collapse strain, pore pressure, and wall deflection). Measurements
during construction excavation. To ensure that these struc- are taken to compare against the expected values given by
tures are operating within design specifications, load forces the model and also to determine if the design specification
on supports must be monitored. Current approaches are has been exceeded. These measurements, however, are ex-
expensive, often manual, and difficult to integrate into pre- pensive to obtain, and the process of gathering the data is
dictive models. We developed a wireless strain gauge suit- not always automated. Furthermore it is not general prac-
able for harsh construction environments along with a data tice to incorporate field measurements into models during
collection back-end that could be integrated into on-line construction to compare actual versus expected perform-
predictive models. This system has been used to monitor ance. If this could be enabled, knowledge about the con-
an underground train station construction site in Singapore struction process could be improved, potentially reducing
for 5 months. Key challenges are to ensure valid meas- risk and costs in the future.
urements, reliable wireless communication, sufficiently long In applying wireless sensing to TERS monitoring, we want
battery life, and protection against weather and incidental to increase measurement density and value to the overall
damage. This paper describes the system design, experi- construction (e.g., enabling data to feed into real-time mod-
ences with its deployment and lessons for future develop- els, not just inform static alarm thresholds). Our initial
ments. aim is to create a proof-of-concept Wireless Sensor Net-
work (WSN) that is deployed directly onto support struts
to acquire strain data (a proxy for structural load). Achiev-
Categories and Subject Descriptors ing a feasible prototype requires valid measurements, reli-
C.3 [Special-Purpose and Application-Based Systems]: able wireless communication, and packaging for protection
[Realtime and Embedded Systems] against the weather. Analysis of the archive of data cre-
ated through this pilot will give insight into the feasibility
of moving toward intelligent WSNs for TERS monitoring.
Keywords The project has been active since January 2014. In the
Wireless Sensing; Structural Health Monitoring; Deploy- first 12 months, a small prototype network was built and
ment characterised, and in the last 6 months we deployed our
network in stages over a TERS on a live excavation. A
1. INTRODUCTION unique and challenging feature of instrumenting an active
construction site is that sensor node deployment continues
Excavation operations require the use of Temporary Earth for the duration of the construction, thus the environment is
Retaining Structure (TERS) whose designs are based on constantly changing and the network expanding over time.
model-based analysis carried out before the excavation be- In this paper, we focus on the design of the WSN and its
gins. Once construction begins, structural parameters of the performance in the pilot deployments, along with the lessons
Permission to make digital or hard copies of all or part of this work for personal or learned over the course of the project.
classroom use is granted without fee provided that copies are not made or distributed The paper is structured as follows: Section 2 provides an
for profit or commercial advantage and that copies bear this notice and the full citation
overview of our system. Section 3 describes a Mass Rapid
on the first page. Copyrights for components of this work owned by others than the
author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or
Transit (MRT) construction case study and provides results
republish, to post on servers or to redistribute to lists, requires prior specific permission of the system evaluation and deployment experiences. Sec-
and/or a fee. Request permissions from Permissions@acm.org. tion 4 describes related work, and Section 5 concludes the
RealWSN’15, November 1, 2015, Seoul, South Korea.. paper.
Copyright is held by the owner/author(s). Publication rights licensed to ACM.
ACM 978-1-4503-3840-0/15/11 ...$15.00.
DOI: DOI: http://dx.doi.org/10.1145/2820990.2820999.

37
Figure 1: Architecture of the strain monitoring sensor network.

Drivers were written for our strain board, and at the ap-
plication level we developed a traditional sense and send
system that acquires, stores to flash and transmits network
information (time, RSSI, beacon interval, sequence number,
neighbours), strain and temperature/humidity data at five
minute intervals (92 bytes of payload total). Storing samples
to flash allows a full dataset after construction is completed,
regardless of the network condition during the deployment.
This is important for post-deployment analysis of data for
techniques such as edge mining [1].
Micro-benchmarking was used to estimate the lifetime of
the sensor nodes given a particular battery capacity and
Figure 2: Sensor node hardware.
sampling rate. The individual operations of a sensing cycle
(5 minute interval) were measured and aggregated to de-
termine a baseline for node lifetime. 270 days of operation
2. SYSTEM OVERVIEW are expected with two 7.8 Ah 1.5 V C cells.
Figure 1 shows the an overview of the end–to–end sys-
tem architecture. Data flows from the sensor nodes to a 2.2 Gateway
sink, where it is transmitted to a remote server via 3G and
Our Gateway was built using a Raspberry Pi model A+
made available to user applications. To reduce risk and de-
combined with a TelosB node and a USB 3G modem (both
velopment time, we opted to use off-the-shelf hardware and
with external antennas). The Gateway is deployed inside
software wherever possible.
an IP65 mild steel enclosure and mounted on a pole. Due
to deployment constraints, we chose to power the Gateway
2.1 Sensor node using a 12 V 100 Ah battery. During normal operation WSN
Our sensor node combines the Zolertia Z1 platform with data is collected by the TelosB, aggregated at the Raspberry
a custom strain gauge board (see Figure 2). The Z1 is based Pi and transmitted hourly via 3G to a remote server.
around an MSP430 CPU and a CC2420 radio. Our custom From prior experience with deploying WSNs, we focused
board provides input for one resistive strain gauge (strut on ways to i) lower power consumption, ii) improve fault
loading is assumed to be axial), whose readings are acquired tolerance to minimise on-site maintenance, and iii) to make
using a Wheatstone bridge combined with a low power 16- on-site maintenance as easy as possible. To reduce power,
bit ADC (TI ADS1115). Measurement resolution is <1 μe we disabled HDMI on the Raspberry Pi and made custom
with a measurement range of ±2500 μe. External tem- circuitry to allow the power to the TelosB and 3G modem
perature/humidity is enabled by a Sensirion SHT15. Each to be controlled through GPIOs. To provide fault tolerance
sensor node is packaged in an IP65 aluminium enclosure between the TelosB and the Raspberry Pi, we implemented
(115×65.5×50 mm, 370 g) with holes for an external radio a simple handshake connection so that the Raspberry Pi
antenna (4.4 dBi Antenova Titanis), a gland to support the can periodically check and restart the TelosB if necessary.
strain gauge wiring, and a waterproof breathable membrane For 3G transmissions, if an hourly update fails, it is rolled
for the SHT15. Magnetic feet on the enclosures allow the into the following hour’s transmissions, and data is always
nodes to be placed over the strain gauges. archived locally in case of extended 3G outages. In order to
Our software was developed on top of the Contiki WSN make on-site debugging and deployment easier, the Gateway
OS. Contiki provides a network stack with a low power MAC hosts a USB WI-FI dongle. When the dongle is connected,
(ContikiMAC) and multi-hop tree formation/data collection the Gateway becomes a wireless access point (using hostap),
protocol (Contiki Collect) that we took almost entirely out- hosting a web page that shows real-time updates of data.
of-the box. The only minor change necessary was to reduce Our measures reduced the average power requirement of
the size of the recent message buffer kept by each node; this the gateway from over 400 mA to 128 mA (at 12V). Based
reduced the amount of time needed to wait in order to verify on microbenchmarking tests, we estimate a 100 Ah battery
network formation after a node restart. will provide one month’s operation.

38
100
3. CASE STUDY: MRT EXCAVATION SITE
From February to July 2015, we deployed our system on
the TERS of an active MRT excavation site in Singapore. 75
The main goals of our deployment were to: i) validate the

Daily Yield (%)


communication performance of our system in-situ, ii) assess NodeId
the measurement validity, and iii) collect an archive of data 50 53
Day before 57
for offline analysis. The deployment environment was par- battery change
ticularly challenging for three reasons:
Server covered
Climate - Singapore has a tropical climate with frequent 25
in plastic
and severe rainstorms. Temperatures can rise to 35 ℃ and Stopped receiving
data after
99% RH above ground, and above 40 ℃ inside the excav- battery change
0
ation. There are large daily cycles where temperature can
change by 25 ℃ and relative humidity by 60%. Feb 15 Mar 01 Mar 15 Apr 01 Apr 15

Obstructions - Radio communication is challenging due


to the construction environment, which is mainly a combin- Figure 4: First deployment daily PDR
ation of metal and concrete materials that can reflect and
attentuate radio signals. The construction site itself is very Deployment D1–L2 D2–L1 D2–L2
dusty and there is lots of machinery and movement that Node 53 57 51 55 50 52
could damage equipment. Days 70 70 82 82 47 47
Dynamic environment - A construction site is con- PDR 83.9 83.9 90.9 86.3 94.3 94.1
stantly changing: over days and months material is being Avg. RSSI -75.1 -82.8 -83.9 -84 -82.7 -82.2
excavated from the bottom of the trench and new struts % 1–hop 98.3 51.1 83.9 98.6 71.7 79.7
and supports are being placed. Therefore, the communica- % beacon 99.2 98.0 98.1 99.0 96.8 98.1
tion environment can as struts are deployed, providing new
surfaces that can reflect and interfere with signals. Table 1: Network performance for 2 deployments
In our deployments, the network performance of the sys-
tem was evaluated using two metrics: i) Packet Delivery Ra-
tio (PDR) (calculated as packets received / expected num- traced to human error. In the first instance, the external
ber of packets) and ii) network stability - which we define as antenna for the TelosB became dislodged from the enclos-
the ratio of the number of packets a sensor node sends with ure during construction works right next to it. In the second
beacon interval = 3600 versus the total number of packets case, the battery was not changed in a timely manner be-
sent. The beacon interval is a per-node parameter used in cause there was no capability on the Gateway to remotely
the Contiki Collect stack; the beacon interval gets shorter report the battery voltage; this caused a power outage at
when the network is reorganising to allow quick building the Gateway for several hours. Finally, a faulty battery was
of neighbour tables. We also measured the Received Sig- installed on site and could not be changed over the weekend,
nal Strength Indication (RSSI) for incoming packets to the leading to several days of data loss.
Gateway. Nodes communicated on 802.15.4 channel 26 in Table 1 summarises the deployment; for both nodes the
an attempt to avoid any WI-FI interference and the max- beacon interval was at 3600 for 98 to 99% of the time, indic-
imum retries for each packet sent by Contiki Collect was set ating a stable network. Node 53 sent directly to the sink 98%
at 16. The rest of this section evaluates two deployments of the time, whilst node 57 sent through node 53 roughly as
undertaken at the MRT site. often as it sent directly to the sink.

3.1 Deployment 1: Two Node Deployment 3.2 Deployment 2: Four Node Deployment
Two sensor nodes were deployed on a level 2 strut at the Two sensor nodes were initially deployed on a level 1 strut
excavation, 10 metres below ground level. The gateway was and two more on a level 2 strut as it was installed, several
deployed at ground level near the excavation, 12 metres away weeks later. This demonstrated the growing of the network
from the nodes. The strain gauges were welded to the strut as the construction evolved. The gateway was situated at
next to the contractor’s vibrating wire system, as shown in ground level, 10 m away from the level 1 strut.
Figure 3, while the strut had zero load on it. This enabled The level 1 deployment has been in place for 82 days, with
us to have reference measurements to compare against. De- level 2 in place for 47 days; Table 1 shows the results for both
ployment 1 lasted for 70 days, during which time both nodes levels. The PDR ranges from 86% to 94%, and expected the
sent data, but one node’s data stream began to drift away nodes on level 1 transmit directly to the sink over 95% of
from realistic values when compared against reference con- the time. The nodes on the second level still manage to get
tractor data. We suspect this was due to water ingress from directly to the sink between 72% to 80% of the time. The
a storm event, but it was not possible to retrieve either of network can be considered stable with the maximum beacon
the nodes to debug for safety reasons, since the construction interval being reported 97% of the time.
had progressed significantly by this point. After this period
the Gateway was moved to another part of the construction 3.3 Deployment experiences
site for deployment 2; the nodes continue to log data and The deployments allowed us to test the integrity of the
will be collected at the end of the construction. network and the data collected, and better understand how
Figure 4 shows the daily PDR for the deployment, with a network could be scaled up alongside continuing construc-
an average PDR of 83.9%. For 50 days, the PDR was 98% tion. We found that the Gateway lifetime was in line with
on average. The drop in PDR for the other 20 days can be our predictions, and the battery is being replaced every 30

39
Figure 3: Example deployment on a strut next to vibrating wire gauges

days. Having the Gateway as a wireless access point was 5. CONCLUSIONS


helpful when deploying sensors below ground, as a laptop This paper has described our experiences with building a
could be taken into the excavation to calibrate the node WSN for strain monitoring and its deployment on a TERS
and get immediate connectivity feedback. It is important for an MRT excavation site in Singapore. Future work will
that functionality be added to the Gateway to detect and investigate scaling up deployment sizes to larger numbers
report low battery levels and tampering (i.e. a tilt sensor). of nodes, as well as the improvement that can be gained
In general, we found that communication links were much through using sub 1 GHz radios, the integration of meas-
shorter than expected. In ground level tests of radio con- urements into real-time models, and the use of in–network
nectivity, a maximum transmission range of 60 m was ob- intelligence to increase the lifetime of individual nodes in
served. However, hop lengths observed during deployment the network.
were closer to 10 m. The most likely cause of this problem
is the 2.4 GHz communication frequency used.
As part of the deployment, our sensor nodes required the
6. ACKNOWLEDGEMENTS
Wheatstone bridge to be manually calibrated (or balanced); This research was supported by the National Research
this was an extremely inconvenient, time-consuming opera- Foundation Singapore through the Singapore MIT Alliance
tion that would benefit from being automated [2]. for Research and Technology’s Center for Environmental
The relatively slow speed of construction limited our de- Sensing and Modeling research program. Fieldwork has
ployment speed, and the inability to access nodes after they been enabled through a collaborative agreement between
had been deployed meant that we couldn’t re-use nodes, SMART and Singapore’s Land Transport Authority. We’d
or debug them directly. This presents two significant chal- also like to thank the reviewers for their valuable feedback.
lenges in relation to learning from the deployment: i) the
development iteration cycles become much longer term than 7. REFERENCES
is standard in research, and ii) hardware failures cannot be [1] E.I. Gaura, J. Brusey, M. Allen, R. Wilkins,
quickly diagnosed and resolved. D. Goldsmith, and R. Rednic. Edge mining the internet
of things. IEEE Sensors Journal, 13(10):3816–3825,
4. RELATED WORK Oct 2013.
Our work has a commonality with other WSNs deployed [2] S. Jang, H. Jo, S. Cho, K. Mechitov, J. A. Rice, S. Sim,
for structrual health monitoring, though those systems tend H. Jung, C-B Yun, B. F. Spencer Jr, and Gul A.
to focus on sub-second sampling of acceleration and strain Structural health monitoring of a cable-stayed bridge
on dynamic structures such as bridges. The Illinois Struc- using smart sensor technology: deployment and
tural Health Monitoring Project (ISHMP) yielded a struc- evaluation. Smart Structures and Systems,
tural health monitoring platform based around the Imote2. 6(5-6):439–459, 2010.
Jang et al. [2] describe the hardware and deployments of [3] S. Kim, S. Pakzad, D. Culler, J. Demmel, G. Fenves,
2 single hop networks of 33 and 37 nodes. Their dynamic S. Glaser, and M. Turon. Health Monitoring of Civil
resistive strain interface provides auto-calibration and zero- Infrastructures Using Wireless Sensor Networks. In
ing of the Wheatstone bridge. Kurata et al. [4] deployed 28 2007 6th International Symposium on Information
Narada nodes on the New Carquinez suspension bridge for Processing in Sensor Networks, pages 254–263.
a period of two years. Kim et al. [3], carried out a three- [4] J. P. Lynch G. W. van der Linden H. Sedarat E.
month deployment of a 64 MicaZ node multi-hop network Thometz P. Hipley L.-H. Sheng M. Kurata, J. Kim.
on the Golden Gate Bridge, sampling vibration/acceleration Internet-enabled wireless structural monitoring
at 1 kHz. These projects highlight the difficulties of deploy- systems: Development and permanent deployment at
ing on physical structures; Kim et al. [3], for example, found the new carquinez suspension bridge. Journal of
that radio reception range was around 30 m, decreasing to Structural Engineering, 139:1688–1702, October 2013.
around 15 m where line of sight was blocked, matching our
deployment experience.

40

Você também pode gostar