We have developed an on-line sensor calibration scheme that employes a additional single source as the external stimulus that creats differential sensor readings used for calibration. The key idea of our approach is to use an actuator to produce differential simultaneous excitement of all sensors over a number of time frames while the environment the sensors are deployed in is relatively inactive. The sensor calibration functions are derived in such a way that all sensors (or a group of sensors) agree on the effect of the actuator in the most consistent way. More specifically, we utilizes the maximal likelihood principle and a nonlinear system optimization solver to derive the calibration functions of arbitrary complexity and accuracy. The approach has the following noble properties: i) it is maximally localized in that each sensor only needs to communicate with one other sensor in order to be calibrated; ii) the number of time steps that are required for calibration is very low. Therefore, the approach is both communication and time efficitent. We present two variants of the approach: i) one where only two neighboring sensors have to communicate in order to conduct calibration; ii) one that utilizes an integer linear programming (ILP) formulation to provably minimize the required number of packets that must be sent for calibration. We evaluate the techniques using traces from light sensors recorded by in-field deployed sensors, and statistical evaluations are conducted in order to obtain the interval of confidence to support all the results.
We propose a sensor calibration approach that is based on constructing statistical error models that capture the characteristics of the measurement errors. The approach is generic in the sense that it can be utilized in any arbitrary sensor modalities. The error models can be constructed either off-line or on-line and is derived using the nonparametric kernel density estimation techniques. Models constructed using various forms of the kernel smoothing functions are compared and contrasted using statistical evaluation methods. Based on the selected error model, we propose four alternatives to make the transition from the error model to the calibration model, which is represented by piece-wise polynomials. In addition, statistical validation and evaluation methods such as resubstitution, is used in order to establish the interval of confidence for both the error model and the calibration model. Traces of the acoustic signal-based distance measurements recorded by in-field deployed sensors are used as our demonstrative example. Furthermore, we discuss the broad range of applications of the error models and provide a tangible example on how adopting statistical error model as the optimization objective impacts the accuracy of the location discovery task for wireless ad-hoc sensor networks.
Wireless sensor networks have emerged as the major criteria that enable the next scientific, technological, engineering, and economic revolution. Since digital rights management is of the crucial importance for sensor networks, there is an urgent need for development of intellectual property protection (IPP) techniques. We have developed the first system of watermarking techniques to embed cryptologically encoded authorship signatures into data and information acquired by wireless embedded sensor networks. The key idea is to impose additional constraints during the data acquisition or sensor data processing. Constraints correspond to the encrypted signature and are selected in such a way that they provide favorable tradeoffs between the accuracy and the strength of proof of the authorship. The techniques for watermarking raw sensor data include one that modifies the location and orientation of a sensor, time management discipline (e.g. frequency and phase of intervals between consecutive data capturing), and its resolution. The second set of techniques embeds signature during data processing. There are at least three degrees of freedom that can be exploited: error minimization procedures, physical world model building, and solving of computationally intractable problems. We have developed several watermarking techniques that leverage on the error minimization degree of freedom and have demonstrated their effectiveness for watermarking location discovery information.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.