KEYWORDS: Signal attenuation, Telecommunications, Visualization, Pattern recognition, Signal analysis, Detection and tracking algorithms, Image segmentation, Logic, Signal analyzers, Networks
The data-entropy quality-budget developed by the authors is used as an alternative to the conventional power budget. The traditional power budget approach is not capable of providing a full analysis of a system with different noise types and specifically providing a measure of signal quality. The quality-budget addressed this issue by applying its dimensionless 'bit measure' to integrate the analysis of all types of losses. A data-entropy visualisation is produced for
each set of points in a reference and test signal. This data-entropy signal is a measure of signal disorder and reflects the power loss and types of signal degradation experienced by the test signal. To analyse the differences between two signals an algorithm known as phase-coherent data-scatter (PCDS) is used to assess levels of attenuation, dispersion, jitter, etc. Practical analysis of telecommunications signals using the new multiple-centroid (MC) PCDS is presented here for the first time. MC-PCDS is then used to analyse differences between sets of data-entropy signals and digital signals. The theory behind MC data-scatter is discussed and its advantages for the quantification of signal degradations are assessed. Finally, a brief consideration is given to the use of pattern recognition algorithms to measure optical signal degrading factors.
A novel method for the measurement of ultra-high absorbance liquids has been devised and details are given of a new ultra absorbance instrument developed specifically for these thin liquid film measurements. The instrument specifically constructed for monitoring and measuring sunscreen products has been tested using locally produced sunscreen products. This new approach has been made possible by the development of very accurate liquid micro-dispensers and details are given of the novel procedure to carry out these measurements. Detailed description of the apparatus construction is given with photographs of the apparatus. The work described is largely based on research and quality control measurements of Parasol suncare products. Results on the reproducibility of measurements taken with the UAI for a commercial range of factor 20 sunscreen liquid are given and these have been used to validate the performance of the instrument. It is believed that the absorbance measurements described here are perhaps the largest ever reported. In addition, the photostability of this product has been monitored in aging tests. Finally, some studies have been done on two other commercially available factor 20 products that show that these are significantly worse with regards to both protection from ageing and burn.
The development of optical engineering, photonics, optical telecommunications and networking courses in the Carlow Institute of Technology are briefly outlined in its national and local historic context. The experience of running various pioneering technician and degree courses in Carlow using assessment procedures designed to test specified learning outcomes is described. A critical review of the use of these educational methods for optical engineering is then made based on the personal experience of one of the author's postgraduate experience in studying in both the Glasgow universities. A differential study is presented of the Scottish and Irish experience, made from the point of view of best practice in educational methodology, as it applies specifically to teaching the high level skills required for engineering design in optical engineering programmes. Details on technology teacher training are presented and some discussion is given on relevant educational initiatives for this area. Possibly the first ever quantitative taxonometric analysis of the 2003-4 examination papers from a leading Institute of Technology in Ireland is undertaken to provide an insight into the present practice of the lecturers and educational managers running this programme. This analysis reveals the coordinated teamwork involved in the course implementation and identifies that various roles that are taken by individual courses in the context of balancing appropriately the whole educational programme. Critical observations on some of the programmes for technician, technologist and degree programmes should enable the delivery to be improved. The statistical analysis of results should also deliver improvements in retention rates of the students. The paper ends with a observations on some useful lessons to be drawn from this wide-ranging review of world, Scottish and Irish experience.
The rapidly emerging field of MEMS (micro electromechanical systems) has recently seen a proliferation of microscale devices and processes. Indeed, microsystems and nanotechnology has from its origins in the integrated circuits industry now become an extensive field of research encompassing everything from biosensors with near real-time diagnostics to power MEMS for portable devices, enabling vastly improved performance to power consumption over their macro counterparts. The paper uses relevant contemporary issues that arise from conceptual limitations in this burgeoning field to illustrate and highlight some critical analysis of the key educational issues involved in teaching in this vital area. The paper considers a number of political-strategic issues arising for Ireland directly out of the nano-biotechnology revolution. It also highlights a number of relevant concerns that can be addressed by educational initiatives. Theoretical and philosophical concepts regarding changes in thinking surrounding recent developments are also explored, with some specific primary science discussion made on research issues and second-third level education. The paper ends with an attempt to identify the major opportunities for Ireland and highlights the changes that science and technology will wrote for an Ireland ready or not, to face a new reality that is also no respecter of any country's past successes.
A study was conducted to determine the alcohol concentration, refractive index and surface tension of binary solutions from multianalyser tensiotrace data. Characteristic vector analysis of multivariate response data has been successfully applied to a variety of optical tensiotraces to explore the quantitative capabilities of the multianalyser tensiograph. Singular value decomposition was used to determine the key vector i.e. the characteristics of the required signal as it affects the data. This vector is then optimised using the known established data to estimate the value of the unknown parameters. By the use of characteristic vector analysis the paper explores the relationship that exists between the tensiotrace features and the physical properties of a liquid. This paper shows the possibility of future work for identifying wines. A second study has been conducted where five wine samples were run on the multianalyser and their tensiotraces acquired. This preliminary study demonstrates that wine archiving and fingerprinting is possible.
Darwin's seminal work 'Origin of Species' immediately attracted the 19th Century scholar John Tyndall. Darwin's book was, and is, a hypothetical and metaphysical treatise but it has great explanatory power. The cryptically named 'X Club'-9 members, including Tyndall-was formed to defend Darwin's outrageous ideas. Tyndall's responsibilities within this X-Club were to support Darwin's theory through experimental studies in solar physics and chemistry. Research was, of course, directed at understanding the physical basis of life on earth. The studies founded modern meteorological sciences, nephelometry and bacteriology (pace, Pasteur). This current essay details some of the historical background of Tyndall's work in natural philosophy; allowing the value of Tyndall's work to be assessed more objectively. Also it evaluates their respective contributions to the founding of this different way of looking at the world. The work of Tyndall at the 1868 Norwich 'British Association for the Advancement of Science' (BAAS) Meeting and the later internationally explosive 1874 Belfast BAAS meeting are examined in the light of his research. Some amplification of Tyndall's works both philosophically and historically is attempted.
This study into the SPLITS{Sampling Partitioning (Liquid) Irregularities in Turbid Solutions}effect is part of the Aqua-
STEW (Water Quality Surveillance Techniques for Early Warning by Tensiographic Sensors) 5th Framework European project. The paper reports new experimental measurements into this effect using the fibre drop analyser. Capillary hydrodynamic fractionation has been previously been observed, but the drop analyser technique provides improved experimental characterisation of the effect. A description of the optical tensiograph system used is given with details of the five strategies designed to minimize this SPLITS effect. It has been observed that the sampling of the solution by a stepper-pump to the instrument drophead, leads to irregular concentrations in the delivery tubing and thence the drophead. The measured tensiotrace variations in successive microlitre samples delivered from the capillary have provided an insightful experimental approach for the study of this important effect. It has been found that the best approach for the minimization of these unwanted concentration irregularities is using high-speed aspiration. The paper ends with discussion on the general relevance of this SPLITS effect to other chemical techniques that must sample turbid solutions and analyses the specific issues posed for on-line water monitoring systems.
The importance of sensitive monitoring of changes in Raman spectra in particular for microelectronic applications is discussed here. We explore the practicality of using a data-scattering method to analyse Raman spectra, and to establish the dependence of changes observed in all the spectral function characteristics on the parameters of data-scatter such as scatter closeness and scatter radii using "Trace Miner" software. In addition to the analysis performed on model data, analysis on experimental Raman data is also discussed. The sensitivity of the approach is fully appreciated.
KEYWORDS: Signal attenuation, Signal to noise ratio, Telecommunications, Oscilloscopes, Interference (communication), Information theory, Signal analysis, Quantization, Optoelectronics, Data acquisition
This paper introduces for the first time a numerical example of the data-entropy 'quality-budget' method. The paper builds on an earlier theoretical investigation into the application of this information theory approach for opto-electronic system engineering. Currently the most widely used way of analysing such a system is with the power budget. This established method cannot however integrate noise of different generic types. The traditional power budget approach is not capable of allowing analysis of a system with different noise types and specifically providing a measure of signal quality. The data-entropy budget first introduced by McMillan and Reidel on the other hand is able to handle diverse forms of noise. This is achieved by applying the dimensionless 'bit measure' in a quality-budget to integrate the analysis of all types of losses. This new approach therefore facilitates the assessment of both signal quality and power issues in a unified way. The software implementation of data-entropy has been utilised for testing on a fiber optic network. The results of various new quantitative data-entropy measures on the digital system are given and their utility discussed. A new data mining technique known as data-scatter also introduced by McMillan and Reidel provides a useful visualisation of the relationships between data sets and is discussed. The paper ends by giving some perspective on future work in which the data-entropy technique, providing the objective difference measure on the signals, and data-scatter technique, providing qualitative information on the signals, are integrated together for optical communication applications.
An extensive experimental study into the relationships between tensiotrace features and surface tension of alcohols and bifunctional liquids has produced a series of empirical relationships. The use of this 'inside the rainbow' studies for pendant drops is known as optical tensiography. A series of empirical relationships discovered will enable the experimental measurement of surface tension without the correction factors that have been used since the development of the drop volume/weight method over a century ago for a restricted range of liquids. This approach offers potentially important applications in surface science and it is also suggested how these new relationships will be tested using theoretical models developed by the authors in the ongoing work. This paper provides the first experimental investigation into the commencement of the tensiotrace, a position at which optical coupling begins, which reveals measurement possibilities.
For the first time the term data diffraction is introduced, with examples drawn from the algorithm known as phase coherent data-scatter (PCDS) that produces identifiable visual patterns for different types of signal degradation in optical telecommunications. The main signal degradation factors that affect the performance of optical fibers include attenuation, rise-times and dispersion. The theory behind data-scatter is introduced including comprehensive explanations of the theoretical conceptual components of this technique such as centroids, exchange operation, coherence, closeness and projection radius. The various issues of assessing the quality of digital signals are outlined using a simulation study. The authors for the study of optical telecommunications issues have extended the functionality of data-scatter. This approach shows considerable promise. The utility of the data-entropy based 'quality budget method' for optoelectronic system engineering is revisited using an information theory based approach for optical telecommunications. Proposals for the implementation of pattern recognition algorithms to analyse the repeatable patterns within data-scatter are discussed. The paper concludes with brief considerations into the advantages of linking the new data-scatter and data-entropy approaches in digital fiber systems for performance quantification and assessment.
The adsorption properties of polymers are of great importance for implant studies. A better understanding of these properties can lead to improved implant materials. In this study the surface energy of different polymers was derived from contact angle measurements taken using profile analysis tensiometry (PAT) of sessile drops of water. The contact angles were measured for advancing and receding water drops on polished polymer surfaces and also on polymer surfaces modified by adsorbing protein to the surface prior to analysis of the sessile drop. The protein used was bovine serum albumin (BSA) and the surfaces were poly-methylmethacrylate (PMMA), poly-ether-ether-ketone (PEEK) and stainless steel. The polymer surfaces were also studied using atomic force microscopy (AFM). Images of the surfaces were taken in different states: rough, smooth and with albumin adsorbed. As a method to identify the proteins on the surface easier, anti-albumin antibodies with 30nm nano gold particles attached were adsorbed to the albumin on the surfaces. Using nano gold particles made the imaging more straightforward and thus made identification of the protein on the surface easier. The results from this work show the differing hydrophobicities of polymer surfaces under different conditions and a new nanotechnological method of protein identification.
The paper investigates from the perspective of computer science the phase coherence theory (PCT) and phase coherent data-scatter (PCD-S). These techniques were originally developed for the area of optical tensiographic data mining and analysis but have a more general appplication in data mining. These develoments have recently been augmented with the engineering of a software toolkit called TraceMiner. Although the toolkit was originally devised for tensiography it was developed to perform as a generic data mining and analysis application with PCT, PCD-S and a range of other data mining algorithms implemented. To date the toolkit has been utilised in its main application area, tensiography, but has also been applied to UV-visible spectroscopy. This work presents a critical investigation of the general utility of PCT, PCD-S and the toolkit for data mining and analysis. A new application of PCT and the TraceMiner software toolkit to Raman spectroscopy is presented with discussion of the relevant measures and the information provided by the toolkit. This provides more insight into the generic potential of the techniques for data mining. The analysis performed on theoretical Raman data is augmented with a study of experimental Raman data. Raman spectroscopy is used for composition and fault detecton analysis in semiconductor surfaces. Finally, the utility of the PCT technique in comparison with traditional Raman spectroscopy methods is considered together with some more general applications in the field of imaging and machine vision.
KEYWORDS: Data mining, Data modeling, Visualization, Data acquisition, Data processing, Software development, Mining, Statistical analysis, Mathematical modeling, Raman spectroscopy
Phase Coherent Data-scatter (PCD-S) was originally developed for the area of tensiographic data mining and analysis. This development has been augmented with the engineering of a software toolkit called TraceMiner, which integrates this technique with additional data mining and statistical tools for general use. This paper presents, for the first time, a theoretical treatment of data-scatter as a generic data mining tool, cognisant of the data set descriptions, data transformations, measurands and data model visualisations possible with data-scatter. Data-diffraction resulting from data scatter is also presented here for the first time. The use of the two approaches in a Hough technique to analyse the resulting data-diffraction patterns is discussed briefly in the context of applications of this new data scatter approach.
Scattered colorimetry, i.e., multi-angle and multi-wavelength absorption spectroscopy performed in the visible spectral range, was used to map three kinds of liquids: extra virgin olive oils, frying oils, and detergents in water. By multivariate processing of the spectral data, the liquids could be classified according to their intrinisic characteristics: geographic area of extra virgin olive oils, degradation of frying oils, and surfactant types and mixtures in water.
The paper critically assesses and illustrates the use of the data entropy budget method in both product and systems engineering based on the experience of developing an optoelectronic instrument known as the tensiograph. The design of such a system involving optoelectronic, electronic, thermal, mechanical, chemical and data processing noise components presents difficult engineering problem from the complex of noise spectrum contributions. This project provides perhaps an important case study for optical engineers because it was developed over a period of 15 years. The design history recorded in the data entropy-time graph, shows clearly the step-wise improvements achieved from the various engineering efforts. The present 11-bits information content of the instrument, with impressive signal-to-noise ratio exceeding 1000:1, was developed from prototype with less than 3-bit resolution. The paper concludes with an assessment of the relevance of this method to optical engineering in which a diverse number of technologies are frequently integrated in products and systems. Finally, the role of data entropy methods in third level education is then briefly considered with very clear lessons drawn from the foregoing concrete example offered by this case study.
This paper analyses the evolution of courses in the Institute of Technology Carlow (formerly the Regional Technical College) in physics, optics, photonics, instrumentation and optoelectronics from 1979. Notably, these course developments culminated in the fist specialist optoelectronic Honours degree in the UK or Ireland, which ran with outstanding success for over a decade under the umbrella of the University of Essex. In the last year, the first specialist degree in Optical Engineering to appear perhaps indeed in Europe has been launched. All these development of Irish optics education have been achieved against unresponsive national and institutional policies framed by a severe reluctantly to accept the need for technical manpower outside of the established disciplines. Other associated optical course innovations of the 'Computer Networking and Optical Communications' National Certificate/Diploma and then subsequently 'Networking' degree course have developed in Carlow exclusively on the basis of the photonics diploma material and contain large element of optical fiber telecommunications/photonics material. The exciting Carlow odyssey is however just beginning as can be seen by the preparation of a further new degree in Optical Engineering Design. This course it is hoped will comprehensively address the issues of training manpower for optical device and system engineering.
The work which we report on here makes use of a new (patented)
technique for measuring the tensile and viscosity properties of
any liquid. One modality uses a laser-derived beam of light
directed into a drop as it builds up on a drop-head, grows and
eventually falls off through gravity. The light is reflected
through the drop, and a trace is built up of its intensity over
time. The trace has been found to have very good discrimination
potential for various classes of liquid. Other sensing modalities
can be used, -- multiple simultaneous optical and near infrared
wavelengths, ultraviolet, ultrasound. In the studies reported on
here, we use the ultrasound modality. Further background on this
new technology for the fingerprinting of liquid content and
composition can be found in McMillan et al. (1992, 1998, 2000).
The multianalyzer is a powerful amplitude modulated fiber optic sensor which is perhaps quite typical of so many sensor innovations in that it is a technology looking for an application. Consequently, a series of collaborations with fruit juice, brewing, distilling, biotechnology and polymer industries were made with the objective of identifying potential applications of the multianalyzer. An assessment of these interactions is made for each of the industrial fields explored, by giving for each, just one positive result from the work. The results are then critically assessed. While these studies have illustrated the universal nature of the technology, in every case, lessons have been drawn of a general nature. This experience in particular underlined the difficulty in acceptance of a fiber based technology in industrial process monitoring, against the backdrop of the conservative practice of industry with long established instrumentation. The hard won experience of this product development has shown the vital important of technologists understanding the difference between the marketing concepts of features, benefits and advantages. Three categories of conclusions are drawn, the technical, the commercial, and finally, conclusions drawn from generalizations of the project by the Kingston partners based on their own independent experience in sensor development involving industrial and medical collaborations.
A new multianalyser tensiograph approach to concentration measurements of pure protein solutions has been devised. Much work has been done over a long period using tensiometers and other surface analysis methods on proteins, enzymes and complex surface active molecules.
A preliminary investigation into the use of multiwavelength fiber drop analyzer (FDA) for the measurement of viscosity, spectral absorbance, and refractive index is made with a view to obtaining conservative estimates of the instrumental capability of the FDA for these measurands. Some important new insights into drop vibrations are made from studies on the fiber drop traces (FDTs) of mechanically excited damped vibrations in drops with a set volume. A brief description of the feasibility measurements on the first application of the FDA in the diagnosis of disease in synovial fluid is given. Strong experimental evidence is reported for the existence of the surface-guided wave (SGW) peak of the fiber drop trace and some new insights into the nature of the FDT are suggested based on a comparative study of the FDTs from a multiple-wavelength and a single-wavelength FDA. The earlier reported drop period dependence on applied electric field is critically reexamined, a new interpretation of this effect is suggested, and an experimental study of clarification is given. Finally, a brief review of the projected capabilities of the FDA based on the work reported here is provided.
A preliminary study has been made into the temporal and spectral characteristics of internally illuminated liquid drops using the Fiber Drop Analyzer (FDA) with the objective of using this instrument for the diagnosis of disease in synovial fluid based on measurements of viscosity, absorbance and other parameters. Two approaches to the measurement of viscosity are identified and described. This study describes for the first time the operation of a multiwavelength FDA. Spectral absorbance of liquids containing 10 ppb rhodamine-b are made and the sensitivity of the FDA compared with standard spectra-photometric techniques. The variation in returned signal as a function of drop growth phase obtained from three water based solutions are qualitatively investigated and the understanding of the measurement potential of the instrument system is discussed.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.