PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.
This PDF file contains the front matter associated with SPIE Proceedings Volume 8350, including the Title Page, Copyright information, Table of Contents, Introduction, and Conference Committee listing.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The traditional safety evaluation of urban rail transit operation was limited to the station or line, not from the
perspective of the whole network operation safety. Specific to the characteristics of the urban rail transit network operation
in the new situation, based on complex network theory, the urban rail transit network model was established, and the
formalized description of the network model was given. Based on above, from the passenger traffic, environment and other
aspects, the safety evaluation index of urban rail transit network operation was established, which included hidden trouble
indexes, accident indexes and safety economic index, aiming at to provide support for overall safety evaluation.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
For the current Internet information access of contradictions and difficulties, the study on the basis of the data mining
technique and recommender system, propose and implement a facing internet personalization information
recommendation system based on data mining. The system is divided into offline and online, offline part to complete the
from the site server log files access the appropriate online intelligent personalized recommendation service transaction
mode, using the association rules mining. The online part, realizes personalized intelligence recommendation service
based on the connection rule excavation. Provides the personalization information referral service method based mining
association rules, And through the experiment to this system has carried on the test, has confirmed this system's
feasibility and the validity.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The multi attributes decision model is presented basing on a number of indicators of book procurement bidders, and
by the characteristics of persons to engage in joint decision-making. For each evaluation to define the ideal solution and
negative ideal solution, further the relative closeness of each evaluation person and each supplier. The ideal solution and
negative ideal solution of the evaluation committee is defined based on the group closeness matrix, and then the results
of the ultimate supplier evaluation are calculated by decision-making groups. In this paper, the model is through the
application of experimental data.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The scheduling problem in distributed systems is known as an NP-complete problem, and methods based on heuristic
or metaheuristic search have been proposed to obtain optimal and suboptimal solutions. The task scheduling is a key
factor for distributed systems to gain better performance. In this paper, an efficient method based on memetic algorithm
is developed to solve the problem of distributed systems scheduling. With regard to load balancing efficiently, Artificial
Bee Colony (ABC) has been applied as local search in the proposed memetic algorithm. The proposed method has been
compared to existing memetic-Based approach in which Learning Automata method has been used as local search. The
results demonstrated that the proposed method outperform the above mentioned method in terms of communication cost.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
With the development of the peer-to-peer (P2P) technology, file sharing is becoming the hottest, fastest growing
application on the Internet. Although we can benefit from different protocols separately, our research shows that if there
exists a proper model, most of the seemingly different protocols can be classified to a same framework. In this paper, we
propose an improved Chord arithmetic based on the binary tree for P2P networks. We perform extensive simulations to
study our proposed protocol. The results show that the improved Chord reduces the average lookup path length without
increasing the joining and departing complexity.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Optical flow algorithm proposed by Horn and Schunck (OFCE-HS) in 1981 was the first technique and one of the
best performers for motion estimation. Attempts to implement OFCE-HS into real-time hardware have been performed
by researchers. Hardware architecture of OFCE-HS proposed by Martin et al. with full integer for all calculations is one
of the attempts. However, the hardware architecture has a significant drawback because it requires two dividers which
decrease the speed of the system, increase the use of resources and add errors in the truncation of the least significant
bits. To overcome this problem, new proposed hardware architecture of OFCE-HS is presented in this paper. With
calculations using a combination of integer and fractional arithmetic, it is possible to reduce the number of dividers and
to improve its performance. The goal of this work is to design hardware architecture of OFCE-HS which can increase the
speed of the system, lower resource utilization and achieve good precision and accuracy compared previous works.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Knowledge sharing is done through various knowledge sharing forums which requires multiple
logins through multiple browser instances. Here a single Multi-Forum knowledge sharing concept is introduced
which requires only one login session which makes user to connect multiple forums and display the data in a single
browser window. Also few optimization techniques are introduced here to speed up the access time using cloud
computing architecture.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Nowadays, the relation model faces the challenge of being applied to massively distributed databases and cloud
databases. It can not be easily scaled out in such computing environments. The main reason is lack of a proper data
distribution unit and a uniform data distribution model. In this paper, a new data distribution model is proposed. As
semantic clusters of data, data multitrees are taken as the distribution units. Schema multitree and data multitree are
defined, and then a method of designing the schema graph is proposed to ensure that the data graph is a data multitree.
Three theorems proved the correctness of the proposed method. Since relational databases can be viewed as data
multitrees, the sematic related data can be split or unified together easily with multiree operations, the scalability of
relational model can be improved. In addition, this data distribution model is transparent to programmers.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
A Mobile Adhoc Network (MANET) is characterized by mobile nodes, multihop wireless connectivity,
infrastructureless environment and dynamic topology. A recent trend in Ad Hoc network routing is the reactive ondemand
philosophy where routes are established only when required. Stable Routing is of major concern in Ad hoc
routing. Security and Power efficiency are the major concerns in this field. This paper is an effort to use security to
achieve more reliable routing. The ad hoc environment is accessible to both legitimate network users and malicious
attackers. The proposed scheme is intended to incorporate security aspect on existing protocols. The study will help in
making protocol more robust against attacks to achieve stable routing in routing protocols.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
In this paper, we propose a parallel chaos-based encryption scheme in order to take advantage of the dual-core
processor. The chaos-based cryptosystem is combinatorially generated by the logistic map and Fibonacci sequence.
Fibonacci sequence is employed to convert the value of the logistic map to integer data. The parallel algorithm is
designed with a master/slave communication model with the Message Passing Interface (MPI). The experimental results
show that chaotic cryptosystem possesses good statistical properties, and the parallel algorithm provides more enhanced
performance against the serial version of the algorithm. It is suitable for encryption/decryption large sensitive data or
multimedia.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Nowadays, the web has enabled an explosive growth of information sharing (there are currently over 4 billion pages
covering most areas of human endeavor) so that the web has faced a new challenge of information overhead. The
challenge that is now before us is not only to help people locating relevant information precisely but also to access and
aggregate a variety of information from different resources automatically. Current web document are in human-oriented
formats and they are suitable for the presentation, but machines cannot understand the meaning of document. To address
this issue, Berners-Lee proposed a concept of semantic web. With semantic web technology, web information can be
understood and processed by machine. It provides new possibilities for automatic web information processing. A main
problem of semantic web information retrieval is that when these is not enough knowledge to such information retrieval
system, the system will return to a large of no sense result to uses due to a huge amount of information results. In this
paper, we present the architecture of information based on semantic web. In addiction, our systems employ the inference
Engine to check whether the query should pose to Keyword-based Search Engine or should pose to the Semantic Search
Engine.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
From locating techniques in the wireless sensors networks two groups based on harbor and without it can be referred
to. Firstly, the harbor nodes distribute the local information in the network and through that the average distance between
two groups or the average length of a step is identified. Non-harbor nodes know the shortest path as the number of steps
to each of the harbors and determine their distance to the harbors by understanding this average step length and using this
estimation compute their location distance. Firstly, the network nodes are clustered. Each harbor is a cluster head and the
cluster members using information derived from this cluster head begin locating. This process starts by the nodes located
in the common field between two clusters. Although algorithm comparability based on harbor is increased by the nodes
clustering, but the algorithm precise and efficiency is still dependent on the number of harbor nodes. Using harbor in all
of the conditions causes its usage limitation in the wireless sensor networks.
Regarding the algorithms without needing to harbor, algorithm is the first case. This algorithm has invented a new
method to make a local graph in the network which is applicable in computing the relative features of nodes. Firstly,
each node makes a graph with its own axis. Then the general graph of network is made and each node changes its
coordinates by using an algorithm. Because of the current limitations in the trigonometry method used in this algorithm,
the computed coordinates are not reliable and face difficulties in many cases. The other algorithms being needless to
harbor try to use another methods instead of trigonometry methods in locating.
For instance, among these methods, those ones based on graph drawing or mass and coil algorithms can be referred
to. These kinds of algorithms take much time and use a lot of energy. In order to upgrade the algorithm results quality
and prevent the fault distribution, we define a secondary parameter called the computed location accuracy. This
parameter indicates location accuracy and can be a value between zero and one.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The purpose of this paper is to classify the sole patterns from a 3D shoe model which is comprised of scattered point
cloud data. Sole patterns can be divided into five categories based on the texture of each pattern. The point cloud data is
sliced into a number of layers, and the unordered data points in each layer are projected onto a viewing plane to get a 2D
shoeprint, in which we can further segment a texture element by region growing. Then, each texture element segmented
can be classified into two types, non-closed curve and closed curve, by detecting if there are point cloud data in each
external unit of the region and looking for the nearest points to the region. Finally, we can identify the type of the texture
element into one of the five categories by analyzing its geometrical characteristics.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The former fault analysis on RSA with "Left-to-Right" was based on modifying the public modulus N, but it is
difficult to be taken in practice. In order to find a more practical attack method, considering the characteristic that the
multiplier of microprocessor is easy to affect by voltage, the fault can be injected into the multiplier during the RSA
signature by adjusting the voltage. This paper proposes a new fault analysis on RSA signature based the error with
multiplier, improving the feasibility of attack, and extends the attack to RSA with fixed-window algorithm. In the end,
the complexity of algorithm is analyzed; the expansibility and feasibility of algorithm are proved by demonstrating in
theory and simulation experiments. The results of experiment show that the new fault analysis algorithm is more
practical in operation.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Software development effort is one of the most important metrics in field of software engineering. Since accurate
estimating of this metric affects the project manager plans, numerous research works have been performed to increase
the accuracy of estimations in this field. Almost all the previous publications in this area used several project features as
independent features and considered the development effort as dependent one. Constructive Cost Model (COCOMO) is
the most famous algorithmic model for estimating the software development effort. Despite the fact that many
researchers have tried to improve the performance of COCOMO using non-algorithmic methods, all of which have
estimated the development effort regardless of the project type. In this paper, the effect of considering the project type in
estimating was investigated by means of neural networks. The obtained results were compared with the original
COCOMO and neural network. The comparisons showed that the software project type can affect the accuracy of
estimations significantly.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Detecting and tracking space objects in video sequences is a challenging task of wide interest. In this paper, a
comprehensive framework for detecting and tracking space objects is presented. Unlike the traditional linear structure of
tracking after detection, this framework also allows detection after tracking. What is more, the combination of the level
set and the frame subtraction algorithms in the tracking subsystem makes detection and tracking of a space object during
an entire video sequence a reality. Experimental results on 15 videos generated by STK show robust tracking under both
star background and earth background.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Organizations, to have a competitive edge upon each other, resort to business intelligence which refers to information
available for enterprise to make strategic decisions. Data warehouse being the repository of data provides the backend for
achieving business intelligence. The design of data warehouse, thereby, forms the key, to extract and obtain the relevant
information facilitating to make strategic decisions. The initial focus for the design had been upon the conceptual models
but now object oriented multidimensional modelling has emerged as the foundation for the designing of data warehouse.
Several proposals have been put forth for object oriented multidimensional modelling, each incorporating some or other
features, but not all. This paper consolidates all the features previously introduced and the new introduced, thus,
proposing a new model having features to be incorporated while designing the data warehouse.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
In order to improve the accuracy of the Single-axial Rotation of INS (SRINS), the idea of the level damping of the
platform INS is introduced to the system, and the principle of the damping is offered. On the basic of analyzing on both
of inner level damping and outer level damping, the mixed level damping is put forward. The results show that by
introducing the damping network to the system, both of the Schuler oscillation and the Foucault oscillation are
eliminated, and the precision of the SRINS is greatly enhanced; At the same time, by used of the mixed level damping,
which can not only reduce the effect of the vehicle power-driven to the precision of the system, but also avoid the limit
of the accurate reference velocity.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The paper deals with the structural topology optimization with fuzzy constraint. The optimal topology of structure is
defined as a material distribution problem. The objective is the weight of the structure. The multifrequency dynamic
loading is considered. The optimal topology design of the structure has to eliminate the danger of the resonance
vibration. The uncertainty of the loading is defined with help of fuzzy loading. Special fuzzy constraint is created from
exciting frequencies. Presented study is applicable in engineering and civil engineering. Example demonstrates the
presented theory.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
This paper explores how to use cache to improve the performance of web system, designs multi-layer cache
strategies based on Seam, constructs Web caching system from four levels. This strategy can improve Web system
scalability, and reduce the load of the system.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
This paper deals with the performance of standard 32 kb/s ADPCM. This performance is measured using signal-tonoise
ratio (SNR). Here, the new contribution is related with mathematical derivation of SNR for asynchronous tandem
ADPCM systems as given in section 5. Another contribution is study this performance using QAM modem signal with
different constellations. A computer simulation program has been developed and a number of simulation tests have been
carried out using QAM modem signal at 9.6 Kb/s with four types of constellations, rectangular, and (5,11), (4,12), (8,8)
circular. The results of testing asynchronous tandem ADPCM systems show that the performance degrades with
increasing the stages of ADPCM. Also, the results show that the performance with circular constellation is better than
rectangular constellation.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The concentrated wind energy turbine is a new type of wind energy electric generator set which utilizes the rare wind
energy after having concentrated it. In order to handle the problem in control system of the concentrated wind energy
turbine, this article introduces a set of wind power testing platform based on dSPACE hardware-in-the-loop-simulation,
and the control principle about wind power is researched and analyzed based on this testing plat form. This experiment
result shows that our testing platform can test not only the whole running process, but also the fault protection function.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Based on a standard ontology definition meta-model of Object Role Modeling (ORM), i.e. ORM-ODM, an abstract
syntax of ORM 2.0 was presented that was specified by means of a version of Extended BNF.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
ORM (Object Role Modeling) has been used as an ontology modeling language to model domain ontologies. In order
to publish domain ontologies modeled in ORM on the Semantic Web, it needs to translate ORM models into OWL 2, the
latest standard Web Ontology Language. Several equivalent transformation methods for ORM model have been
considered and a series of mapping rules have been presented.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Multiobjectives linear programming (MOLP) is one of the most important models for decision-making experts.
When the parameters can be represented by fuzzy numbers, it would be more interesting. In this paper, we present an
approximate algorithm for solving the fuzzy multiobjectives linear programming (FMOLP), where the coefficients of
objective functions and constraints are fuzzy. This algorithm solved MOLP problem obtained after converting fuzzy
coefficients to fixed coefficients, by using the maximin method. A detailed description and analysis of the algorithm are
supplied, and an illustrative example is presented.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
This study explores the application of Particle Swarm Optimization (PSO) for optimization of a cross-flow plate fin
heat exchanger. Minimization total annual cost is the target of optimization. Seven design parameters, namely, heat
exchanger length at hot and cold sides, fin height, fin frequency, fin thickness, fin-strip length and number of hot side
layers are selected as optimization variables. A case study from the literature proves the effectiveness of the proposed
algorithm in case of achieving more accurate results.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The paper describes possible design of the vehicle track computational model and basic testing procedure of the track
dynamic loading simulation. The proposed approach leads to an improvement of track vehicle course stability. The
computational model is built for MSC. ADAMS, AVT computational simulating system. Model, which is intended for
MSC computational system, is built from two basic parts. The first one is represented by geometrical part, while the
second one by contact computational part of the model. The aim of the simulating calculation consist in determination of
change influence of specific vehicle track constructive parameters on changes of examined qualities of the vehicle track
link and changes of track vehicle course stability. The work quantifies the influence of changes of track preloading
values on the demanded torque changes of driving sprocket. Further research possibilities and potential are also
presented.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The new technologies set the stage for Mobile Learning. In this paper, we explored a Mobile Teaching-Learning
pattern and its advantages. And then we modeled courses with Atom and Atom Publishing Protocol. Grounded on the
pattern and modeling, we implemented mobile learning client side with Apple technologies, which could achieve
anytime, anywhere learning. And at last, we discussed the application of our system.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The Hanavan's fifteen rigid multi-body human model was simplified into the six rigid multi-body model, and then
the six degrees of freedom The Kane dynamic model was set up. With the human body parameters and muscle
parameters, the six degrees of freedom Kane formula and restriction conditions were used to obtain the elder tumble
movement status when the feet were stopped suddenly. The initial rotation speed of each part of the body can be
calculated. After that, the tumble movement of elder was stimulated and the impact force from the ground surface was
obtained.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The node mobility and limited energy are two main factors affecting the link stability in mobile ad hoc networks. This paper proposed an advanced routing protocol PLS-AOMDV (Prediction of Link Stability-AOMDV) based on AOMDV multi-path routing protocol, which periodically predicts link stability by taking both node mobility and energy consumption into consideration so as to choose the highest stability link to transmit data. The simulation results show that PLS-AOMDV can increase the packet delivery rate and the lifetime of the network evidently.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The rapid development of parallel computer systems, making parallel operating environment gradually mature and
widely used in scientific computing and research in many fields, thus parallel database of research becomes more and
more attention and research has become an important database field of study. This network-based parallel cluster of
characteristics and the current parallel computer system new trends, analyzes the network parallel clusters of
workstations, parallel database data skew problem in data distribution characteristics of the environment is proposed with
the ability to adapt to the dynamic data balanced distribution programs.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
During the simulation process of real-time three-dimensional scene, the popular modeling software and the real-time
rendering platform are not compatible. The common solution is to create three-dimensional scene model by using
modeling software and then transform the format supported by rendering platform. This paper takes digital campus scene
simulation as an example, analyzes and solves the problems of surface loss; texture distortion and loss; model flicker and
so on during the transformation from 3Ds Max to MultiGen Creator. Besides, it proposes the optimization strategy of
model which is transformed. The operation results show that this strategy is a good solution to all kinds of problems
existing in transformation and it can speed up the rendering speed of the model.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Based on the theory of collaborated self-directed study and the strengths of modern education
technology, the study explores application of websites for collaborated self-directed college English
learning. It introduces the characteristics and functions of the website developed to assist college
English teaching in China. It also points out the problems currently existing among teachers and
students, and puts forward some suggestions and strategies for the improvement of the application of
the website.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Nowadays, the level of the practically used programs is often complex and of such a large scale so that it is not as
easy to analyze and debug them as one might expect. And it is quite difficult to diagnose attacks and find vulnerabilities
in such large-scale programs. Thus, dynamic program slicing becomes a popular and effective method for program
comprehension and debugging since it can reduce the analysis scope greatly and drop useless data that do not influence
the final result. Besides, most of existing dynamic slicing tools perform dynamic slicing in the source code level, but the
source code is not easy to obtain in practice. We believe that we do need some kinds of systems to help the users
understand binary programs. In this paper, we present an approach of diagnosing attacks using dynamic backward
program slicing based on binary executables, and provide a dynamic binary slicing tool named DBS to analyze binary
executables precisely and efficiently. It computes the set of instructions that may have affected or been affected by
slicing criterion set in certain location of the binary execution stream. This tool also can organize the slicing results by
function call graphs and control flow graphs clearly and hierarchically.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The growing proliferation of computer viruses becomes the lethal threat and research focus of the security of network
information. While new virus is emerging, the number of viruses is growing, virus classification increasing complex.
Virus naming because of agencies' capture time differences can not be unified. Although each agency has its own virus
database, the communication between each other lacks, or virus information is incomplete, or a small number of sample
information. This paper introduces the current construction status of the virus database at home and abroad, analyzes
how to standardize and complete description of virus characteristics, and then gives the information integrity, storage
security and manageable computer virus database design scheme.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Accurate point matching is a crucial and challenging process in feature-based image registration, especially for
images with a monotonous background, In this paper, we propose a robust point matching algorithm for image
registration which integrates cyclic string matching method and a two decision criteria, i. e., the stability and accuracy of
transformation error. In this algorithm, a filtering strategy is designed to eliminate dubious matches to get exactly
matched point sets. The performance of the proposed algorithm is evaluated by registering two typical image pairs
containing repetitive patterns. Compared with Random Sample Consensus (RANSAC), Graph Transformation Matching
(GTM), the proposed algorithm obtains the highest precision and stability.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
In order to overcome the limitations of piecewise constant phenomenon and computational burden which exist in Markov Random Field (MRF) with pair wise neighborhood and traditional learning style respectively, this paper proposes a clustering learning method from natural image database, no filters included. By this method, we get the distributive law of the blocks abstracted from natural images. Furthermore, we also do the prior image modeling according to the learned law. And the real application in image restoration illustrates its effectiveness by comparison between high order MRF prior model and pair wise MRF prior model.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
This is a hot spot a seamless panoramic view of the image smooth digital can form by some image study computer vision, image processing and computer graphics. According to the application of the system conditions, especially in limited system resources and some concrete integration technology solutions, and the implementation algorithm have been put forward, based on this M operator, wavelet transform the foundation and the algorithm and the feasible operation design consideration. At the same time, the effect, the joining together of boundary conditions and the basic factors of the classic algorithm is considered to be compromisingly application fields.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The large amount of entity data are continuously published on web pages. Extracting these entities automatically for
further application is very significant. Rule-based entity extraction method yields promising result, however, it is
labor-intensive and hard to be scalable. The paper proposes a web entity extraction method based on entity attribute
classification, which can avoid manual annotation of samples. First, web pages are segmented into different blocks by
algorithm Vision-based Page Segmentation (VIPS), and a binary classifier LibSVM is trained to retrieve the candidate
blocks which contain the entity contents. Second, the candidate blocks are partitioned into candidate items, and the
classifiers using LibSVM are performed for the attributes annotation of the items and then the annotation results are
aggregated into an entity. Results show that the proposed method performs well to extract agricultural supply and
demand entities from web pages.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Solve the problem of fiber grating measuring calibration pressure, temperature, dip Angle and other important parameters, it is a satisfactory solution to use high strength dielectric-coated metallic structure of the hollow fiber grating sensors too Hertz. Preliminary theory analysis and simulation and test results show that the absorption of polyethylene with smaller in the terahertz wave band an ideal choice to the terahertz hollow fiber membrane materials. Use of metal and metal structure dielectric-coated hollow fiber grave phase-shifted fiber grating, constitute a kind of fiber grating sensor calibration. Differential structure can be used to overcome the influence of the environment. Dielectric-coated metallic structure of the hollow fiber the coherent detection methods of obtaining high gain, phase-shifted fiber grating optical heterodyne method to detect frequency, use the frequency range is 1012 kHz, and the frequency resolution 1 KHz.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
This paper proposes a simple, fast sports scene image segmentation method; a lot of work so far has been looking for a way to reduce the different shades of emotions in smooth area. A novel method of pretreatment, proposed the elimination of different shades feelings. Internal filling mechanism is used to change the pixels enclosed by the interest as interest pixels. For some test has achieved harvest sports scene images has been confirmed.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
In the knowledge explosion, rapid development of information age, how quickly the user or users interested in useful
information for feedback to the user problem to be solved in this article. This paper based on data mining, association
rules to the model and classification model a combination of electronic books on the recommendation of the user's
neighboring users interested in e-books to target users. Introduced the e-book recommendation and the key technologies,
system implementation algorithms, and implementation process, was proved through experiments that this system can
help users quickly find the required e-books.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
We present a general and effective projector calibration method using a ray-based generic model, which consists of
rays projected from all pixel elements of the projector. For computing the parameters of the rays, we propose a flexible
3D coordinates calculation method for the projected calibration target. Since the ray-based generic model does not rely
on any assumption, our approach is applicable to arbitrary projection system. The calibrated rays can be applied to
evaluate projector's actual distortion model, reconstruct 3D points of the scene and correct geometric distortion of the
projected image etc. Experiments are presented to verify the performance of the proposed technique.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The edge extraction is the key in the research on the application of the machine vision in the detection area. Classical edge detection algorithm's anti-noise performance is poor. The traditional morphological edge detection algorithm has good anti-noise performance, but it have a bad edge performance. In this paper, Multi-structure element morphology edge detection algorithm was used for the bottle mouth and the bottle body image edge detection.And, it was compared with classical edge detction and traditional morphological edge detection operator.The experimental results show that the gray-scale morphological edge detection algorithm is efficient, has strong anti-noise capability, improve the detection accuracy.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Co-design is a new trend in the social world which tries to capture different ideas in order to use the most
appropriate features for a system. In this paper, co-design of two information system methodologies is regarded; rapid
application development (RAD) and effective technical and human implementation of computer-based systems
(ETHICS). We tried to consider the characteristics of these methodologies to see the possibility of having a co-design or
combination of them for developing an information system. To reach this purpose, four different aspects of them are
analyzed: social or technical approach, user participation and user involvement, job satisfaction, and overcoming change
resistance. Finally, a case study using the quantitative method is analyzed in order to examine the possibility of co-design
using these factors. The paper concludes that RAD and ETHICS are appropriate to be co-designed and brings some
suggestions for the co-design.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
There are many feature descriptors that are insensitive to geometric transformations such as rotation and scale
variation. However, most of them cannot effectively deal with blurred image which is a key problem in many real
applications. In this paper, we propose a new feature descriptor that combines SIFT descriptor with combined blur, scale
and rotation invariant Legendre moment (CBRSL). The proposed method inherits the advantage of SIFT and CBRSL
which leads to invariance for scale, rotation and blur degradation simultaneously. We also show how this new descriptor
is able to better represent the blur and geometric invariant feature descriptor in image registration. The experimental
results validate the effectiveness of our method which is superior to SIFT methods.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The Laplacian based matting methods are attracting a lot of attention due to their elegant and high quality closedform
solution. In this paper, we develop an alternative Laplacian construction for matting task by using local linear
learning model, and naturally derive its nonlinear extension by incorporating Kernel Ridge Regression algorithm. Our
Laplacian matrix construction approaches are based on the assumption that the alpha matte of each pixel point can be
reconstructed from its neighbors' alpha values in each of overlapping windows. In this way the induced Laplacians can
better exploit neighborhood intrinsic structure to constrain the propagation of foreground and background labels.
Experimental results demonstrate the proposed approaches produce very high accuracy matte values, of which our
nonlinear method even outperforms other Laplacian based matting methods on many test images.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Image processing algorithms and fuzzy logic method are used to design a visual tracking controller for mobile robot
navigation. In this paper, a wheeled mobile robot is equipped with a camera for detecting its task space. The grabbed
environmental images are treated using image recognition processing to obtain target's size and position. The images are
treated using input membership functions as the fuzzy logic controller input. The recognized target's size and position
are input into a fuzzy logic controller in which fuzzy rules are used for inference. The inference results are output to the
defuzzifier to obtain a physical control signal to control the mobile robot's movement. The velocity and direction of the
mobile robot are the output of fuzzy logic controller. The differences in velocities for two wheels are used to control the
robot's movement directions. The fuzzy logic controller outputs the control commands to drive the mobile robot to reach
a position 50cm front of the target location. The simulation results verify that the proposed FLC is effective in navigating
the mobile robot to track a moving target.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Edge detection technology of oil spills image on the sea is one of most key technologies to monitor oil spills on the
sea. This paper presents a new method to detect continuous and closed edges of oil slick infrared (IR) aerial images on
the sea. The method is composed of two stages: determination of edge points and edge linking. Non-maximal
suppression and self-adaptive dynamic block threshold (SADBT) algorithm are applied to determine edge points. Then
an improved edge linking algorithm is used for linking discrete edge points into closed edge contours, according to a cost
function of the combination of Euclidean distance, intensity and angle information of edge ending points to improve the
edge linking decision. Using the proposed algorithm, we can gain continuous and closed edges of oil slick IR aerial
images, thereby confirming the location and acreage of oil spill. The experiment results have shown that the proposed
method improves the degree of automation of edge detection, suppresses the striping noise, intensity inhomogeneity and
weak edge boundaries effectively.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The image may be partially blurred because of defocus, shaking camera or moving object. In this paper, we introduce a
novel method to extract the blurred area automatically, which mainly consists of two stages: coarse detection and fine
extraction. In the coarse detection, we proposed a block-based blurred/sharp area detection algorithm which roughly divides
the image into blurred, sharp and undefined blocks. Both the spatial gradient statistics and the frequential power spectrum are
used as blur metrics in the algorithm. For the fine extraction, we introduce an improved lazy snapping which takes blurred
and sharp blocks of the coarse detection as the seeds for automatic lazy snapping and thus extracts the blurred area
automatically. Experimental results prove that the efficiency of the proposed method.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Aiming at calibrating camera on site, where the lighting condition is hardly controlled and the quality of target
images would be declined when the angle between camera and target changes, an adaptive active target is designed and
the camera calibration approach based on the target is proposed. The active adaptive target in which LEDs are embedded
is flat, providing active feature point. Therefore the brightness of the feature point can be modified via adjusting the
electricity, judging from the threshold of image feature criteria. In order to extract features of the image accurately, the
concept of subpixel-precise thresholding is also proposed. It converts the discrete representation of the digital image to
continuous function by bilinear interpolation, and the sub-pixel contours are acquired by the intersection of the
continuous function and the appropriate selection of threshold. According to analysis of the relationship between the
features of the image and the brightness of the target, the area ratio of convex hulls and the grey value variance are
adopted as the criteria. Result of experiments revealed that the adaptive active target accommodates well to the changing
of the illumination in the environment, the camera calibration approach based on adaptive active target can obtain high
level of accuracy and fit perfectly for image targeting in various industrial sites.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Detecting roadside curb is a challenging research topic for stereo vision, as the curb is only 10-30cm higher than the
road surface and the difference between the curb and road surface is usually interfered by the noise in the disparity map.
In this paper, a roadside curb detection algorithm integrating the advantages of stereo vision and mono vision is
proposed. At first, the rough results are detected from the disparity varied curve and then sign filters are used to get quite
robust results. At last, the curb lines are estimated by using weighted Hough transform. Experimental results show that
this algorithm can detect the roadside curb fast and effectively.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
In this paper, we developed a blowhole detection algorithm using texture analysis. We applied Gabor filter to extract defect candidates and used subsequently texture information to classify defect and pseudo-defect. To increase performance, size filtering and adaptive thresholding method were used. The proposed algorithm was tested on 343 images. The experimental result described in this paper shows that this algorithm was effective and suitable for blowhole detection in steel slabs.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Speech recognition is becoming popular in current development on mobile devices. Presumably, mobile devices have
limited computational power, memory size and battery life. In general, speech recognition is a heavy process that
required large sample data within each window. Fast Fourier Transform (FFT) is the most popular transform in speech
recognition. In addition, FFT operates in complex field with imaginary numbers. This paper proposes an approach based
on discrete orthonormal Tchebichef polynomials as a possible alternative to FFT. Discrete Tchebichef Transform (DTT)
shall be utilized here instead of FFT. The preliminary experimental result shows that speech recognition using DTT
produces a simpler and efficient transformation for speech recognition. The frequency formants using FFT and DTT
have been compared. The result showed that, they have produced relatively identical output in term of basic vowel and
consonant recognition. DTT has the potential to provide simpler computing with DTT coefficient real numbers only than
FFT on speech recognition.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
An algorithm is proposed for traffic sign detection and identification based on color filtering, color segmentation and
neural networks. Traffic signs in Thailand are classified by color into four types: namely, prohibitory signs (red or blue),
general warning signs (yellow) and construction area warning signs (amber). A color filtering method is first used to
detect traffic signs and classify them by type. Then color segmentation methods adapted for each color type are used to
extract inner features, e.g., arrows, bars etc. Finally, neural networks trained to recognize signs in each color type are
used to identify any given traffic sign. Experiments show that the algorithm can improve the accuracy of traffic sign
detection and recognition for the traffic signs used in Thailand.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Support vector machine (SVM) is a new type of machine learning method based on statistical learning. It avoids the
neural network of some inherent disadvantages, such as the local minimum in the training process, structure, selection,
slow convergence speed problem, have very strong nonlinear system identification and generalization ability and small
sample. In this paper, we established multi-classifier based on binary tree model. Category rules are based on experience.
But this method cans classifier to come to less than optimal results. Our future work focused on how to find a category to
get the optimal rules of multi-classifier method, experience risk analysis, cluster analysis will consider further.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
In the last decade, a variety of robotic/intelligent wheelchairs have been proposed to meet the need in aging society.
Their main research topics are autonomous functions such as moving toward some goals while avoiding obstacles, or
user-friendly interfaces. Although it is desirable for wheelchair users to go out alone, caregivers often accompany them.
Therefore we have to consider not only autonomous functions and user interfaces but also how to reduce caregivers' load
and support their activities in a communication aspect. From this point of view, we have proposed a robotic wheelchair
moving with a caregiver side by side based on the MATLAB process. In this project we discussing about robotic wheel
chair to follow a caregiver by using a microcontroller, Ultrasonic sensor, keypad, Motor drivers to operate robot. Using
camera interfaced with the DM6437 (Davinci Code Processor) image is captured. The captured image are then processed
by using image processing technique, the processed image are then converted into voltage levels through MAX 232 level
converter and given it to the microcontroller unit serially and ultrasonic sensor to detect the obstacle in front of robot. In
this robot we have mode selection switch Automatic and Manual control of robot, we use ultrasonic sensor in automatic
mode to find obstacle, in Manual mode to use the keypad to operate wheel chair. In the microcontroller unit, c language
coding is predefined, according to this coding the robot which connected to it was controlled. Robot which has several
motors is activated by using the motor drivers. Motor drivers are nothing but a switch which ON/OFF the motor
according to the control given by the microcontroller unit.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
This paper describes a method of human guidance for autonomous cruise of indoor robot. A low-cost robot follows a
person in a room and notes the path for autonomous cruise using its monocular vision. A method of video-based object
detection and tracking is taken to detect the target by the video received from the robot's camera. The validity of the
human guidance method is proved by the experiment.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
2DLPP is a valid dimensions reduction method which directly extracts feature from image matrix and can detect the
intrinsic manifold structure of data by preserving the local information of training data. We analyze the relation between
2DLPP and LPP. We demonstrate they are equivalent on some special conditions. Conventional 2DLPP is working in
the row direction of images. We proposed an alternative 2DLPP which is working in the column direction of images. By
simultaneously considering the row and column directions, we develop the two-directional 2DLPP, i.e. (2D)2LPP. The
proposed method not only extracts feature with lower dimension than 2DLPP, but also take full advantage of row and
column structure information of images. Experiment results on two standard face databases demonstrate the
effectiveness of the proposed method.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Detection and classification of defects on surface mount device printed circuit board (SMD-PCB) is an important
requirement in electronic manufacturing process. This process which is primarily performed by automatic optical
inspection (AOI) system ensures the functionality and quality of manufactured products. In this paper, the pattern
recognition algorithms proposed in the literature for the inspection of defects using AOI are reviewed. The review
focuses on segmentation algorithms, choice of features and feature extraction algorithms as well as the types of classifier
and their relative classification performance. The review spans a 20 year period from 1990 to 2011. The results of the
review suggest that solder joint defect is the type of defects mostly investigated and that the trend is moving towards
combining the results of more than one classifier to improve classification accuracy and robustness.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The phase-based image matching is effective for both iris and palm recognition tasks. Hence, we can expect that the approach may be useful for multimodal biometric system having palmprint and iris recognition capabilities. This paper investigates the fusion of palmprint and iris biometric at image level. A new image fusion algorithm named Baud limited image product (BLIP) especially for phase-based image matching is proposed. Based on this, a new multi-biometric fusion scheme at image level that combines BLIP and phase-based image matching is proposed. The effective region of iris and palm images are first extracted respectively, then they are fused into one small size image using BLIP, finally matched with the template using phase-based image matching to get a score. The experimental results show that this new scheme can not only improve the system accuracy performance, but also reduce the memory size used to store the template and time consumed by the matching.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Genetic algorithms, there are many difficult issues, such as premature convergence, choice of control parameters. This combination of all improvements, the optimal preservation strategy, adaptive set the crossover probability and mutation probability, the idea of fitness scaling into the simple genetic algorithm, the algorithm is improved and used Matlab program to achieve the improved algorithm, prove the correctness and practicability of this method.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Salient regions detection and segmentation of an image is very important for many applications like content-based
image retrieval (CBIR), object recognition, and content delivery. But in the practical environment, there may be many or
heavy level of noisy in the input image. For a better effect, a new method is proposed in this paper: denoise the input
image by using a good performance algorithm, and then detect and segment the salient regions which we are concerned
with. We compare our new method with the current method, the experimental results demonstrate that our method
performs well and reaches a good result of image saliency regions detection and segmentation.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The usage of cell phones has increased enormously; at present silence and security is the need of the hour in many
places. This can be done by using cell phone jammer, which blocks all the signals. This paper describes the design of
an enhanced technique for jamming the cell phone signals. Our main objective is to concentrate on a specific band of
frequency, which makes communication possible, by jamming this frequency we block out the specific signal that are
responsible for making the call. This method enables the jammer to be more precise and effective, so precise that it can
focus on specific area and allowing the programmer to define the area. The major advancement will be that emergency
services can be availed which is very crucial in case of any calamity, they are intelligent devices as they act only after
they receive signals and also it has a lesser power consumption than existing models. This technique has infinite
potentials and sure can this be modified to match all our imaginations.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Armies of NATO countries are involved in foreign operations, where there are great risks of injuries during combat
operations. In addition to other injuries soldiers may suffer infraction or destruction of bones or their parts. Research in
the use of biocompatible materials finds solutions to successfully restore this harm.
This article deals with the evaluation methods that are typical of hydroxyapatite biocompatible coating. This coating
with its poor mechanical properties must be deposited on the base, which also must be biocompatible.
In order to get perfect bonding with the natural bone, it is necessary to have the titanium implant surface covered by
bio-ceramics coating, which is similar to human bone - hydroxyapatite-like coating.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Edge is an important feature of image and it is very useful in machine vision application. In view of the parallelism,
logic operation and pipelined of Field Programmable Logic Array (FPGA), this paper proposes an improved edge
detection algorithm based on the Canny operator for FPGA. Median Filter in 3-way Parallel can complete image
preprocessing in high-speed. Second Harmonic of the Variable Parameters (SHOVP) calculates the gradient easily and
flexibly. 45 Degrees into the Direction of Gradient, and Non Maximum Suppression based on Quarter of the Gradient
Direction will be completed easily just using logic operation. This algorithm transforms the complex data operation into
multi-task operation, and simplifies arithmetic into logic operation. It improves the computing feasibility, effectiveness
and real-time for FPGA. This paper gives the result of the algorithm based on FPGA.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Nowadays, both personal identification and classification are very important. In order to identify the person for some security applications, physical or behavior-based characteristics of individuals with high uniqueness might be analyzed. Biometric becomes the mostly used in personal identification purpose. There are many types of biometric information currently used. In this work, iris, one kind of personal characteristics is considered because of its uniqueness and collectable. Recently, the problem of various iris recognition systems is the limitation of space to store the data in a variety of environments. This work proposes the iris recognition system with small-size of feature vector causing a reduction in space complexity term. For this experiment, each iris is presented in terms of frequency domain, and based on neural network classification model. First, Fast Fourier Transform (FFT) is used to compute the Discrete Fourier Coefficients of iris data in frequency domain. Once the iris data was transformed into frequency-domain matrix, Singular Value Decomposition (SVD) is used to reduce a size of the complex matrix to single vector. All of these vectors would be input for neural networks for the classification step. With this approach, the merit of our technique is that size of feature vector is smaller than that of other techniques with the acceptable level of accuracy when compared with other existing techniques.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
In Wireless Sensor Networks, congestion occurs when the traffic rate is high. This happens when the an event is
detected in a network. Congestion causes packet loss thus degrading the performance of the network. Hence it
necessitates to develop an effective congestion control technique. This paper focuses on congestion due to concurrent
transmission. We have proposed an efficient protocol to detect and control congestion in a MAC. The occurrence of
congestion is detected by calculating a new metrics called congestion scale. When the congestion scale exceeds the
threshold value it intimates, that, congestion has occurred. Congestion notification signal is send to all the nodes. On
receiving the notification signal all nodes adjust their transmission rate to control congestion. We have implemented
Hop-by-Hop Rate Control Technique(HRCT) to control congestion and to guarantee both high throughput and minimum
delay. This technique is implemented successfully in NS-2 simulator. Finally, simulation results have demonstrated the
effectiveness of our proposed algorithm.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Today's advanced scenario where each information is available in one click, data security is the main aspect.
Individual information which sometimes needs to be hiding is easily available using some tricks. Medical information,
income details are needed to be kept away from adversaries and so, are stored in private tables. Some publicly released
information contains zip code, sex, birth date. When this released information is linked with the private table, adversary
can detect the whole confidential information of individuals or respondents, i.e. name, medical status. So to protect
respondents identity, a new concept k-anonymity is used which means each released record has at least (k-1) other
records in the release whose values are distinct over those fields that appear in the external data. K-anonymity can be
achieved easily in case of single sensitive attributes i.e. name, salary, medical status, but it is quiet difficult when
multiple sensitive attributes are present. Generalization and Suppression are used to achieve k-anonymity. This paper
provides a formal introduction of k-anonymity and some techniques used with it l-diversity, t-closeness. This paper
covers k-anonymity model and the comparative study of these concepts along with a new proposed concept for multiple
sensitive attributes.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Installation of cross-linked polyethylene (XLPE) cable joint possibly introduces defects into the XLPE-silicon rubber interface, such as micro-cavity and micro-wire. Those defects greatly decrease the interfacial breakdown strength and endanger the stability of power system. However, the traditional method only measures the breakdown strength, which alone is limited and can not provide detailed information to more clearly understand the dielectric performance and tracking failure mechanism. This paper investigated the effect of micro-cavity on tracking failure by analyzing the distribution characteristics of discharge light and carbonization. Interfaces with those defects were setup by pressing together a slice of XLPE and a slice of transparent silicon rubber. A 50 Hz AC voltage was applied on a pair of flat-round electrodes sandwiched at the interface with their insulation distance of 5 mm until tracking failure occurred. The evolution of both discharge light and carbonization at the interface from discharge to the failure was recorded with a video recorder and then their channel width was analyzed with image processing method. Obtained results show that micro-cavity at an XLPE-silicon rubber interface strengthens the transportation of charge and easily leads to interfacial discharge and tracking failure. The distribution of discharge light and carbonization at the interface with micro-wire proves this.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
On the basis of traditional PID algorithm, the paper analyzes the improvements of control algorithm of PMAC
(Programmable Multiple-Axis Controller) with feedforward control, and presents the computational model of the control
algorithm. The debugging results of motors are analyzed in combination with PMAC as the controller. The problems and
the final graphics data that appear throughout debugging process are analyzed in detail, which proves that there are better
steady characteristics and dynamic performance for the servo-control system based on PMAC with feedforward control.
The control system is rebuilded by parameter-adaptive PID+feedforward control for higher machining accuracy.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Key agreement protocols are designed to establish a session keys between two or multiple entities oven an insecure
network and the session key is used to assure confidentiality thought encryption. With the advantages of identity-based
(ID-based) cryptography, there have been many ID-based key agreement protocols proposed. However, most of them are
based on Weil pairing, which is more cost of computation compared with Tate paring. In this paper, we propose a newly
ID-based key agreement protocol from the Tate pairing. Compared with previous protocols, the new protocol minimizes
the cost of computation with no extra message exchange time. In addition, the proposed protocol provides known key
security, no key control, no key-compromise impersonation and perfect forward secrecy.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Preprocessing #SAT instances can reduce their size considerably and decrease the solving time. In this paper we
investigate the use of the hyper-binary resolution and equality reduction to preprocess the #SAT instances. And a
preprocessing algorithm Preprocess MC is presented, which combines the unit propagation, the hyper-binary resolution,
and the equality reduction together. The experiment shows that these excellent technologies not only reduce the size of
the #SAT formula, but also improve the ability of the model counters to solve #SAT problems.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Based on the theory of adaptive modulation, the compressed format is introduced in voice and data transmission, and
a novel adaptive dynamic capability allocation algorithm is presented. In the given transmission system model, according
to the channel state information (CSI) provided by channel estimating, the transmitter can adaptively select the
modulation model, and shrink the voice symbol duration to improve the data throughput of data transmission. Simulation
results shows that the novel algorithm can effectively evaluate the percentage occupation of data bit in one fame, and
improve the data throughput.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
As for inconsistent decision tables, this paper puts forward a reverse-order of data reduction algorithm. Compared
with the traditional methods, firstly, this algorithm makes attribute value reduction, and then makes attribute reduction,
which could avoid the negative factor of inconsistent individuals set. The result of attribute value reduction could be used
in the proceed of attribute reduction .the time complexity of the novel algorithm is [see manuscript].
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
C++ is a widely used object-orientated programming language in the software industry. The purpose of this paper is
to discuss concept and application of the copy constructor, a special constructor in C++. As fundamental knowledge,
constructor and destructor were introduced at first. Several examples of copy constructor were presented to illustrate
concept of copy constructor and its use. Shallow copy and deep copy were also presented. After discussions on copy
constructor by analyzing all the examples of copy constructor, the conclusion was made about that how to define a copy
constructor and how to use it properly.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
This paper first defines the elementary granulation and the granulation, and other concepts, then we could convert
decision tables to granular graph, and make visual presentation of them .some relevant knowledge of graph theory is
applied to granular graph and its computing. In the paper it is shown that it is feasible and effective that granular graph is
applied in describing data reduction. The method has the characteristic of simple and visual form and so on. Compared
with other analysis methods, its time complexity is decreased to O(n).
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Semantic Web service discovery is focused on finding the best services from the majority of services which provide
similar or same functions based on Semantic matching Algorithms. This paper firstly proposed a new QoS model for
describing Semantic Web services by adding a new significant characteristic (SU ratio) of QoS. To raise the finding
work efficiency, we then give an improved algorithm for matching semantic web services; the new algorithm does not
only focus on the inclusion relation of ontology concepts in taxonomic tree as implemented by classic algorithms, but
also includes binary relations. Through the comparison of weighted semantic distance in taxonomic tree, the similarity
computation is more accuracy and the precision ratio and recall ratio of algorithm is enhanced. Finally a preliminary case
study was accomplished in order to prove its applicability and viability.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Concept and research actuality of genetic algorithm are introduced in detail in the paper. Under this condition, the
simple genetic algorithm and an improved algorithm are described and applied in an example of TSP problem, where the
advantage of genetic algorithm is adequately shown in solving the NP-hard problem. In addition, based on partial
matching crossover operator, the crossover operator method is improved into extended crossover operator in order to
advance the efficiency when solving the TSP. In the extended crossover method, crossover operator can be performed
between random positions of two random individuals, which will not be restricted by the position of chromosome.
Finally, the nine-city TSP is solved using the improved genetic algorithm with extended crossover method, the efficiency
of whose solution process is much higher, besides, the solving speed of the optimal solution is much faster.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
With the rapid growth of motorization, public parking has become a distinct problem influences urban development
and people's life quality. Unscientific method of urban planning and lagging of related studies lead to parking problem
becoming more and more seriously day by day. The propose of this paper is to put forward several solutions for public
parking problem, including parking management policies, building with construction parking policies, regional
differentiation policy, parking industrial policy, self-parking policy and so on, from the point of regulatory plan and
related policy. It is significant to guide new urban areas construction and improve the parking situation.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Nowadays everywhere we come across electronic devices and now the world has become entirely mobile with so
many new electronic equipments. The number of computing and telecommunications devices is increasing and
consequently the focus on how to connect them to each other. The usual solution is to connect the device with cables or
using infra red light to make file transfer and synchronizations possible but infrared light requires line of sight. To solve
these problems a new technology,Wibree radio technology complements other local connectivity technologies,
consuming only a fraction of the power compared to other radio technologies, enabling smaller and less costly
implementations and being easy to integrate with Bluetooth solutions, Furthermore it can be also used to enable
communication between several units such as small radio LANs.This paper focuses on why this technology has got large
attention although there are pro's and con's with respect to other technologies.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
This article describes the theory knowledge of the extenics and association rules, And combined extension and
association rule mining algorithm, Construction of a database element model, the complex database reduced to intuitive
and simple database, make expression more clear, and reduce the next step rule mining of data calculation. In the basic
of Apriori algorithm significant association rules data mining based on the extension. Using association rule mining
algorithm and extension of the correlation thought the database of extension of data mining association rules, access to
many valuable association rules. And an example illustrates the effectiveness of this method.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
As storage systems grow larger and more complex, the traditional block-based file systems cannot satisfy the large
workload. More recent distributed file systems have adopted architectures based on object-based storage. This paper
presents a framework of efficient storage management for distributed storage system. In object storage side, low-level
storage tasks and data distribution must be managed and in metadata server side, we will manage how to scale the
metadata. Due to the high space efficiency and fast query response, bloom filters have been widely utilized in recent
storage systems. So, we will also utilize bloom-filter based approach to manage metadata by taking the advantages of
bloom-filter and the semantic-based scheme will also be used to narrow the managed workload. In this paper, we will
neglect the data distribution of object-based storage side.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
In the traditional design approach of data-model-driven, system analysis and design phases are often separated which
makes the demand information can not be expressed explicitly. The method is also easy to lead developer to the process-oriented
programming, making codes between the modules or between hierarchies disordered. So it is hard to meet
requirement of system scalability. The paper proposes a software hiberarchy based on rich domain model according to
domain-driven design named FHRDM, then the Webwork + Spring + Hibernate (WSH) framework is determined.
Domain-driven design aims to construct a domain model which not only meets the demand of the field where the
software exists but also meets the need of software development. In this way, problems in Navigational Maritime System
(NMS) development like big system business volumes, difficulty of requirement elicitation, high development costs and
long development cycle can be resolved successfully.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Using China firm data about ICT, we provide some insight into the link between ICT, productivity and complementary investment. The results show that the contribution of ICT capital deepening is raised when firms combine ICT use and some complementary investment (human capital, innovation and organization change).
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The Monte Carlo technique is used to evaluate the performance of four techniques for making decisions in the
presence of ambiguity. A modified probability approach (both weighted and unweighted) and weighted and unweighted
Dempster-Shafer are applied to compare the reliability of these methods in producing a correct single decision based on a
priori knowledge perturbed by expert or sensor inaccuracy. These methods are tested across multiple conditions which
differ in condition mass values and the relative accuracy of the expert or sensor. Probability and weighted probability
are demonstrated to work suitably, as expected, in cases where the bulk of the input (expert belief or sensor) data can be
assigned directly to a condition or in scenarios where the ambiguity is somewhat evenly distributed across conditions.
The Dempster-Shafer approach would outperform standard probability when significant likelihood is assigned to a
particular subset of conditions. Weighted Dempster-Shafer would also be expected to outperform standard and weighted
probability marginally when significant likelihood is assigned to a particular subset of conditions and input accuracy
varies significantly. However, it is demonstrated that by making minor changes to the probability algorithm, results
similar to those produced by Dempster-Shafer can be obtained. These results are considered in light of the
computational costs of Dempster-Shafer versus probability.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The concept of urban parking guidance information system is briefly given on the paper. From a different point of
view, the purpose of the system is set forth, and the basic functions of the system are analyzed. Then a framework for the
system design and module design features are put forward, even a framework for the system design is given from the
management level and technical level. Finally, a parking guidance information system development is put forward
combining with the final status of China's largest city parking and traffic congestion situation.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
It systematically introduced the architecture of spreadsheet in office open XML (OOXML) and open document format (ODF), as well as their similarity and discrepancy. It compared not only the physical structure, logical structure, the way of description about some frequently-used features but also description ability of the two standards by the statistical data of features. It is beneficial to use and improve the spread-sheet format in practical application for both OOXML and ODF.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
With the popularity of mobile devices and social network services, more and more people tend to use their mobile
phones to share information with others by social network service. This paper proposes a new location-based mobile
information sharing service system, which allows geo-tagged information to share between users through social network.
The sharing service works in a personalized push mode, and sends the only location-based information of genuine
interest for users. The implementation of a demonstration system including the mobile information sharing service server
and the mobile client is presented in this paper.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Aiming to the problems that Mean-Shift algorithm costs low computation, but easy to fall into local maximum, and
huge computation of Particle Filter tracking algorithm leads to low real-time processing capacity, according to the need
of real stereo vision measurement system, a kind of tracking algorithm which combines Mean-Shift and Particle Filter by
essentiality function is proposed. Under the condition without occlusion, Mean-Shift is used to track object. When object
is occluded, Particle Filter is applied to accomplish the later object tracking. These two algorithms alternate by a defined
threshold. The tracking algorithm is used into real stereo vision measurement system. Experiment result indicates that the
algorithm takes on high efficiency, so it is of high practicability.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The purpose of this paper is to set up a classification framework of online learning activities. Fifty-nine online
learning activity cases were collected from seven disciplines. Open coding, axial coding, and selective coding were
conducted according to Grounded Theory. After step-by-step validation, the classification framework consists of six core
categories (Argumentation, Resource Sharing, Collaboration, Inquiry, Evaluation, and Social Network) has been set up.
Further study is needed to get more insight into each category and establish effective activity-based instruction models.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Our new idea can make the system classification technology requirements with value-oriented requirements more
easily and less ambiguous. So, the new concept is platform to refine the value orientation for the requirements of
iterative and incremental development real-time database system. This idea gives us life time keep a single platform of
real-time database update user needs and full availability, guarantees the reliability of the real-time database system. The
method of the value orientation is the evolution of proof that the batter support requirements and specifications. This new
model more system structure of the definition and model is better than the existing iterative and incremental model. All
kinds of relations attribute and traceability value the requirement to alleviate.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The task of nonlinear dimensionality reduction is to find meaningful low-dimensional structures hidden in high
dimensional data. In this paper, an unsupervised algorithm for nonlinear dimensionality reduction called locally linear
embedding based on local correlation (LC-LLE) is presented. The LC-LLE algorithm is motivated by locally linear
embedding (LLE) algorithm and correlation coefficient which usually gives the correlation between two random vectors.
It is a major advantage of the LC-LLE to optimize the process of dimensionality reduction by giving more reasonable
neighbor searching. Simulation studies demonstrate that the LC-LLE can give better results in dimension reduction than
LLE. Experiments on face images data sets have shown the potential of LC-LLE in practical problem.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The need for CMS's to create and edit e-commerce websites has increased with the growing importance of e-commerce.
In this paper, the various features essential for e-commerce CMS's are explored. The aim of the paper was to
find the best CMS solution for e-commerce which includes the best of both CMS and store management. Accordingly,
we conducted a study on three popular open source CMS's for e-commerce: VirtueMart from Joomla!, Ubercart from
Drupal, and Magento. We took into account features like hosting and installation, performance, support/community,
content management, add on modules and functional features. We concluded with improvements that could be made in
order to alleviate problems.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Gear mechanism is the most popular transmission mechanism; however, the traditional design method is complex
and not accurate. Optimization design is the effective method to solve the above problems, used in gear design method.
In many of the optimization software MATLAB, there are obvious advantage projects and numerical calculation. There
is a single gear transmission as example, the mathematical model of gear transmission system, based on the analysis of
the objective function, and on the basis of design variables and confirmation of choice restrictive conditions. The results
show that the algorithm through MATLAB, the optimization designs, efficient, reliable, simple.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Quality is a significant factor in the application and sustainable development of Open Online Course Resources. To
analyze course quality, 102 courses were randomly selected: 51 from the Open University's OpenLearn (OL) program in
the UK and 51 from the National Essential Online Courses (NEOC) program in China. These courses were evaluated by
applying the Online Course Perceived Quality Evaluation Model (OCPQEM) in order to compare NEOC and OL in
terms of Learning Resources and Learning Processes from the perspective of student users. Suggestions are proposed to
improve the learning resources, user interface, learning guides, and user interaction.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The general formulation of the assignment problem consists in the optimal allocation of a given set of tasks to a workforce. This problem is covered by existing literature for different domains such as distributed databases, distributed systems, transportation, packets radio networks, IT outsourcing, and teaching allocation. This paper presents a new version of the assignment problem for the allocation of academic tasks to staff members in departments with long leave opportunities. It presents the description of a workload allocation scheme and its algorithm, for the allocation of an equitable number of tasks in academic departments where long leaves are necessary.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
With the popularization of e-finance in the city, the construction of e-finance is transfering to the vast rural market,
and quickly to develop in depth. Developing the business processing network system suitable for the rural credit
cooperative Banks can make business processing conveniently, and have a good application prospect. In this paper, We
analyse the necessity of adopting special purpose distributed database in Credit Cooperation Band System, give
corresponding distributed database system structure , design the specical purpose database and interface technology .
The application in Tongbai Rural Credit Cooperatives has shown that system has better performance and higher
efficiency.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
It is well-known that accurate prediction of fund trend is very important to get high profits from fund market. In the
paper, least squares support vector regression (LSSVR) is adopted to predict fund trend. LSSVR higher the non-linear
prediction ability than other prediction methods .The trading price of fund "kexun" from 2007-3-1 to 2007-3-30 is used
as our experimental data, and the trading price from 2007-3-26 to 2007-3-30 is used as the testing data. The forecasting
results of BP neural network and least squares support vector regression are given. The experimental results show that
the forecasting values of LSSVR are nearer to actual values that those of BP neural network.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Estimating cost for the software project is one of the most important and crucial task to maintain the software
reliability. Many cost estimation models have been reported till now, but most of them have significant drawbacks due to
rapid changes in the technology. For example, Source Line Of Code (SLOC) can only be counted when the software
construction is complete. Function Point (FP) metric is deficient in handling Object Oriented Technology, as it was
designed for procedural languages such as COBOL. Since Object-Oriented Programming became a popular development
practice, most of the software companies started applying the Unified Modeling Language (UML). The objective of this
research is to develop a new cost estimation model with the application of class diagram for the software cost estimation.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Selection and identification of a subset of compounds from libraries or databases, which are likely to possess a
desired biological activity is the main target of ligand-based virtual screening approaches. The main challenge of such
approaches is achieving of high recall of active molecules. To this end, different models of Bayesian network have been
developed. In this study, we enhance the Bayesian Inference Network (BIN) using a subset of selected molecule's
features. In this approach, a few features that represent the Minifingerprints (MFPs) were filtered from the molecular
fingerprint features based on an analysis of distributions of molecular descriptors and structural fragments into large
compound data set collections. Simulated virtual screening experiments with MDL Drug Data Report (MDDR) data sets
showed that the proposed method provides simple ways of enhancing the cost effectiveness of ligand-based virtual
screening searches, especially for higher diversity data set.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
One of main disadvantage of OFDM is high peak-to-average power ratio (PAPR). In this paper, an effective PAPR
reduction scheme is proposed. The new scheme is based on PTS and composed of hadamard transform followed by the
PTS technique for reduction peak-to-average of OFDM signal. Simulation results show that the proposed scheme can
obtain significant PAPR reduction compared to ordinary PTS method.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
E-learning device is widely used in our daily life. However mobile e-learning machine is not satisfied as expected.
Most mobile e-learning device is a static learning device, unable to fulfill the requirement of collaboration, long standby
time, usage under strong sunlight. To meet these requirements, we developed a learning machine based on electronic
paper. This paper will discuss the software consideration of the device. The software is a knowledge navigation system,
the navigation system is based on Ontology. Research shows that combine frame Ontology and description logic together
can afford a uniform interface to user application.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Private cloud as one model of cloud computing has its own special features, compared with public cloud. It is
designed to supply more control over the data to one organization with implementation in a private network. This paper
makes a comparison of public cloud and private cloud. After this comparison, we choose Eucalyptus to do a case study
on the private cloud. Furthermore, we realize a kind of prototype of SaaS based on Eucalyptus by WaveMaker. We also
do some experimental evaluation of this cloud platform services.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Software Testing is a costly and time consuming process in software development. Therefore, software testing tools are often deployed to automate the process in order to reduce cost and improve efficiency. However, many of them are proprietary and expensive. Hence, open source software testing tools could be an appealing alternative. In this paper, we survey the current states of open source software testing tools from three aspects, namely, their availability for different programming platforms and types testing activities, maintenance of the tools and license limitations. From the 152 tools surveyed, we found that open source software testing tools not only are widely available for popular programming platforms, but also support a wide range of testing activities. Furthermore, we also found that more than half of the tools surveyed have been actively maintained and updated by the open source communities. Finally, these tools have very few licensing limitations for commercial use, customization and redistribution.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
With the expansion of business scale, small and medium-sized enterprises began to use information platform to improve their management and competition ability, the server becomes the core factor which restricts the enterprise's infomationization construction. This paper puts forward a suitable design scheme for small and medium-sized enterprise web server soft load balancing, and proved it effective through experiment.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Hibernate is a object relational mapping framework of open source code, which conducts light-weighted object encapsulation of JDBC to let Java programmers use the concept of object-oriented programming to manipulate database at will. The appearence of the concept of Ajax (asynchronous JavaScript and XML technology) begins the time prelude of page partial refresh so that developers can develop web application programs with stronger interaction. The paper illustrates the concrete application of Ajax and Hibernate to the development of E-shop in details and adopts them to design to divide the entire program code into relatively independent parts which can cooperate with one another as well. In this way, it is easier for the entire program to maintain and expand.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Limited trust, cooperation and communication have been identified as some of the issues that hinder collaboration
among business partners. These one also true in the acceptance of e-supply chain integrator among organizations that
involve in the same industry. On top of that, the huge number of components in supply chain industry also makes it
impossible to include entire supply chain components in the integrator. Hence, this study intends to propose a method for
identifying "trusted" collaborators for inclusion into an e-supply chain integrator. For the purpose of constructing and
validating the method, the Malaysian construction industry is chosen as the case study due to its size and importance to
the economy. This paper puts forward the background of the research, some relevant literatures which lead to trust values
elements formulation, data collection from Malaysian Construction Supply Chain and a glimpse of the proposed method
for trusted partner selection. Future work is also presented to highlight the next step of this research.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Aiming at the problem of social communication network evolution, first, topology potential is introduced to measure the local influence among nodes in networks. Second, from the perspective of topology potential distribution the method of network evolution description based on topology potential distribution is presented, which takes the artificial intelligence with uncertainty as basic theory and local influence among nodes as essentiality. Then, a social communication network is constructed by enron email dataset, the method presented is used to analyze the characteristic of the social communication network evolution and some useful conclusions are got, implying that the method is effective, which shows that topology potential distribution can effectively describe the characteristic of sociology and detect the local changes in social communication network.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
To convert a first-order formula containing an existential quantification into an equivalent clausal normal form, it is
necessary to introduce function variables and accordingly extend the space of first-order formulas. This paper proposes
unfolding transformation in such an extended formula space and demonstrates how it is employed to simplify queryanswering
problems. The presented work provides a foundation for constructing a correct method for solving queryanswering
problems that include unrestricted use of universal and existential quantifications.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Situation calculus is an expressive tool for modeling dynamical system in artificial intelligence, changes in a
dynamical world is represented naturally by the notions of action, situation and fluent in situation calculus. Program can
be viewed as a discrete dynamical system, so it is possible to model program with situation calculus. To model program
written in a smaller core programming language CL, notion of fluent is expanded for representing value of expression.
Together with some functions returning concerned objects from expressions, a basic action theory of CL programming is
constructed. Under such a theory, some properties of program, such as correctness and termination can be reasoned
about.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The User-Generated Service (UGS) concept allows end-users to create their own services as well as to share and
manage the lifecycles of these services. The current development of the Internet-of-Things (IoT) has brought new
challenges to the UGS area. Creating smart services in the IoT environment requires a dynamic social network that
considers the relationship between people and things. In this paper, we consider the know-how required to best organize
exchanges between users and things to enhance service composition. By surveying relevant aspects including service
composition technology, social networks and a recommendation system, we present the first concept of our framework to
provide recommendations for a dynamic social network-based means to organize UGSs in the IoT.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The Internet of Things is not yet very widespread, many people have little information about Internet of things. But,
for magical properties of the Internet of things , its appearance immediately aroused people's great interest. This paper,
aiming application of the Internet of Things , use AHP to analyze and look forward to prospects of IOT in many fields.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Defect density is a measurement conducted in one of Malaysia's ICT leading company. This paper will be discussing
on issues of defect density measurement. Regarding defects counted, in order to calculate defect density, we also need to
consider the total size of product that is the system size. Generally, defect density is a measure of the number of total
defect found divided by the size of the system measured. Therefore, the system size is measured by lines of code.
Selected projects in the company have been identified and GeroneSoft Code Counter Pro V1.32 is used as tool to count
the lines of code. To this end, the paper presents method used. Analyzed defect density data are represented using control
chart because shows the capability of the process so that the achievable goal can be set.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The test questions library management system is the essential part of the on-line examination system. The basic demand for
which is to deal with compound text including information like images, formulae and create the corresponding Word
documents. Having compared with the two current solutions of creating documents, this paper presents a design proposal
of Word Automation mechanism based on OLE/COM technology, and discusses the way of Word Automation application
in detail and at last provides the operating results of the system which have high reference value in improving the generated
efficiency of project documents and report forms.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Community governance is not only a global social development of the core issues, but also the essential requirement of
social development, At present, there are still exist poor sense of involved in community awareness and participation
insufficient. These constraints have become the bottleneck of China's urban community development. This paper attempts
to the community perspective of participation, and the District of Wuhan, for example X, from a historical, psychological,
institutional and other elements to discussed the causes of low participation, Hope benefit to the domestic community
development.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Active replication plays an important role in fault tolerant application. However, the vulnerable to multithreading
environment becomes its main drawback in the real world. This paper proposes three active replication algorithms that
supporting multithreading in distributed object computing environment. Experiments show the algorithms can overcome
the pitfalls and increase system performance effectively.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Bisimulation relation is one of the most important equivalence relations in the process calculus for judging whether
two processes are equivalent. In practical applications, when bisimulation equivalence is not available between two
processes, we need to know how much extent that one process can simulate the other process. In this paper, we define a
bisimilarity metric between two finite-state processes based on the bisimulation game semantics. We also propose an
algorithm to calculate the bisimilarity metric. A tool is developed to implement this algorithm.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Debugging, albeit useful for software development, is also a double-edge sword since it could also be exploited by
malicious attackers. This paper analyzes the prevailing debuggers and classifies them into 4 categories based on the debugging
mechanism. Furthermore, as an opposite, we list 13 typical anti-debugging strategies adopted in Windows. These
methods intercept specific execution points which expose the diagnostic behavior of debuggers.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Campus network security is growing importance, Design a very effective defense hacker attacks, viruses, data theft,
and internal defense system, is the focus of the study in this paper. This paper compared the firewall; IDS based on the
integrated, then design of a campus network security model, and detail the specific implementation principle.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.