KEYWORDS: Photovoltaics, Data storage, Solar cells, Databases, Data mining, Solar energy, Neural networks, Evolutionary algorithms, Manufacturing, Data processing
Photovoltaic is a method of generating electrical power by converting solar radiation into direct current electricity using
semiconductors that exhibit the photovoltaic effect. Photovoltaic power generation employs solar panels composed of a
number of solar cells containing a photovoltaic material. Due to the growing demand for renewable energy sources, the
manufacturing of solar cells and photovoltaic arrays has advanced considerably in recent years. Solar photovoltaics are
growing rapidly, albeit from a small base, to a total global capacity of 40,000 MW at the end of 2010. More than 100
countries use solar photovoltaics. Driven by advances in technology and increases in manufacturing scale and
sophistication, the cost of photovoltaic has declined steadily since the first solar cells were manufactured. Net metering
and financial incentives, such as preferential feed-in tariffs for solar-generated electricity; have supported solar
photovoltaics installations in many countries. However, the power that generated by solar photovoltaics is affected by the
weather and other natural factors dramatically. To predict the photovoltaic energy accurately is of importance for the
entire power intelligent dispatch in order to reduce the energy dissipation and maintain the security of power grid. In this
paper, we have proposed a big data system--the Solar Photovoltaic Power Forecasting System, called SPPFS to calculate
and predict the power according the real-time conditions. In this system, we utilized the distributed mixed database to
speed up the rate of collecting, storing and analysis the meteorological data. In order to improve the accuracy of power
prediction, the given neural network algorithm has been imported into SPPFS.By adopting abundant experiments, we
shows that the framework can provide higher forecast accuracy-error rate less than 15% and obtain low latency of
computing by deploying the mixed distributed database architecture for solar-generated electricity.
KEYWORDS: Sensor fusion, Data storage, Performance modeling, Analytical research, Data backup, Data processing, Data centers, System integration, Optoelectronics, Computer science
Recently, the research about deduplication technology is focus on how to save the disk space and decrease the chunking
index table's size but not performance analysis when this technology is integrated into the storage system. In this paper,
we present the classification of various deduplication technologies for storage and compare these different categories
then develop the architecture and process flow for the online storage service. The analytical model is developed to
description and calculation of storage performance, including the deduplication ratio, system throughput and energy
savings. The results analyzed on the model are encouraging, and indicating that the model provides the accurate
performance.
KEYWORDS: Optical storage, Data storage, Computer networks, Computing systems, Data storage servers, Embedded systems, Optical networks, Network architectures, Computer architecture, System on a chip
In this paper, we present the architecture and implementation of a virtual network computers' (VNC) optical storage
virtualization scheme called VOSV. Its task is to manage the mapping of virtual optical storage to physical optical
storage, a technique known as optical storage virtualization. The design of VOSV aims at the optical storage resources of
different clients and servers that have high read-sharing patterns. VOSV uses several schemes such as a two-level Cache
mechanism, a VNC server embedded module and the iSCSI protocols to improve the performance. The results measured
on the prototype are encouraging, and indicating that VOSV provides the high I/O performance.
Because Toshiba quit the competition, there is only one standard of blue-ray disc: BLU-RAY DISC, which satisfies the
demands of high-density video programs. But almost all the patents are gotten by big companies such as Sony, Philips.
As a result we must pay much for these patents when our productions use BD. As our own high-density optical disk
storage system, Next-Generation Versatile Disc(NVD) which proposes a new data format and error correction code with
independent intellectual property rights and high cost performance owns higher coding efficiency than DVD and 12GB
which could meet the demands of playing the high-density video programs. In this paper, we develop Low-Density
Parity-Check Codes (LDPC): a new channel encoding process and application scheme using Q-matrix based on LDPC
encoding has application in NVD's channel decoder. And combined with the embedded system portable feature of SOPC
system, we have completed all the decoding modules by FPGA. In the NVD experiment environment, tests are done.
Though there are collisions between LDPC and Run-Length-Limited modulation codes (RLL) which are used in optical
storage system frequently, the system is provided as a suitable solution. At the same time, it overcomes the defects of the
instability and inextensibility, which occurred in the former decoding system of NVD--it was implemented by hardware.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.