|
1.INTRODUCTIONScientific and environmental data are often stored in big-space data formats like netCDF. A collection of machine-independent data formats and software libraries called NetCDF (Network Common Data Form) facilitate the development, access, and exchange of array-oriented scientific data. Additionally netCDF [1] is a community standard for exchanging scientific data. It is commonly used to share and store scientific data, including many computed data types. In-situ observation data, also referred to as point measurement data, is another crucial and frequently utilized type of information. Typically, they are profile data that comprise observations for a few parameters at a coordinate-based point and can be kept at ODV [2] or similar spreadsheet formats. While there are numerous websites that offer tools for accessing and visualizing point measurement data [3,4] and netCDF data [5,6], there aren’t as many that compile and offer tools for utilizing both data in combination. Therefore, creating web tools that integrate point observation data and netCDF, for instance in the Black Sea region, is crucial. 2.METHOD AND TECHNIQUESTo provide visualize and data access to netCDF format data the method mentioned at [7] was used. netCDF variables should be converted to single-band georeferencing TIFF (geoTiff) files for corresponding time, and depth and then should be stored in a structured file archive. Next, using Open Layer 7, this data may be uploaded to the map as a layer. Postgresql is used to store observation point data. Its structure was explained at [8]. An additional point layer built from JSON data was utilized to access and show them. Open Layer 7 allows to visualize geoTiff data. It can be added as a map geotiff layer. Also, OL gives the user the possibility to visualize the value at the selected point. In practice, it shows the value under the mouse cursor. To visualize profile data, the PlotlyJS library was used. It allows to plot data in JSON format. So the Python script selects the data corresponded to the selected station from the database, converts the request to json format, and sends it to the UI. 3.THE SYSTEM STRUCTUREThe system was developed at the base of a client-server architecture for the Black Sea region. The general scheme of the system is shown in Figure 1. The similar structure was used in [9]. The server part includes the oceanographic database at the base of DBMS Postgresql, which includes Black Sea hydrological and hydrochemical cruise data. Dataaccess modules provide DB access and convert requests to JSON format. The detailed scheme of the netCDF data access and visualization module is presented in Figure 2. Python script is used to process netCDF files and create corresponding semantically named files in geoTIFF formats. Further, these files are used as sources for the geoTiff layer in OpenLayers at Interactive Map. The javascript libraries Plotly [10], OpenLayers [11], and jQuery [12] were used to develop the user interface. The JSON format is used to provide data exchange. THE USER INTERFACE ALLOWS:
The UI examples are shown in Figure 3 and Figure 4. Figure 3 presents the data selection by rectangle region. The results are shown as a set of interactive points on the map; each of them corresponds to the oceanographic station. Figure4 shows the overlay layer, which presents the netCDF parameter and selected oceanographic stations. This method was used in developing a data access and analysis system for carbon fluxes. It allows for the combination of different types of information and provides complex visualization and analysis. So it combines hydrological and climatic data, forecast data, and in-situ measurement data. Climatic and forecast data are stored in netCDF data archives; measurement data was kept as a DBMS using Postgresql. 4.CONCLUSIONSCombining various oceanographic data sets, verifying forecast data, and analyzing and contrasting non-homogenous data that is kept in various forms with various structures, spaces, and temporal resolutions are crucial responsibilities for academics. Thus, these systems offer the ability to work with many data types and offer tools that are easy to use. ACKNOWLEDGEMENTSThe work was developed within the framework of the MHI RAS state task on themes No. FNNN-2023-0001, “Ensuring climate and bio-geochemical monitoring of carbon fluxes in the Black Sea base on long-term observations data and numerical simulation results” (“Carbon”) and themes No FNNN-2024-0012 “Analysis, diagnosis and real-time forecast of the state of hydrophysical and hydrochemical fields of marine water areas based on mathematical modeling using data from remote and in situ methods of measurements” (“Operational Oceanology”) REFERENCEShttps: Seadatanet, https://cdi.seadatanet.org/search Google Scholar
Zhuk, E.,
“Using GIS technology for visualizing marine environmental data in netCDF format,”
RSCy 2023 Ayia Napa, T.12786 127861W
(2023) https://doi.org/10.1117/12.2681785 Google Scholar
Zhuk, E.,
“GIS tool developments for online data access systems,”
in Proceedings of SPIE - The International Society for Optical Engineering. 9th International Conference on Remote Sensing and Geoinformation of the Environment,
1278623
(202320232023). https://doi.org/10.1117/12.2683109 Google Scholar
A.Yu. Basykina*, E.V. Zhuk, A.Kh. Khaliulin,
“The Geo-Information System Application for Display of the Tsunami Type Long Wave Propagation Modeling Results in the Black Sea Coastal Area,”
Physical Oceanography, 3 74
–81
(2017). https://doi.org/10.22449/0233-7584-2017-3-74-81 Google Scholar
Plotly, https://plotly.com/ Google Scholar
Openlayers, https://openlayers.org/ Google Scholar
jQuery, https://jquery.com/ Google Scholar
|