Paper
19 November 2003 Architecture for the benchmarking of watermarking algorithms
Author Affiliations +
Abstract
Watermarking sofwares are difficult to evaluate because their desired features have to be evaluated in a multidimensional space. Furthermore, the required characteristics are strongly dependent on the envisaged scenario in which the watermarking will be evaluated. While several benchmarking systems have been proposed to include attacks, perceptual and statistical evaluations, none of them constitute an established reference. Due to the difficulty of this benchmarking issue, we propose to set a web based open-source suite of tools which would allow the scientific community around watermarking research to achieve fair and reproducible benchmarking tests. This paper describes the required basic architecture. A benchmarking session is parameterized with several options relevant to media, embedding, decoding, attacks, etc. The session is divided into tests (which may encompass several runs), and the results are collected and submitted to set multidimensional curves.
© (2003) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
David Dargent, Edward J. Delp III, Jana Dittman, and Benoit M. M. Macq "Architecture for the benchmarking of watermarking algorithms", Proc. SPIE 5203, Applications of Digital Image Processing XXVI, (19 November 2003); https://doi.org/10.1117/12.512553
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Digital watermarking

Computer programming

Java

Operating systems

Computer programming languages

Algorithm development

Databases

RELATED CONTENT

Desigining a mobile application on the example of a system...
Proceedings of SPIE (February 11 2020)
A CORBA event system for ALMA common software
Proceedings of SPIE (September 15 2004)
The LINC-NIRVANA common software
Proceedings of SPIE (June 27 2006)
Thread-based benchmarking deployment
Proceedings of SPIE (June 22 2004)
A framework for data-driven algorithm testing
Proceedings of SPIE (March 21 2005)

Back to Top