Current Issue : January - March Volume : 2012 Issue Number : 1 Articles : 5 Articles
We present a simple combinatorial construction for the mapping of the biometric vectors to short strings, called the passwords. A verifier has to decide whether a given vector can be considered as a corrupted version of the original biometric vector whose password is known or not. The evaluations of the compression factor, the false rejection/acceptance rates, are derived, and an illustration of a possible implementation of the verification algorithm for the DNA data is presented....
For evaluation of biometric performance of biometric components and system, the availability of independent databases and desirably independent evaluators is important. Both databases of significant size and independent testing institutions provide the precondition for fair and unbiased benchmarking. In order to show generalization capabilities of the system under test, it is essential that algorithm developers do not have access to the testing database, and thus the risk of tuned algorithms is minimized. In this paper, we describe the GUC100 multiscanner fingerprint database that has been created for independent and in-house (semipublic) performance and interoperability testing of third party algorithms. The GUC100 was collected by using six different fingerprint scanners (TST, L-1, Cross Match, Precise Biometrics, Lumidigm, and Sagem). Over several months, fingerprint images of all 10 fingers from 100 subjects on all 6 scanners were acquired. In total, GUC100 contains almost 72.000 fingerprint images. The GUC100 database enables us to evaluate various performances and interoperability settings by taking into account different influencing factors such as fingerprint scanner and image quality. The GUC100 data set is freely available to other researchers and practitioners provided that they conduct their testing in the premises of the Gj�¸vik University College in Norway, or alternatively submit their algorithms (in compiled form) to run on GUC100 by researchers in Gj�¸vik. We applied one public and one commercial fingerprint verification algorithm on GUC100, and the reported results indicate that GUC100 is a challenging database....
An efficient fragile image watermarking technique for pixel level tamper detection and resistance is proposed. It uses five most significant bits of the pixels to generate watermark bits and embeds them in the three least significant bits. The proposed technique uses a logistic map and takes advantage of its sensitivity property to a small change in the initial condition. At the same time, it incorporates the confusion/diffusion and hashing techniques used in many cryptographic systems to resist tampering at pixel level as well as at block level. This paper also presents two new approaches called nonaggressive and aggressive tamper detection algorithms. Simulations show that the proposed technique can provide more than 99.39% tamper detection capability with less than 2.31% false-positive detection and less than 0.61% false-negative detection responses....
Digital fingerprinting of multimedia contents involves the generation of a fingerprint, the embedding operation, and the realization of traceability from redistributed contents. Considering a buyer's right, the asymmetric property in the transaction between a buyer and a seller must be achieved using a cryptographic protocol. In the conventional schemes, the implementation of a watermarking algorithm into the cryptographic protocol is not deeply discussed. In this paper, we propose the method for implementing the spread spectrum watermarking technique in the fingerprinting protocol based on the homomorphic encryption scheme. We first develop a rounding operation which converts real values into integer and its compensation, and then explore the tradeoff between the robustness and communication overhead. Experimental results show that our system can simulate Cox's spread spectrum watermarking method into asymmetric fingerprinting protocol....
An overview of reversible watermarking techniques appeared in literature during the last five years approximately is presented in this paper. In addition to this a general classification of algorithms on the basis of their characteristics and of the embedding domain is given in order to provide a structured presentation simply accessible for an interested reader. Algorithms are set in a category and discussed trying to supply the main information regarding embedding and decoding procedures. Basic considerations on achieved results are made as well....
Loading....