The use of correlation methods in pattern recognition is a well known technique to detect the absence, presence and even the spatial or temporal position of any signal within another signal embedded in a complex background. Blurring, rotation, scaling, and noise often lead to false alarms in the correlation plane when working with images. Simple thresholding algorithms then might give the wrong correlation peak. Often, however, the human user can easily define the correct peak by taking into account the shape and surrounding of those local maxima that could represent the correlation peak. Sometimes less obvious factors have influence on the user's decision to discriminate between wrong and false peaks. These factors have to be interrogated and transferred into quantities that can be accessed by the computer. For example, it is possible in some applications (e.g., stress analysis in experimental mechanics) to predict the peak's location to be within a certain area of the correlation plane. Thus, this quantity could be used as well. In many cases, however, it is not easy to define mathematical relations between these input variables that lead to a quantity that helps to distinguish between wrong and correct peaks. All these facts lead to the introduction of fuzzy logic to be used on the correlation plane to decide which of the local maxima corresponds to the correct correlation peak. Fuzzy logic simplifies the way in which input quantities and rules that connect these quantities have to be defined. This way the discrimination capability of different correlation methods could be greatly improved.