As in many areas, performance of landmine detection algorithms is judged in terms of detection and false alarm rates. For the landmine detection problem, it is often the case that detectors satisfy one requirement at the cost of poor performance with regard to the other. It is widely accepted that single sensors cannot simultaneously achieve both high detection rates and low false alarm rates, since every sensor has its advantages and disadvantages when dealing with a large variety of landmines, from large metal-cased mines to small plastic-cased mines. Thus, in this paper we consider two types of sensors, EMI and GPR. In its most common instantiation, time-domain EMI is essentially a metal detector and thus detects mines with high metal content as well as metal debris in the environment. More advanced EMI systems have begun to show potential for the discrimination of such debris from mines. GPR is also used for landmine detection since it can detect and identify low-metallic subsurface anomalies. In our previous work, we have shown that Bayesian detection approach can be applied to EMI data and provide promising results. In this paper, we present results that indicate that statistical signal processing technique applied to GPR data can also yield performance improvements. Theoretical results are verified by data collected with a developmental mine-detection system, which consists of co-located metal detectors and GPR sensors. Thus, in addition to discussing individual sensor data processing, we also present result of data fusion of both the EMI and the GPR data using the detection system.