This article presents an elementary change detection algorithm designed using a synchronous model of computation (MoC) aiming at efficient implementations on parallel architectures. The change detection method is based on a 2D-first-order autoregressive ([2D-AR(1)]) recursion that predicts one-lag changes over bitemporal signals, followed by a high-parallelized spatial filtering for neighborhood training, and an estimated quantile function to detect anomalies. The proposed method uses a model-based on the functional language paradigm and a well-defined MoC, potentially enabling energy and runtime optimizations with deterministic data parallelism over multicore, GPU, or FPGA architectures. Experimental results over the bitemporal CARABAS-II SAR UWB dataset are evaluated using the synchronous MoC implementation, achieving gains in detection and hardware performance compared to a closed-form and well-known complexity model over the generalized likelihood ratio test (GLRT). In addition, since the one-lag AR(1) is a Markov process, its extension for a Markov chain in multitemporal (n-lags) analysis is applicable, potentially improving the detection performance still subject to high-parallelized structures.
|