Brachytherapy and thermal ablation are relatively new approaches in robot-assisted minimally invasive interventions for treating malignant tumors. Ultrasound remains the most favored choice for imaging feedback, the benefits being cost effectiveness, radiation free, and easy access in an OR. However it does not generally provide high contrast, noise free images. Distortion occurs when the sound waves pass through a medium that contains air and/or when the target organ is deep within the body. The distorted images make it quite difficult to recognize and localize tumors and surgical tools. Often tools, such as a bevel-tipped needle, deflect from its path during insertion, making it difficult to detect the needle tip using a single perspective view. The shifting of the target due to cardiac and/or respiratory motion can add further errors in reaching the target. This paper describes a comprehensive system that uses robot dexterity to capture 2D ultrasound images in various pre-determined modes for generating 3D ultrasound images and assists in maneuvering a surgical tool. An interactive 3D virtual reality environment is developed that visualizes various artifacts present in the surgical site in real-time. The system helps to avoid image distortion by grabbing images from multiple positions and orientation to provide a 3D view. Using the methods developed for this application, an accuracy of 1.3 mm was achieved in target attainment in an in-vivo experiment subjected to tissue motion. An accuracy of 1.36 mm and 0.93 mm respectively was achieved for the ex-vivo experiments with and without external induced motion. An ablation monitor widget that visualizes the changes during the complete ablation process and enables evaluation of the process in its entirety is integrated.