The Air Force Research Laboratory's Human Effectiveness Directorate (AFRL/HE) supports research addressing human factors associated with Unmanned Aerial Vehicle (UAV) operator control stations. Recent research, in collaboration with Rapid Imaging Software, Inc., has focused on determining the value of combining synthetic vision data with live camera video presented on a UAV control station display. Information is constructed from databases (e.g., terrain, cultural features, pre-mission plan, etc.), as well as numerous information updates via networked communication with other sources (e.g., weather, intel). This information is overlaid conformal, in real time, onto the dynamic camera video image display presented to operators. Synthetic vision overlay technology is expected to improve operator situation awareness by highlighting key spatial information elements of interest directly onto the video image, such as threat locations, expected locations of targets, landmarks, emergency airfields, etc. Also, it may help maintain an operator’s situation awareness during periods of video datalink degradation/dropout and when operating in conditions of poor visibility. Additionally, this technology may serve as an intuitive means of distributed communications between geographically separated users. This paper discusses the tailoring of synthetic overlay technology for several UAV applications. Pertinent human factors issues are detailed, as well as the usability, simulation, and flight test evaluations required to determine how best to combine synthetic visual data with live camera video presented on a ground control station display and validate that a synthetic vision system is beneficial for UAV applications.