Object detection and tracking in football video is a very challenging task, and it has good practical and commercial value. The traditional method of extracting the target movement trajectory of football matches is often carried out by players carrying recording chips, which is expensive and difficult to popularize in amateur stadiums. There are also some studies that only use the camera to process the targets in the football video, but due to the similar appearance and frequent occlusion of the targets in the football video, these methods can only segment the players and the ball in the image, but cannot. Track it or only for a short period of time. We study active object tracking method for football game, where a tracker takes visual observations (i.e., frame sequences) as input and produces the corresponding camera control signals as output (e.g., turn up, turn left, etc.). Conventional methods tackle tracking and camera control tasks separately, and the resulting system is difficult to tune jointly. These methods also require significant human efforts for image labeling and expensive trial-and-error system tuning in the real world. To address these issues, we propose, in this paper, an end-to-end solution via deep reinforcement learning. By building a football game simulation scene in the simulator (Unreal Engine), the entire field can be covered by turning the camera in the simulation scene.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.