Deep learning technology has been utilized in computed tomography, but, it needs centralized dataset to train the neural networks. To solve it, federated learning has been proposed, which collaborates the data from different local medical institutions with privacy-preserving decentralized strategy. However, lots of unpaired data is not included in the local models training and directly aggregating the parameters would degrade the performance of the updated global model. In order to deal with the issues, we present a semi-supervised and semi-centralized federated learning method to promote the performance of the learned global model. Specifically, each local model is trained with an unsupervised strategy locally at a fixed round. After that, the parameters of the local models are shared to aggregate on the server to update the global model. Then, the global model is further trained with a standard dataset, which contains well paired training samples to stabilize and standardize the global model. Finally, the global model is distributed to local models for the next training step. For shorten, we call the presented federated learning method as “3SC-FL”. Experiments demonstrate the presented 3SC-FL outperforms the compared methods, qualitatively and quantitatively.
Federated learning method shows great potential in computed tomography imaging field by utilizing a decentralized strategy with data privacy-preserving for local medical institutions. However, directly aggregating the parameters of each local model would degrade the generalization performance of the updated global model. In addition, well paired centralized training datasets can be collected in real world, which are not included in the current federated learning methods. To address the issue, we present a semi-centralized federated learning method to promote the generalization performance of the learned global model. Specifically, each local model is firstly trained locally at a fixed round, then, the parameters are aggregated on server to initialized the global model. After that, the global model is further trained with a standard dataset on the server, which contains well paired training samples to stabilize and standardize the global model. For shorten, we call the presented semi-centralized federated learning method as “SC-FL”. Experimental results on different local datasets demonstrate the presented SC-FL outperforms the competing methods.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.