TY - JOUR
T1 - Industrial federated learning algorithm (P-PFedSGD) for tool wear estimation
AU - Huang, Guan Ying
AU - Lee, Ching Hung
N1 - Publisher Copyright:
© 2024
PY - 2024/9
Y1 - 2024/9
N2 - In recent years, machine learning has been challenged by the growing concerns regarding data privacy. This has led to the emergence of federated learning, which aims to train a model across distributed clients without sharing their data, thereby resolving the data privacy issues. However, this scheme may not generalize well to the heterogeneous data of distributed clients, particularly in industrial applications. This has motivated the development of personalized approaches for preserving privacy. Therefore, in this study, we introduced an index called gradient divergence to verify heterogeneity based on federated learning, which was adopted to adjust the aggregation weight. A personalized federated learning with privacy preserving algorithm called personalized federated stochastic gradient descent (P-PFedSGD) was developed to improve the performance of federated learning on local datasets while maintaining the performance on other client datasets. P-PFedSGD allows clients to transmit the local gradient instead of the local model. This approach utilizes the advantages of both personalized and federated learning, while preserving data privacy. The developed P-PFedSGD algorithm was applied to tool-wear estimation to demonstrate its performance and effectiveness. The results showed that the developed approach has more functionality than other algorithms to overcome the challenges of federated learning, such as communication cost reduction and computation reduction for clients.
AB - In recent years, machine learning has been challenged by the growing concerns regarding data privacy. This has led to the emergence of federated learning, which aims to train a model across distributed clients without sharing their data, thereby resolving the data privacy issues. However, this scheme may not generalize well to the heterogeneous data of distributed clients, particularly in industrial applications. This has motivated the development of personalized approaches for preserving privacy. Therefore, in this study, we introduced an index called gradient divergence to verify heterogeneity based on federated learning, which was adopted to adjust the aggregation weight. A personalized federated learning with privacy preserving algorithm called personalized federated stochastic gradient descent (P-PFedSGD) was developed to improve the performance of federated learning on local datasets while maintaining the performance on other client datasets. P-PFedSGD allows clients to transmit the local gradient instead of the local model. This approach utilizes the advantages of both personalized and federated learning, while preserving data privacy. The developed P-PFedSGD algorithm was applied to tool-wear estimation to demonstrate its performance and effectiveness. The results showed that the developed approach has more functionality than other algorithms to overcome the challenges of federated learning, such as communication cost reduction and computation reduction for clients.
KW - Communication cost
KW - Data privacy
KW - Federated learning
KW - Gradient averaging
KW - Tool wear
UR - http://www.scopus.com/inward/record.url?scp=85191476230&partnerID=8YFLogxK
U2 - 10.1016/j.future.2024.04.026
DO - 10.1016/j.future.2024.04.026
M3 - Article
AN - SCOPUS:85191476230
SN - 0167-739X
VL - 158
SP - 150
EP - 157
JO - Future Generation Computer Systems
JF - Future Generation Computer Systems
ER -