Date Approved

10-25-2022

Embargo Period

10-26-2022

Document Type

Thesis

Degree Name

M.S. Computer Science

Department

Computer Science

College

College of Science & Mathematics

Advisor

Shen Shyang Ho, Ph.D.

Committee Member 1

Anthony Breitzman, Ph.D.

Committee Member 2

Nancy Tinkham, Ph.D.

Committee Member 3

Ning Wang, Ph.D.

Keywords

Federated Learning, Few-Shot Learning, Meta Learning

Subject(s)

Machine learning; Artificial intelligence

Disciplines

Computer Sciences

Abstract

The efficient and effective handling of few-shot learning tasks on mobile devices is challenging due to the small training set issue and the physical limitations in power and computational resources on these devices. In this thesis, we propose a solution that combines federated learning and meta-learning to handle independent few-shot learning tasks on multiple devices (or clients) and the server. In particular, we utilize the Prototypical Networks to perform meta-learning on all devices to learn multiple independent few-shot learning models and to combine the models in a centralized data distributed architecture using federated learning which can be reused by the clients subsequently. We perform extensive experiments to (1) compare three different federated learning approaches, namely Federated Averaging (FedAvg), Federated Proximal (FedProx), and Federated Personalization (FedPer) on our proposed framework, and (2) explore the effect of data heterogeneity issue on the few-shot learning performance. Our empirical results show that our proposed approach is feasible and is able to improve the devices' individual prediction performance and improve significantly on the global model (on the server) using any of the federated learning approaches when the few-shot learning tasks are on the same datasets. However, the data heterogeneity problem still affects the prediction performance of our proposed solution no matter which federated learning approach we used.

Share

COinS