References:
NAS for Better Federated Learning Models in Different FL Topologies?
(Another direction might also be enticing: using FL to improve NAS.)
In Mi Zhang’s FL benchmark paper ArXiv’20: FedML: A Research Library and Benchmark for Federated Learning, I found several important features that an FL can have, such as computing paradigms, topology, exchanged information, and training procedures.
It also mentions FedNAS as one category of FL algorithm (in Section 4.1). However, based on a survey paper about Fed + NAS, I am thinking there can be many different ways to improve FL using NAS. Recalling Song Han’s Once for All paper, what he do is to adopt NAS to find the best model for machine constraints. Similarly, we probably can use NAS to find the best model for FL constraints, and these constraints seems more complicated than just the machine resource constraints – FL has multiple constraints (or features?), such as computing paradigms, topology, exchanged information and training procedures.
So tracking down this direction, let’s see whether there are anything has been done in improving FL models using NAS.
New ideas:
If you could revise
the fundmental principles of
computer system design
to improve security...
... what would you change?