Wenzhi Fang

alt text 

Hello and welcome! I am currently a first-year PhD student at Purdue University, majoring in Electrical and Computer Engineering, under the supervision of Prof. Christopher G. Brinton. Previous to that, I obtained my master's degree in Communication and Information Systems at ShanghaiTech University. I had a delightful journey at ShanghaiTech, and I was especially fortunate to be advised by Prof. Yong Zhou and Prof. Yuanming Shi. In addition, from Aug. 2022 to Feb. 2023, I served as a research intern in the Optimization for Machine Learning lab at KAUST led by Prof. Peter Richtaric.

Email: fang375@purdue.edu

Address: Rm. 051, BHEE, West Lafayette, IN, 47906, US.
Phone: +1 (765) 694-5334

News

I have been admitted to Purdue ECE, beginning in Fall 2023.

Research

My research interests include

  • Distributed Optimization.

  • Wireless Communication.

  • Federated Learning.

Preprints

  1. W. Fang, D.-J. Han, C. G. Brinton, "Submodel Partitioning in Hierarchical Federated Learning: Algorithm Design and Convergence Analysis ", accepted to 2024 IEEE ICC. [pdf]

In this project, we propose hierarchical independent submodel training (HIST), a new FL methodology that aims to address these issues in hierarchical networks.

alt text 

The key idea behind HIST is a hierarchical version of model partitioning, where we divide the global model into disjoint partitions (or submodels) per round so that each cell is responsible for training only one partition of the model, reducing client-side computational/storage costs and overall communication load. The codes of HIST for training a fully connected neural network and CNNs on FMNIST and CIFAR-10 are provided [here].

alt text 

Model partitioning can be achieved by partitioning the hidden neurons of fully connected layers. For CNNs, we only partition the fully connected layers while the convolutional layers are shared by different cells. In particular, we let the input and output neurons be independent of partition and partition the hidden neurons every two layers. As a result, the parameter volume of each submodel is equal to 1/N of that of the full model on average.

Publications

  1. W. Fang, Ziyi Yu, Yuning Jiang, Yuanming Shi, Colin N. Jones, and Yong Zhou, "Communication-Efficient Stochastic Zeroth-Order Optimization for Federated Learning ", IEEE Trans. Signal Process. vol. 70, pp. 5058-5073, 2022. [pdf] [code] [slides]

  2. W. Fang, Y. Jiang, Y. Shi, Y. Zhou, W. Chen, and K. Letaief, "Over-the-Air Computation via Reconfigurable Intelligent Surface", IEEE Trans. Commun., vol. 69, no. 12, pp. 8612-8626, Dec. 2021. [pdf] [code]

  3. W. Fang, Y. Zou, H. Zhu, Y. Shi, and Y. Zhou, "Optimal Receive Beamforming for Over-the-Air Computation ", in Proc. IEEE SPAWC, Virtual Conferences, Sept. 2021. [pdf] [code]

  4. W. Fang, M. Fu, K. Wang, Y. Shi, and Y. Zhou, "Stochastic Beamforming for Reconfigurable Intelligent Surface Aided Over-the-Air Computation", in Proc. IEEE Globecom, Virtual Conference, Dec. 2020. [pdf ]

  5. W. Fang, M. Fu, Y. Shi, and Y. Zhou, "Outage Minimization for Intelligent Reflecting Surface Aided MISO Communication Systems via Stochastic Beamforming", in Proc. IEEE SAM, Virtual Conference, Jun. 2020. [pdf]