Skip to content
/ fedjax Public

FedJAX is a JAX-based open source library for Federated Learning simulations that emphasizes ease-of-use in research.

License

Notifications You must be signed in to change notification settings

google/fedjax

FedJAX: Federated learning simulation with JAX

Build and minimal test Documentation Status PyPI version

Documentation | Paper

NOTE: FedJAX is not an officially supported Google product. FedJAX is still in the early stages and the API will likely continue to change.

What is FedJAX?

FedJAX is a JAX-based open source library for Federated Learning simulations that emphasizes ease-of-use in research. With its simple primitives for implementing federated learning algorithms, prepackaged datasets, models and algorithms, and fast simulation speed, FedJAX aims to make developing and evaluating federated algorithms faster and easier for researchers. FedJAX works on accelerators (GPU and TPU) without much additional effort. Additional details and benchmarks can be found in our paper.

Installation

You will need a moderately recent version of Python. Please check the PyPI page for the up to date version requirement.

First, install JAX. For a CPU-only version:

pip install --upgrade pip
pip install --upgrade jax jaxlib  # CPU-only version

For other devices (e.g. GPU), follow these instructions.

Then, install FedJAX from PyPI:

pip install fedjax

Or, to upgrade to the latest version of FedJAX:

pip install --upgrade git+https://github.com/google/fedjax.git

Getting Started

Below is a simple example to verify FedJAX is installed correctly.

import fedjax
import jax
import jax.numpy as jnp
import numpy as np

# {'client_id': client_dataset}.
fd = fedjax.InMemoryFederatedData({
    'a': {
        'x': np.array([1.0, 2.0, 3.0]),
        'y': np.array([2.0, 4.0, 6.0]),
    },
    'b': {
        'x': np.array([4.0]),
        'y': np.array([12.0])
    }
})
# Initial model parameters.
params = jnp.array(0.5)
# Mean squared error.
mse_loss = lambda params, batch: jnp.mean(
    (jnp.dot(batch['x'], params) - batch['y'])**2)
# Loss for clients 'a' and 'b'.
print(f"client a loss = {mse_loss(params, fd.get_client('a').all_examples())}")
print(f"client b loss = {mse_loss(params, fd.get_client('b').all_examples())}")

The following tutorial notebooks provide an introduction to FedJAX:

You can also take a look at some of our working examples:

Citing FedJAX

To cite this repository:

@article{fedjax2021,
  title={{F}ed{JAX}: Federated learning simulation with {JAX}},
  author={Jae Hun Ro and Ananda Theertha Suresh and Ke Wu},
  journal={arXiv preprint arXiv:2108.02117},
  year={2021}
}

Useful pointers

About

FedJAX is a JAX-based open source library for Federated Learning simulations that emphasizes ease-of-use in research.

Topics

Resources

License

Code of conduct

Security policy

Stars

Watchers

Forks

Packages

No packages published