Skip to content

A highly scalable parallel version of Galactic Swarm Optimisation Algorithm

License

Notifications You must be signed in to change notification settings

shubham0704/parallelGSO

Repository files navigation

parallelGSO

A highly scalable parallel version of Galactic Swarm Optimisation Algorithm

Galactic Swarm Optimization is a state-of-the-art meta-heuristic optimization algorithm which is insiped by the motion of stars, galaxies, superclusters interacting with each other under the influence of gravity.

Train Artificial Neural Networks quickly without backprop!


Take a look at a detailed introduction to our project - HERE


Installation

We recommend Anaconda. For installing Anaconda (for Linux) you can use this script

For Anaconda Users -

$ conda env update -f env.yaml
$ conda activate pgso

For PIP Users -

pip install -r requirements.txt

To run Benchmarks

All the testing experiments are present in the experiments/tests directory. To rerun the benchmarks do -

// cd into this project directory then
$ cd experiments/tests
$ jupyter notebook

You will then find a lot of notebooks which contains all kinds of different testings

To run the main experiments, check -Main Experiments (Performance Tests)

For per-cpu utilization benchmarks check - Per CPU Utilisation Experiments

To run Benchmarks against the functions test suite - Benchmarks

To Use PGSO (Parallel Galactic Swarm Optimization) as a module do -

from pgso.gso import GSO as PGSO
PGSO(
    M=<number of processes to be spawned
    bounds=<[[-100, 100],[-100, 100]]>, 
    num_particles=<number of particles>,
    max_iter=<maximum number of iterations>,
    costfunc=<n dimensional cost function>
    )

:returns:
    best_postition -> 1d array of n positions ex: [x, y] or [x,y,z] etc.
    best_error -> the best minimized error for the given function

Training Artificial Neural Networks

Check out ANN vs PGSO. This directory contains tutorial notebooks for you to get started.

About

A highly scalable parallel version of Galactic Swarm Optimisation Algorithm

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published