Skip to content

Releases: Meta-optimization/L2L

Version 2.0.0-beta

30 Oct 11:57
aaeeea0

Choose a tag to compare

What's Changed

  • This release brings a new and more flexible runner which allows for better parallelization on HPC systems. Removed dependency to JUBE.
  • We include the new Emsemble Kalman Filter optimizer used in: Yegenoglu, A., Krajsek, K., Pier, S.D. and Herty, M., 2020, July. Ensemble Kalman filter optimizing deep neural networks: an alternative approach to non-performing gradient descent. In International Conference on Machine Learning, Optimization, and Data Science (pp. 78-92). Cham: Springer International Publishing.
  • Added new wrapper for multi-optimizers, this allows optimizees to be deployed on highly parallelizable hardware and accelerators like GPUs, as well as neuromorphic computing and quantum computing systems, taking full advantage of all the available resources.
  • Enhanced testsuit.
  • Updated documenation.
  • L2L is now pip installable as L2LforHPC

New Contributors
@sdiazpier made their first contribution in #21
@alperyeg made their first contribution in #25
@brenthuisman made their first contribution in #44
@dagush made their first contribution in #67
@neich made their first contribution in #72
@WASAB95 made their first contribution in #84
@HannaMohr made their first contribution in #94
@JoWilhelm made their first contribution in #101
@DeLaVlag made their first contribution in #124