Skip to content

Release 3.3.0

Compare
Choose a tag to compare
@drasmuss drasmuss released this 14 Aug 20:03

Compatible with Nengo 3.0.0

Compatible with TensorFlow 2.2.0 - 2.3.0

Added

  • Added support for new Nengo core NeuronType state implementation. (#159)
  • Compatible with TensorFlow 2.3.0. (#159)
  • Added support for nengo.Tanh, nengo.RegularSpiking, nengo.StochasticSpiking, and nengo.PoissonSpiking neuron types. (#159)
  • Added nengo_dl.configure_settings(learning_phase=True/False) configuration option. This mimics the previous behaviour of tf.keras.backend.learning_phase_scope (which was deprecated by TensorFlow). That is, if you would like to override the default behaviour so that, e.g., sim.predict runs in training mode, set nengo_dl.configure_settings(learning_phase=True). (#163)

Changed

  • Simulator.evaluate no longer prints any information to stdout in TensorFlow 2.2 in graph mode (due to a TensorFlow issue, see tensorflow/tensorflow#39456). Loss/metric values will still be returned from the function as normal. (#153)
  • A warning will now be raised if activation types are passed to Converter.swap_activations that aren't actually in the model. (#168)
  • Updated TensorFlow installation instruction in documentation. (#170)
  • NengoDL will now use TensorFlow's eager mode by default. The previous graph-mode behaviour can be restored by calling tf.compat.v1.disable_eager_execution(), but we cannot guarantee that that behaviour will be supported in the future. (#163)
  • NengoDL will now use TensorFlow's "control flow v2" by default. The previous behaviour can be restored by calling tf.compat.v1.disable_control_flow_v2(), but we cannot guarantee that that behaviour will be supported in the future. (#163)
  • NengoDL will now default to allowing TensorFlow's "soft placement" logic, meaning that even if you specify an explicit device like "/gpu:0", TensorFlow may not allocate an op to that device if there isn't a compatible implementation available. The previous behaviour can be restored by calling tf.config.set_soft_device_placement(False). (#163)
  • Internal NengoDL OpBuilder classes now separate the "pre build" stage from OpBuilder.__init__ (so that the same OpBuilder class can be re-used across multiple calls, rather than instantiating a new OpBuilder each time). Note that this has no impact on front-end users, this is only relevant to anyone that has implemented a custom build class. The logic that would previously have gone in OpBuilder.__init__ should now go in OpBuilder.build_pre. In addition, the ops argument has been removed from OpBuilder.build_pre; that will be passed to OpBuilder.__init__ ( and will be available in build_pre as self.ops). Similarly, the ops and config argument have been removed from build_post, and can instead be accessed through self.ops/config. (#163)
  • Minimum TensorFlow version is now 2.2.0. (#163)

Fixed

  • Support Sparse transforms in Simulator.get_nengo_params. (#149)
  • Fixed bug in TensorGraph log message when logging was enabled. (#151)
  • Updated the KerasWrapper class in the tensorflow-models example to fix a compatibility issue in TensorFlow 2.2. (#153)
  • Handle Nodes that are not connected to anything else, but are probed (this only occurs in Nengo>=3.1.0). (#159)
  • More robust support for converting nested Keras models in TensorFlow 2.3. (#161)
  • Fix bug when probing slices of certain probeable attributes (those that are directly targeting a Signal in the model). (#164)

Removed

  • Removed nengo_dl.utils.print_op (use tf.print instead). (#163)