Skip to content

Releases: google/TensorNetwork

Removal of the TensorNetwork class

31 Oct 00:09
6e20d13
Compare
Choose a tag to compare

The TensorNetwork class has been REMOVED and will raise an error if you try to use it. To upgrade your code, please see our simple upgrade guide.

  • Added tn.FiniteMPS class for your standard MPS applications
  • Added tn.reduce_density method for creating a reduced density matrices.
  • Added ignore_edge_order argument to the contractor methods.
  • Added tn.split_edge method.
  • Added support for contracting subgraphs of a network. Now you can use the tn.contractor methods for contracting a subset of your nodes.
  • Added support for copying a subgraph of the network. When an edge is non-dangling, but only one of the nodes for that edge is in the set of given edges, the "copied" edge become a dangling edge. This is useful for calculating things like environments and normalizations.
  • Added a tn.NodeCollection context manager. This allows you to collect all of your created nodes into a single list or set easily.

Free Node Paradigm

03 Oct 20:28
Compare
Choose a tag to compare

We have switched to using a "Free Node" paradigm. Examples of what this looks like can be found on our latest README.

Users have told us that the TensorNetwork object is more cumbersome than helpful, so we will be totally removing it in the next release. This release is out to help users upgrade their code. This is a BREAKING CHANGE. We have made a simple tutorial to upgrade your code.

  • New prefered way to import tensornetwork is now as import tensornetwork as tn
  • Deprecated the TensorNetwork object. Prefer now to just create your nodes using tn.Node. All operations such as net.split_node can now be called like tn.split_node.
  • Added tn.reachable method to get a set of all nodes reachable from another node or set of nodes. This usually can be a drop in replacement for when you had to originally give a TensorNetwork object in methods like all of the contractors.

Change Default Backend to Numpy

12 Sep 22:12
25ef3d6
Compare
Choose a tag to compare

We have changed the default backend to numpy. This is a BREAKING CHANGE. To fix your code, you can simply do tensornetwork.set_default_backend("tensorflow") at the top of your main file.

  • Slice notation has been added to accessing a node's edges. Simple do node[:3] to get the edges for the first 3 axes in a node.
  • Contraction algorithms for optimal, branch and greedy now require an output_edge_order when there is more than one dangling edge in the network. This is to prevent a user from accidentally depending on a non-deterministic edge order after the network contraction.
  • CopyNodes can now be created outside of a TensorNetwork, so now net.add_node(tensornetwork.CopyNode(...)) is the preferred way to add a copy node to a network. We will be removing net.add_copy_node in the future.
  • Added net.split_node_qr and net.split_node_rq methods that do QR decomposition of a Node.
  • Added net.copy operation that will copy a TensorNetwork. This copied network will keep the same tensors objects between the two nodes. This is to make taking gradients of nodes relative to the final contracted value much easier.
  • Added a conj option to net.copy(conj=True). This will copy the TensorNetwork and conjugate all of the tensors in the network. This is useful for calculating things like reduced density matrices.
  • Added net.save(..) and tensornetwork.load(...) methods for saving and loading a TensorNetwork object.
  • Removed support for python 3.5. If this is a blocker for you, please raise a github issue to readd support.

Opt_einsum contraction algorithms.

07 Aug 20:27
01cbe6c
Compare
Choose a tag to compare

Added integration to opt_einsum's deterministic contraction algorithms. See https://optimized-einsum.readthedocs.io/en/latest/path_finding.html for definitions on how these algorithms work.

  • Added contractors.optimal to find the optimal contraction based on required flops.
  • Added contractors.branch, which uses branch heuristics to determine which paths to explore.
  • Added contractors.greedy, which uses a fast greedy heuristic.
  • Added contractors.auto, which chooses which of the above algorithms to use based on network size.
  • Added contractors.custom, which allows users to develop their own contraction algorithms.

PyTorch Support!

30 Jul 18:03
65f73d6
Compare
Choose a tag to compare
  • Added PyTorch support. Simply do tensornetwork.set_default_backend("pytorch") or TensorNetwork(backend="pytorch") to enable it.
  • Added edge1 ^ edge2 as an alias to net.connect(edge1, edge2)

0.0.5

15 Jul 21:59
b5de2d5
Compare
Choose a tag to compare
  • Added net.switch_backend method
  • Fixed an SVD bug when the backend is either numpy or jax

TensorNetwork 0.0.4 release

21 Jun 23:28
73088ee
Compare
Choose a tag to compare
  • Added the greedy contraction algorithm. This will greedily contract the lowest cost node pair first.
  • Added the bucket contraction algorithm. This algorithm is optimized for tensor networks with a lot of copy tensors.
  • Upgraded the naive contraction algorithm. Now it should work even after some edge have been contracted.
  • Added the @ operator. Doing node1 @ node2 is equal to running net.contract_between(node1, node2)
  • Added graphviz visualization integration. Simply do tensornetwork.to_graphviz(net) to get a graphviz object that is isomorphic to your network.
  • Added net.remove_node(node) method.
  • Added node.shape and edge.dimension properties.
  • Improved TF 2.0 beta compatibility.

Multi backend support

23 May 21:23
ac0c770
Compare
Choose a tag to compare
  • Added support for JAX and NumPy backends.
  • Added new tensordot2 that compiles ~20% faster than tf.tensordot