Group Equivariant Convolutional Networks
Created by: chrisrothUT
Hi everybody,
I now have a working implementation of G-CNNs that works with a generic SymmGroup. GCNN works exactly like RBM_Symm. Here is a 4 layer GCNN on the Kagome lattice:
graph = nk.graph.Lattice(basis_vectors=[[1.,0.],[-1./2.,np.sqrt(3)/2]],extent=[2,2],atoms_coord=[[0.,0.],[-1./4.,np.sqrt(3)/4.],[1./2.,0.]])
ma = nk.models.GCNN(
symmetries = graph,
layers = 4,
features = 4,
dtype=complex
)
And we're off! The N layer model involves a layer DenseSymm, followed by N-1 DenseEquivariant layers which convolve over the entire set of poses based on their relative symmetry orientations. Like RBMSymm, this will produce identical outputs for each automorphism of the lattice, and I've included a similar test.
Questions
- What would be a good test for the layer DenseEquivariant?
- Can we figure out a way to only call reshape() once in DenseEquivariant (instead of twice)
- What are good defaults for initialization?
Future
- Implement restricted convolutions. This requires information about the symmetry group, not just a randomly ordered list of automorphisms.
- After testing on a variety of lattices, we tend to get caught in local minima on frustrated lattices. So we should think about adding an annealing method (i.e. optimizing phases first) to VMC (obviously beyond the scope of this PR)