Skip to content

WIP: Extract stochastic reconfiguration code from VMC class

Vicentini Filippo requested to merge refactor-vmc into v2.0

Created by: femtobit

The main commit in this PR extracts the SR code from VariationalMonteCarlo to a separate class. So far, the interface of the VMC class is left unchanged. However, this could be a good point to discuss whether it should stay this way for v2.0.

Currently, the method to update the variational parameters is configurable in two ways:

  1. First, the method to compute the parameter update dp. Can be either Gd or Sr and is set via several VMC constructor parameters. The actual computation steps happen in the Gradient and UpdateParameters methods of VariationalMonteCarlo.
  2. Second, the method to apply dp, which is determined by the AbstractOptimizer passed to VMC. In the simplest case (the Sgd class), this is of the form p(t+1) = p(t) + a(t) * dp where a(t) is the current learning rate.

Part 2 is under control of the users since they can pass in a custom optimizer. Part 1 is currently less configurable. We could discuss whether we should make this extendable as well by providing a suitable base class with subclasses for the current SR and GD methods. What do you think?

(In any case, I think extracting the SR part from the VMC code makes it easier to understand.)

Merge request reports