Skip to content

Fix Squared Operator and add tests + super-simplify LdagL logic

Vicentini Filippo requested to merge pv/squared into master

Created by: PhilipVinc

I recently noticed that the lazy SquaredOperator constructed by doing ha@ha is not being tested for it's expect_and_grad.

Adding a test I noticed that

  • you could not do LazyOperator@densevector, which is used for testing Merged in other PR
  • the gradient is off by a factor of 2 (or the finite difference computation is wrong) for real parameters Merged in other PR
  • for complex-valued C->C ansatze the test does not pass (this is the same code path as for non hermitian operators, so maybe it's related, or maybe the finite difference is wrong?)

This PR now fixes point 3 (the other two have been fixed in other PRs today). It seems that the C->C gradient is off by a factor of 2. This time I have to divide the complex result by 2. Why? I don't fully understand it yet. Possibly related to jax.vjp and the way it handles non holomorphic functions. But I'll try to check again.

If you compute on paper the gradient returned by jax.vjp for a complex-valued ansatz using the formula given here you see that you get twice the expected gradient if you have complex parameters.

--

Added 25/2/2022

With this logic now the Squared gradient is much more efficient, and supports arbitrary even non holomorphic wavefunctions/density matrices. There is therefore no more the need to keep the special code for Lindblad/density matrices, so I deleted it.

I also added tests to check that the gradient computed for LdagL is correct wrt the finite difference gradient (before it was not checked, but I was only checking that it converged).

Merge request reports