- Apr 25, 2022
-
-
Damian Hofmann authored
-
Filippo Vicentini authored
-
Damian Hofmann authored
-
- Apr 19, 2022
-
-
Damian Hofmann authored
-
- Apr 13, 2022
-
-
dependabot[bot] authored
Bumps [sphinx-autodoc-typehints](https://github.com/tox-dev/sphinx-autodoc-typehints) from 1.17.0 to 1.18.0. - [Release notes](https://github.com/tox-dev/sphinx-autodoc-typehints/releases) - [Changelog](https://github.com/tox-dev/sphinx-autodoc-typehints/blob/main/CHANGELOG.md) - [Commits](https://github.com/tox-dev/sphinx-autodoc-typehints/compare/1.17.0...1.18.0 ) --- updated-dependencies: - dependency-name: sphinx-autodoc-typehints dependency-type: direct:production update-type: version-update:semver-minor ... Signed-off-by:
dependabot[bot] <support@github.com> Co-authored-by:
dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
-
- Apr 12, 2022
-
-
Filippo Vicentini authored
-
Filippo Vicentini authored
If you run Examples/Dyanamics/Ising1d.py with RBMModPhase Ansatz which has real parameters but complex output, TDVP breaks down because the gradient is complex but we need to take the real part.
-
- Apr 11, 2022
-
-
Clemens Giuliani authored
* fix unwanted promotion in dense and pytree fixes unwanted promotion to double precision of the oks when working with single precision params
-
Filippo Vicentini authored
I think this change got lost when @femtobit rebased my driver implementation for TDVP so.. here's it. This actually makes the TDVP solver log the times (before, it was not working). It's actually useful to log other stuff sometimes. We should actually document it somewhere, but I'll leave the documentation effort for later.
-
Filippo Vicentini authored
Add a shape check fix #1157
-
- Apr 08, 2022
-
-
Dian Wu authored
* Change tree_multimap to tree_map * Also replace for tests
-
- Apr 06, 2022
-
-
Filippo Vicentini authored
* update changelog for release * update slack invite link
-
- Apr 05, 2022
-
- Apr 01, 2022
-
-
Attila Szabó authored
Closes #1133, the issue of GCNNs described at [RFC] [bug] logsumexp in GCNNs. Adds `nk.nn.logsumexp_cplx`, which wraps the JAX logsumexp but always returns complex results, handles the logs of negative reals correctly within complex arithmetics, without promoting all real inputs to complex (which might be a memory bottleneck for some applications). Only thing to settle is whether the new ensure_cplx flag should default to true or false. True is a minimal breaking change (hitherto real outputs will come with identically zero imaginary parts), but it removes a silent error we had until now, so I would go for it.
-
Damian Hofmann authored
This PR introduces a flag to compute the split-R_hat diagnostic instead of the plain R_hat currently used. Split-R_hat is able to additionally detect non-stationarity within single chains. See, e.g., Vehtari et al. (arXiv:1903.08008) for a modern discussion of this diagnostic (that paper points out some failure types that split-Rhat does not cover either and proposes an improved version, but let's do one step at a time here). Here is a very simple example featuring two (identical) linearly increasing and thus non-stationary chains: ```python # stats_example.py import netket as nk import numpy as np r = np.arange(100, dtype=float) x = np.array([r, r]) print(nk.stats.statistics(x)) ``` which gives these results (split-Rhat is currently enabled by the NETKET_USE_SPLIT_RHAT flag in this PR): The non-split Rhat is happy (with R^ ≈ 1.0) because the chains do have identical mean and variance, whereas the split-chain version correctly identifies the failed MC convergence (R^ >> 1.01).
-
- Mar 31, 2022
-
-
Filippo Vicentini authored
Our sampling tests do not take into account the fact that samples are autocorrelated. Therefore they are doomed to fail. I increase the n_sweeps to reduce autocorrelation to avoid this.
-
Filippo Vicentini authored
Co-authored-by:
Damian Hofmann <damian.hofmann@mpsd.mpg.de>
-
- Mar 30, 2022
-
-
Damian Hofmann authored
-
- Mar 29, 2022
-
-
dependabot[bot] authored
Bumps [sphinx](https://github.com/sphinx-doc/sphinx) from 4.4.0 to 4.5.0. - [Release notes](https://github.com/sphinx-doc/sphinx/releases) - [Changelog](https://github.com/sphinx-doc/sphinx/blob/4.x/CHANGES) - [Commits](https://github.com/sphinx-doc/sphinx/compare/v4.4.0...v4.5.0 ) --- updated-dependencies: - dependency-name: sphinx dependency-type: direct:production update-type: version-update:semver-minor ... Signed-off-by:
dependabot[bot] <support@github.com> Co-authored-by:
dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
-
- Mar 18, 2022
-
-
Filippo Vicentini authored
The latest jax released earlier today/yesterday broke some small things because it removed jax.ops.index.
-
- Mar 16, 2022
-
-
Clemens Giuliani authored
* support pytrees as diag_shift * add test * adapt test for dense and pytree (problem is finding the correct shape of shift when e.g. doing real-imag split; ideally we should convert it in the constructor of the qgt) * black * isinstance instead of checking attr
-
- Mar 13, 2022
-
-
Damian Hofmann authored
Currently, using the JaxFramework model framework without haiku being installed causes an exception, because HaikuFramework.is_my_module (which tries to import haiku) is called anyways. This PR fixes this by only checking for haiku-based modules if haiku is already loaded.
-
- Mar 07, 2022
-
-
Filippo Vicentini authored
The special code for the steady state computation is quite old. With the big operator rewrite for chunking I did not realise it immediately, but I implemented a more efficient version of squared operator gradient. This PR simply removes the old special code and falls back to using AD on nkjax.expect to compute gradient of LdagL. It's also 10% faster.
-
- Mar 02, 2022
-
-
Filippo Vicentini authored
-
- Mar 01, 2022
-
-
Enrico Rinaldi authored
* [fix] Removed extra 's' in link
-
Filippo Vicentini authored
Cherry picked from #1065 so that i separate the two changes in two different PRs. Computing the gradient of operators that use nkjax.expect instead of the covariance formula (such as SquaredOperator) also had a wrong factor of 2 for C->C models. This is due to the weird way that Jax handles complex differentiation. This PR now fixes it, and a test is added to check the gradient wrt finite differences. Also, this PR moves out into a common test file the finite difference functions so that they can be used for other tests.
-
- Feb 28, 2022
-
-
Enrico Rinaldi authored
* [fix] Fixed decumentation links + Colab link * [fix] Get Started link * [fix] netket package in quotes
-
- Feb 24, 2022
-
-
Dian Wu authored
To support Windows, the most nasty thing is that the default size of some int is 32 bits rather than 64 bits, even on 64-bit Windows. (I didn't test it on 32-bit Windows, but come on it's 2022) Also, we need to correctly find mpicc because the default configuration of mpi4py on Windows does not provide it.
-
- Feb 23, 2022
-
-
Filippo Vicentini authored
-
Dian Wu authored
-
dependabot[bot] authored
* Bump myst-nb from 0.13.1 to 0.13.2 Bumps [myst-nb](https://github.com/executablebooks/myst-nb) from 0.13.1 to 0.13.2. - [Release notes](https://github.com/executablebooks/myst-nb/releases) - [Changelog](https://github.com/executablebooks/MyST-NB/blob/master/CHANGELOG.md) - [Commits](https://github.com/executablebooks/myst-nb/compare/v0.13.1...v0.13.2 ) --- updated-dependencies: - dependency-name: myst-nb dependency-type: direct:production update-type: version-update:semver-patch ... Signed-off-by:
dependabot[bot] <support@github.com> * Update requirements.txt * Update requirements.txt Co-authored-by:
dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by:
Filippo Vicentini <filippovicentini@gmail.com>
-
- Feb 22, 2022
-
-
Filippo Vicentini authored
-
Filippo Vicentini authored
-
- Feb 21, 2022
-
-
Filippo Vicentini authored
-
Filippo Vicentini authored
-
Filippo Vicentini authored
-
- Feb 19, 2022
-
-
Filippo Vicentini authored
The QGT where not splitting the real/imaginary part of the initial point as well. The test in this PR does not pass on current master:
-
Filippo Vicentini authored
-
- Feb 18, 2022
-
-
Dian Wu authored
-