Code Monkey home page Code Monkey logo

Comments (9)

noamwies avatar noamwies commented on June 12, 2024

@XiaoTailong
Thanks, I fixed the issue and updated the pypi package, please update to the new package and let me know if it works.

from flowket.

XiaoTailong avatar XiaoTailong commented on June 12, 2024

@XiaoTailong
Thanks, I fixed the issue and updated the pypi package, please update to the new package and let me know if it works.

I installed the new package. The above error was solved. But when I run Ising_test.py file, I still encounter a new error.

Traceback (most recent call last):
File "Ising_test.py", line 169, in
run(params, batch_size_list, epochs_list)
File "Ising_test.py", line 92, in run
true_ground_state_energy = true_ground_state_energy_mapping[params.gamma]
KeyError: None

from flowket.

noamwies avatar noamwies commented on June 12, 2024

@XiaoTailong
Did you mean examples/ising.py file?
You should run it with a gamma argument e.g. python3 ising.py --gamma 2.0.
Moreover it just an example of the API usage, if you want to obtain SOTA results the experiments folder can be a good starting point (run with --help to see the required argument).

from flowket.

XiaoTailong avatar XiaoTailong commented on June 12, 2024

@XiaoTailong
Did you mean examples/ising.py file?
You should run it with a gamma argument e.g. python3 ising.py --gamma 2.0.
Moreover it just an example of the API usage, if you want to obtain SOTA results the experiments folder can be a good starting point (run with --help to see the required argument).

Thanks, I will give it a try!

from flowket.

XiaoTailong avatar XiaoTailong commented on June 12, 2024

Hello! I have run the experiments of Ising.py. The file aims to obtain the ground energy of the Ising model. I wonder if i can use the model to obtain the ground state corresponding to the ground energy. I believe the model is able to output ground state but I have no idea how to achieve it. Can you give me some guidelines?

from flowket.

noamwies avatar noamwies commented on June 12, 2024

@XiaoTailong

If your's hilbert space is small enough you can use to_log_wave_function_vector to get the explicit full (log) wave function.

import numpy as np
import tensorflow
from flowket.machines import ConvNetAutoregressive2D
from flowket.exact.utils import to_log_wave_function_vector, decimal_array_to_binary_array

input_size = (4, 4)
inputs = tensorflow.keras.layers.Input(shape=input_size, dtype='int8')
convnet = ConvNetAutoregressive2D(inputs, ...)
model = tensorflow.keras.models.Model(inputs=inputs, outputs=convnet.predictions)
model.load_weights("pre-trained weights path")
log_wave_function = to_log_wave_function_vector(model)

# below you can construct the coresponding state for each enrty in the wave function
number_of_spins = np.prod(input_size)
num_of_states = 2 ** number_of_spins
states = decimal_array_to_binary_array(np.arange(num_of_states), number_of_spins, False).reshape((num_of_states, ) + input_size)

Otherwise you can still get specific entries of the (normalaized) wave function by

num_of_states = 2
states = np.array((num_of_states,)+input_size, dtype=np.int32)
states[0, ...] = np.array([[1, 1, -1, -1], [1, 1, -1, -1], [1, 1, -1, -1], [1, 1, -1, -1]])
states[1, ...] = np.array([[1, 1, -1, -1], [1, -1, 1, -1], [1, -1, 1, -1], [1, 1, -1, -1]])
log_wave_function = model.predict(states)[:, 0]

And sample from the Born probabilty by FastAutoregressiveSampler

from flowket.samplers import FastAutoregressiveSampler

sampler = FastAutoregressiveSampler(tensorflow.keras.models.Model(inputs=inputs, outputs=convnet.conditional_log_probs), 2**10)
sample = next(sampler)

You may also use flowket.evaluation.evaluate to estimate some operator

from flowket.optimization import VariationalMonteCarlo
from flowket.callbacks.monte_carlo import default_wave_function_stats_callbacks_factory
from flowket.evaluation import evaluate
from flowket.operators import Ising
from flowket.observables.monte_carlo import Observable
from flowket.callbacks.monte_carlo improt ObservableStats

generator = VariationalMonteCarlo(model, Ising(input_size, pbc=False, h=2.), sampler, mini_batch_size=2**7)
callbacks = default_wave_function_stats_callbacks_factory(generator, log_in_batch_or_epoch=False)
operator = ... # implements flowket.operators.Operator
callbacks += [ObservableStats(generator, Observable(operator), 'operator_name', log_in_batch_or_epoch=False)]
evaluate(generator, steps, callbacks)

Finally, note that in all of the above examples you might want to make the wave function invariant to some symetries by adding the following lines (the Born sampling remain unchanged)

from flowket.machines.ensemble import make_2d_obc_invariants, make_up_down_invariant

evaluation_inputs = Input(shape=input_size, dtype='int8')
obc_input = Input(shape=input_size, dtype='int8'))
model = make_2d_obc_invariants(obc_input, model)
model = make_up_down_invariant(evaluation_inputs, model)

from flowket.

XiaoTailong avatar XiaoTailong commented on June 12, 2024

@XiaoTailong

If your's hilbert space is small enough you can use to_log_wave_function_vector to get the explicit full (log) wave function.

import numpy as np
import tensorflow
from flowket.machines import ConvNetAutoregressive2D
from flowket.exact.utils import to_log_wave_function_vector, decimal_array_to_binary_array

input_size = (4, 4)
inputs = tensorflow.keras.layers.Input(shape=input_size, dtype='int8')
convnet = ConvNetAutoregressive2D(inputs, ...)
model = tensorflow.keras.models.Model(inputs=inputs, outputs=convnet.predictions)
model.load_weights("pre-trained weights path")
log_wave_function = to_log_wave_function_vector(model)

# below you can construct the coresponding state for each enrty in the wave function
number_of_spins = np.prod(input_size)
num_of_states = 2 ** number_of_spins
states = decimal_array_to_binary_array(np.arange(num_of_states), number_of_spins, False).reshape((num_of_states, ) + input_size)

Otherwise you can still get specific entries of the (normalaized) wave function by

num_of_states = 2
states = np.array((num_of_states,)+input_size, dtype=np.int32)
states[0, ...] = np.array([[1, 1, -1, -1], [1, 1, -1, -1], [1, 1, -1, -1], [1, 1, -1, -1]])
states[1, ...] = np.array([[1, 1, -1, -1], [1, -1, 1, -1], [1, -1, 1, -1], [1, 1, -1, -1]])
log_wave_function = model.predict(states)[:, 0]

And sample from the Born probabilty by FastAutoregressiveSampler

from flowket.samplers import FastAutoregressiveSampler

sampler = FastAutoregressiveSampler(tensorflow.keras.models.Model(inputs=inputs, outputs=convnet.conditional_log_probs), 2**10)
sample = next(sampler)

You may also use flowket.evaluation.evaluate to estimate some operator

from flowket.optimization import VariationalMonteCarlo
from flowket.callbacks.monte_carlo import default_wave_function_stats_callbacks_factory
from flowket.evaluation import evaluate
from flowket.operators import Ising
from flowket.observables.monte_carlo import Observable
from flowket.callbacks.monte_carlo improt ObservableStats

generator = VariationalMonteCarlo(model, Ising(input_size, pbc=False, h=2.), sampler, mini_batch_size=2**7)
callbacks = default_wave_function_stats_callbacks_factory(generator, log_in_batch_or_epoch=False)
operator = ... # implements flowket.operators.Operator
callbacks += [ObservableStats(generator, Observable(operator), 'operator_name', log_in_batch_or_epoch=False)]
evaluate(generator, steps, callbacks)

Finally, note that in all of the above examples you might want to make the wave function invariant to some symetries by adding the following lines (the Born sampling remain unchanged)

from flowket.machines.ensemble import make_2d_obc_invariants, make_up_down_invariant

evaluation_inputs = Input(shape=input_size, dtype='int8')
obc_input = Input(shape=input_size, dtype='int8'))
model = make_2d_obc_invariants(obc_input, model)
model = make_up_down_invariant(evaluation_inputs, model)

Thank you very much!

from flowket.

XiaoTailong avatar XiaoTailong commented on June 12, 2024

@noamwies
Now I know how to get the explict form of ground state wavefunction for small model. For large model, I think the above method is not possible or avaliable. If I understood right, I could only obtain the obervables's expectation value from sampling the ground state waveefunction in large models. If I want to get measurement results, could I regard the measurement operators as the observables? Then I will obtain the samples of measurments just as examples done in netket package.

Symetries, as i expected, can increase the precision of estimation. But I am not very clear how symetries exactly means for the estimation of ground state wavefunction? Or how do symetries mean to maching learning model? I may not very clear this physical concept.

from flowket.

noamwies avatar noamwies commented on June 12, 2024

@XiaoTailong
Sure, if you want to measurement netket operator you can wrap it with NetketOperatorWrapper and pass the results as the observable

from flowket.optimization import VariationalMonteCarlo
from flowket.callbacks.monte_carlo import default_wave_function_stats_callbacks_factory
from flowket.evaluation import evaluate
from flowket.operators import Ising, NetketOperatorWrapper
from flowket.observables.monte_carlo import Observable
from flowket.callbacks.monte_carlo improt ObservableStats

generator = VariationalMonteCarlo(model, Ising(input_size, pbc=False, h=2.), sampler, mini_batch_size=2**7)
callbacks = default_wave_function_stats_callbacks_factory(generator, log_in_batch_or_epoch=False)
netket_operator = ...
operator = NetketOperatorWrapper(netket_operator, input_size)
callbacks += [ObservableStats(generator, Observable(operator), 'operator_name', log_in_batch_or_epoch=False)]
evaluate(generator, steps, callbacks)

Regarding the symmetries, i just saying that empirically the make_2d_obc_invariants, make_up_down_invariant gives you wave function that is closer to the ground state.
I guess you can think of this as ensemble method , which is quiet common in machine learning

from flowket.

Related Issues (14)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.