Conversation
andrea-pasquale
left a comment
There was a problem hiding this comment.
Thanks @Edoardo-Pedicillo, just a few minor changes.
Co-authored-by: Andrea Pasquale <andreapasquale97@gmail.com>
src/boostvqe/boost.py
Outdated
| Learning rate decay factor used when the optimizer is SGD (stochastic gradient descent). | ||
|
|
||
| nboost (int, default: 1): | ||
| Number of times DBI (Deterministic Boost Iteration) is applied in the optimization process. |
There was a problem hiding this comment.
DBI = double-bracket iteration
is this part of the CMA optimizer?
There was a problem hiding this comment.
I think this is the number of times DBQA is used along a single, long, VQE training.
src/boostvqe/boost.py
Outdated
|
|
||
| if args.optimizer_options is None: | ||
| opt_options = {} | ||
| dbi_steps (int, default: 1): |
There was a problem hiding this comment.
| dbi_steps (int, default: 1): | |
| dbqa_steps (int, default: 1): |
this code should be capable of running DBI (dense matrices) and GCI (qibo.Circuit)
src/boostvqe/boost.py
Outdated
| dbi_steps (int, default: 1): | ||
| Number of DBI iterations performed each time DBI is called. | ||
|
|
||
| stepsize (float, default: 0.01): |
There was a problem hiding this comment.
| stepsize (float, default: 0.01): | |
| dbr_duration (float, default: 0.01): |
Each DBI step is DBR (double-bracket rotation) and in the strategies proceeding we always say DBR duration
There was a problem hiding this comment.
I think the stepsize has been removed from the arguments, but still remains in the docstrings. One of the two things should be solved.
src/boostvqe/boost.py
Outdated
| if args.optimizer_options is None: | ||
| opt_options = {} | ||
| dbi_steps (int, default: 1): | ||
| Number of DBI iterations performed each time DBI is called. |
There was a problem hiding this comment.
Number of rotations, namely DBQA steps.
| store_h (bool, default: False): | ||
| If this flag is set, the Hamiltonian `H` is stored at each iteration. | ||
|
|
||
| hamiltonian (str, default: "XXZ"): |
There was a problem hiding this comment.
after merging #77 this can become also a symbolic Hamiltonian
There was a problem hiding this comment.
Yes, this is also proposed by @Edoardo-Pedicillo at line 148.
|
|
||
| mode: | ||
| Define the DBI Generator. | ||
| """ |
There was a problem hiding this comment.
| """ | |
| """please_be_verbose: | |
| Flag which switches on verbose reporting - after each DBQA step a report is printed on energy gain, fidelity and circuit depth and stored in the output_folder as a log file. |
There was a problem hiding this comment.
Also this should be done in a separate PR
There was a problem hiding this comment.
Ok, let's move ahead without it
|
Thank you @Edoardo-Pedicillo ! I like the overall code structure and e.g. the clarity of the new Readme. I removed Sam as reviewer:
|
|
Will this close #83 where the number of trotter steps and dbqa steps should be different? |
Some fixes to run VQE trainings + derivative wrt RBS parameter
src/boostvqe/boost.py
Outdated
| Learning rate decay factor used when the optimizer is SGD (stochastic gradient descent). | ||
|
|
||
| nboost (int, default: 1): | ||
| Number of times DBI (Deterministic Boost Iteration) is applied in the optimization process. |
There was a problem hiding this comment.
I think this is the number of times DBQA is used along a single, long, VQE training.
src/boostvqe/boost.py
Outdated
| if args.optimizer_options is None: | ||
| opt_options = {} | ||
| dbi_steps (int, default: 1): | ||
| Number of DBI iterations performed each time DBI is called. |
There was a problem hiding this comment.
Number of rotations, namely DBQA steps.
| store_h (bool, default: False): | ||
| If this flag is set, the Hamiltonian `H` is stored at each iteration. | ||
|
|
||
| hamiltonian (str, default: "XXZ"): |
There was a problem hiding this comment.
Yes, this is also proposed by @Edoardo-Pedicillo at line 148.
src/boostvqe/boost.py
Outdated
| dbi_steps (int, default: 1): | ||
| Number of DBI iterations performed each time DBI is called. | ||
|
|
||
| stepsize (float, default: 0.01): |
There was a problem hiding this comment.
I think the stepsize has been removed from the arguments, but still remains in the docstrings. One of the two things should be solved.
Co-authored-by: Matteo Robbiati <62071516+MatteoRobbiati@users.noreply.github.com>
#84 (comment)