Skip to content

Commit 0b47034

Browse files
fix some docs
1 parent 382c626 commit 0b47034

File tree

2 files changed

+7
-5
lines changed

2 files changed

+7
-5
lines changed

docs/source/advance.rst

Lines changed: 5 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,9 @@ Advanced Usage
55
MPS Simulator
66
----------------
77

8-
Very straightforward to use, we provide the same set of API for ``MPSCircuit`` as ``Circuit``,
8+
TensorCircuit-NG provides Matrix Product State (MPS) simulation as an efficient alternative to exact simulation for quantum circuits. MPS simulation can handle larger quantum systems by trading off accuracy for computational efficiency.
9+
10+
MPS simulator is very straightforward to use, we provide the same set of API for ``MPSCircuit`` as ``Circuit``,
911
the only new line is to set the bond dimension for the new simulator.
1012

1113
.. code-block:: python
@@ -70,7 +72,7 @@ Split Two-qubit Gates
7072

7173
The two-qubit gates applied on the circuit can be decomposed via SVD, which may further improve the optimality of the contraction pathfinding.
7274

73-
`split` configuration can be set at circuit-level or gate-level.
75+
``split`` configuration can be set at circuit-level or gate-level.
7476

7577
.. code-block:: python
7678
@@ -153,7 +155,7 @@ Jitted Function Save/Load
153155

154156
To reuse the jitted function, we can save it on the disk via support from the TensorFlow `SavedModel <https://www.tensorflow.org/guide/saved_model>`_. That is to say, only jitted quantum function on the TensorFlow backend can be saved on the disk.
155157

156-
We wrap the tf-backend `SavedModel` as very easy-to-use function :py:meth:`tensorcircuit.keras.save_func` and :py:meth:`tensorcircuit.keras.load_func`.
158+
We wrap the tf-backend ``SavedModel`` as very easy-to-use function :py:meth:`tensorcircuit.keras.save_func` and :py:meth:`tensorcircuit.keras.load_func`.
157159

158160
For the JAX-backend quantum function, one can first transform them into the tf-backend function via JAX experimental support: `jax2tf <https://github.com/google/jax/tree/main/jax/experimental/jax2tf>`_.
159161

docs/source/sharpbits.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -236,8 +236,8 @@ Vmap (vectorized map) outside a grad-like function may cause incorrected results
236236
Grad over vmap function
237237
~~~~~~~~~~~~~~~~~~~~~~~~~
238238

239-
A related issue is the different behavior for `K.grad(K.vmap(f))` on different backends. For tensorflow backend, the function to be differentiated has a scalar output which is the sum of all outputs.
239+
A related issue is the different behavior for ``K.grad(K.vmap(f))`` on different backends. For tensorflow backend, the function to be differentiated has a scalar output which is the sum of all outputs.
240240

241241
However, for Jax backend, the function simply raise error as only scalar output function can be differentiated, no implicit sum of the vectorized ``f`` is assumed. For non-scalar output, one should use `jacrev` or `jacfwd` to get the gradient information.
242242

243-
Specifically, `K.grad(K.vmap(f))` on TensorFlow backend is equilvalent to `K.grad(K.append(K.vamp(f), K.sum))` on Jax backend.
243+
Specifically, ``K.grad(K.vmap(f))`` on TensorFlow backend is equilvalent to ``K.grad(K.append(K.vamp(f), K.sum))`` on Jax backend.

0 commit comments

Comments
 (0)