@@ -101,7 +101,7 @@ class BacktrackingOfThought(synalinks.Module):
101101 instructions = self .instructions,
102102 use_inputs_schema = self .use_inputs_schema,
103103 use_outputs_schema = self .use_outputs_schema,
104- name = self .name + " _thinking_generator " ,
104+ name = self .name + f " _thinking_generator_ { i } " ,
105105 )
106106 )
107107 self .critique = []
@@ -115,7 +115,7 @@ class BacktrackingOfThought(synalinks.Module):
115115 instructions = self .instructions,
116116 use_inputs_schema = self .use_inputs_schema,
117117 use_outputs_schema = self .use_outputs_schema,
118- name = self .name + " _critique_generator " ,
118+ name = self .name + f " _critique_generator_ { i } " ,
119119 )
120120 )
121121 # This is going to be the final generator
@@ -257,7 +257,7 @@ if __name__ == "__main__":
257257
258258First, let's explain the ` __init__() ` function. When implementing modules that
259259use a ` Generator ` , you want to externalize the generator's parameters
260- (` prompt_template ` , ` hints ` , ` examples ` , ` use_inputs_schema ` , ` use_outputs_schema ` )
260+ (` prompt_template ` , ` instructions ` , ` examples ` , ` use_inputs_schema ` , ` use_outputs_schema ` )
261261to give maximum flexibility to your module when possible.
262262Then, you have to include the default arguments of a module (` name ` , ` description ` , ` trainable ` )
263263that will be provided to the ` super().__init__() ` .
@@ -281,7 +281,7 @@ As a rule of thumb, the variables should be anything that evolve over time durin
281281inference/training. These variables could be updated by the module itself, or by
282282the optimizer if you have an optimizer designed for that. They will be serialized
283283when you save your program so you can recover the state of your program by loading
284- a JSON file. In this example, the variables are encapsulated in the ` Generator ` .
284+ a JSON file. In this example, the variables are encapsulated in the ` Generator ` module .
285285
286286### The ` call() ` function
287287
@@ -306,12 +306,12 @@ backtracking logic:
306306
307307The ` compute_output_spec() ` function is responsible for defining the output data model
308308of the module/program. It allows the system to understand the structure of the data
309- produced by this module. Its inputs is always a ` SymbolicDataModel ` .
309+ produced by this module. Its inputs is always a ` SymbolicDataModel ` , a placeholder that only contains a JSON schema that serve as data specification .
310310
311311In this example, ` compute_output_spec() ` returns a ` SymbolicDataModel ` based on the module's
312312schema by calling the modules sequentially, indicating the expected structure of the output data.
313313
314- As a rule of thumb, if you access a data model field (using ` get() ` ) you will have to
314+ As a rule of thumb, if you access a data model field in your call (using ` get() ` ) you will have to
315315implement it otherwise, Synalinks will infer the output spec by running the call
316316function with symbolic data models. If you have any doubt, do not implement it and the system will
317317raise an error if you needs to.
@@ -324,7 +324,7 @@ any trainable variables, and restoring it later.
324324
325325- The ` get_config() ` method should return a dictionary containing all the information needed
326326 to recreate the module. This includes the module's configuration and any serialized
327- sub-components like the language model or critique program in this case.
327+ sub-components like the language model in this case.
328328- The ` from_config() ` class method should be able to reconstruct the module from the
329329 configuration dictionary returned by ` get_config() ` .
330330
0 commit comments