You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@param rating: Rating rescaled to interval [-1.0,1.0], where -1.0 means the worst rating possible, 0.0 means neutral, and 1.0 means absolutely positive rating. For example, in the case of 5-star evaluations, rating = (numStars-3)/2 formula may be used for the conversion.
17
17
18
18
19
-
Optional parameters (given as dictionary C{optional}):
19
+
Optional parameters:
20
20
@param timestamp: UTC timestamp of the rating as ISO8601-1 pattern or UTC epoch time. The default value is the current time.
21
21
22
-
@param cascadeCreate: Sets whether the given user/item should be created if not present in the database.
22
+
@param cascade_create: Sets whether the given user/item should be created if not present in the database.
23
23
24
24
"""
25
25
self.user_id=user_id
26
26
self.item_id=item_id
27
27
self.rating=rating
28
-
self.timestamp=optional.get('timestamp')
29
-
self.cascade_create=optional.get('cascadeCreate')
30
-
forparinoptional:
31
-
ifnotparin {"timestamp","cascadeCreate"}:
32
-
raiseValueError("Unknown parameter %s was given to the request"%par)
Copy file name to clipboardExpand all lines: recombee_api_client/api_requests/batch.py
+7-7Lines changed: 7 additions & 7 deletions
Original file line number
Diff line number
Diff line change
@@ -6,13 +6,13 @@ class Batch(Request):
6
6
"""
7
7
Batch request for submitting an arbitrary sequence of requests
8
8
9
-
In many cases, it may be desirable to execute multiple requests at once. By example, when synchronizing the catalog of items in periodical manner, you would have to execute a sequence of thousands of separate POST requests, which is very ineffective and may take a very long time to complete. Most notably, network latencies can make execution of such a sequence very slow and even if executed in multiple parallel threads, there will still be unreasonable overhead caused by the HTTP(s). To avoid the problems mentioned, batch processing may be used, encapsulating a sequence of requests into a single HTTP request.
10
-
Batch processing allows you to submit arbitrary sequence of requests in form of JSON array. Any type of request from the above documentation may be used in the batch, and the batch may combine different types of requests arbitrarily as well.
11
-
Note that:
12
-
- executing the requests in a batch is equivalent as if they were executed one-by-one individually; there are, however, many optimizations to make batch execution as fast as possible,
13
-
- the status code of the batch request itself is 200 even if the individual requests result in error - you have to inspect the code values in the resulting array,
14
-
- if the status code of the whole batch is not 200, then there is an error in the batch request itself; in such a case, the error message returned should help you to resolve the problem,
15
-
- currently, batch size is limited to **10,000** requests; if you wish to execute even larger number of requests, please split the batch into multiple parts.
9
+
In many cases, it may be desirable to execute multiple requests at once. By example, when synchronizing the catalog of items in periodical manner, you would have to execute a sequence of thousands of separate POST requests, which is very ineffective and may take a very long time to complete. Most notably, network latencies can make execution of such a sequence very slow and even if executed in multiple parallel threads, there will still be unreasonable overhead caused by the HTTP(s). To avoid the problems mentioned, batch processing may be used, encapsulating a sequence of requests into a single HTTP request.
10
+
Batch processing allows you to submit arbitrary sequence of requests in form of JSON array. Any type of request from the above documentation may be used in the batch, and the batch may combine different types of requests arbitrarily as well.
11
+
Note that:
12
+
- executing the requests in a batch is equivalent as if they were executed one-by-one individually; there are, however, many optimizations to make batch execution as fast as possible,
13
+
- the status code of the batch request itself is 200 even if the individual requests result in error - you have to inspect the code values in the resulting array,
14
+
- if the status code of the whole batch is not 200, then there is an error in the batch request itself; in such a case, the error message returned should help you to resolve the problem,
15
+
- currently, batch size is limited to **10,000** requests; if you wish to execute even larger number of requests, please split the batch into multiple parts.
@param user_id: ID of the user who made the cart addition.
13
13
14
14
@param item_id: ID of the item of which was added to cart.
15
15
16
16
17
-
Optional parameters (given as dictionary C{optional}):
17
+
Optional parameters:
18
18
@param timestamp: Unix timestamp of the cart addition. If the `timestamp` is omitted, then all the cart additions with given `userId` and `itemId` are deleted.
19
19
20
20
"""
21
21
self.user_id=user_id
22
22
self.item_id=item_id
23
-
self.timestamp=optional.get('timestamp')
24
-
forparinoptional:
25
-
ifnotparin {"timestamp"}:
26
-
raiseValueError("Unknown parameter %s was given to the request"%par)
@param user_id: ID of the user who made the detail view.
13
13
14
14
@param item_id: ID of the item of which the details were viewed.
15
15
16
16
17
-
Optional parameters (given as dictionary C{optional}):
17
+
Optional parameters:
18
18
@param timestamp: Unix timestamp of the detail view. If the `timestamp` is omitted, then all the detail views with given `userId` and `itemId` are deleted.
19
19
20
20
"""
21
21
self.user_id=user_id
22
22
self.item_id=item_id
23
-
self.timestamp=optional.get('timestamp')
24
-
forparinoptional:
25
-
ifnotparin {"timestamp"}:
26
-
raiseValueError("Unknown parameter %s was given to the request"%par)
0 commit comments