Skip to content

Commit 7678cb3

Browse files
committed
[Cogsvcs] Personalizer - troubleshooting
1 parent 1e5d75b commit 7678cb3

File tree

1 file changed

+22
-7
lines changed

1 file changed

+22
-7
lines changed

articles/cognitive-services/personalizer/troubleshooting.md

Lines changed: 22 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -17,28 +17,37 @@ This article contains answers to frequently asked troubleshooting questions abou
1717

1818
## Transaction errors
1919

20-
### I get a HTTP 429 (Too many requests) response from the service. What can I do?
20+
### I get an HTTP 429 (Too many requests) response from the service. What can I do?
2121

22-
If you picked a free price tier when you created the Personalizer instance, there is a quota limit on the number of Rank requests that are allowed. Please review your api call rate for the Rank api calls (in the Metrics pane) and adjust the pricing tier (in the Pricing Tier pane) if your call volume is expected to increase beyond the threshold for chosen pricing tier.
22+
If you picked a free price tier when you created the Personalizer instance, there is a quota limit on the number of Rank requests that are allowed. Review your API call rate for the Rank API (in the Metrics pane in the Azure portal for your Personalizer resource) and adjust the pricing tier (in the Pricing Tier pane) if your call volume is expected to increase beyond the threshold for chosen pricing tier.
2323

2424
### I'm getting a 5xx error on Rank or Reward APIs. What should I do?
2525

26-
These issues should be transparent. If they continue, please contact support.
26+
These issues should be transparent. If they continue, contact support by selecting in the Azure portal for your Personalizer resource.
2727

2828

2929
## Learning loop
3030

31+
<!--
32+
33+
### How do I import a learning policy?
34+
35+
36+
-->
37+
3138
### The learning loop doesn't seem to learn. How do I fix this?
3239

3340
The learning loop needs a few thousand Reward calls before Rank calls prioritize effectively.
3441

3542
If you are unsure about how your learning loop is currently behaving, run an [offline evaluation](concepts-offline-evaluation.md), and apply the corrected learning policy.
36-
<!--
43+
3744
### I keep getting rank results with all the same probabilities for all items. How do I know Personalizer is learning?
3845

39-
Personalizer returns the same probabilities in a rank result when …. This is usually happening because …. You can avoid it by…..
40-
-->
41-
### The learning loop was learning but seems to not learn any more, and the quality of the Rank results isn't that good. What should I do?
46+
Personalizer returns the same probabilities in a Rank API result when it has just started and has an _empty_ model, or when you reset the Personalizer Loop, and your model is still within your **Model update frequency** period.
47+
48+
When the new update period begins, the updated model is used, and you’ll see the probabilities change.
49+
50+
### The learning loop was learning but seems to not learn anymore, and the quality of the Rank results isn't that good. What should I do?
4251

4352
* Make sure you've completed and applied one evaluation in the Azure portal.
4453
* Make sure all rewards are sent and processed.
@@ -54,6 +63,12 @@ You can find the time when the model was last updated in the **Model and Learnin
5463

5564
This is typically due to timestamps, user IDs or some other fine grained features sent in.
5665

66+
### I created an offline evaluation and it succeeded almost instantly. Why is that? I don’t see any results?
67+
68+
The offline evaluation uses the trained model data from the events in that time period. If you did not send any data in the time period between start and end time of the evaluation, it will complete without any results. Submit a new offline evaluation by selecting a time range with events you know were sent to Personalizer.
69+
70+
71+
5772
## Security
5873

5974
### The API key for my loop has been compromised. What can I do?

0 commit comments

Comments
 (0)