-
Notifications
You must be signed in to change notification settings - Fork 0
Open
Description
Hi,
On the Raw Logs API Documentation it says:
| Response Field | Description |
|---|---|
| request_time | Time in milliseconds for request to complete |
| origin_time | How long it takes MaxCDN to retrieve the file (if cache status was a MISS) |
As I understand it, that means request time is how long it took to serve the content to the browser, and origin time is the time required to load the resource from upstream in case of a cache miss. Please let me know if I've misunderstood here :)
When looking through the values returned from the API, I see the following:
- Almost all cache
HITs have arequest_timeof zero. I did see a few that had decimal results like0.044. - For cache
MISSes, bothrequest_timeandorigin_timeare in the 0.xx range.
For example, using the ruby client:
[29] pry(main)> client.get('v3/reporting/logs.json', limit: 5, status: '200',
cache_status: 'MISS')['records'].
map { |r| [r['request_time'], r['origin_time']] }
=> [[0.052, 0.052],
[0.018, 0.018],
[0.581, 0.58],
[0.048, 0.048],
[0.035, 0.035]]As a result I have several questions/issues:
- The docs are incorrect and these units are seconds, right?
- Why are all the responses for cache
HITs zero? Even if these units are seconds I would expect to see response times in the low milliseconds. a recorded response time of zero is not useful for tracking performance.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels