Skip to content

Raw Logs API: request_time / origin_time units #14

@ledbettj

Description

@ledbettj

Hi,

On the Raw Logs API Documentation it says:

Response Field Description
request_time Time in milliseconds for request to complete
origin_time How long it takes MaxCDN to retrieve the file (if cache status was a MISS)

As I understand it, that means request time is how long it took to serve the content to the browser, and origin time is the time required to load the resource from upstream in case of a cache miss. Please let me know if I've misunderstood here :)

When looking through the values returned from the API, I see the following:

  • Almost all cache HITs have a request_time of zero. I did see a few that had decimal results like 0.044.
  • For cache MISSes, both request_time and origin_time are in the 0.xx range.

For example, using the ruby client:

[29] pry(main)> client.get('v3/reporting/logs.json', limit: 5, status: '200',
                           cache_status: 'MISS')['records'].
                           map  { |r| [r['request_time'], r['origin_time']] }
=> [[0.052, 0.052],
 [0.018, 0.018],
 [0.581, 0.58],
 [0.048, 0.048],
 [0.035, 0.035]]

As a result I have several questions/issues:

  • The docs are incorrect and these units are seconds, right?
  • Why are all the responses for cache HITs zero? Even if these units are seconds I would expect to see response times in the low milliseconds. a recorded response time of zero is not useful for tracking performance.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions