Skip to content
Discussion options

You must be logged in to vote

Hey. I found a way to do this.

If you already have archives set up that you can re-hydrate from, its all in how you set up the file name and folder, and the sink particulars. I'll explain below:

In your "transforms", you will need to format the logs destined for cloud storage to look like this, with this structure:

transforms:
  format_datadog_style:
    type: "remap"
    inputs:
    - my_upstream_input
    source: |
      # Generate unique ID (using timestamp + random component)
      ._id = encode_base64(to_string(now()) + "-" + uuid_v4())

      # Store timestamp once
      ts = format_timestamp!(now(), format: "%+")

      # Create nested attributes structure - use ! to abort on failu…

Replies: 2 comments 1 reply

Comment options

You must be logged in to vote
1 reply
@cspargo-apptio
Comment options

Comment options

You must be logged in to vote
0 replies
Answer selected by pront
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
provider: datadog Anything `datadog` service provider related
4 participants