- 
                Notifications
    You must be signed in to change notification settings 
- Fork 5.5k
[Components] geckoboard #13394 #15908
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
| The latest updates on your projects. Learn more about Vercel for Git ↗︎ 3 Skipped Deployments
 | 
| Important Review skippedReview was skipped due to path filters ⛔ Files ignored due to path filters (1)
 CodeRabbit blocks several paths by default. You can override this behavior by explicitly including those paths in the path filters. For example, including  You can disable this status message by setting the  WalkthroughThis pull request deletes an existing  Changes
 Sequence Diagram(s)sequenceDiagram
    participant User
    participant Action as "Action Module (Append/Create/Delete)"
    participant App as "Geckoboard App Module"
    participant API as "Geckoboard API"
    
    User->>Action: Trigger dataset operation
    Action->>App: Call corresponding method (e.g., appendToDataset)
    App->>API: Send HTTP request with axios
    API-->>App: Return response
    App-->>Action: Process and return response
    Action-->>User: Provide summary message
Possibly related PRs
 Suggested labels
 Suggested reviewers
 Poem
 🪧 TipsChatThere are 3 ways to chat with CodeRabbit: 
 Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
 Other keywords and placeholders
 CodeRabbit Configuration File ( | 
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 3
🧹 Nitpick comments (7)
components/geckoboard/actions/append-to-dataset/append-to-dataset.mjs (1)
24-25: Add error handling for dataset retrievalThe additionalProps method calls
this.app.getDatasets()without error handling. If the API call fails, the user won't get a clear error message.Consider adding try/catch:
async additionalProps(existingProps) { const datasetId = this.datasetId?.value || this.datasetId; - const datasets = await this.app.getDatasets(); + let datasets; + try { + datasets = await this.app.getDatasets(); + } catch (error) { + console.error("Failed to retrieve datasets:", error); + return existingProps; + }components/geckoboard/geckoboard.app.mjs (6)
7-11: Improve fields property definition and documentation.The fields property uses a JSON string format, which can be error-prone for users to manually format. Consider enhancing the documentation to provide clearer guidelines on valid field types, required attributes, and formatting requirements.
fields: { type: "string", label: "fields", - description: "JSON containing the fields of the dataset, i.e.: `{ \"amount\": { \"type\": \"number\", \"name\": \"Amount\", \"optional\": false }, \"timestamp\": { \"type\": \"datetime\", \"name\": \"Date\" } }`", + description: "JSON containing the fields of the dataset. Each field requires a 'type' and 'name'. Valid types include: 'number', 'datetime', 'string', etc. Example: `{ \"amount\": { \"type\": \"number\", \"name\": \"Amount\", \"optional\": false }, \"timestamp\": { \"type\": \"datetime\", \"name\": \"Date\" } }`", },
34-50: Enhance the _makeRequest method with better error handling.The current implementation delegates error handling entirely to axios, but adding specific error handling would improve debugging and user experience.
async _makeRequest(opts = {}) { const { $ = this, path, auth, + headers = {}, ...otherOpts } = opts; - return axios($, { - ...otherOpts, - url: this._baseUrl() + path, - auth: { - username: `${this.$auth.api_key}`, - password: "", - ...auth, - }, - }); + try { + return await axios($, { + ...otherOpts, + url: this._baseUrl() + path, + headers: { + "Content-Type": "application/json", + ...headers, + }, + auth: { + username: `${this.$auth.api_key}`, + password: "", + ...auth, + }, + }); + } catch (error) { + const status = error.response?.status; + const message = error.response?.data?.message || error.message; + throw new Error(`Geckoboard API error (${status}): ${message}`); + } },
51-59: Ensure data validation in appendToDataset method.The appendToDataset method currently does not validate the input data structure before sending it to the API. Consider adding validation to prevent errors.
async appendToDataset({ datasetId, + data, ...args }) { + if (!datasetId) { + throw new Error("Dataset ID is required"); + } + if (!data || !Array.isArray(data.data)) { + throw new Error("Data must be provided with a 'data' array property"); + } return this._makeRequest({ path: `/datasets/${datasetId}/data`, method: "post", + data, ...args, }); },
60-68: Enhance createDataset method with validation.Similar to the appendToDataset method, the createDataset method should validate the required parameters before making the API request.
async createDataset({ id, + fields, ...args }) { + if (!id) { + throw new Error("Dataset ID is required"); + } + if (!fields) { + throw new Error("Fields definition is required"); + } return this._makeRequest({ path: `/datasets/${id}`, method: "put", + data: { fields }, ...args, }); },
69-77: Validate datasetId in deleteDataset method.The deleteDataset method should verify that a datasetId is provided before attempting the deletion.
async deleteDataset({ datasetId, ...args }) { + if (!datasetId) { + throw new Error("Dataset ID is required"); + } return this._makeRequest({ path: `/datasets/${datasetId}`, method: "delete", ...args, }); },
1-85: Ensure proper documentation for API methods.The methods in this module lack JSDoc comments that would help users understand the expected parameters and return values. Consider adding comprehensive documentation for each method.
+/** + * Makes an authenticated request to the Geckoboard API + * @param {Object} opts - Request options + * @param {Object} [opts.$] - Context reference for axios + * @param {string} opts.path - API endpoint path + * @param {Object} [opts.auth] - Optional auth overrides + * @param {Object} [opts.headers] - Additional headers + * @returns {Promise<Object>} API response + */ async _makeRequest(opts = {}) { // ... }, +/** + * Appends data to an existing dataset + * @param {Object} params - Parameters + * @param {string} params.datasetId - ID of the dataset to append to + * @param {Object} params.data - Data object with 'data' array property containing records to append + * @returns {Promise<Object>} API response + */ async appendToDataset({ // ... }), // Add similar JSDoc comments for other methods
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
- pnpm-lock.yamlis excluded by- !**/pnpm-lock.yaml
📒 Files selected for processing (7)
- components/geckoboard/.gitignore(0 hunks)
- components/geckoboard/actions/append-to-dataset/append-to-dataset.mjs(1 hunks)
- components/geckoboard/actions/create-dataset/create-dataset.mjs(1 hunks)
- components/geckoboard/actions/delete-dataset/delete-dataset.mjs(1 hunks)
- components/geckoboard/app/geckoboard.app.ts(0 hunks)
- components/geckoboard/geckoboard.app.mjs(1 hunks)
- components/geckoboard/package.json(1 hunks)
💤 Files with no reviewable changes (2)
- components/geckoboard/.gitignore
- components/geckoboard/app/geckoboard.app.ts
⏰ Context from checks skipped due to timeout of 90000ms (3)
- GitHub Check: pnpm publish
- GitHub Check: Publish TypeScript components
- GitHub Check: Verify TypeScript components
🔇 Additional comments (8)
components/geckoboard/package.json (3)
3-3: LGTM: Version updated correctlyThe version bump from 0.0.2 to 0.1.0 properly reflects the addition of new features according to semantic versioning principles.
15-16: LGTM: Dependencies added correctlyThe addition of the @pipedream/platform dependency with the appropriate version constraint is necessary for the new functionality.
5-5:✅ Verification successful
Verify main entry point path
The main entry point has been changed from "dist/app/geckoboard.app.mjs" to "geckoboard.app.mjs", which suggests a change in file structure. Please verify that this new path is correct and the file exists in the expected location.
🏁 Script executed:
#!/bin/bash # Check if the main entry point file exists if [ -f "components/geckoboard/geckoboard.app.mjs" ]; then echo "✅ Main entry point file exists at the specified path" else echo "❌ Main entry point file not found at the specified path" fiLength of output: 168
Main entry point verified.
The filecomponents/geckoboard/geckoboard.app.mjsexists at the specified path, confirming that the change from"dist/app/geckoboard.app.mjs"to"geckoboard.app.mjs"is correct.components/geckoboard/actions/create-dataset/create-dataset.mjs (1)
1-35: LGTM: Well-structured action implementationThe action is well structured and follows the Pipedream component pattern correctly. It imports the app module, defines appropriate properties, and implements a run method that calls the necessary API function.
components/geckoboard/actions/delete-dataset/delete-dataset.mjs (1)
1-26: LGTM: Clean and well-structured implementationThe delete dataset action is implemented correctly, with proper error handling patterns through the use of the Pipedream
$context, clear variable naming, and a well-structured API interaction.components/geckoboard/actions/append-to-dataset/append-to-dataset.mjs (2)
20-40: LGTM: Dynamic property generation based on dataset fieldsThe dynamic property generation based on the selected dataset's fields is a good approach that provides a better user experience by adapting the UI to the specific dataset structure.
60-68: LGTM: Proper API interactionThe method correctly calls the appendToDataset function with the required parameters and formats the data according to the expected structure.
components/geckoboard/geckoboard.app.mjs (1)
78-83:❌ Incorrect review comment
Add pagination support to getDatasets method.
The current implementation of getDatasets doesn't handle pagination, which could be an issue if there are many datasets. Geckoboard API likely supports pagination parameters.
🌐 Web query:
Does Geckoboard API support pagination for listing datasets?💡 Result:
The Geckoboard Datasets API does not appear to support pagination for listing datasets based on the available documentation[1][5][7][16]. While the API includes features like rate limiting (30 requests per minute)[1] and record count limits (5,000 records per dataset)[1], there's no mention of pagination mechanisms like
page,offset/limit, or cursor-based navigation when retrieving datasets.Key observations:
- API documentation explicitly covers data handling operations (PUT/POST/DELETE) but doesn't list pagination parameters for dataset enumeration[1][5].
- Rate limiting is implemented at 30 requests per minute[1], suggesting datasets are designed to be managed in bulk rather than paginated chunks.
- Third-party integrations like Parabola and CData Connect Cloud discuss pagination for other APIs[3][20], but no examples show pagination being applied to Geckoboard dataset listings.
- Client libraries like the Python SDK[15] provide methods for dataset creation/modification but no paginated listing functionality.
For large-scale dataset management, the recommended approach is to:
- Use unique identifiers and timestamps for incremental updates[8][12]
- Adhere to the 5,000-record limit per dataset[1]
- Implement client-side batch processing if needed[16]
Developers requiring paginated access would need to implement custom logic using timestamps or unique keys, as the API doesn't provide native support for this feature[3][12].
Citations:
- 1: https://developer.geckoboard.com
- 2: https://support.geckoboard.com/en/articles/6055495-create-a-new-dataset-in-geckoboard
- 3: https://parabola.io/product/integration/api
- 4: https://support.geckoboard.com/en/articles/6055605-use-make-to-post-new-data-to-datasets-created-in-geckoboard
- 5: https://support.geckoboard.com/en/articles/10306899-how-to-use-the-datasets-api
- 6: https://cloud.cdata.com/docs/APIExample.html
- 7: https://www.geckoboard.com/academy/data-sources/datasets/
- 8: https://github.com/geckoboard/servicetitan-to-dataset
- 9: https://teamrelated.com/projects
- 10: https://support.geckoboard.com/en/articles/6055496-use-zapier-to-post-new-data-to-datasets-created-in-geckoboard
- 11: https://docs.senetco.io/dev/streaming/Geckoboard/
- 12: https://support.geckoboard.com/en/articles/6055500-track-more-metrics-using-parabola-with-geckoboard
- 13: https://support.geckoboard.com/en/articles/6174980-pre-built-collection-for-testing-datasets-api-and-custom-widgets
- 14: https://support.geckoboard.com/en/articles/6055498-track-more-metrics-using-make-with-geckoboard
- 15: https://github.com/geckoboard/geckoboard-python
- 16: https://support.geckoboard.com/en/articles/6055542-connect-data-to-geckoboard-from-any-source
- 17: https://support.geckoboard.com/en/articles/6055592-more-ways-to-get-data-from-your-database
- 18: https://support.geckoboard.com/en/articles/8304379-display-xero-metrics-in-geckoboard-using-datasets-and-zapier
- 19: https://support.geckoboard.com/en/collections/3715006-connect-to-our-datasets-api
- 20: https://www.cdata.com/kb/tech/api-cloud-geckoboard.rst
Geckoboard API: Pagination Not Supported
After reviewing the official Geckoboard documentation and recent discussions, it’s clear that the Datasets API does not offer native pagination parameters (like
page,offset, or cursor-based options) for listing datasets. The current implementation ofgetDatasets(incomponents/geckoboard/geckoboard.app.mjs, lines 78-83) is therefore aligned with the API’s design.
- The API doesn't support native pagination—even though handling large datasets might be necessary, any pagination would need to be implemented client-side.
- Rather than adding unsupported pagination parameters, consider documenting this API limitation or applying custom batch-processing logic if large-data handling becomes an issue.
Please update the review request accordingly.
Likely an incorrect or invalid review comment.
| async run({ $ }) { | ||
| const response = await this.app.createDataset({ | ||
| $, | ||
| id: this.id, | ||
| data: { | ||
| fields: JSON.parse(this.fields), | ||
| }, | ||
| }); | ||
| $.export("$summary", "Successfully created dataset"); | ||
| return response; | ||
| }, | 
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Add error handling for JSON parsing
The current implementation uses JSON.parse(this.fields) without any error handling. If the user provides invalid JSON, this will throw an uncaught exception.
Consider adding a try/catch block to handle potential JSON parsing errors:
  async run({ $ }) {
+   let parsedFields;
+   try {
+     parsedFields = JSON.parse(this.fields);
+   } catch (error) {
+     throw new Error(`Invalid JSON format for fields: ${error.message}`);
+   }
+   
    const response = await this.app.createDataset({
      $,
      id: this.id,
      data: {
-       fields: JSON.parse(this.fields),
+       fields: parsedFields,
      },
    });
    $.export("$summary", "Successfully created dataset");
    return response;
  },📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| async run({ $ }) { | |
| const response = await this.app.createDataset({ | |
| $, | |
| id: this.id, | |
| data: { | |
| fields: JSON.parse(this.fields), | |
| }, | |
| }); | |
| $.export("$summary", "Successfully created dataset"); | |
| return response; | |
| }, | |
| async run({ $ }) { | |
| let parsedFields; | |
| try { | |
| parsedFields = JSON.parse(this.fields); | |
| } catch (error) { | |
| throw new Error(`Invalid JSON format for fields: ${error.message}`); | |
| } | |
| const response = await this.app.createDataset({ | |
| $, | |
| id: this.id, | |
| data: { | |
| fields: parsedFields, | |
| }, | |
| }); | |
| $.export("$summary", "Successfully created dataset"); | |
| return response; | |
| }, | 
| for (const key of Object.keys(this)) { | ||
| if (![ | ||
| "app", | ||
| "datasetId", | ||
| ].includes(key)) { | ||
| let value = this[key]; | ||
|  | ||
| if (!isNaN(value)) { | ||
| value = parseFloat(value); | ||
| } | ||
|  | ||
| data[key] = value; | ||
| } | ||
| } | 
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Improve property iteration and value conversion
The current approach of iterating over all properties of this could include methods or internal properties that shouldn't be part of the data payload. Additionally, the automatic conversion of any numeric string to float might not be appropriate for all cases.
Consider a more targeted approach:
  async run({ $ }) {
    const data = {};
+   
+   // Get dynamic field definitions
+   const props = await this.additionalProps({});
+   const fieldKeys = Object.keys(props);
-   for (const key of Object.keys(this)) {
-     if (!["app", "datasetId"].includes(key)) {
-       let value = this[key];
-
-       if (!isNaN(value)) {
-         value = parseFloat(value);
-       }
-
-       data[key] = value;
-     }
-   }
+   // Only process fields from the dataset schema
+   for (const key of fieldKeys) {
+     if (this[key] === undefined) continue;
+     
+     let value = this[key];
+     
+     // Convert numeric strings to numbers when appropriate
+     if (typeof value === 'string' && !isNaN(value) && value.trim() !== '') {
+       value = parseFloat(value);
+     }
+     
+     data[key] = value;
+   }This approach only processes fields that were explicitly defined by the dataset schema and provides more robust number conversion.
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| for (const key of Object.keys(this)) { | |
| if (![ | |
| "app", | |
| "datasetId", | |
| ].includes(key)) { | |
| let value = this[key]; | |
| if (!isNaN(value)) { | |
| value = parseFloat(value); | |
| } | |
| data[key] = value; | |
| } | |
| } | |
| async run({ $ }) { | |
| const data = {}; | |
| // Get dynamic field definitions | |
| const props = await this.additionalProps({}); | |
| const fieldKeys = Object.keys(props); | |
| // Only process fields from the dataset schema | |
| for (const key of fieldKeys) { | |
| if (this[key] === undefined) continue; | |
| let value = this[key]; | |
| // Convert numeric strings to numbers when appropriate | |
| if (typeof value === 'string' && !isNaN(value) && value.trim() !== '') { | |
| value = parseFloat(value); | |
| } | |
| data[key] = value; | |
| } | |
| // ...rest of the function | |
| } | 
| async options() { | ||
| const response = await this.getDatasets(); | ||
| const datasets = response.data; | ||
| return datasets.map(({ id }) => ({ | ||
| value: id, | ||
| })); | ||
| }, | 
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion
Add error handling to the datasetId options function.
The async options() function for datasetId does not include error handling. If the API call fails, it could lead to an uncaught exception.
 async options() {
-  const response = await this.getDatasets();
-  const datasets = response.data;
-  return datasets.map(({ id }) => ({
-    value: id,
-  }));
+  try {
+    const response = await this.getDatasets();
+    const datasets = response.data;
+    return datasets.map(({ id }) => ({
+      value: id,
+      label: id,
+    }));
+  } catch (error) {
+    console.error("Error fetching datasets:", error);
+    return [];
+  }
 },📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| async options() { | |
| const response = await this.getDatasets(); | |
| const datasets = response.data; | |
| return datasets.map(({ id }) => ({ | |
| value: id, | |
| })); | |
| }, | |
| async options() { | |
| try { | |
| const response = await this.getDatasets(); | |
| const datasets = response.data; | |
| return datasets.map(({ id }) => ({ | |
| value: id, | |
| label: id, | |
| })); | |
| } catch (error) { | |
| console.error("Error fetching datasets:", error); | |
| return []; | |
| } | |
| }, | 
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @lcaresia lgtm! Ready for QA!
| /approve | 
| /approve @jcortes When you approve, can you merge too? I need to update this branch every time and need an approve again | 
WHY
Summary by CodeRabbit
New Features
Chores