Skip to content

refactor(source-shiftbase): Migrate to manifest-only YAML connector#72899

Open
Aaron ("AJ") Steers (aaronsteers) wants to merge 42 commits intomasterfrom
devin/1770321883-migrate-source-shiftbase-manifest-only
Open

refactor(source-shiftbase): Migrate to manifest-only YAML connector#72899
Aaron ("AJ") Steers (aaronsteers) wants to merge 42 commits intomasterfrom
devin/1770321883-migrate-source-shiftbase-manifest-only

Conversation

@aaronsteers
Copy link
Copy Markdown
Member

@aaronsteers Aaron ("AJ") Steers (aaronsteers) commented Feb 5, 2026

This PR targets the following PR:


What

Migrates the Shiftbase source connector from a Python-based implementation to a manifest-only declarative YAML connector, removing all custom Python code in favor of a single manifest.yaml.

Additionally adds an optional schedule_report_end_date spec field for the Schedule Detail Report stream to support fetching future scheduled records (defaults to 1 month into the future).

Requested by Aaron ("AJ") Steers (@aaronsteers). Link to Devin run.

How

  • Replaced source_shiftbase/source.py, streams.py, and all Python infrastructure with a declarative manifest.yaml
  • Multi-account support via ListPartitionRouter iterating over config.accounts array
  • Parent-child stream dependencies using SubstreamPartitionRouter (departments→employees, employees→employee_time_distribution)
  • Child streams access the account's access_token via the parent_slice chain (e.g. {{ stream_partition.parent_slice.account.access_token }} for employees, {{ stream_partition.parent_slice.parent_slice.account.access_token }} for employee_time_distribution)
  • account_id propagated to child streams via extra_fields in ParentStreamConfig (departments→employees→employee_time_distribution)
  • Report streams (timesheet_detail, schedule_detail) use DatetimeBasedCursor with start_time_option/end_time_option injecting from/to into POST body
  • employee_time_distribution uses DatetimeBasedCursor with datetime_format: "%Y", step: P1Y, and start_time_option to inject the year request parameter directly
  • All stream schemas extracted to a centralized definitions.schemas section with $ref pointers for easier review
  • Composite primary keys include account_id (for streams that have it in their schema) or account_name (for streams using ListPartitionRouter without account_id) to ensure uniqueness across multi-account syncs
  • Updated metadata.yaml to use source-declarative-manifest:7.6.5 base image with language:manifest-only and cdk:low-code tags
  • Removed all Python files: source_shiftbase/, main.py, pyproject.toml, poetry.lock, requirements.txt, unit_tests/, integration_tests/
  • Updated acceptance-test-config.yml with bypass reasons for all test types (no test credentials available)
  • Merged connector-level docs into docs/integrations/sources/shiftbase.md
  • Added optional schedule_report_end_date spec field (format: YYYY-MM-DD) used by schedule_detail_report stream as end_datetime; defaults to day_delta(30, '%Y-%m-%d') (approximately 1 month into the future) when not provided

Review guide

⚠️ This has NOT been validated against the live API (no test credentials available). Careful structural review is needed.

Items resolved during review:

  • PKs validated: All streams now have composite primary keys with account_id or account_name for multi-account uniqueness. employees PK includes department_id. employees_report has no PK (userId not unique across daily syncs).
  • Incremental sync cleanup: end_datetime removed from all incremental syncs (CDK defaults to now). start_time_option/end_time_option used for request param injection instead of manual interpolation.
  • account_id propagation: Child streams (employees, employee_time_distribution) receive account_id from parent via extra_fields.
  • Schema refactoring: All 10 inline schemas extracted to definitions.schemas section with $ref references.
  • employee_time_distribution: datetime_format changed to "%Y" with start_time_option injecting the year param directly instead of string slicing.
  • format: date added to start_date spec field for date picker UI.

Updates since last revision:

  • schedule_report_end_date spec field added: Optional config field explicitly scoped to the Schedule Detail Report stream. When provided, it sets the end_datetime for the schedule_detail_report cursor. When omitted, defaults to 30 days from the current date via day_delta(30, '%Y-%m-%d'). This enables fetching future scheduled records. Field was named schedule_report_end_date (rather than generic end_date) to make its stream-specific scope clear.

Human review checklist:

  1. manifest.yamlparent_slice chain for grandchild streams

    • employee_time_distribution uses {{ stream_partition.parent_slice.parent_slice.account.access_token }} (two levels deep). This pattern relies on the CDK propagating the full partition context through nested SubstreamPartitionRouter layers. Untested live.
  2. manifest.yaml — Report stream schemas

    • The connector sends export: json in POST body, which returns flat JSON records (not the raw columns/rows format shown by default in API docs). Schemas were transcribed from the original Python connector. Verify field accuracy if possible.
  3. manifest.yaml_year_partition synthetic cursor

    • Uses a synthetic cursor field that doesn't exist in the API response — added via AddFields transformation. Verify this pattern works for year-by-year iteration without breaking state management.
  4. manifest.yamlemployees_report has no PK

    • userId is not unique across daily syncs, so no primary key is defined. This means no deduplication in append+dedup mode.
  5. README.md

    • Still references Python/Poetry development workflow (inherited from base PR). Should be updated to reflect manifest-only development.
  6. Cloud UI — accounts field not highlighted as required

    • The schema is correct (required: [accounts, start_date]), but the Cloud UI may not display the required indicator for array-type fields with nested objects. This appears to be a UI rendering behavior, not a schema issue.
  7. manifest.yamlschedule_report_end_date / day_delta Jinja expression

    • schedule_detail_report uses {{ config.get('schedule_report_end_date', day_delta(30, '%Y-%m-%d')) }} for end_datetime. Verify that config.get() and day_delta with a format argument resolve correctly in the CDK's Jinja interpolation context. Note that 30 days is an approximation of "1 month". Untested live.

User Impact

  • Connector behavior should remain the same (same streams, same sync modes)
  • Reduced maintenance burden with manifest-only approach
  • No Python dependencies to manage
  • New optional schedule_report_end_date config field allows the Schedule Detail Report stream to fetch future scheduled records (defaults to 1 month ahead)

Can this PR be safely reverted and rolled back?

  • YES 💚

Link to Devin session: https://app.devin.ai/sessions/f6d1ddc1eb074fddbbb41a42271999be

@devin-ai-integration
Copy link
Copy Markdown
Contributor

🤖 Devin AI Engineer

I'll be helping with this pull request! Here's what you should know:

✅ I will automatically:

  • Address comments on this PR. Add '(aside)' to your comment to have me ignore it.
  • Look at CI failures and help fix them

Note: I can only respond to comments from users who have write access to this repository.

⚙️ Control Options:

  • Disable automatic comment and CI monitoring

@github-actions
Copy link
Copy Markdown
Contributor

github-actions bot commented Feb 5, 2026

👋 Greetings, Airbyte Team Member!

Here are some helpful tips and reminders for your convenience.

💡 Show Tips and Tricks

PR Slash Commands

Airbyte Maintainers (that's you!) can execute the following slash commands on your PR:

  • 🛠️ Quick Fixes
    • /format-fix - Fixes most formatting issues.
    • /bump-version - Bumps connector versions, scraping changelog description from the PR title.
  • ❇️ AI Testing and Review:
    • /ai-docs-review - Provides AI-powered documentation recommendations for PRs with connector changes.
    • /ai-prove-fix - Runs prerelease readiness checks, including testing against customer connections.
    • /ai-canary-prerelease - Rolls out prerelease to 5-10 connections for canary testing.
    • /ai-review - AI-powered PR review for connector safety and quality gates.
  • 🚀 Connector Releases:
    • /publish-connectors-prerelease - Publishes pre-release connector builds (tagged as {version}-preview.{git-sha}) for all modified connectors in the PR.
    • /bump-progressive-rollout-version - Bumps connector version with an RC suffix (2.16.10-rc.1) for progressive rollouts (enableProgressiveRollout: true).
      • Example: /bump-progressive-rollout-version changelog="Add new feature for progressive rollout"
  • ☕️ JVM connectors:
    • /update-connector-cdk-version connector=<CONNECTOR_NAME> - Updates the specified connector to the latest CDK version.
      Example: /update-connector-cdk-version connector=destination-bigquery
    • /bump-bulk-cdk-version bump=patch changelog='foo' - Bump the Bulk CDK's version. bump can be major/minor/patch.
  • 🐍 Python connectors:
    • /poe connector source-example lock - Run the Poe lock task on the source-example connector, committing the results back to the branch.
    • /poe source example lock - Alias for /poe connector source-example lock.
    • /poe source example use-cdk-branch my/branch - Pin the source-example CDK reference to the branch name specified.
    • /poe source example use-cdk-latest - Update the source-example CDK dependency to the latest available version.
  • ⚙️ Admin commands:
    • /force-merge reason="<REASON>" - Force merges the PR using admin privileges, bypassing CI checks. Requires a reason.
      Example: /force-merge reason="CI is flaky, tests pass locally"
📚 Show Repo Guidance

Helpful Resources

📝 Edit this welcome message.


### Prerequisites

* Python (`^3.9`)
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[markdownlint] reported by reviewdog 🐶
MD004/ul-style Unordered list style [Expected: dash; Actual: asterisk]

### Prerequisites

* Python (`^3.9`)
* Poetry (`^1.7`) - installation instructions [here](https://python-poetry.org/docs/#installation)
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[markdownlint] reported by reviewdog 🐶
MD004/ul-style Unordered list style [Expected: dash; Actual: asterisk]

* Python (`^3.9`)
* Poetry (`^1.7`) - installation instructions [here](https://python-poetry.org/docs/#installation)


Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[markdownlint] reported by reviewdog 🐶
MD012/no-multiple-blanks Multiple consecutive blank lines [Expected: 1; Actual: 2]

* Poetry (`^1.7`) - installation instructions [here](https://python-poetry.org/docs/#installation)



Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[markdownlint] reported by reviewdog 🐶
MD012/no-multiple-blanks Multiple consecutive blank lines [Expected: 1; Actual: 3]

### Installing the connector

From this connector directory, run:
```bash
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[markdownlint] reported by reviewdog 🐶
MD031/blanks-around-fences Fenced code blocks should be surrounded by blank lines [Context: "```bash"]

### Running as a docker container

Then run any of the connector commands as follows:
```
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[markdownlint] reported by reviewdog 🐶
MD031/blanks-around-fences Fenced code blocks should be surrounded by blank lines [Context: "```"]

### Running as a docker container

Then run any of the connector commands as follows:
```
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[markdownlint] reported by reviewdog 🐶
MD040/fenced-code-language Fenced code blocks should have a language specified [Context: "```"]


### Dependency Management

All of your dependencies should be managed via Poetry.
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[markdownlint] reported by reviewdog 🐶
MD009/no-trailing-spaces Trailing spaces [Expected: 0 or 2; Actual: 1]

## Publishing a new version of the connector

You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-shiftbase test`
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[markdownlint] reported by reviewdog 🐶
MD032/blanks-around-lists Lists should be surrounded by blank lines [Context: "1. Make sure your changes are ..."]


You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-shiftbase test`
2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)):
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[markdownlint] reported by reviewdog 🐶
MD009/no-trailing-spaces Trailing spaces [Expected: 0 or 2; Actual: 1]

Comment on lines +28 to +31
* Python (`^3.9`)
* Poetry (`^1.7`) - installation instructions [here](https://python-poetry.org/docs/#installation)


Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[markdownlint-fix] reported by reviewdog 🐶

Suggested change
* Python (`^3.9`)
* Poetry (`^1.7`) - installation instructions [here](https://python-poetry.org/docs/#installation)
- Python (`^3.9`)
- Poetry (`^1.7`) - installation instructions [here](https://python-poetry.org/docs/#installation)

### Installing the connector

From this connector directory, run:
```bash
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[markdownlint-fix] reported by reviewdog 🐶

Suggested change
```bash
```bash

poetry install --with dev
```


Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[markdownlint-fix] reported by reviewdog 🐶

Suggested change

Note that any directory named `secrets` is gitignored across the entire Airbyte repo, so there is no danger of accidentally checking in sensitive information.
See `sample_files/sample_config.json` for a sample config file.


Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[markdownlint-fix] reported by reviewdog 🐶

Suggested change


1. Install [`airbyte-ci`](https://github.com/airbytehq/airbyte/blob/master/airbyte-ci/connectors/pipelines/README.md)
2. Run the following command to build the docker image:
```bash
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[markdownlint-fix] reported by reviewdog 🐶

Suggested change
```bash
```bash


An image will be available on your host with the tag `airbyte/source-shiftbase:dev`.


Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[markdownlint-fix] reported by reviewdog 🐶

Suggested change

### Running as a docker container

Then run any of the connector commands as follows:
```
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[markdownlint-fix] reported by reviewdog 🐶

Suggested change
```
```


### Dependency Management

All of your dependencies should be managed via Poetry.
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[markdownlint-fix] reported by reviewdog 🐶

Suggested change
All of your dependencies should be managed via Poetry.
All of your dependencies should be managed via Poetry.

## Publishing a new version of the connector

You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-shiftbase test`
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[markdownlint-fix] reported by reviewdog 🐶

Suggested change
1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-shiftbase test`
1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-shiftbase test`


You've checked out the repo, implemented a million dollar feature, and you're ready to share your changes with the world. Now what?
1. Make sure your changes are passing our test suite: `airbyte-ci connectors --name=source-shiftbase test`
2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)):
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[markdownlint-fix] reported by reviewdog 🐶

Suggested change
2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)):
2. Bump the connector version (please follow [semantic versioning for connectors](https://docs.airbyte.com/contributing-to-airbyte/resources/pull-requests-handbook/#semantic-versioning-for-connectors)):

…config

Co-Authored-By: AJ Steers <aj@airbyte.io>
@github-actions
Copy link
Copy Markdown
Contributor

github-actions bot commented Feb 5, 2026

source-shiftbase Connector Test Results

3 tests   1 ✅  3s ⏱️
1 suites  2 💤
1 files    0 ❌

Results for commit 2048581.

♻️ This comment has been updated with latest results.

@github-actions
Copy link
Copy Markdown
Contributor

github-actions bot commented Feb 5, 2026

Deploy preview for airbyte-docs ready!

✅ Preview
https://airbyte-docs-b2up8erz5-airbyte-growth.vercel.app

Built with commit 2048581.
This pull request is being automatically deployed with vercel-action

devin-ai-integration[bot]

This comment was marked as resolved.

@darynaishchenko
Copy link
Copy Markdown
Collaborator

for streams that have account_id in their records schema use account_id in pk list instead of account_name

…ccount_id in schema

Co-Authored-By: AJ Steers <aj@airbyte.io>
@devin-ai-integration
Copy link
Copy Markdown
Contributor

Daryna Ishchenko (@darynaishchenko) Done — updated PKs to use account_id instead of account_name for the 6 streams that have account_id in their schema:

  • departments[id, account_id]
  • employees[id, account_id]
  • employee_time_distribution[employeeId, year, account_id]
  • absentees[id, account_id]
  • shifts[id, account_id]
  • users[id, account_id]

The remaining streams (availabilities, timesheet_detail_report, schedule_detail_report) don't have account_id in their schema so they still use account_name.


Devin session

…_id to employees PK, add _year_partition doc comment

Co-Authored-By: AJ Steers <aj@airbyte.io>
@devin-ai-integration
Copy link
Copy Markdown
Contributor

Re: Comment 65 (employees_report schema vs API docs)

The API docs at https://developer.shiftbase.com/docs/core/4d05f64e94419-employees-report show two response formats:

  1. Raw format (default): Returns {data: {columns: [...], rows: [...]}} — a tabular format with column metadata and row arrays
  2. JSON format: Returns a flat array of objects with named fields like {userId, userName, userEmployeeNr, year, weeknumber, start, end}

Our connector sends export: json in the request body (line 407 of manifest.yaml), so it receives the JSON format — not the raw/tabular format that the docs show by default. The schema in our manifest matches the JSON response format.

The JSON example in the docs only shows 7 fields (userId, userName, userEmployeeNr, year, weeknumber, start, end), but the actual API likely returns more fields when no columns filter is specified. The additional fields in our schema (like endContractLocation, userCurrentStatus, userHireDate, etc.) were defined in the original Python connector (PR #72829) and were presumably based on actual API responses during development.

Without test credentials to verify, I've kept the schema as-is from the original connector. If you'd like, I can trim it to only the 7 fields shown in the docs example, but that might lose valid data fields.


Devin session

@devin-ai-integration
Copy link
Copy Markdown
Contributor

Re: Comment 67 (timesheet_detail_report schema vs API docs)

Same situation as the employees_report — the API docs at https://developer.shiftbase.com/docs/core/5612d41bb72b1-timesheet-detail-report show the "Raw format" response by default (columns/rows tabular structure), but our connector sends export: json in the request body, which returns flat JSON records with named fields like {userId, userEmployeeNr, timesheetDate, timesheetStarttime, ...}.

The schema in our manifest matches the JSON response format, not the raw format shown in the docs. The fields were transcribed from the original Python connector (PR #72829) which was presumably tested against the actual API.

The API docs have a "JSON format" response example toggle (next to "Raw format") that shows the flat record format matching our schema.


Devin session

@devin-ai-integration
Copy link
Copy Markdown
Contributor

Re: Comment 69 (accounts field not highlighted as required in Cloud UI)

I investigated the spec and the schema declaration is correct:

spec:
  type: Spec
  connection_specification:
    $schema: http://json-schema.org/draft-07/schema#
    title: Shiftbase Spec
    type: object
    required:
    - accounts
    - start_date
    properties:
      accounts:
        type: array
        ...

The required: [accounts, start_date] is properly placed at the connection_specification level alongside properties, which is valid JSON Schema.

This is likely a UI rendering behavior specific to array-type fields with nested object items. The Cloud UI / Connector Builder may not display the "required" asterisk for complex array fields the same way it does for simple scalar fields (string, number, etc.). The webapp code that handles this lives in airbyte-platform-internal, so the fix would need to be there if it's indeed a UI rendering gap.

The schema itself is correct — the field IS required and validation should still enforce it even if the visual indicator is missing. If this is a known issue, it might be worth filing a bug against the platform UI.


Devin session

- - id
- - department_id
- - account_id
retriever:
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we use /definitions/base_requester here and overwrite only different parts? also apply this refactoring for other streams that doesn't use /definitions/base_requester

…ribution streams

Co-Authored-By: AJ Steers <aj@airbyte.io>
@devin-ai-integration
Copy link
Copy Markdown
Contributor

Daryna Ishchenko (@darynaishchenko) Re: using $ref for base_requester — Done in commit 334730c.

Refactored employees_stream and employee_time_distribution_stream to use $ref: "#/definitions/base_requester" with only the differing parts overridden (path, request_headers for auth, request_parameters). This removes the duplicated error_handler, url_base, http_method, and type blocks from both streams.

All other streams already used $ref: "#/definitions/base_requester".


Devin session

devin-ai-integration[bot]

This comment was marked as resolved.

Copy link
Copy Markdown
Contributor

@devin-ai-integration devin-ai-integration bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Devin Review found 1 new potential issue.

View 17 additional findings in Devin Review.

Open in Devin Review

Comment on lines +262 to +264
primary_key:
- - id
- - account_name
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🔴 availabilities_stream uses mutable user-provided account_name in primary key instead of stable account_id

The availabilities_stream primary key is [id, account_name] (line 263-264), but account_name is a user-provided configuration string (manifest.yaml:1604-1607), not a stable API-returned identifier. Every other standard GET-based stream in this connector uses account_id (an API-returned field) in its composite primary key: departments_stream (manifest.yaml:57-58), absentees_stream (manifest.yaml:218-219), shifts_stream (manifest.yaml:307-308), users_stream (manifest.yaml:338-339). If the user changes the account_name value in their config between syncs, records from the same availability will produce different primary keys, breaking deduplication and causing duplicates in the destination. The Shiftbase API consistently returns account_id in its standard entity endpoints (as shown by all other streams' schemas), so availabilities almost certainly does too — but the schema (manifest.yaml:949-992) omits account_id entirely.

Prompt for agents
In airbyte-integrations/connectors/source-shiftbase/manifest.yaml, fix the availabilities_stream to use account_id instead of account_name in its primary key, consistent with all other standard GET streams:

1. At line 262-264, change the primary_key from [id, account_name] to [id, account_id]
2. In the availabilities_stream schema (lines 949-992), add an account_id property of type [null, string]
3. Verify that the Shiftbase availabilities API endpoint returns account_id in the Availability object (consistent with departments, absentees, shifts, users endpoints)
Open in Devin Review

Was this helpful? React with 👍 or 👎 to provide feedback.

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The observation is valid in principle — account_name is user-provided and mutable, so account_id would be more stable for deduplication.

However, the availabilities schema currently doesn't include account_id because we haven't confirmed the Shiftbase /availabilities endpoint returns it. During the earlier review cycle, Daryna Ishchenko (@darynaishchenko) instructed us to use account_id in PKs for streams where it exists in the schema — and we did that for departments, absentees, shifts, and users. Availabilities was left with account_name because its schema doesn't have account_id.

Adding account_id to the schema without API verification risks referencing a field that doesn't exist in the response, which would produce null PK values. Deferring to the human reviewer on whether to add it based on API knowledge.


Devin session

@github-actions
Copy link
Copy Markdown
Contributor

github-actions bot commented Mar 6, 2026

Pre-release Connector Publish Started

Publishing pre-release build for connector source-shiftbase.
PR: #72899

Pre-release versions will be tagged as {version}-preview.86cac29
and are available for version pinning via the scoped_configuration API.

View workflow run
Pre-release Publish: FAILED

Docker image (pre-release):
airbyte/source-shiftbase:0.0.1-preview.86cac29

Docker Hub: https://hub.docker.com/layers/airbyte/source-shiftbase/0.0.1-preview.86cac29

Registry JSON:

@github-actions
Copy link
Copy Markdown
Contributor

github-actions bot commented Mar 6, 2026

Pre-release Connector Publish Started

Publishing pre-release build for connector source-shiftbase.
PR: #72899

Pre-release versions will be tagged as {version}-preview.7e3b5f2
and are available for version pinning via the scoped_configuration API.

View workflow run
Pre-release Publish: FAILED

Docker image (pre-release):
airbyte/source-shiftbase:0.0.1-preview.7e3b5f2

Docker Hub: https://hub.docker.com/layers/airbyte/source-shiftbase/0.0.1-preview.7e3b5f2

Registry JSON:

@DanyloGL
Copy link
Copy Markdown
Collaborator

This PR looks good but still account details are not required during source setup on cloud. Trying to resolve it. Devin created new PR to fix it here.

@devin-ai-integration
Copy link
Copy Markdown
Contributor

❌ Cannot revive Devin session - the session is too old. Please start a new session instead.

Copy link
Copy Markdown
Contributor

@devin-ai-integration devin-ai-integration bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Devin Review found 1 new potential issue.

View 17 additional findings in Devin Review.

Open in Devin Review

Comment on lines +363 to +365
employees_report_stream:
type: DeclarativeStream
name: employees_report
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🟡 employees_report_stream is missing primary_key, preventing dedup sync modes

The employees_report_stream (line 363) does not define a primary_key, unlike all other 9 streams in the manifest. The schema at manifest.yaml:1168-1239 includes userId which is a natural unique identifier for employee reports. Without a primary_key, this stream cannot support Incremental Append+Deduped or Full Refresh Overwrite with deduplication. Given that every other stream in the connector defines a primary key, this appears to be an oversight.

Suggested change
employees_report_stream:
type: DeclarativeStream
name: employees_report
employees_report_stream:
type: DeclarativeStream
name: employees_report
primary_key:
- - userId
- - account_name
Open in Devin Review

Was this helpful? React with 👍 or 👎 to provide feedback.

@darynaishchenko
Copy link
Copy Markdown
Collaborator

Daryna Ishchenko (darynaishchenko) commented Mar 17, 2026

/publish-connectors-prerelease

Pre-release Connector Publish Started

Publishing pre-release build for connector source-shiftbase.
PR: #72899

Pre-release versions will be tagged as {version}-preview.94cf612
and are available for version pinning via the scoped_configuration API.

View workflow run
Pre-release Publish: FAILED

Docker image (pre-release):
airbyte/source-shiftbase:0.0.1-preview.94cf612

Docker Hub: https://hub.docker.com/layers/airbyte/source-shiftbase/0.0.1-preview.94cf612

Registry JSON:

devin-ai-integration bot and others added 2 commits March 27, 2026 17:07
…eport stream

Add optional end_date config field used by the schedule_detail_report stream
to fetch future scheduled records. Defaults to 1 month (30 days) into the
future via day_delta(30) when not provided by the user.

Co-Authored-By: gl_anatolii.yatsuk <gl_anatolii.yatsuk@airbyte.io>
Rename the spec field to schedule_report_end_date to make it explicit
that this parameter is only used by the schedule_detail_report stream.

Co-Authored-By: gl_anatolii.yatsuk <gl_anatolii.yatsuk@airbyte.io>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants