Skip to content

ci: link validation in GH actions - Fix hashFiles() empty hash issue #6264

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
34 changes: 33 additions & 1 deletion .github/actions/validate-links/action.yml
Original file line number Diff line number Diff line change
Expand Up @@ -30,12 +30,44 @@ outputs:
runs:
using: 'composite'
steps:
- name: Generate file-specific cache key
if: inputs.cache-enabled == 'true'
id: cache-key
shell: bash
run: |
# Create a hash based only on the files being validated
files="${{ inputs.files }}"
if [ -n "$files" ]; then
# Convert space-separated file list to array and process each file
file_list=($files)
file_data=""
for file in "${file_list[@]}"; do
if [ -f "$file" ]; then
# Get file modification time and size for hashing
file_info=$(ls -l "$file" | awk '{print $5, $6, $7, $8}')
Comment on lines +46 to +47
Copy link
Preview

Copilot AI Jul 29, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Using ls -l output for file metadata is fragile and platform-dependent. Consider using stat command instead: stat -f "%z %m" "$file" for more reliable file size and modification time extraction.

Suggested change
# Get file modification time and size for hashing
file_info=$(ls -l "$file" | awk '{print $5, $6, $7, $8}')
# Get file size and modification time for hashing
file_info=$(stat -c "%s %Y" "$file")

Copilot uses AI. Check for mistakes.

file_data="${file_data}${file}:${file_info}\n"
Copy link
Preview

Copilot AI Jul 29, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The literal \n in the string concatenation may not work as expected in all shells. Use $'\n' or a proper newline character instead.

Suggested change
file_data="${file_data}${file}:${file_info}\n"
file_data="${file_data}${file}:${file_info}$'\n'"

Copilot uses AI. Check for mistakes.

fi
done

if [ -n "$file_data" ]; then
file_hash=$(echo -e "$file_data" | sha256sum | cut -d' ' -f1)
else
file_hash="no-files"
fi

echo "file-hash=$file_hash" >> $GITHUB_OUTPUT
echo "Generated cache key for files: $files"
echo "File hash: $file_hash"
else
echo "file-hash=no-files" >> $GITHUB_OUTPUT
fi

- name: Restore link validation cache
if: inputs.cache-enabled == 'true'
uses: actions/cache@v4
with:
path: .cache/link-validation
key: ${{ inputs.cache-key }}-${{ runner.os }}-${{ hashFiles('content/**/*.md', 'content/**/*.html') }}
key: ${{ inputs.cache-key }}-${{ runner.os }}-${{ steps.cache-key.outputs.file-hash }}
restore-keys: |
${{ inputs.cache-key }}-${{ runner.os }}-
${{ inputs.cache-key }}-
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/pr-link-validation.yml
Original file line number Diff line number Diff line change
Expand Up @@ -119,7 +119,7 @@ jobs:
files: ${{ matrix.files || needs.setup.outputs.all-files }}
product-name: ${{ matrix.product }}
cache-enabled: ${{ matrix.cacheEnabled || 'true' }}
cache-key: link-validation-${{ hashFiles(matrix.files || needs.setup.outputs.all-files) }}
cache-key: link-validation-${{ matrix.product }}
timeout: 900

report:
Expand Down
2 changes: 1 addition & 1 deletion content/example.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ weight: 1
related:
- /influxdb/v2/write-data/
- /influxdb/v2/write-data/quick-start
- https://influxdata.com, This is an external link
- https://github.com/influxdata/docs-v2, This is an external link
test_only: true # Custom parameter to indicate test-only content
---

Expand Down
103 changes: 13 additions & 90 deletions cypress.config.js
Original file line number Diff line number Diff line change
Expand Up @@ -8,8 +8,8 @@ import {
initializeReport,
readBrokenLinksReport,
saveCacheStats,
saveValidationStrategy,
} from './cypress/support/link-reporter.js';
import { createCypressCacheTasks } from './cypress/support/link-cache.js';

export default defineConfig({
e2e: {
Expand All @@ -31,7 +31,13 @@ export default defineConfig({
}
});

// Register cache tasks
const cacheTasks = createCypressCacheTasks();

on('task', {
// Cache management tasks
...cacheTasks,

// Fetch the product list configured in /data/products.yml
getData(filename) {
return new Promise((resolve, reject) => {
Expand Down Expand Up @@ -93,6 +99,12 @@ export default defineConfig({
return initializeReport();
},

// Save cache statistics for the reporter
saveCacheStatsForReporter(stats) {
saveCacheStats(stats);
return null;
},

// Special case domains are now handled directly in the test without additional reporting
// This task is kept for backward compatibility but doesn't do anything special
reportSpecialCaseLink(linkData) {
Expand Down Expand Up @@ -180,95 +192,6 @@ export default defineConfig({
}
},

// Cache and incremental validation tasks
saveCacheStatistics(stats) {
try {
saveCacheStats(stats);
return true;
} catch (error) {
console.error(`Error saving cache stats: ${error.message}`);
return false;
}
},

saveValidationStrategy(strategy) {
try {
saveValidationStrategy(strategy);
return true;
} catch (error) {
console.error(`Error saving validation strategy: ${error.message}`);
return false;
}
},

runIncrementalValidation(filePaths) {
return new Promise(async (resolve, reject) => {
try {
console.log('Loading incremental validator module...');

// Use CommonJS require for better compatibility
const {
IncrementalValidator,
} = require('./.github/scripts/incremental-validator.cjs');
console.log('✅ Incremental validator loaded successfully');

const validator = new IncrementalValidator();
const results = await validator.validateFiles(filePaths);
resolve(results);
} catch (error) {
console.error(`Incremental validation error: ${error.message}`);
console.error(`Stack: ${error.stack}`);

// Don't fail the entire test run due to cache issues
// Fall back to validating all files
console.warn('Falling back to validate all files without cache');
resolve({
validationStrategy: {
unchanged: [],
changed: filePaths.map((filePath) => ({
filePath,
fileHash: 'unknown',
links: [],
})),
newLinks: [],
total: filePaths.length,
},
filesToValidate: filePaths.map((filePath) => ({
filePath,
fileHash: 'unknown',
})),
cacheStats: {
totalFiles: filePaths.length,
cacheHits: 0,
cacheMisses: filePaths.length,
hitRate: 0,
},
});
}
});
},

cacheValidationResults(filePath, fileHash, results) {
return new Promise(async (resolve, reject) => {
try {
const {
IncrementalValidator,
} = require('./.github/scripts/incremental-validator.cjs');
const validator = new IncrementalValidator();
const success = await validator.cacheResults(
filePath,
fileHash,
results
);
resolve(success);
} catch (error) {
console.error(`Cache validation results error: ${error.message}`);
// Don't fail if caching fails - just continue without cache
resolve(false);
}
});
},

filePathToUrl(filePath) {
return new Promise(async (resolve, reject) => {
try {
Expand Down
Loading
Loading