You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: content/actions/reference/workflows-and-actions/workflow-syntax.md
-2Lines changed: 0 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1002,8 +1002,6 @@ All `include` combinations are processed after `exclude`. This allows you to use
1002
1002
1003
1003
## `jobs.<job_id>.strategy.fail-fast`
1004
1004
1005
-
`jobs.<job_id>.strategy.fail-fast`applies to the entire matrix. If `jobs.<job_id>.strategy.fail-fast` is set to `true` or its expression evaluates to `true`, {% data variables.product.github %} will cancel all in-progress and queued jobs in the matrix if any job in the matrix fails. This property defaults to `true`.
1006
-
1007
1005
{% data reusables.actions.jobs.section-using-a-build-matrix-for-your-jobs-failfast %}
Copy file name to clipboardExpand all lines: content/code-security/code-scanning/creating-an-advanced-setup-for-code-scanning/codeql-code-scanning-for-compiled-languages.md
+16-1Lines changed: 16 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -267,7 +267,22 @@ If you added manual build steps for compiled languages and {% data variables.pro
267
267
268
268
When you enable default setup for a repository that contains C/C++ code, the build mode is set to `none` automatically.
269
269
270
-
>[!NOTE] Support of build mode `none` for C/C++ codebases is currently in {% data variables.release-phases.public_preview %} and subject to change.
270
+
### No build for C/C++
271
+
272
+
{% data variables.product.prodname_codeql %} will infer C/C++ compilation units through source file extensions. For each source file found, compilation flags and include paths are inferred by inspecting the codebase without the need for a working build command.
273
+
274
+
#### Accuracy of no build analysis for C/C++
275
+
276
+
Creating a {% data variables.product.prodname_codeql %} C/C++ database without a build may produce less accurate results than using `autobuild` or manual build steps in some cases; for example, if:
277
+
278
+
* The code depends heavily on custom macros/defines not available in existing headers
279
+
* The codebase has many external dependencies
280
+
281
+
You can ensure a more accurate analysis by taking the following steps:
282
+
283
+
* Place custom macros and defines in header files that are included in relevant source files
284
+
* Ensure external dependencies (headers) are available in system include directories or in the workspace
285
+
* Run the extraction on the target platform. For example, choose a Windows runner to analyze Windows projects to give access to platform specific headers and compilers
Copy file name to clipboardExpand all lines: content/copilot/reference/ai-models/model-hosting.md
+2-1Lines changed: 2 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -35,6 +35,7 @@ When using OpenAI's models, input requests and output responses continue to run
35
35
36
36
Used for:
37
37
38
+
* {% data variables.copilot.copilot_claude_haiku_45 %}
38
39
* {% data variables.copilot.copilot_claude_sonnet_45 %}
39
40
* {% data variables.copilot.copilot_claude_opus_41 %}
40
41
* {% data variables.copilot.copilot_claude_opus %}
@@ -43,7 +44,7 @@ Used for:
43
44
* {% data variables.copilot.copilot_claude_sonnet_37 %} Thinking
44
45
* {% data variables.copilot.copilot_claude_sonnet_40 %}
45
46
46
-
{% data variables.copilot.copilot_claude_opus_41 %} is hosted by Anthropic PBC. {% data variables.copilot.copilot_claude_opus %} and {% data variables.copilot.copilot_claude_sonnet_40 %} are hosted by Anthropic PBC and Google Cloud Platform. {% data variables.copilot.copilot_claude_sonnet_45 %} and {% data variables.copilot.copilot_claude_sonnet_37 %} are hosted by Amazon Web Services, Anthropic PBC, and Google Cloud Platform. {% data variables.copilot.copilot_claude_sonnet_35 %} is hosted exclusively by Amazon Web Services. {% data variables.product.github %} has provider agreements in place to ensure data is not used for training. Additional details for each provider are included below:
47
+
{% data variables.copilot.copilot_claude_haiku_45 %} and {% data variables.copilot.copilot_claude_opus_41 %} are hosted by Anthropic PBC. {% data variables.copilot.copilot_claude_opus %} and {% data variables.copilot.copilot_claude_sonnet_40 %} are hosted by Anthropic PBC and Google Cloud Platform. {% data variables.copilot.copilot_claude_sonnet_45 %} and {% data variables.copilot.copilot_claude_sonnet_37 %} are hosted by Amazon Web Services, Anthropic PBC, and Google Cloud Platform. {% data variables.copilot.copilot_claude_sonnet_35 %} is hosted exclusively by Amazon Web Services. {% data variables.product.github %} has provider agreements in place to ensure data is not used for training. Additional details for each provider are included below:
47
48
48
49
* Amazon Bedrock: Amazon makes the [following data commitments](https://docs.aws.amazon.com/bedrock/latest/userguide/data-protection.html): _Amazon Bedrock doesn't store or log your prompts and completions. Amazon Bedrock doesn't use your prompts and completions to train any AWS models and doesn't distribute them to third parties_.
49
50
* Anthropic PBC: {% data variables.product.github %} maintains a [zero data retention agreement](https://privacy.anthropic.com/en/articles/8956058-i-have-a-zero-retention-agreement-with-anthropic-what-products-does-it-apply-to) with Anthropic.
0 commit comments