Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -308,7 +308,15 @@ If you added manual build steps for compiled languages and {% data variables.pro

## Building C/C++

{% ifversion codeql-no-build %}{% data variables.product.prodname_codeql %} supports build modes `autobuild` or `manual` for C/C++ code.
{% ifversion codeql-no-build %}{% data variables.product.prodname_codeql %} supports build modes {% ifversion codeql-no-build-c-cpp %}`none`, {% endif %}`autobuild` or `manual` for C/C++ code.

{% ifversion codeql-no-build-c-cpp %}

When you enable default setup for a repository that contains C/C++ code, the build mode is set to `none` automatically.

>[!NOTE] Support of build mode `none` for C/C++ codebases is currently in {% data variables.release-phases.public_preview %} and subject to change.

{% endif %}

### Autobuild summary for C/C++{% endif %}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -201,10 +201,10 @@ In addition, for {% data variables.code-scanning.no_build_support %}, there is a
The {% data variables.product.prodname_codeql_cli %} includes autobuilders for {% data variables.code-scanning.compiled_languages %} code. {% data variables.product.prodname_codeql %} autobuilders allow you to build projects for compiled languages without specifying any build commands. When an autobuilder is invoked, {% data variables.product.prodname_codeql %} examines the source for evidence of a build system and attempts to run the optimal set of commands required to extract a database. For more information, see [AUTOTITLE](/code-security/code-scanning/creating-an-advanced-setup-for-code-scanning/codeql-code-scanning-for-compiled-languages#about-autobuild).

An autobuilder is invoked automatically when you execute `codeql database create` for a compiled language if you don’t include a
`--command` option{% ifversion codeql-no-build %} or set `--build-mode none`{% endif %}. For example, for a C/C++ codebase, you could simply run:
`--command` option{% ifversion codeql-no-build %} or set `--build-mode none`{% endif %}. For example, for a Swift codebase, you could simply run:

```shell
codeql database create --language=cpp <output-folder>/cpp-database
codeql database create --language=swift <output-folder>/swift-database
```

If a codebase uses a standard build system, relying on an autobuilder is often the simplest way to create a database. For sources that require non-standard build steps, you may need to explicitly define each step in the command line.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ topics:
- Copilot
---

By default, {% data variables.copilot.copilot_chat_short %} uses a base model to provide fast, capable responses for a wide range of tasks, such as summarization, knowledge-based questions, reasoning, math, and coding.
By default, {% data variables.copilot.copilot_chat_short %} uses {% data variables.copilot.copilot_gpt_41 %} to provide fast, capable responses for a wide range of tasks, such as summarization, knowledge-based questions, reasoning, math, and coding.

However, you are not limited to using this model. You can choose from a selection of other models, each with its own particular strengths. You may have a favorite model that you like to use, or you might prefer to use a particular model for inquiring about a specific subject.

Expand All @@ -21,9 +21,8 @@ Changing the model that's used by {% data variables.copilot.copilot_chat_short %
{% webui %}

> [!NOTE]
> * Multiple model support in {% data variables.copilot.copilot_chat_short %} is in {% data variables.release-phases.public_preview %} and is subject to change.
> * Support for GPT-4.5 is only available on {% data variables.copilot.copilot_pro_plus_short %}{% ifversion copilot-enterprise %} and {% data variables.copilot.copilot_enterprise_short %}{% endif %}.
> * You can only use an alternative AI model in the immersive view of {% data variables.copilot.copilot_chat_short %}. This is the full-page version of {% data variables.copilot.copilot_chat_short %} that's displayed at [https://github.com/copilot](https://github.com/copilot). The {% data variables.copilot.copilot_chat_short %} panel always uses the default model.
> * Support for {% data variables.copilot.copilot_gpt_45 %}, {% data variables.copilot.copilot_claude_opus %}, and {% data variables.copilot.copilot_o3 %} are only available on {% data variables.copilot.copilot_pro_plus_short %}{% ifversion copilot-enterprise %} and {% data variables.copilot.copilot_enterprise_short %}{% endif %}.
> * You can only use an alternative AI model in the immersive view of {% data variables.copilot.copilot_chat_short %} on GitHub.com. This is the full-page version of {% data variables.copilot.copilot_chat_short %} that's displayed at [https://github.com/copilot](https://github.com/copilot). The {% data variables.copilot.copilot_chat_short %} panel always uses the default model.

## AI models for {% data variables.copilot.copilot_chat_short %}

Expand All @@ -35,14 +34,14 @@ The following models are currently available in the immersive mode of {% data va
* {% data variables.copilot.copilot_claude_sonnet_35 %}
* {% data variables.copilot.copilot_claude_sonnet_37 %}
* {% data variables.copilot.copilot_claude_sonnet_37 %} Thinking
* {% data variables.copilot.copilot_claude_sonnet_40 %}
* {% data variables.copilot.copilot_claude_opus %}
* {% data variables.copilot.copilot_claude_sonnet_40 %} (preview)
* {% data variables.copilot.copilot_claude_opus %} (preview)
* {% data variables.copilot.copilot_gemini_flash %}
* {% data variables.copilot.copilot_gemini_25_pro %} (preview)
* {% data variables.copilot.copilot_o1 %}
* {% data variables.copilot.copilot_o3 %}
* {% data variables.copilot.copilot_o1 %} (preview)
* {% data variables.copilot.copilot_o3 %} (preview)
* {% data variables.copilot.copilot_o3_mini %}
* {% data variables.copilot.copilot_o4_mini %}
* {% data variables.copilot.copilot_o4_mini %} (preview)

For more information about these models, see [AUTOTITLE](/copilot/using-github-copilot/ai-models/choosing-the-right-ai-model-for-your-task).

Expand All @@ -58,11 +57,11 @@ These instructions are for {% data variables.product.prodname_copilot_short %} o

> [!NOTE] If you use {% data variables.copilot.copilot_extensions_short %}, they may override the model you select.

1. In the top right of any page on {% data variables.product.github %}, click {% octicon "triangle-down" aria-label="The downwards triangle icon" %} beside the **{% octicon "copilot" aria-hidden="true" aria-label="copilot" %}** icon and click **Immersive** in the dropdown menu.
1. In the top right of any page on {% data variables.product.github %}, click the **{% octicon "copilot" aria-hidden="true" aria-label="copilot" %}** icon.

![Screenshot of the 'Immersive' button, highlighted with a dark orange outline.](/assets/images/help/copilot/copilot-immersive-button.png)
![Screenshot of the 'Copilot' button, highlighted with a dark orange outline.](/assets/images/help/copilot/copilot-icon-top-right.png)

1. At the top of the immersive view, select the **CURRENT-MODEL** {% octicon "chevron-down" aria-hidden="true" aria-label="chevron-down" %} dropdown menu, then click the AI model of your choice.
1. At the bottom of the immersive view, select the **CURRENT-MODEL** {% octicon "chevron-down" aria-hidden="true" aria-label="chevron-down" %} dropdown menu, then click the AI model of your choice.

1. Optionally, after submitting a prompt, you can regenerate the same prompt using a different model by clicking the retry icon ({% octicon "sync" aria-label="The re-run icon" %}) below the response. The new response will use your selected model and maintain the full context of the conversation.

Expand All @@ -72,7 +71,7 @@ These instructions are for {% data variables.product.prodname_copilot_short %} o

> [!NOTE]
> * Multiple model support in {% data variables.copilot.copilot_chat_short %} is in {% data variables.release-phases.public_preview %} and is subject to change.
> * Support for GPT-4.5 is only available on {% data variables.copilot.copilot_pro_plus_short %}{% ifversion copilot-enterprise %} and {% data variables.copilot.copilot_enterprise_short %}{% endif %}.
> * Support for {% data variables.copilot.copilot_gpt_45 %}, {% data variables.copilot.copilot_claude_opus %}, and {% data variables.copilot.copilot_o3 %} are only available on {% data variables.copilot.copilot_pro_plus_short %}{% ifversion copilot-enterprise %} and {% data variables.copilot.copilot_enterprise_short %}{% endif %}.

## AI models for {% data variables.copilot.copilot_chat_short %}

Expand All @@ -84,13 +83,14 @@ The following models are currently available through multi-model {% data variabl
* {% data variables.copilot.copilot_claude_sonnet_35 %}
* {% data variables.copilot.copilot_claude_sonnet_37 %}
* {% data variables.copilot.copilot_claude_sonnet_37 %} Thinking
* {% data variables.copilot.copilot_claude_sonnet_40 %}
* {% data variables.copilot.copilot_claude_opus %}
* {% data variables.copilot.copilot_claude_sonnet_40 %} (preview)
* {% data variables.copilot.copilot_claude_opus %} (preview)
* {% data variables.copilot.copilot_gemini_flash %}
* {% data variables.copilot.copilot_o1 %}
* {% data variables.copilot.copilot_o3 %}
* {% data variables.copilot.copilot_gemini_25_pro %} (preview)
* {% data variables.copilot.copilot_o1 %} (preview)
* {% data variables.copilot.copilot_o3 %} (preview)
* {% data variables.copilot.copilot_o3_mini %}
* {% data variables.copilot.copilot_o4_mini %}
* {% data variables.copilot.copilot_o4_mini %} (preview)

For more information about these models, see [AUTOTITLE](/copilot/using-github-copilot/ai-models/choosing-the-right-ai-model-for-your-task).

Expand All @@ -109,17 +109,22 @@ These instructions are for {% data variables.product.prodname_vscode_shortname %

{% visualstudio %}

> [!NOTE] Multiple model support in {% data variables.copilot.copilot_chat_short %} is in {% data variables.release-phases.public_preview %} and is subject to change.

## AI models for {% data variables.copilot.copilot_chat_short %}

The following models are currently available through multi-model {% data variables.copilot.copilot_chat_short %}:

* {% data variables.copilot.copilot_gpt_4o %}
* {% data variables.copilot.copilot_gpt_41 %}
* {% data variables.copilot.copilot_gpt_45 %} (preview)
* {% data variables.copilot.copilot_claude_sonnet_35 %}
* {% data variables.copilot.copilot_claude_sonnet_37 %}
* {% data variables.copilot.copilot_o1 %}
* {% data variables.copilot.copilot_claude_sonnet_37 %} Thinking
* {% data variables.copilot.copilot_gemini_flash %}
* {% data variables.copilot.copilot_gemini_25_pro %} (preview)
* {% data variables.copilot.copilot_o1 %} (preview)
* {% data variables.copilot.copilot_o3 %} (preview)
* {% data variables.copilot.copilot_o3_mini %}
* {% data variables.copilot.copilot_o4_mini %} (preview)

For more information about these models, see [AUTOTITLE](/copilot/using-github-copilot/ai-models/choosing-the-right-ai-model-for-your-task).

Expand All @@ -142,7 +147,7 @@ To use multi-model {% data variables.copilot.copilot_chat_short %}, you must use

> [!NOTE]
> * Multiple model support in {% data variables.copilot.copilot_chat_short %} is in {% data variables.release-phases.public_preview %} and is subject to change.
> * Support for GPT-4.5 is only available on {% data variables.copilot.copilot_pro_plus_short %}{% ifversion copilot-enterprise %} and {% data variables.copilot.copilot_enterprise_short %}{% endif %}.
> * Support for {% data variables.copilot.copilot_gpt_45 %} and {% data variables.copilot.copilot_o3 %} are only available on {% data variables.copilot.copilot_pro_plus_short %}{% ifversion copilot-enterprise %} and {% data variables.copilot.copilot_enterprise_short %}{% endif %}.

## AI models for {% data variables.copilot.copilot_chat_short %}

Expand Down Expand Up @@ -181,20 +186,24 @@ These instructions are for the JetBrains IDEs. For instructions on different cli

> [!NOTE]
> * Multiple model support in {% data variables.copilot.copilot_chat_short %} is in {% data variables.release-phases.public_preview %} and is subject to change.
> * Support for GPT-4.5 is only available on {% data variables.copilot.copilot_pro_plus_short %}{% ifversion copilot-enterprise %} and {% data variables.copilot.copilot_enterprise_short %}{% endif %}.
> * Support for {% data variables.copilot.copilot_gpt_45 %} and {% data variables.copilot.copilot_o3 %} are only available on {% data variables.copilot.copilot_pro_plus_short %}{% ifversion copilot-enterprise %} and {% data variables.copilot.copilot_enterprise_short %}{% endif %}.

## AI models for {% data variables.copilot.copilot_chat_short %}

The following models are currently available through multi-model {% data variables.copilot.copilot_chat_short %}:

* {% data variables.copilot.copilot_gpt_4o %}
* {% data variables.copilot.copilot_gpt_41 %}
* {% data variables.copilot.copilot_gpt_45 %} (preview)
* {% data variables.copilot.copilot_claude_sonnet_35 %}
* {% data variables.copilot.copilot_claude_sonnet_37 %}
* {% data variables.copilot.copilot_claude_sonnet_37 %} Thinking
* {% data variables.copilot.copilot_gemini_flash %}
* {% data variables.copilot.copilot_gemini_25_pro %} (preview)
* {% data variables.copilot.copilot_o1 %} (preview)
* {% data variables.copilot.copilot_o3 %} (preview)
* {% data variables.copilot.copilot_o3_mini %}
* {% data variables.copilot.copilot_o4_mini %} (preview)

For more information about these models, see [AUTOTITLE](/copilot/using-github-copilot/ai-models/choosing-the-right-ai-model-for-your-task).

Expand All @@ -216,7 +225,7 @@ These instructions are for the Eclipse IDE. For instructions on different client

> [!NOTE]
> * Multiple model support in {% data variables.copilot.copilot_chat_short %} is in {% data variables.release-phases.public_preview %} and is subject to change.
> * Support for GPT-4.5 is only available on {% data variables.copilot.copilot_pro_plus_short %}{% ifversion copilot-enterprise %} and {% data variables.copilot.copilot_enterprise_short %}{% endif %}.
> * Support for {% data variables.copilot.copilot_gpt_45 %} and {% data variables.copilot.copilot_o3 %} are only available on {% data variables.copilot.copilot_pro_plus_short %}{% ifversion copilot-enterprise %} and {% data variables.copilot.copilot_enterprise_short %}{% endif %}.

## AI models for {% data variables.copilot.copilot_chat_short %}

Expand Down
5 changes: 5 additions & 0 deletions data/features/codeql-no-build-c-cpp.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
# Reference: #16543 (C/C++ public preview)

versions:
fpt: '*'
ghec: '*'
2 changes: 1 addition & 1 deletion data/variables/code-scanning.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ codeql_workflow: 'CodeQL analysis workflow'
tool_status_page: 'tool status page'

# List of compiled languages supported for `no-build` extraction
no_build_support: '{% ifversion codeql-no-build-csharp %}C# and{% endif %} Java'
no_build_support: '{% ifversion codeql-no-build-c-cpp %}C/C++, {% endif %}{% ifversion codeql-no-build-csharp %}C# and{% endif %} Java'

# List of compiled languages
compiled_languages: 'C/C++, C#, Go, Java, Kotlin, and Swift'
Expand Down
9 changes: 8 additions & 1 deletion src/links/scripts/rendered-content-link-checker.ts
Original file line number Diff line number Diff line change
Expand Up @@ -570,9 +570,16 @@ function flawIssueDisplay(flaws: LinkFlaw[], opts: Options, mentionExternalExclu
'For more information, see [Fixing broken links in GitHub user docs](https://github.com/github/docs/blob/main/src/links/lib/README.md).'
}

return `${flawsToDisplay} broken${
output = `${flawsToDisplay} broken${
opts.commentLimitToExternalLinks ? ' **external** ' : ' '
}links found in [this](${opts.actionUrl}) workflow.\n${output}`

// limit is 65536
if (output.length > 60000) {
output = output.slice(0, 60000) + '\n\n---\n\nOUTPUT TRUNCATED'
}

return output
}

function printGlobalCacheHitRatio(core: CoreInject) {
Expand Down
Loading