Skip to content

Feature Request: Additional Configuration of System Prompts #161

@Broken-Admin

Description

@Broken-Admin

Is your feature request related to a problem? Please describe.
OpenAI allows for custom system prompts when prompting ChatGPT through their web interface.

I believe that the same features implemented similarly here could prove to provide additional functionality to the user, perhaps by specifying the particular system prompt per-model, per-directory or by option.

Describe the solution you'd like
I am in the process of attempting to implement a solution that could easily integrate into the existing codebase.

It would be necessary implement new logic in the new, one-shot, and load commands alongside additional file configuration and reading for a particular user directory specifying new user prompts.

Describe alternatives you've considered
The current user alternative is to directly modify the config.py and then proceed to run terminalgpt.

This solution works but requires additional configuration each time it is wished to change the current system prompt. This is inconvenient when additional logic could potentially be introduced in order to configure the current system prompt for particular user contexts.

Additional context
I am willing to contribute my time to the codebase in order to add this feature.

Currently I find myself stuck on the particular of what the best way to implement this feature would be, as there are multiple ways to go about it.

Any additional commentary of idea generation such as the optimal place to begin storing or loading and processing user prompts would be appreciated.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions