diff --git a/DESCRIPTION b/DESCRIPTION index b13adce..06d3b9b 100644 --- a/DESCRIPTION +++ b/DESCRIPTION @@ -23,7 +23,7 @@ BugReports: https://github.com/coatless-rpkg/searcher/issues Depends: R (>= 3.3.0) License: GPL (>= 2) Encoding: UTF-8 -RoxygenNote: 7.3.1 +RoxygenNote: 7.3.2 Roxygen: list(markdown = TRUE) Suggests: testthat (>= 2.1.0), @@ -31,3 +31,11 @@ Suggests: knitr, rmarkdown VignetteBuilder: knitr +Collate: + 'ai-prompts.R' + 'index-sites.R' + 'ai-search-functions.R' + 'defunct-functions.R' + 'search-functions.R' + 'searcher-package.R' + 'utilities.R' diff --git a/NAMESPACE b/NAMESPACE index 42a948c..9932e1c 100644 --- a/NAMESPACE +++ b/NAMESPACE @@ -1,5 +1,18 @@ # Generated by roxygen2: do not edit by hand +export(ai_prompt) +export(ai_prompt_active) +export(ai_prompt_clear) +export(ai_prompt_list) +export(ai_prompt_register) +export(ai_prompt_remove) +export(ask_bing_copilot) +export(ask_chatgpt) +export(ask_claude) +export(ask_copilot) +export(ask_meta_ai) +export(ask_mistral) +export(ask_perplexity) export(search_bb) export(search_bing) export(search_bitbucket) diff --git a/NEWS.md b/NEWS.md index 01fd5d7..5bd51a4 100644 --- a/NEWS.md +++ b/NEWS.md @@ -2,6 +2,23 @@ ## Features +- Added GenAI Search Portals: + - `ask_chatgpt()`: Searches with ChatGPT + - `ask_claude()`: Searches with Claude AI + - `ask_perplexity()`: Searches with Perplexity AI + - `ask_mistral()`: Searches with Mistral AI + - `ask_copilot()`: Searches with Microsoft Bing's Copilot + - `ask_meta_ai()`: Searches with Meta AI +- Added an AI Prompt Management System with Persona Prompts + - `ai_prompt()`: Set a prompt for the AI + - `ai_prompt_active()`: View the active prompt + - `ai_prompt_clear()`: Clear the active prompt + - `ai_prompt_list()`: List all prompts + - `ai_prompt_register()`: Add a custom prompt + - `ai_prompt_remove()`: Remove a prompt +- Added new vignettes: + - `search-with-ai-assistants.Rmd`: Overview of the AI Searching Techniques + - `managing-ai-prompts.Rmd`: Overview of `searcher`'s AI Prompt Management System - Added searcher logo ([#40](https://github.com/coatless-rpkg/searcher/pull/40)) diff --git a/R/ai-prompts.R b/R/ai-prompts.R new file mode 100644 index 0000000..bc5b636 --- /dev/null +++ b/R/ai-prompts.R @@ -0,0 +1,288 @@ +#' AI Prompt Management System +#' +#' @description +#' A set of functions to manage and apply prompts for AI search services. +#' These functions allow you to create, maintain, and use a library of +#' effective prompts for different AI assistants and scenarios. +#' +#' @details +#' The prompt management system works with multiple levels of prompts: +#' +#' 1. System-level prompt: Set with `ai_prompt()`, applies across all AI services +#' 2. Service-specific prompts: Set with `options()` or in function calls +#' 3. Default prompts: Built-in prompts that ship with the package +#' +#' When a search is performed, prompts are applied in this order, with the +#' query at the end. +#' +#' @name ai_prompt_management +NULL + +# Global variable to store the current system prompt +.searcher_system_prompt <- new.env(parent = emptyenv()) +.searcher_system_prompt$active <- NULL +.searcher_system_prompt$name <- NULL + +# Default prompt library +.searcher_prompt_library <- list( + general = "As an R expert, help with the following question or error:", + debugging = "You are an R debugging expert. First identify what's wrong with this code without fixing it. Then explain why it's wrong and what concept I'm misunderstanding. Finally, provide a working solution with an explanation of why it works.", + learning = "As an R educator teaching a diverse classroom, explain this concept in multiple ways: 1) Start with an intuitive explanation a beginner would understand, 2) Provide a simple working example, 3) Explain how this concept connects to other R concepts, 4) Show a more advanced practical application with commented code.", + package_selection = "As an unbiased R consultant familiar with the entire CRAN ecosystem, compare the top 3-4 R packages for this task. For each package, discuss: 1) Key strengths and limitations, 2) Ease of use and learning curve, 3) Community support and maintenance status, 4) Performance characteristics, 5) Unique features. Conclude with situational recommendations.", + code_review = "As a senior R developer conducting a code review: 1) Note what the code does correctly, 2) Identify potential issues in correctness, performance, readability, and maintainability, 3) Suggest specific improvements with before/after code examples, 4) If relevant, mention R idioms or functions that would simplify the code.", + stats_analysis = "As both a statistician and R programmer, help me with this analysis task. First, explain the appropriate statistical approach and why it's suitable. Then, provide an R implementation with explanations of each step, how to interpret the outputs, and diagnostic checks.", + visualization = "As a data visualization expert who specializes in R: 1) Recommend 2-3 visualization types that would best represent this data and explain why, 2) For the most appropriate visualization, provide ggplot2 code with a clear aesthetic mapping rationale, 3) Suggest specific customizations to improve readability." +) + +#' Set or View Active System-level AI Prompt +#' +#' @description +#' Sets a system-level prompt to be used with all AI search functions. +#' When called with no arguments, returns the currently active prompt. +#' +#' @param prompt_name Name of a prompt from the prompt library, or a custom prompt text. +#' Use `ai_prompt_list()` to see available prompt names. +#' If NULL, returns the current active prompt without changing it. +#' +#' @return Invisibly returns the active prompt text. If called without arguments, +#' returns the active prompt visibly. +#' +#' @examples +#' \dontrun{ +#' # Set a predefined prompt +#' ai_prompt("debugging") +#' +#' # Set a custom prompt +#' ai_prompt("Explain this R error in simple terms with examples:") +#' +#' # Check current active prompt +#' ai_prompt() +#' +#' # Clear the system prompt +#' ai_prompt(NULL) +#' } +#' +#' @export +ai_prompt <- function(prompt_name = NULL) { + # If no prompt_name is provided, return the current active prompt + if (is.null(prompt_name)) { + return(ai_prompt_active()) + } + + # If prompt_name is NA, clear the prompt + if (identical(prompt_name, NA)) { + .searcher_system_prompt$active <- NULL + .searcher_system_prompt$name <- NULL + message("System prompt cleared.") + return(invisible(NULL)) + } + + # Check if prompt_name is in the library + if (prompt_name %in% names(.searcher_prompt_library)) { + prompt_text <- .searcher_prompt_library[[prompt_name]] + prompt_source <- prompt_name + } else { + # Assume it's a custom prompt text + prompt_text <- prompt_name + prompt_source <- "custom" + } + + # Set the prompt + .searcher_system_prompt$active <- prompt_text + .searcher_system_prompt$name <- prompt_source + + message("Set system prompt to: ", ifelse(prompt_source == "custom", + "custom prompt", + paste0('"', prompt_source, '"'))) + invisible(prompt_text) +} + +#' Get Currently Active System Prompt +#' +#' @description +#' Returns the currently active system-level prompt, if any. +#' +#' @return The active prompt text, or NULL if no system prompt is set. +#' +#' @examples +#' \dontrun{ +#' # Check current active prompt +#' ai_prompt_active() +#' } +#' +#' @export +ai_prompt_active <- function() { + active_prompt <- .searcher_system_prompt$active + if (is.null(active_prompt)) { + message("No system prompt is currently active.") + return(NULL) + } else { + source_info <- .searcher_system_prompt$name + if (source_info == "custom") { + message("Active system prompt (custom):") + } else { + message("Active system prompt (", source_info, "):") + } + return(active_prompt) + } +} + +#' List Available AI Prompts +#' +#' @description +#' Lists all available prompts in the prompt library. +#' +#' @return A named list of available prompts. +#' +#' @examples +#' \dontrun{ +#' # List all available prompts +#' ai_prompt_list() +#' } +#' +#' @export +ai_prompt_list <- function() { + if (length(.searcher_prompt_library) == 0) { + message("No prompts found in the library.") + return(invisible(NULL)) + } + + cat("Available AI prompts:\n\n") + for (name in names(.searcher_prompt_library)) { + cat(sprintf("- %s\n", name)) + } + + invisible(.searcher_prompt_library) +} + +#' Register a New AI Prompt +#' +#' @description +#' Adds a new prompt to the prompt library. +#' +#' @param name Name for the new prompt. +#' @param prompt_text The prompt text to register. +#' @param overwrite Whether to overwrite an existing prompt with the same name. Default is FALSE. +#' +#' @return Invisibly returns the updated prompt library. +#' +#' @examples +#' \dontrun{ +#' # Register a new prompt +#' ai_prompt_register( +#' "tidyverse", +#' paste("As a tidyverse expert, explain how to solve this problem using", +#' "dplyr, tidyr, and other tidyverse packages:" +#' ) +#' ) +#' } +#' +#' @export +ai_prompt_register <- function(name, prompt_text, overwrite = FALSE) { + if (!is.character(name) || length(name) != 1) { + stop("Prompt name must be a single character string.") + } + + if (!is.character(prompt_text) || length(prompt_text) != 1) { + stop("Prompt text must be a single character string.") + } + + if (name %in% names(.searcher_prompt_library) && !overwrite) { + stop("A prompt named '", name, "' already exists. Use overwrite = TRUE to replace it.") + } + + .searcher_prompt_library[[name]] <- prompt_text + message("Registered prompt: ", name) + invisible(.searcher_prompt_library) +} + +#' Remove an AI Prompt from the Library +#' +#' @description +#' Removes a prompt from the prompt library. +#' +#' @param name Name of the prompt to remove. +#' +#' @return Invisibly returns the updated prompt library. +#' +#' @examples +#' \dontrun{ +#' # Remove a prompt +#' ai_prompt_remove("tidyverse") +#' } +#' +#' @export +ai_prompt_remove <- function(name) { + if (!name %in% names(.searcher_prompt_library)) { + stop("No prompt named '", name, "' found in the library.") + } + + .searcher_prompt_library[[name]] <- NULL + message("Removed prompt: ", name) + + # If the removed prompt was active, clear it + if (!is.null(.searcher_system_prompt$name) && .searcher_system_prompt$name == name) { + .searcher_system_prompt$active <- NULL + .searcher_system_prompt$name <- NULL + message("The removed prompt was active and has been cleared.") + } + + invisible(.searcher_prompt_library) +} + +#' Clear the Active System Prompt +#' +#' @description +#' Clears the currently active system-level prompt. +#' +#' @return Invisibly returns NULL. +#' +#' @examples +#' \dontrun{ +#' # Clear the system prompt +#' ai_prompt_clear() +#' } +#' +#' @export +ai_prompt_clear <- function() { + .searcher_system_prompt$active <- NULL + .searcher_system_prompt$name <- NULL + message("System prompt cleared.") + invisible(NULL) +} + +# Helper function to get the active prompts with sources +get_prompt_info <- function(service_name, user_prompt) { + system_prompt <- .searcher_system_prompt$active + system_prompt_name <- .searcher_system_prompt$name + + service_option_name <- paste0("searcher.", service_name, "_prompt") + service_default_prompt <- getOption(service_option_name, "") + + prompts <- list() + prompt_sources <- character() + + # Add system prompt if present + if (!is.null(system_prompt) && nchar(system_prompt) > 0) { + prompts <- c(prompts, system_prompt) + if (system_prompt_name == "custom") { + prompt_sources <- c(prompt_sources, "system (custom)") + } else { + prompt_sources <- c(prompt_sources, paste0("system (", system_prompt_name, ")")) + } + } + + # Add user prompt if present, or service default if not + if (!is.null(user_prompt) && nchar(user_prompt) > 0) { + prompts <- c(prompts, user_prompt) + prompt_sources <- c(prompt_sources, "function call") + } else if (nchar(service_default_prompt) > 0) { + prompts <- c(prompts, service_default_prompt) + prompt_sources <- c(prompt_sources, paste0("default (", service_name, ")")) + } + + list( + prompts = prompts, + sources = prompt_sources + ) +} diff --git a/R/ai-search-functions.R b/R/ai-search-functions.R new file mode 100644 index 0000000..db3ff72 --- /dev/null +++ b/R/ai-search-functions.R @@ -0,0 +1,164 @@ +#' Searcher Function Generator for AI Services +#' +#' Creates a search function specifically for interacting with AI services. +#' Unlike regular search functions, AI searchers support custom prompts +#' that can guide how the AI responds to queries. +#' +#' @param site Name of the site to search on (e.g., "chatgpt", "claude") +#' +#' @return A function that can be used to search the specified AI service with optional +#' prompt customization +#' @keywords internal +#' +#' @details +#' The returned function will apply prompts in the following priority order: +#' +#' 1. System-level prompt set via `ai_prompt()` (if any) +#' 2. Function call prompt or service-specific option +#' 3. The query itself +#' +#' The returned function accepts two parameters: +#' +#' - `query`: The question or request to send to the AI (defaults to last error message) +#' - `prompt`: Custom prompt to guide the AI's response (optional) +#' +#' @keywords internal +#' @include index-sites.R +ai_searcher = function(site) { + + entry = site_details(site) + site_name = tolower(site) + + function(query = geterrmessage(), prompt = NULL) { + + if (!valid_query(query)) { + message("`query` must contain only 1 element that is not empty.") + return(invisible("")) + } + + # Get prompt information + prompt_info <- get_prompt_info(site_name, prompt) + prompts <- prompt_info$prompts + sources <- prompt_info$sources + + # Display information about which prompts are being used + if (length(prompts) > 0) { + message("Using prompts: ", paste(sources, collapse = ", ")) + } + + # Combine all prompts and the query + if (length(prompts) > 0) { + combined_prompt <- paste(prompts, collapse = " ") + query <- paste0(combined_prompt, " ", query) + } + + # AI search doesn't use the R language suffix - directly use the query + browse_url(entry$site_url, query, entry$suffix) + } +} + +########################### Start Search with Generative AI + +#' Search Generative AI Services from R +#' +#' Opens a browser to query various generative AI assistants directly from R. +#' These functions allow you to ask questions, get code help, or search for information +#' using popular AI services. +#' +#' @param query Contents of string to send to the AI. Default is the last error message. +#' @param prompt Optional prompt prefix to add before your query to guide how the AI +#' responds. If NULL, uses the service-specific default prompt option. +#' +#' @return The generated search URL or an empty string. +#' +#' @rdname search_genai +#' @export +#' @seealso [search_site()] +#' @examples +#' \dontrun{ +#' # Basic AI queries +#' ask_chatgpt("How to join two dataframes in R?") +#' ask_claude("Explain what purrr::map_df does") +#' ask_perplexity("Compare dplyr vs data.table") +#' +#' # Using custom prompts +#' ask_mistral("Find bug: ggplot(mtcars, aes(x=mpg, y=hp) + geom_point()", +#' prompt = "Debug this code step by step:") +#' +#' # Searching the last error +#' tryCatch( +#' median("not a number"), +#' error = function(e) ask_chatgpt() +#' ) +#' +#' # Setting default prompts +#' options( +#' searcher.chatgpt_prompt = "You are an R viz expert. Help with:", +#' searcher.claude_prompt = "As an R statistics expert, answer:" +#' ) +#' } +#' +#' @section ChatGPT Search: +#' The `ask_chatgpt()` function opens a browser with OpenAI's ChatGPT interface and your query using: +#' `https://chat.openai.com/?model=auto&q=` +#' +#' You can customize the AI's behavior by setting a prompt prefix through: +#' 1. The `prompt` parameter for per-call customization +#' 2. The `options(searcher.chatgpt_prompt = "...")` setting for persistent customization +ask_chatgpt = ai_searcher("chatgpt") + +#' @rdname search_genai +#' @export +#' @section Claude Search: +#' The `ask_claude()` function opens Anthropic's Claude AI assistant with your query using: +#' `https://claude.ai/new?q=` +#' +#' Claude can be directed to respond in specific ways by using the prompt parameter or by +#' setting a default prompt via `options()`. +ask_claude = ai_searcher("claude") + +#' @rdname search_genai +#' @export +#' @section Perplexity Search: +#' The `ask_perplexity()` function searches with Perplexity AI using: +#' `https://www.perplexity.ai/search?q=&focus=internet&copilot=false` +#' +#' Perplexity AI provides answers with citations to sources, making it particularly +#' useful for research-oriented queries. +ask_perplexity = ai_searcher("perplexity") + +#' @rdname search_genai +#' @export +#' @section Mistral Search: +#' The `ask_mistral()` function launches Mistral AI with your query using: +#' `https://chat.mistral.ai/chat?q=` +#' +#' The default prompt can be customized through the `searcher.mistral_prompt` option. +ask_mistral = ai_searcher("mistral") + +#' @rdname search_genai +#' @export +#' @section Bing Copilot Search: +#' The `ask_bing_copilot()` and `search_copilot()` functions both search +#' Microsoft Bing Copilot using: +#' `https://www.bing.com/search?showconv=1&sendquery=1&q=` +#' +#' Bing Copilot combines search results with AI-generated responses, making it +#' useful for queries that benefit from web information. +ask_bing_copilot = ai_searcher("copilot") + +#' @rdname search_genai +#' @export +ask_copilot = ask_bing_copilot + +#' @rdname search_genai +#' @export +#' @section Meta AI Search: +#' The `ask_meta_ai()` function searches Meta AI using: +#' `https://www.meta.ai/?q=` +#' +#' Meta AI provides general-purpose AI assistance with a focus on conversational +#' responses. +ask_meta_ai = ai_searcher("meta") + +########################### End Search with Generative AI diff --git a/R/index-sites.R b/R/index-sites.R index 72f1840..1efc1a8 100644 --- a/R/index-sites.R +++ b/R/index-sites.R @@ -85,7 +85,22 @@ site_index = "rseek", "https://rseek.org/?q=", keywords = keyword_entry("", "tidyverse") - ) + ), + site_entry("chatgpt", "https://chat.openai.com/?model=auto&q=", + keywords = NULL), + site_entry("claude", "https://claude.ai/new?q=", + keywords = NULL), + site_entry("perplexity", "https://www.perplexity.ai/search?q=", + keywords = NULL, + suffix = "&focus=internet&copilot=false"), + site_entry("mistral", "https://chat.mistral.ai/chat?q=", + keywords = NULL), + site_entry("bing copilot", "https://www.bing.com/search?showconv=1&sendquery=1&q=", + "copilot", + keywords = NULL), + site_entry("meta ai", "https://www.meta.ai/?q=", + "meta", + keywords = NULL) ) site_name_matrix = function() { diff --git a/R/search-functions.R b/R/search-functions.R index f275cf1..569646a 100644 --- a/R/search-functions.R +++ b/R/search-functions.R @@ -9,8 +9,11 @@ #' `"github"`, `"grep"`, and `"bitbucket"`. #' @param query Contents of string to search. Default is the error message. #' @param rlang Search for results written in R. Default is `TRUE` +#' @param prompt Optional prompt prefix to add before your query to guide how the AI +#' responds. If `NULL`, uses the service-specific default prompt option. #' -#' @return The generated search URL or an empty string. +#' @return +#' The generated search URL or an empty string. #' #' @rdname search_site #' @export @@ -91,9 +94,18 @@ search_site = function(query, "gh", "grep", "bitbucket", - "bb" + "bb", + "chatgpt", + "claude", + "perplexity", + "mistral", + "bing copilot", + "copilot", + "meta ai", + "meta" ), - rlang = TRUE) { + rlang = TRUE, + prompt = NULL) { site = tolower(site) site = match.arg(site) @@ -116,7 +128,15 @@ search_site = function(query, gh = search_github(query, rlang), grep = search_grep(query, rlang), bitbucket = , # empty case carried below - bb = search_bitbucket(query, rlang) + bb = search_bitbucket(query, rlang), + chatgpt = ask_chatgpt(query, prompt), + claude = ask_claude(query, prompt), + perplexity = ask_perplexity(query, prompt), + mistral = ask_mistral(query, prompt), + `bing copilot` = , # empty case carried below + copilot = ask_bing_copilot(query, prompt), + `meta ai` = , # empty case carried below, + meta = ask_meta_ai(query, prompt) ) } @@ -165,6 +185,7 @@ searcher = function(site, keyword = getOption("searcher.default_keyword")) { } } + ########################### Start Search Engines #' @rdname search_site @@ -336,3 +357,6 @@ search_bitbucket = searcher("bb") search_bb = search_bitbucket ########################### End Search Code Repos + + + diff --git a/R/searcher-package.R b/R/searcher-package.R index 0ab413f..34180c9 100644 --- a/R/searcher-package.R +++ b/R/searcher-package.R @@ -17,7 +17,15 @@ searcher_default_options = list( searcher.launch_delay = 0.5, searcher.use_rstudio_viewer = FALSE, - searcher.default_keyword = "base" + searcher.default_keyword = "base", + + # Default AI prompts + searcher.chatgpt_prompt = "You are an R programming expert. Please answer questions concisely with code examples when appropriate.", + searcher.claude_prompt = "You are an R programming assistant. Focus on providing clear explanations and efficient code solutions.", + searcher.perplexity_prompt = "Answer with a focus on R programming and statistics. Include reliable sources when possible.", + searcher.mistral_prompt = "As an R expert, please help with this question. Include code examples if relevant.", + searcher.bing_copilot_prompt = "Please help with this R programming question. Provide working code examples.", + searcher.metaai_prompt = "You're an R programming assistant. Answer questions with practical examples." ) .onLoad = function(libname, pkgname) { diff --git a/README.Rmd b/README.Rmd index be49cc9..b12fd06 100644 --- a/README.Rmd +++ b/README.Rmd @@ -21,12 +21,12 @@ knitr::opts_chunk$set( The goal of `searcher` is to provide a search interface directly inside of _R_. -For example, to look up `rcpp example numeric vector` +For example, to look up `rcpp example numeric vector` or `ggplot2 fix axis labels` call one of the `search_*()` functions to -automatically have a web browser open, go to a search site, and type the query. -By default, the search functions will attempt to search the last error on call +automatically have a web browser open, go to a search site, and type the query. +`searcher` also provides direct integration with AI assistants, allowing you to send queries to ChatGPT, Claude, and other AI services with R-optimized prompts. +By default, the search functions will attempt to search the last error on call if no query is specified. - ![](https://i.imgur.com/Zq2rg6G.gif) ## Installation @@ -106,6 +106,42 @@ search_bitbucket("assertions") search_bitbucket("assertions", rlang = FALSE) # or search_bb(...) ``` +## AI Assistants + +The package also provides functions to query AI assistants directly from R. +These functions open a browser with your query pre-filled, using customizable +prompts that help the AI give more effective responses for R programming: + +```r +# Get coding help from AI assistants +ask_chatgpt("How to create a ggplot scatterplot with regression line?") +ask_claude("Explain what purrr::map_df does") +ask_perplexity("Compare dplyr vs data.table performance") +ask_mistral("How to handle missing data in R?") +ask_bing_copilot("Write a function to calculate the median") +ask_meta_ai("What are the best R packages for time series analysis?") + +# Search with an error message +tryCatch( + cor(mtcrs), # Intentional typo + error = function(e) ask_claude() # Will search the error message +) +``` + +All AI search functions accept an optional `prompt` parameter that guides how +the AI responds: + +```r +# Adding specific instructions to the prompt +ask_chatgpt( + "Fix this code: ggplot(mtcars, aes(x=mpg, y=hp) + geom_point()", + prompt = "You are an R debugging expert. Explain what's wrong step by step." +) +``` + +See `vignette("search-with-ai-assistants")` for more details on using AI +assistants in searches through `searcher`. + ## Search Errors `searcher` offers preliminary support for automatically or manually @@ -165,6 +201,12 @@ Presently, the following options are available: viewer pane instead of a web browser. Default is `FALSE`. - `searcher.default_keyword`: Suffix keyword to focus search results between either `"base"` or `"tidyverse"`. Default is `"base"`. +- `searcher.chatgpt_prompt`: Default prompt for ChatGPT queries, used if no specific prompt is provided. +- `searcher.claude_prompt`: Default prompt for Claude queries. +- `searcher.perplexity_prompt`: Default prompt for Perplexity queries. +- `searcher.mistral_prompt`: Default prompt for Mistral AI queries. +- `searcher.bing_copilot_prompt`: Default prompt for Bing Copilot queries. +- `searcher.meta_ai_prompt`: Default prompt for Meta AI queries. To set one of these options, please create the `.Rprofile` by typing into _R_: @@ -179,12 +221,37 @@ From there, add: options( searcher.launch_delay = 0, searcher.use_rstudio_viewer = FALSE, - searcher.default_keyword = "tidyverse" + searcher.default_keyword = "tidyverse", + searcher.chatgpt_prompt = "You are an R programming expert. Please provide concise answers with code examples.", ## Additional options. ) } ``` +## AI Prompt Management + +For those who frequently use AI assistants, searcher provides a prompt management system: + +```r +# List available prompts +ai_prompt_list() + +# Set a system-level prompt for all AI services +ai_prompt("debugging") # Use a predefined prompt for debugging + +# Create custom prompts +ai_prompt_register("my_prompt", "As an R expert analyzing the mtcars dataset...") + +# Check active prompt +ai_prompt_active() + +# Clear active prompt +ai_prompt_clear() +``` + +See `vignette("managing-ai-prompts")` for more details on the prompt management system. + + ## Motivation The idea for `searcher` began as a project to automatically search errors and diff --git a/README.md b/README.md index 89b6e39..13f7e87 100644 --- a/README.md +++ b/README.md @@ -15,9 +15,10 @@ The goal of `searcher` is to provide a search interface directly inside of *R*. For example, to look up `rcpp example numeric vector` or `ggplot2 fix axis labels` call one of the `search_*()` functions to automatically have a web browser open, go to a search site, and type the -query. By default, the search functions will attempt to search the last -error on call if no query is specified. - +query. `searcher` also provides direct integration with AI assistants, +allowing you to send queries to ChatGPT, Claude, and other AI services +with R-optimized prompts. By default, the search functions will attempt +to search the last error on call if no query is specified. ![](https://i.imgur.com/Zq2rg6G.gif) ## Installation @@ -101,6 +102,43 @@ search_bitbucket("assertions") search_bitbucket("assertions", rlang = FALSE) # or search_bb(...) ``` +## AI Assistants + +The package also provides functions to query AI assistants directly from +R. These functions open a browser with your query pre-filled, using +customizable prompts that help the AI give more effective responses for +R programming: + +``` r +# Get coding help from AI assistants +ask_chatgpt("How to create a ggplot scatterplot with regression line?") +ask_claude("Explain what purrr::map_df does") +ask_perplexity("Compare dplyr vs data.table performance") +ask_mistral("How to handle missing data in R?") +ask_bing_copilot("Write a function to calculate the median") +ask_meta_ai("What are the best R packages for time series analysis?") + +# Search with an error message +tryCatch( + cor(mtcrs), # Intentional typo + error = function(e) ask_claude() # Will search the error message +) +``` + +All AI search functions accept an optional `prompt` parameter that +guides how the AI responds: + +``` r +# Adding specific instructions to the prompt +ask_chatgpt( + "Fix this code: ggplot(mtcars, aes(x=mpg, y=hp) + geom_point()", + prompt = "You are an R debugging expert. Explain what's wrong step by step." +) +``` + +See `vignette("search-with-ai-assistants")` for more details on using AI +assistants in searches through `searcher`. + ## Search Errors `searcher` offers preliminary support for automatically or manually @@ -164,6 +202,14 @@ Presently, the following options are available: viewer pane instead of a web browser. Default is `FALSE`. - `searcher.default_keyword`: Suffix keyword to focus search results between either `"base"` or `"tidyverse"`. Default is `"base"`. +- `searcher.chatgpt_prompt`: Default prompt for ChatGPT queries, used if + no specific prompt is provided. +- `searcher.claude_prompt`: Default prompt for Claude queries. +- `searcher.perplexity_prompt`: Default prompt for Perplexity queries. +- `searcher.mistral_prompt`: Default prompt for Mistral AI queries. +- `searcher.bing_copilot_prompt`: Default prompt for Bing Copilot + queries. +- `searcher.meta_ai_prompt`: Default prompt for Meta AI queries. To set one of these options, please create the `.Rprofile` by typing into *R*: @@ -179,12 +225,38 @@ From there, add: options( searcher.launch_delay = 0, searcher.use_rstudio_viewer = FALSE, - searcher.default_keyword = "tidyverse" + searcher.default_keyword = "tidyverse", + searcher.chatgpt_prompt = "You are an R programming expert. Please provide concise answers with code examples.", ## Additional options. ) } ``` +## AI Prompt Management + +For those who frequently use AI assistants, searcher provides a prompt +management system: + +``` r +# List available prompts +ai_prompt_list() + +# Set a system-level prompt for all AI services +ai_prompt("debugging") # Use a predefined prompt for debugging + +# Create custom prompts +ai_prompt_register("my_prompt", "As an R expert analyzing the mtcars dataset...") + +# Check active prompt +ai_prompt_active() + +# Clear active prompt +ai_prompt_clear() +``` + +See `vignette("managing-ai-prompts")` for more details on the prompt +management system. + ## Motivation The idea for `searcher` began as a project to automatically search diff --git a/man/ai_prompt.Rd b/man/ai_prompt.Rd new file mode 100644 index 0000000..3904755 --- /dev/null +++ b/man/ai_prompt.Rd @@ -0,0 +1,37 @@ +% Generated by roxygen2: do not edit by hand +% Please edit documentation in R/ai-prompts.R +\name{ai_prompt} +\alias{ai_prompt} +\title{Set or View Active System-level AI Prompt} +\usage{ +ai_prompt(prompt_name = NULL) +} +\arguments{ +\item{prompt_name}{Name of a prompt from the prompt library, or a custom prompt text. +Use \code{ai_prompt_list()} to see available prompt names. +If NULL, returns the current active prompt without changing it.} +} +\value{ +Invisibly returns the active prompt text. If called without arguments, +returns the active prompt visibly. +} +\description{ +Sets a system-level prompt to be used with all AI search functions. +When called with no arguments, returns the currently active prompt. +} +\examples{ +\dontrun{ +# Set a predefined prompt +ai_prompt("debugging") + +# Set a custom prompt +ai_prompt("Explain this R error in simple terms with examples:") + +# Check current active prompt +ai_prompt() + +# Clear the system prompt +ai_prompt(NULL) +} + +} diff --git a/man/ai_prompt_active.Rd b/man/ai_prompt_active.Rd new file mode 100644 index 0000000..c6ad770 --- /dev/null +++ b/man/ai_prompt_active.Rd @@ -0,0 +1,21 @@ +% Generated by roxygen2: do not edit by hand +% Please edit documentation in R/ai-prompts.R +\name{ai_prompt_active} +\alias{ai_prompt_active} +\title{Get Currently Active System Prompt} +\usage{ +ai_prompt_active() +} +\value{ +The active prompt text, or NULL if no system prompt is set. +} +\description{ +Returns the currently active system-level prompt, if any. +} +\examples{ +\dontrun{ +# Check current active prompt +ai_prompt_active() +} + +} diff --git a/man/ai_prompt_clear.Rd b/man/ai_prompt_clear.Rd new file mode 100644 index 0000000..cb1ceac --- /dev/null +++ b/man/ai_prompt_clear.Rd @@ -0,0 +1,21 @@ +% Generated by roxygen2: do not edit by hand +% Please edit documentation in R/ai-prompts.R +\name{ai_prompt_clear} +\alias{ai_prompt_clear} +\title{Clear the Active System Prompt} +\usage{ +ai_prompt_clear() +} +\value{ +Invisibly returns NULL. +} +\description{ +Clears the currently active system-level prompt. +} +\examples{ +\dontrun{ +# Clear the system prompt +ai_prompt_clear() +} + +} diff --git a/man/ai_prompt_list.Rd b/man/ai_prompt_list.Rd new file mode 100644 index 0000000..78acd2c --- /dev/null +++ b/man/ai_prompt_list.Rd @@ -0,0 +1,21 @@ +% Generated by roxygen2: do not edit by hand +% Please edit documentation in R/ai-prompts.R +\name{ai_prompt_list} +\alias{ai_prompt_list} +\title{List Available AI Prompts} +\usage{ +ai_prompt_list() +} +\value{ +A named list of available prompts. +} +\description{ +Lists all available prompts in the prompt library. +} +\examples{ +\dontrun{ +# List all available prompts +ai_prompt_list() +} + +} diff --git a/man/ai_prompt_management.Rd b/man/ai_prompt_management.Rd new file mode 100644 index 0000000..74b3170 --- /dev/null +++ b/man/ai_prompt_management.Rd @@ -0,0 +1,21 @@ +% Generated by roxygen2: do not edit by hand +% Please edit documentation in R/ai-prompts.R +\name{ai_prompt_management} +\alias{ai_prompt_management} +\title{AI Prompt Management System} +\description{ +A set of functions to manage and apply prompts for AI search services. +These functions allow you to create, maintain, and use a library of +effective prompts for different AI assistants and scenarios. +} +\details{ +The prompt management system works with multiple levels of prompts: +\enumerate{ +\item System-level prompt: Set with \code{ai_prompt()}, applies across all AI services +\item Service-specific prompts: Set with \code{options()} or in function calls +\item Default prompts: Built-in prompts that ship with the package +} + +When a search is performed, prompts are applied in this order, with the +query at the end. +} diff --git a/man/ai_prompt_register.Rd b/man/ai_prompt_register.Rd new file mode 100644 index 0000000..458d593 --- /dev/null +++ b/man/ai_prompt_register.Rd @@ -0,0 +1,33 @@ +% Generated by roxygen2: do not edit by hand +% Please edit documentation in R/ai-prompts.R +\name{ai_prompt_register} +\alias{ai_prompt_register} +\title{Register a New AI Prompt} +\usage{ +ai_prompt_register(name, prompt_text, overwrite = FALSE) +} +\arguments{ +\item{name}{Name for the new prompt.} + +\item{prompt_text}{The prompt text to register.} + +\item{overwrite}{Whether to overwrite an existing prompt with the same name. Default is FALSE.} +} +\value{ +Invisibly returns the updated prompt library. +} +\description{ +Adds a new prompt to the prompt library. +} +\examples{ +\dontrun{ +# Register a new prompt +ai_prompt_register( + "tidyverse", + paste("As a tidyverse expert, explain how to solve this problem using", + "dplyr, tidyr, and other tidyverse packages:" + ) +) +} + +} diff --git a/man/ai_prompt_remove.Rd b/man/ai_prompt_remove.Rd new file mode 100644 index 0000000..863d19d --- /dev/null +++ b/man/ai_prompt_remove.Rd @@ -0,0 +1,24 @@ +% Generated by roxygen2: do not edit by hand +% Please edit documentation in R/ai-prompts.R +\name{ai_prompt_remove} +\alias{ai_prompt_remove} +\title{Remove an AI Prompt from the Library} +\usage{ +ai_prompt_remove(name) +} +\arguments{ +\item{name}{Name of the prompt to remove.} +} +\value{ +Invisibly returns the updated prompt library. +} +\description{ +Removes a prompt from the prompt library. +} +\examples{ +\dontrun{ +# Remove a prompt +ai_prompt_remove("tidyverse") +} + +} diff --git a/man/ai_searcher.Rd b/man/ai_searcher.Rd new file mode 100644 index 0000000..3c76553 --- /dev/null +++ b/man/ai_searcher.Rd @@ -0,0 +1,35 @@ +% Generated by roxygen2: do not edit by hand +% Please edit documentation in R/ai-search-functions.R +\name{ai_searcher} +\alias{ai_searcher} +\title{Searcher Function Generator for AI Services} +\usage{ +ai_searcher(site) +} +\arguments{ +\item{site}{Name of the site to search on (e.g., "chatgpt", "claude")} +} +\value{ +A function that can be used to search the specified AI service with optional +prompt customization +} +\description{ +Creates a search function specifically for interacting with AI services. +Unlike regular search functions, AI searchers support custom prompts +that can guide how the AI responds to queries. +} +\details{ +The returned function will apply prompts in the following priority order: +\enumerate{ +\item System-level prompt set via \code{ai_prompt()} (if any) +\item Function call prompt or service-specific option +\item The query itself +} + +The returned function accepts two parameters: +\itemize{ +\item \code{query}: The question or request to send to the AI (defaults to last error message) +\item \code{prompt}: Custom prompt to guide the AI's response (optional) +} +} +\keyword{internal} diff --git a/man/search_genai.Rd b/man/search_genai.Rd new file mode 100644 index 0000000..9d20fcd --- /dev/null +++ b/man/search_genai.Rd @@ -0,0 +1,125 @@ +% Generated by roxygen2: do not edit by hand +% Please edit documentation in R/ai-search-functions.R +\name{ask_chatgpt} +\alias{ask_chatgpt} +\alias{ask_claude} +\alias{ask_perplexity} +\alias{ask_mistral} +\alias{ask_bing_copilot} +\alias{ask_copilot} +\alias{ask_meta_ai} +\title{Search Generative AI Services from R} +\usage{ +ask_chatgpt(query = geterrmessage(), prompt = NULL) + +ask_claude(query = geterrmessage(), prompt = NULL) + +ask_perplexity(query = geterrmessage(), prompt = NULL) + +ask_mistral(query = geterrmessage(), prompt = NULL) + +ask_bing_copilot(query = geterrmessage(), prompt = NULL) + +ask_copilot(query = geterrmessage(), prompt = NULL) + +ask_meta_ai(query = geterrmessage(), prompt = NULL) +} +\arguments{ +\item{query}{Contents of string to send to the AI. Default is the last error message.} + +\item{prompt}{Optional prompt prefix to add before your query to guide how the AI +responds. If NULL, uses the service-specific default prompt option.} +} +\value{ +The generated search URL or an empty string. +} +\description{ +Opens a browser to query various generative AI assistants directly from R. +These functions allow you to ask questions, get code help, or search for information +using popular AI services. +} +\section{ChatGPT Search}{ + +The \code{ask_chatgpt()} function opens a browser with OpenAI's ChatGPT interface and your query using: +\verb{https://chat.openai.com/?model=auto&q=} + +You can customize the AI's behavior by setting a prompt prefix through: +\enumerate{ +\item The \code{prompt} parameter for per-call customization +\item The \code{options(searcher.chatgpt_prompt = "...")} setting for persistent customization +} +} + +\section{Claude Search}{ + +The \code{ask_claude()} function opens Anthropic's Claude AI assistant with your query using: +\verb{https://claude.ai/new?q=} + +Claude can be directed to respond in specific ways by using the prompt parameter or by +setting a default prompt via \code{options()}. +} + +\section{Perplexity Search}{ + +The \code{ask_perplexity()} function searches with Perplexity AI using: +\verb{https://www.perplexity.ai/search?q=&focus=internet&copilot=false} + +Perplexity AI provides answers with citations to sources, making it particularly +useful for research-oriented queries. +} + +\section{Mistral Search}{ + +The \code{ask_mistral()} function launches Mistral AI with your query using: +\verb{https://chat.mistral.ai/chat?q=} + +The default prompt can be customized through the \code{searcher.mistral_prompt} option. +} + +\section{Bing Copilot Search}{ + +The \code{ask_bing_copilot()} and \code{search_copilot()} functions both search +Microsoft Bing Copilot using: +\verb{https://www.bing.com/search?showconv=1&sendquery=1&q=} + +Bing Copilot combines search results with AI-generated responses, making it +useful for queries that benefit from web information. +} + +\section{Meta AI Search}{ + +The \code{ask_meta_ai()} function searches Meta AI using: +\verb{https://www.meta.ai/?q=} + +Meta AI provides general-purpose AI assistance with a focus on conversational +responses. +} + +\examples{ +\dontrun{ +# Basic AI queries +ask_chatgpt("How to join two dataframes in R?") +ask_claude("Explain what purrr::map_df does") +ask_perplexity("Compare dplyr vs data.table") + +# Using custom prompts +ask_mistral("Find bug: ggplot(mtcars, aes(x=mpg, y=hp) + geom_point()", + prompt = "Debug this code step by step:") + +# Searching the last error +tryCatch( + median("not a number"), + error = function(e) ask_chatgpt() +) + +# Setting default prompts +options( + searcher.chatgpt_prompt = "You are an R viz expert. Help with:", + searcher.claude_prompt = "As an R statistics expert, answer:" +) +} + +} +\seealso{ +\code{\link[=search_site]{search_site()}} +} diff --git a/man/search_site.Rd b/man/search_site.Rd index 3daf552..4c0f781 100644 --- a/man/search_site.Rd +++ b/man/search_site.Rd @@ -28,8 +28,10 @@ search_site( query, site = c("google", "bing", "duckduckgo", "ddg", "startpage", "sp", "qwant", "rseek", "rstudio community", "rscom", "twitter", "stackoverflow", "so", "github", "gh", - "grep", "bitbucket", "bb"), - rlang = TRUE + "grep", "bitbucket", "bb", "chatgpt", "claude", "perplexity", "mistral", + "bing copilot", "copilot", "meta ai", "meta"), + rlang = TRUE, + prompt = NULL ) search_google(query = geterrmessage(), rlang = TRUE) @@ -81,6 +83,9 @@ search_bb(query = geterrmessage(), rlang = TRUE) \code{"github"}, \code{"grep"}, and \code{"bitbucket"}.} \item{rlang}{Search for results written in R. Default is \code{TRUE}} + +\item{prompt}{Optional prompt prefix to add before your query to guide how the AI +responds. If \code{NULL}, uses the service-specific default prompt option.} } \value{ The generated search URL or an empty string. diff --git a/vignettes/managing-ai-prompts.Rmd b/vignettes/managing-ai-prompts.Rmd new file mode 100644 index 0000000..df50e90 --- /dev/null +++ b/vignettes/managing-ai-prompts.Rmd @@ -0,0 +1,233 @@ +--- +title: "Managing AI Prompts in `searcher`" +author: "James Joseph Balamuta" +date: "`r Sys.Date()`" +output: rmarkdown::html_vignette +vignette: > + %\VignetteIndexEntry{Managing AI Prompts} + %\VignetteEngine{knitr::rmarkdown} + %\VignetteEncoding{UTF-8} +--- + +```{r, include = FALSE} +knitr::opts_chunk$set( + collapse = TRUE, + comment = "#>", + eval = FALSE +) +``` + + +# Introduction + +The `searcher` package includes a powerful prompt management system for working +with AI assistants. This vignette explains how to use this system to create, +manage, and apply effective prompts when using AI search functions. It +complements the main `vignette("search-with-ai-assistants")`, which provides a broader +overview of using AI services with R. + +## What are Prompts? + +Prompts are instructions that guide how an AI assistant should respond to your +queries. A well-crafted prompt can dramatically improve the quality and +relevance of AI responses for R programming tasks. + +# The Prompt Management System + +The `searcher` package provides a multi-level prompt system: + +1. **System-level prompts**: Set with `ai_prompt()`, these apply across all AI services +2. **Service-specific prompts**: Set with `options()` or within function calls +3. **Default prompts**: Built-in prompts that ship with the package + +When you search with an AI assistant, prompts are applied in this order +(with the query at the end). + +## Built-in Prompt Library + +The package comes with several pre-defined prompts optimized for different R +programming scenarios: + +```r +# List available prompts +ai_prompt_list() +``` + +This will show prompts like: + +- `general`: Balanced for general R questions and errors +- `debugging`: For identifying and fixing bugs in R code +- `learning`: For learning R concepts with progressive complexity +- `package_selection`: For comparing R packages with balanced analysis +- `code_review`: For evaluating and improving R code +- `stats_analysis`: For statistical analysis in R +- `visualization`: For creating effective data visualizations + +## Setting an Active Prompt + +To set a system-level prompt that will be used with all AI search functions: + +```r +# Use a built-in prompt +ai_prompt("debugging") + +# Then search with any AI service +ask_chatgpt("Error: object 'mtcrs' not found") # Note the typo +``` + +You'll see a message indicating which prompt is being used: + +``` +Using prompts: system (debugging) +Searching query in a web browser... +``` + +The AI will receive both your prompt instruction and the query, helping it +respond more effectively. + +## Using Custom Prompts + +You can also use a custom prompt text directly: + +```r +# Set a custom prompt +ai_prompt("As an R package developer, explain this error in terms of how R handles namespaces:") + +# Check the active prompt +ai_prompt_active() +``` + +## Checking and Clearing Prompts + +You can check the currently active prompt or clear it: + +```r +# Check currently active prompt +ai_prompt() +# or +ai_prompt_active() + +# Clear the active prompt +ai_prompt(NA) +# or +ai_prompt_clear() +``` + +## Extending the Prompt Library + +You can add your own prompts to the library: + +```r +# Register a new prompt +ai_prompt_register( + "shiny_expert", + "As a Shiny app developer, explain how to implement this UI feature or fix this reactive issue:" +) + +# Use your new prompt +ai_prompt("shiny_expert") +``` + +## Removing Prompts + +To remove a prompt from the library: + +```r +ai_prompt_remove("shiny_expert") +``` + +# Advanced Usage + +In this section, we will explore more advanced usage of the prompt system. +This includes layering multiple prompts, creating a session prompt library, +and managing prompts in your `.Rprofile` for persistent settings. + +## Prompt Layering + +The prompt system supports layering multiple prompts: + +```r +# Set a system-level prompt +ai_prompt("debugging") + +# Use with service-specific default prompt +options(searcher.claude_prompt = "Focus on tidyverse solutions:") + +# Then use with a function-call prompt +ask_claude("Error in filter(data, x > 0): object 'x' not found", + prompt = "Explain in simple terms:") +``` + +This will use all three prompts in order (debugging, "Focus on tidyverse solutions", and "Explain in simple terms") before the query. + +## Creating a Session Prompt Library + +For more advanced usage, you can create a custom prompt library in your R session: + +```r +# Create custom prompts for different projects +ai_prompt_register("my_package", "As an R package developer reviewing the 'mypackage' codebase.") +ai_prompt_register("data_cleaning", "Analyzing the customer_data.csv dataset with missing values.") +ai_prompt_register("reporting", "Create a Quarto document report for business stakeholders.") + +# Switch between contexts as you work +ai_prompt("my_package") +# ... work on package development ... + +ai_prompt("data_cleaning") +# ... work on data cleaning ... +``` + +## Prompt Management in .Rprofile + +For persistent prompt management, you can add code to your `.Rprofile` +This file is executed every time you start R, allowing you to set up your environment. +Most users will have a `.Rprofile` file in their home directory. You can create +or edit this file to include your prompt management code. + +To create or edit the `.Rprofile` file, you can use the following command in R: + +```r +# Creating or editing the file: +file.edit("~/.Rprofile") +``` + +Then, you can add the following code to set up your prompt management system: + +```r +# In .Rprofile +.First <- function() { + if (requireNamespace("searcher", quietly = TRUE)) { + # Register custom prompts + searcher::ai_prompt_register( + "work", + "As an R analyst at XYZ company working with our sales database:", + overwrite = TRUE + ) + + # Set default prompts for different AI services + options( + searcher.chatgpt_prompt = "Provide R code with tidyverse packages:", + searcher.claude_prompt = "Give me both base R and tidyverse solutions:" + ) + + # Set a default system-level prompt if desired + # searcher::ai_prompt("debugging") + } +} +``` + +# Conclusion + +The prompt management system in `searcher` provides a flexible and powerful way to +create, manage, and apply prompts for AI search functions. Through using +system-level prompts, service-specific prompts, and custom prompts, you can +tailor the AI's responses to your specific needs and context. This +approach enhances the quality of AI-generated responses and, subsequently, +helps you save time and improve the efficiency of your R workflow. + +The system presented in this vignette transforms the custom prompts described +in `vignette("search-with-ai-assistants")` from one-off tools into a systematic +library that can be maintained, shared, and reused. This represents a shift from +ad-hoc prompting to a more deliberate approach that treats prompts as valuable +assets in your R programming toolkit. diff --git a/vignettes/search-with-ai-assistants.Rmd b/vignettes/search-with-ai-assistants.Rmd new file mode 100644 index 0000000..58de50e --- /dev/null +++ b/vignettes/search-with-ai-assistants.Rmd @@ -0,0 +1,332 @@ +--- +title: "Using AI Assistants with searcher" +author: "James Joseph Balamuta" +date: "`r Sys.Date()`" +output: rmarkdown::html_vignette +vignette: > + %\VignetteIndexEntry{Using AI Assistants with searcher} + %\VignetteEngine{knitr::rmarkdown} + %\VignetteEncoding{UTF-8} +--- + +```{r, include = FALSE} +knitr::opts_chunk$set( + collapse = TRUE, + comment = "#>", + eval = FALSE +) +``` + +# Introduction + +The `searcher` package now enables direct interaction with various AI assistants +directly from your R environment. This vignette explains how to use these +features to enhance your R programming workflow by leveraging AI-powered +assistance. + +# Available AI Services + +The package supports the following AI services: + +1. **ChatGPT** (`ask_chatgpt()`) - OpenAI's popular large language model +2. **Claude** (`ask_claude()`) - Anthropic's assistant known for longer context and thoughtful responses +3. **Perplexity** (`ask_perplexity()`) - Research-focused AI with internet search capabilities +4. **Mistral** (`ask_mistral()`) - Mistral AI's assistant with strong reasoning capabilities +5. **Bing Copilot** (`ask_bing_copilot()` or `ask_copilot()`) - Microsoft's AI assistant with web search integration +6. **Meta AI** (`ask_meta_ai()`) - Meta's conversational AI assistant + +# Basic Usage + +To use any of these AI services, simply call the corresponding function with your query: + +```r +library(searcher) + +# Ask ChatGPT a question about R +ask_chatgpt("How do I create a scatterplot with ggplot2?") + +# Get Claude to explain a statistical concept +ask_claude("Explain GAMs (Generalized Additive Models) in R") + +# Research time series forecasting methods +ask_perplexity("What are the best time series forecasting packages in R?") + +# Debug a problematic R function +ask_mistral("Debug this function: calculate_median <- function(x) mean(x)") + +# Compare programming approaches with Bing Copilot +ask_copilot("Compare data.table vs dplyr for large datasets") + +# Ask Meta AI about best practices +ask_metaai("What's the best way to handle missing data in R?") +``` + +# Customizing AI Behavior with Prompts + +One powerful feature of the AI search functions is the ability to customize +how the AI responds by using prompts. This can be done in two ways: + +## 1. Per-call Prompts + +For one-time customization, provide a `prompt` parameter to any AI search +function: + +```r +# Ask for step-by-step debugging +ask_chatgpt("Why doesn't this work? mtcars %>% filter(cyl = 4)", + prompt = "You are an R debugging expert. Identify the error and explain step by step:") + +# Request tidyverse-focused solutions +ask_claude("How to reshape data?", + prompt = "Answer using tidyverse packages, particularly tidyr:") + +# Ask for educational examples +ask_perplexity("How to implement PCA in R?", + prompt = "Provide a beginner-friendly tutorial with examples:") +``` + +## 2. Default Prompts via Options + +For persistent customization, set default prompts in your `.Rprofile` or at the +beginning of your session: + +```r +# Set default prompts +options( + searcher.chatgpt_prompt = "As an R data visualization expert, please help with:", + searcher.claude_prompt = "Provide reproducible R code examples for:", + searcher.perplexity_prompt = "Provide evidence-based R solutions with references:" +) + +# Now all queries will use these default prompts +ask_chatgpt("How to create a heatmap?") +ask_claude("Efficient way to merge multiple dataframes") +``` + +# Error Handling Integration + +A particularly useful feature is the ability to automatically search your errors +with AI assistants: + +```r +# Set Claude as your error handler +options(error = ask_claude) + +# Now any error will automatically be sent to Claude +fibonacci <- function(n) { + if(n <= 0) return(0) + if(n == 1) return(1) + return(fibonacci(n-1) + fibonacci(n-2)) +} + +# This will cause a stack overflow error that will be automatically searched +fibonacci(1000) + +# You can also manually search the last error +ask_chatgpt() # Searches the last error message +``` + + +# Effective Prompting Strategies + +Well-crafted prompts are essential for getting the most out of your interactions +with AI assistants. By providing clear, structured prompts, you can +dramatically improve the quality, relevance, and usefulness of AI responses. +The prompts can also be tailored to the specific AI service you are using, as +each has its own strengths and weaknesses or "quirks." +This section explores effective prompt strategies +for different R programming scenarios. + +## Understanding Prompts + +Prompts serve as instructions that guide how an AI assistant should respond to +your query. A good prompt typically includes: + +1. **Role specification**: Who the AI should act as (e.g., "You are an R programming expert") +2. **Task definition**: What the AI should do (e.g., "Debug this code") +3. **Output format**: How responses should be structured (e.g., "Provide step-by-step explanations") +4. **Context**: Any relevant background information (e.g., "This is for a beginner's tutorial") + +## Tailoring Prompts to Different AI Services + +Different AI services have different strengths and characteristics. Varying +your prompts or selecting the right service can yield better results. Here are some +general guidelines for tailoring prompts to specific AI services: + +- **ChatGPT**: Responds well to direct, structured instructions and specific constraints +- **Claude**: Excels with narrative prompts and nuanced instructions about reasoning +- **Perplexity**: Works best with prompts that request citations and multiple perspectives +- **Mistral**: Benefits from clear, concise prompts with explicit output formatting +- **Bing Copilot**: Responds well to prompts that involve web knowledge and current information +- **Meta AI**: Still a newcomer, but generally effective with straightforward queries + +## Debugging Code + +When debugging R code, effective prompts should guide the AI to identify the +specific issue and provide a clear explanation. + +```r +# Less effective prompt +ask_chatgpt(prompt = "Fix this code:") + +# More effective prompt +ask_chatgpt(prompt = "You are an R debugging expert. First identify what's wrong with this code without fixing it. Then explain why it's wrong and what concept I'm misunderstanding. Finally, provide a working solution with an explanation of why it works.") +``` + +This improved prompt is effective because it: + +- Establishes a clear role (debugging expert) +- Structures the analysis process (identify → explain → solve) +- Focuses on learning (explaining the underlying concept) + +## Learning New Concepts + +When using AI to learn R concepts, prompts should encourage clear explanations +with progressive complexity. + +```r +# Less effective prompt +ask_chatgpt(prompt = "Explain this R concept:") + +# More effective prompt +ask_chatgpt(prompt = "As an R educator teaching a diverse classroom, explain this concept in multiple ways: 1) Start with an intuitive explanation a beginner would understand, 2) Provide a simple working example, 3) Explain how this concept connects to other R concepts like {relevant_concepts}, 4) Show a more advanced practical application with commented code.") +``` + +This approach works well because it: + +- Creates a teaching scenario that encourages clear communication +- Requests multiple perspectives (from basic to advanced) +- Asks for concrete examples, not just theory +- Encourages connections to existing knowledge + +## Selecting Packages and Tools + +When seeking advice on package selection, prompts should encourage comprehensive, balanced comparisons. + +```r +# Less effective prompt +ask_perplexity(prompt = "What package should I use for this?") + +# More effective prompt +ask_perplexity(prompt = "As an unbiased R consultant familiar with the entire CRAN ecosystem, compare the top 3-4 R packages for this task. For each package, discuss: 1) Key strengths and limitations, 2) Ease of use and learning curve, 3) Community support and maintenance status, 4) Performance characteristics, 5) Unique features. Conclude with situational recommendations (when each would be the best choice) rather than a single recommendation. Include citations to benchmarks or articles where relevant.") +``` + +This strategy works because it: + +- Explicitly requests multiple options instead of a single answer +- Defines specific comparison criteria +- Asks for situation-dependent recommendations +- Requests evidence/citations to support claims + +## Code Review and Improvement + +For code review prompts, focus on balancing constructive criticism with actionable improvements. + +```r +# Less effective prompt +ask_mistral(prompt = "Review this code:") + +# More effective prompt +ask_mistral(prompt = "As a senior R developer conducting a code review: 1) Note what the code does correctly, 2) Identify potential issues in correctness, performance, readability, and maintainability, 3) Suggest specific improvements with before/after code examples, 4) If relevant, mention R idioms or functions that would simplify the code, 5) Rate the code on a 1-10 scale for efficiency, readability, and robustness.") +``` + +This approach is effective because it: + +- Balances positive feedback with constructive criticism +- Covers multiple dimensions of code quality +- Provides specific examples, not just general advice +- Includes a structured evaluation framework + +## Complex Statistical Analysis + +When seeking help with statistical methods in R, prompts should emphasize both +theoretical understanding and practical implementation. + +```r +# Less effective prompt +ask_claude(prompt = "Help me analyze this data:") + +# More effective prompt +ask_claude(prompt = "As both a statistician and R programmer, help me with this analysis task. First, explain the appropriate statistical approach and why it's suitable for this situation. Then, provide an R implementation with explanations of: 1) Required packages, 2) Data preparation steps, 3) The analysis code with comments explaining each step, 4) How to interpret the outputs, 5) Diagnostic checks to validate assumptions, 6) Potential limitations of this approach. Show output examples where helpful.") +``` + +This works well because it: + +- Connects statistical theory with R implementation +- Provides a complete workflow from preparation to interpretation +- Includes validation and limitations +- Balances code with explanations + +## Creating Visualizations + +For data visualization queries, prompts should focus on design principles, not +just code implementation. Though, you do want to specify if the visualization should be +created with a specific package (e.g., `ggplot2`, `plotly`, etc.). + +```r +# Less effective prompt +ask_chatgpt(prompt = "Create a visualization of this data:") + +# More effective prompt +ask_chatgpt(prompt = "As a data visualization expert who specializes in R: 1) Recommend 2-3 visualization types that would best represent this data and explain why, 2) For the most appropriate visualization, provide ggplot2 code with a clear aesthetic mapping rationale, 3) Suggest specific customizations to improve readability and visual appeal, 4) Explain how the visualization could be modified to highlight different aspects of the data. Follow ggplot2 best practices and modern data visualization principles.") +``` + +This approach is effective because it: + +- Focuses on visualization strategy, not just implementation +- Explains the reasoning behind design choices +- Provides customization options +- Grounds recommendations in best practices + +## Experimenting with Different Approaches + +The power of prompt engineering comes from experimentation. Consider how these +two prompts would produce different results for the same query about handling +missing data: + +```r +# Technical focus +ask_claude(prompt = + "As an R package developer with deep knowledge of data structures, explain all approaches to handling missing values in R, including their algorithmic implementations, performance characteristics, and edge cases." +) + +# Applied focus +ask_claude(prompt = + "As a data scientist who regularly cleans messy datasets, share your practical workflow for handling missing values in R. Include code examples using both base R and tidyverse approaches, focusing on real-world scenarios and decision criteria for when to use each technique." +) +``` + +Both are well-structured prompts, but they would yield different responses +focused on either technical depth or practical application. + +# Advanced Prompt Management + +While the prompt examples in this vignette provide useful templates, the +`searcher` package offers a more powerful and flexible prompt management system. +This system allows you to: + +- Maintain a library of specialized prompts for different tasks +- Set system-wide prompts that apply across all AI services +- Layer multiple prompts for precise control +- Save and reuse your most effective prompts + +For comprehensive documentation on these advanced features, see the dedicated +vignette: + +```r +vignette("managing-ai-prompts") +``` + +With the prompt management system, you can move beyond single-use prompts to +create a personalized library of AI instructions tailored to your specific R +workflows and projects. + +# Conclusion + +The AI assistant integration in `searcher` provides a way to access +AI help directly from your R environment without needing an API key or +external setup. By customizing prompts, you can tailor the AI's responses to +your specific needs, making it an even more powerful tool for R programming, +data analysis, and problem-solving. Though, keep in mind that AI services have +different strengths or "quirks", so experiment with each to find which works +best for your particular needs.