diff --git a/.changes/unreleased/ENHANCEMENTS-20250806-104456.yaml b/.changes/unreleased/ENHANCEMENTS-20250806-104456.yaml new file mode 100644 index 00000000..fa80257d --- /dev/null +++ b/.changes/unreleased/ENHANCEMENTS-20250806-104456.yaml @@ -0,0 +1,6 @@ +kind: ENHANCEMENTS +body: Add support for Terraform Search files. This provides block and attribute completion, hover, and diagnostics along with syntax validation for Terraform Search files. +time: 2025-08-06T10:44:56.893693+05:30 +custom: + Issue: "2007" + Repository: terraform-ls diff --git a/docs/USAGE.md b/docs/USAGE.md index 10702f11..fa951764 100644 --- a/docs/USAGE.md +++ b/docs/USAGE.md @@ -10,6 +10,7 @@ The following filetypes are supported by the Terraform Language Server: - `terraform-vars` - variable files (`*.tfvars`) - `terraform-stack` - standard `*.tfcomponent.hcl` and `*.tfstack.hcl` files - `terraform-deploy` - standard `*.tfdeploy.hcl` files +- `terraform-search` - standard `*.tfquery.hcl` files _NOTE:_ Clients should be configured to follow the above language ID conventions and do **not** send `*.tf.json`, `*.tfvars.json` nor Packer HCL config diff --git a/docs/architecture.md b/docs/architecture.md index 78fa5fe3..a51bde17 100644 --- a/docs/architecture.md +++ b/docs/architecture.md @@ -60,6 +60,7 @@ We currently have several features: - `*.tfvars` and `*.tfvars.json` files are handled in the `variables` feature - `.terraform/` and `.terraform.lock.hcl` related operations are handled in the `rootmodules` feature - `*.tfcomponent.hcl`, `*.tfstack.hcl` and `*.tfdeploy.hcl` files are handled in the `stacks` feature +- `*.tfquery.hcl` files are handled in the `search` feature A feature can provide data to the external consumers through methods. For example, the `variables` feature needs a list of variables from the `modules` feature. There should be no direct import from feature packages (we could enforce this by using `internal/`, but we won't for now) into other parts of the codebase. The "hot path" service mentioned above takes care of initializing each feature at the start of a new LS session. @@ -93,13 +94,23 @@ The `jobs` package of each feature contains all the different indexing jobs need ### Stack Feature Jobs - `ParseStackConfiguration` - parses `*.tfcomponent.hcl`, `*.tfstack.hcl` and `*.tfdeploy.hcl` files to turn `[]byte` into `hcl` types (AST) -- `LoadStackMetadata` - uses [`earlydecoder`](https://pkg.go.dev/github.com/hashicorp/terraform-schema@main/stacks/earlydecoder) to do early TF version-agnostic decoding to obtain metadata (variables, outputs etc.) which can be used to do more detailed decoding in hot-path within `hcl-lang` decoder +- `LoadStackMetadata` - uses [`earlydecoder`](https://pkg.go.dev/github.com/hashicorp/terraform-schema@main/earlydecoder/stacks) to do early TF version-agnostic decoding to obtain metadata (variables, outputs etc.) which can be used to do more detailed decoding in hot-path within `hcl-lang` decoder - `PreloadEmbeddedSchema` – loads provider schemas based on provider requirements from the bundled schemas - `DecodeReferenceTargets` - uses `hcl-lang` decoder to collect reference targets within `*.tfcomponent.hcl`, `*.tfstack.hcl` and `*.tfdeploy.hcl` - `DecodeReferenceOrigins` - uses `hcl-lang` decoder to collect reference origins within `*.tfcomponent.hcl`, `*.tfstack.hcl` and `*.tfdeploy.hcl` - `SchemaStackValidation` - does schema-based validation of module files (`*.tfcomponent.hcl`, `*.tfstack.hcl` and `*.tfdeploy.hcl`) and produces diagnostics associated with any "invalid" parts of code - `ReferenceValidation` - does validation based on (mis)matched reference origins and targets, to flag up "orphaned" references +### Search Feature Jobs + +- `ParseSearchConfiguration` - parses `*.tfquery.hcl` files to turn `[]byte` into `hcl` types (AST) +- `LoadSearchMetadata` - uses [`earlydecoder`](https://pkg.go.dev/github.com/hashicorp/terraform-schema@main/earlydecoder/search) to do early TF version-agnostic decoding to obtain metadata (variables, outputs etc.) which can be used to do more detailed decoding in hot-path within `hcl-lang` decoder +- `PreloadEmbeddedSchema` – loads provider schemas based on provider requirements from the bundled schemas +- `DecodeReferenceTargets` - uses `hcl-lang` decoder to collect reference targets within `*.tfquery.hcl` +- `DecodeReferenceOrigins` - uses `hcl-lang` decoder to collect reference origins within `*.tfquery.hcl` +- `SchemaSearchValidation` - does schema-based validation of module files (`*.hcl.hcl`) and produces diagnostics associated with any "invalid" parts of code +- `ReferenceValidation` - does validation based on (mis)matched reference origins and targets, to flag up "orphaned" references + ### Adding a new feature / "language" The existing `variables` feature is a good starting point when introducing a new language. Usually you need to roughly follow these steps to get a minimal working example: diff --git a/docs/language-clients.md b/docs/language-clients.md index 34b1dd19..a0401f89 100644 --- a/docs/language-clients.md +++ b/docs/language-clients.md @@ -10,6 +10,7 @@ The following file types are currently supported and language IDs expected: - `terraform-vars` - variable files (`*.tfvars`) - `terraform-stack` - standard `*.tfcomponent.hcl` and `*.tfstack.hcl` files - `terraform-deploy` - standard `*.tfdeploy.hcl` files +- `terraform-search` - standard `*.tfquery.hcl` files Client can choose to highlight other files locally, but such other files must **not** be send to the server as the server isn't equipped to handle those. @@ -46,7 +47,7 @@ This allows IntelliSense to remain accurate e.g. when switching branches in VCS or when there are any other changes made to these files outside the editor. If the client implements file watcher, it should watch for any changes -in `**/*.tf`, `**/*.tfvars`, `*.tfstack.hcl`, `**/*.tfstack.hcl` and `**/*.tfstack.hcl` files in the workspace. +in `**/*.tf`, `**/*.tfvars`, `**.tfstack.hcl`, `**/*.tfcomponent.hcl`, `**/*.tfdeploy.hcl` and `**/*.tfquery.hcl` files in the workspace. Client should **not** send changes for any other files. diff --git a/go.mod b/go.mod index f9b1a8ee..9b493e95 100644 --- a/go.mod +++ b/go.mod @@ -19,7 +19,7 @@ require ( github.com/hashicorp/terraform-exec v0.23.0 github.com/hashicorp/terraform-json v0.26.0 github.com/hashicorp/terraform-registry-address v0.3.0 - github.com/hashicorp/terraform-schema v0.0.0-20250616115602-34f2164294a0 + github.com/hashicorp/terraform-schema v0.0.0-20250813054210-80c1a7ad993e github.com/mcuadros/go-defaults v1.2.0 github.com/mh-cbon/go-fmt-fail v0.0.0-20160815164508-67765b3fbcb5 github.com/mitchellh/cli v1.1.5 diff --git a/go.sum b/go.sum index d9b648b5..8622edb5 100644 --- a/go.sum +++ b/go.sum @@ -126,8 +126,10 @@ github.com/hashicorp/terraform-json v0.26.0 h1:+BnJavhRH+oyNWPnfzrfQwVWCZBFMvjdi github.com/hashicorp/terraform-json v0.26.0/go.mod h1:eyWCeC3nrZamyrKLFnrvwpc3LQPIJsx8hWHQ/nu2/v4= github.com/hashicorp/terraform-registry-address v0.3.0 h1:HMpK3nqaGFPS9VmgRXrJL/dzHNdheGVKk5k7VlFxzCo= github.com/hashicorp/terraform-registry-address v0.3.0/go.mod h1:jRGCMiLaY9zii3GLC7hqpSnwhfnCN5yzvY0hh4iCGbM= -github.com/hashicorp/terraform-schema v0.0.0-20250616115602-34f2164294a0 h1:fpu271clSg0mDkfy7CYr1fs3ntT9AEioutKZR5r1n2s= -github.com/hashicorp/terraform-schema v0.0.0-20250616115602-34f2164294a0/go.mod h1:si3wjikcavAEF1QIx+p+tk5EvVubBpzu9sl8YasITTs= +github.com/hashicorp/terraform-schema v0.0.0-20250811094623-b86d395f9616 h1:9hb9XBtSpp9rlR6gAFgXFy8wJE7W/woPwhkIfab4VuI= +github.com/hashicorp/terraform-schema v0.0.0-20250811094623-b86d395f9616/go.mod h1:Lye3Lm/aJnhNDGkzYg4BVxIxK95NT4gKdwto+xbDP+c= +github.com/hashicorp/terraform-schema v0.0.0-20250813054210-80c1a7ad993e h1:TSFPetmLvdog9jSlOsaQyXHYlTI5dRbqu1JKNYiNnhs= +github.com/hashicorp/terraform-schema v0.0.0-20250813054210-80c1a7ad993e/go.mod h1:nnx41+GPagX9rK6V0ZLKAM+ws5nPxO1G50DhXt44ZhQ= github.com/hashicorp/terraform-svchost v0.1.1 h1:EZZimZ1GxdqFRinZ1tpJwVxxt49xc/S52uzrw4x0jKQ= github.com/hashicorp/terraform-svchost v0.1.1/go.mod h1:mNsjQfZyf/Jhz35v6/0LWcv26+X7JPS+buii2c9/ctc= github.com/hexops/autogold v1.3.1 h1:YgxF9OHWbEIUjhDbpnLhgVsjUDsiHDTyDfy2lrfdlzo= diff --git a/internal/features/search/ast/search.go b/internal/features/search/ast/search.go new file mode 100644 index 00000000..d31577a3 --- /dev/null +++ b/internal/features/search/ast/search.go @@ -0,0 +1,119 @@ +// Copyright (c) HashiCorp, Inc. +// SPDX-License-Identifier: MPL-2.0 + +package ast + +import ( + "strings" + + "github.com/hashicorp/hcl/v2" + globalAst "github.com/hashicorp/terraform-ls/internal/terraform/ast" +) + +type Filename interface { + String() string + IsJSON() bool + IsIgnored() bool +} + +// SearchFilename is a custom type for search configuration files +type SearchFilename string + +func (mf SearchFilename) String() string { + return string(mf) +} + +func (mf SearchFilename) IsJSON() bool { + return strings.HasSuffix(string(mf), ".json") +} + +func (mf SearchFilename) IsIgnored() bool { + return globalAst.IsIgnoredFile(string(mf)) +} + +func IsSearchFilename(name string) bool { + return strings.HasSuffix(name, ".tfquery.hcl") || + strings.HasSuffix(name, ".tfquery.json") +} + +// FilenameFromName returns a valid SearchFilename +func FilenameFromName(name string) Filename { + if IsSearchFilename(name) { + return SearchFilename(name) + } + + return nil +} + +type Files map[Filename]*hcl.File + +func (sf Files) Copy() Files { + m := make(Files, len(sf)) + for name, file := range sf { + m[name] = file + } + return m +} + +func (mf Files) AsMap() map[string]*hcl.File { + m := make(map[string]*hcl.File, len(mf)) + for name, file := range mf { + m[name.String()] = file + } + return m +} + +type Diagnostics map[Filename]hcl.Diagnostics + +func (sd Diagnostics) Copy() Diagnostics { + m := make(Diagnostics, len(sd)) + for name, diags := range sd { + m[name] = diags + } + return m +} + +// AutoloadedOnly returns only diagnostics that are not from ignored files +func (sd Diagnostics) AutoloadedOnly() Diagnostics { + diags := make(Diagnostics) + for name, f := range sd { + if !name.IsIgnored() { + diags[name] = f + } + } + return diags +} + +func (sd Diagnostics) AsMap() map[string]hcl.Diagnostics { + m := make(map[string]hcl.Diagnostics, len(sd)) + for name, diags := range sd { + m[name.String()] = diags + } + return m +} + +func (sd Diagnostics) Count() int { + count := 0 + for _, diags := range sd { + count += len(diags) + } + return count +} + +func DiagnosticsFromMap(m map[string]hcl.Diagnostics) Diagnostics { + mf := make(Diagnostics, len(m)) + for name, file := range m { + mf[FilenameFromName(name)] = file + } + return mf +} + +type SourceDiagnostics map[globalAst.DiagnosticSource]Diagnostics + +func (svd SourceDiagnostics) Count() int { + count := 0 + for _, diags := range svd { + count += diags.Count() + } + return count +} diff --git a/internal/features/search/decoder/path_reader.go b/internal/features/search/decoder/path_reader.go new file mode 100644 index 00000000..1d7811f6 --- /dev/null +++ b/internal/features/search/decoder/path_reader.go @@ -0,0 +1,166 @@ +// Copyright (c) HashiCorp, Inc. +// SPDX-License-Identifier: MPL-2.0 + +package decoder + +import ( + "context" + "fmt" + + "github.com/hashicorp/go-version" + "github.com/hashicorp/hcl-lang/decoder" + "github.com/hashicorp/hcl-lang/lang" + "github.com/hashicorp/hcl-lang/reference" + "github.com/hashicorp/hcl-lang/schema" + "github.com/hashicorp/hcl/v2" + "github.com/hashicorp/terraform-ls/internal/features/search/ast" + "github.com/hashicorp/terraform-ls/internal/features/search/state" + ilsp "github.com/hashicorp/terraform-ls/internal/lsp" + tfaddr "github.com/hashicorp/terraform-registry-address" + tfmod "github.com/hashicorp/terraform-schema/module" + tfschema "github.com/hashicorp/terraform-schema/schema" + searchSchema "github.com/hashicorp/terraform-schema/schema/search" + tfsearch "github.com/hashicorp/terraform-schema/search" +) + +type PathReader struct { + StateReader StateReader + ModuleReader ModuleReader + RootReader RootReader +} + +var _ decoder.PathReader = &PathReader{} + +type CombinedReader struct { + ModuleReader + StateReader + RootReader +} + +type StateReader interface { + List() ([]*state.SearchRecord, error) + SearchRecordByPath(modPath string) (*state.SearchRecord, error) + ProviderSchema(modPath string, addr tfaddr.Provider, vc version.Constraints) (*tfschema.ProviderSchema, error) +} + +type ModuleReader interface { + // LocalModuleMeta returns the module meta data for a local module. This is the result + // of the [earlydecoder] when processing module files + LocalModuleMeta(modPath string) (*tfmod.Meta, error) +} + +type RootReader interface { + InstalledModulePath(rootPath string, normalizedSource string) (string, bool) + + TerraformVersion(modPath string) *version.Version +} + +// PathContext returns a PathContext for the given path based on the language ID +func (pr *PathReader) PathContext(path lang.Path) (*decoder.PathContext, error) { + record, err := pr.StateReader.SearchRecordByPath(path.Path) + if err != nil { + return nil, err + } + + switch path.LanguageID { + case ilsp.Search.String(): + return searchPathContext(record, CombinedReader{ + StateReader: pr.StateReader, + ModuleReader: pr.ModuleReader, + RootReader: pr.RootReader, + }) + } + + return nil, fmt.Errorf("unknown language ID: %q", path.LanguageID) +} + +func searchPathContext(record *state.SearchRecord, stateReader CombinedReader) (*decoder.PathContext, error) { + resolvedVersion := tfschema.ResolveVersion(stateReader.TerraformVersion(record.Path()), record.Meta.CoreRequirements) + + sm := searchSchema.NewSearchSchemaMerger(mustCoreSchemaForVersion(resolvedVersion)) + sm.SetStateReader(stateReader) + + meta := &tfsearch.Meta{ + Path: record.Path(), + CoreRequirements: record.Meta.CoreRequirements, + Lists: record.Meta.Lists, + Variables: record.Meta.Variables, + Filenames: record.Meta.Filenames, + ProviderReferences: record.Meta.ProviderReferences, + ProviderRequirements: record.Meta.ProviderRequirements, + } + + mergedSchema, err := sm.SchemaForSearch(meta) + if err != nil { + return nil, err + } + + pathCtx := &decoder.PathContext{ + Schema: mergedSchema, + ReferenceOrigins: make(reference.Origins, 0), + ReferenceTargets: make(reference.Targets, 0), + Files: make(map[string]*hcl.File, 0), + Validators: searchValidators, + } + + // TODO: Add reference origins and targets if needed + for _, origin := range record.RefOrigins { + if ast.IsSearchFilename(origin.OriginRange().Filename) { + pathCtx.ReferenceOrigins = append(pathCtx.ReferenceOrigins, origin) + } + } + + for _, target := range record.RefTargets { + if target.RangePtr != nil && ast.IsSearchFilename(target.RangePtr.Filename) { + pathCtx.ReferenceTargets = append(pathCtx.ReferenceTargets, target) + } else if target.RangePtr == nil { + pathCtx.ReferenceTargets = append(pathCtx.ReferenceTargets, target) + } + } + + for name, f := range record.ParsedFiles { + if _, ok := name.(ast.SearchFilename); ok { + pathCtx.Files[name.String()] = f + } + } + + return pathCtx, nil +} + +func (pr *PathReader) Paths(ctx context.Context) []lang.Path { + paths := make([]lang.Path, 0) + + searchRecords, err := pr.StateReader.List() + if err != nil { + return paths + } + + for _, record := range searchRecords { + foundSearch := false + for name := range record.ParsedFiles { + if _, ok := name.(ast.SearchFilename); ok { + foundSearch = true + } + + } + + if foundSearch { + paths = append(paths, lang.Path{ + Path: record.Path(), + LanguageID: ilsp.Search.String(), + }) + } + + } + + return paths +} + +func mustCoreSchemaForVersion(v *version.Version) *schema.BodySchema { + s, err := searchSchema.CoreSearchSchemaForVersion(v) + if err != nil { + // this should never happen + panic(err) + } + return s +} diff --git a/internal/features/search/decoder/validators.go b/internal/features/search/decoder/validators.go new file mode 100644 index 00000000..01b7a834 --- /dev/null +++ b/internal/features/search/decoder/validators.go @@ -0,0 +1,19 @@ +// Copyright (c) HashiCorp, Inc. +// SPDX-License-Identifier: MPL-2.0 + +package decoder + +import ( + "github.com/hashicorp/hcl-lang/validator" +) + +var searchValidators = []validator.Validator{ + validator.BlockLabelsLength{}, + validator.DeprecatedAttribute{}, + validator.DeprecatedBlock{}, + validator.MaxBlocks{}, + validator.MinBlocks{}, + validator.MissingRequiredAttribute{}, + validator.UnexpectedAttribute{}, + validator.UnexpectedBlock{}, +} diff --git a/internal/features/search/events.go b/internal/features/search/events.go new file mode 100644 index 00000000..d785e1ef --- /dev/null +++ b/internal/features/search/events.go @@ -0,0 +1,286 @@ +// Copyright (c) HashiCorp, Inc. +// SPDX-License-Identifier: MPL-2.0 + +package search + +import ( + "context" + "os" + "path/filepath" + + lsctx "github.com/hashicorp/terraform-ls/internal/context" + "github.com/hashicorp/terraform-ls/internal/document" + "github.com/hashicorp/terraform-ls/internal/features/search/ast" + "github.com/hashicorp/terraform-ls/internal/features/search/jobs" + "github.com/hashicorp/terraform-ls/internal/job" + "github.com/hashicorp/terraform-ls/internal/lsp" + "github.com/hashicorp/terraform-ls/internal/protocol" + "github.com/hashicorp/terraform-ls/internal/schemas" + globalAst "github.com/hashicorp/terraform-ls/internal/terraform/ast" + "github.com/hashicorp/terraform-ls/internal/terraform/module/operation" +) + +func (f *SearchFeature) discover(path string, files []string) error { + for _, file := range files { + if globalAst.IsIgnoredFile(file) { + continue + } + + if ast.IsSearchFilename(file) { + f.logger.Printf("discovered search file in %s", path) + + err := f.store.AddIfNotExists(path) + if err != nil { + return err + } + + break + } + } + + return nil +} + +func (f *SearchFeature) didOpen(ctx context.Context, dir document.DirHandle, languageID string) (job.IDs, error) { + ids := make(job.IDs, 0) + path := dir.Path() + f.logger.Printf("did open %q %q", path, languageID) + + // We need to decide if the path is relevant to us + if languageID != lsp.Search.String() { + return ids, nil + } + + // Add to state as path is relevant + err := f.store.AddIfNotExists(path) + if err != nil { + return ids, err + } + + decodeIds, err := f.decodeSearch(ctx, dir, false, true) + if err != nil { + return ids, err + } + ids = append(ids, decodeIds...) + + return ids, err +} + +func (f *SearchFeature) didChange(ctx context.Context, dir document.DirHandle) (job.IDs, error) { + hasSearchRecord := f.store.Exists(dir.Path()) + if !hasSearchRecord { + return job.IDs{}, nil + } + + return f.decodeSearch(ctx, dir, true, true) +} + +func (f *SearchFeature) didChangeWatched(ctx context.Context, rawPath string, changeType protocol.FileChangeType, isDir bool) (job.IDs, error) { + ids := make(job.IDs, 0) + + switch changeType { + case protocol.Deleted: + // We don't know whether file or dir is being deleted + // 1st we just blindly try to look it up as a directory + hasSearchRecord := f.store.Exists(rawPath) + if hasSearchRecord { + f.removeIndexedSearch(rawPath) + return ids, nil + } + + // 2nd we try again assuming it is a file + parentDir := filepath.Dir(rawPath) + hasSearchRecord = f.store.Exists(parentDir) + if !hasSearchRecord { + // Nothing relevant found in the feature state + return ids, nil + } + + // and check the parent directory still exists + fi, err := os.Stat(parentDir) + if err != nil { + if os.IsNotExist(err) { + // if not, we remove the indexed module + f.removeIndexedSearch(rawPath) + return ids, nil + } + f.logger.Printf("error checking existence (%q deleted): %s", parentDir, err) + return ids, nil + } + if !fi.IsDir() { + // Should never happen + f.logger.Printf("error: %q (deleted) is not a directory", parentDir) + return ids, nil + } + + // If the parent directory exists, we just need to + // check if the there are open documents for the path and the + // path is a module path. If so, we need to reparse the module. + dir := document.DirHandleFromPath(parentDir) + hasOpenDocs, err := f.stateStore.DocumentStore.HasOpenDocuments(dir) + if err != nil { + f.logger.Printf("error when checking for open documents in path (%q deleted): %s", rawPath, err) + } + if !hasOpenDocs { + return ids, nil + } + + return f.decodeSearch(ctx, dir, true, true) + + case protocol.Changed: + fallthrough + case protocol.Created: + var dir document.DirHandle + if isDir { + dir = document.DirHandleFromPath(rawPath) + } else { + docHandle := document.HandleFromPath(rawPath) + dir = docHandle.Dir + } + + // Check if the there are open documents for the path and the + // path is a module path. If so, we need to reparse the module. + hasOpenDocs, err := f.stateStore.DocumentStore.HasOpenDocuments(dir) + if err != nil { + f.logger.Printf("error when checking for open documents in path (%q changed): %s", rawPath, err) + } + if !hasOpenDocs { + return ids, nil + } + + hasModuleRecord := f.store.Exists(dir.Path()) + if !hasModuleRecord { + return ids, nil + } + + return f.decodeSearch(ctx, dir, true, true) + } + + return nil, nil +} + +func (f *SearchFeature) decodeSearch(ctx context.Context, dir document.DirHandle, ignoreState bool, isFirstLevel bool) (job.IDs, error) { + ids := make(job.IDs, 0) + path := dir.Path() + + parseId, err := f.stateStore.JobStore.EnqueueJob(ctx, job.Job{ + Dir: dir, + Func: func(ctx context.Context) error { + return jobs.ParseSearchConfiguration(ctx, f.fs, f.store, path) + }, + Type: operation.OpTypeParseSearchConfiguration.String(), + IgnoreState: ignoreState, + }) + if err != nil { + return ids, err + } + ids = append(ids, parseId) + + // Changes to a setting currently requires a LS restart, so the LS + // setting context cannot change during the execution of a job. That's + // why we can extract it here and use it in Defer. + // See https://github.com/hashicorp/terraform-ls/issues/1008 + // We can safely ignore the error here. If we can't get the options from + // the context, validationOptions.EnableEnhancedValidation will be false + // by default. So we don't run the validation jobs. + validationOptions, _ := lsctx.ValidationOptions(ctx) + + metaId, err := f.stateStore.JobStore.EnqueueJob(ctx, job.Job{ + Dir: dir, + Func: func(ctx context.Context) error { + return jobs.LoadSearchMetadata(ctx, f.store, f.moduleFeature, f.logger, path) + }, + Type: operation.OpTypeLoadSearchMetadata.String(), + DependsOn: job.IDs{parseId}, + IgnoreState: ignoreState, + Defer: func(ctx context.Context, jobErr error) (job.IDs, error) { + deferIds := make(job.IDs, 0) + + if jobErr != nil { + f.logger.Printf("loading module metadata returned error: %s", jobErr) + } + + // Reference collection jobs will depend on this one, so we move it here in advance + eSchemaId, err := f.stateStore.JobStore.EnqueueJob(ctx, job.Job{ + Dir: dir, + Func: func(ctx context.Context) error { + return jobs.PreloadEmbeddedSchema(ctx, f.logger, schemas.FS, + f.store, f.stateStore.ProviderSchemas, path) + }, + Type: operation.OpTypeSearchPreloadEmbeddedSchema.String(), + IgnoreState: ignoreState, + }) + if err != nil { + return deferIds, err + } + deferIds = append(deferIds, eSchemaId) + + refTargetsId, err := f.stateStore.JobStore.EnqueueJob(ctx, job.Job{ + Dir: dir, + Func: func(ctx context.Context) error { + return jobs.DecodeReferenceTargets(ctx, f.store, f.moduleFeature, f.rootFeature, path) + }, + Type: operation.OpTypeDecodeReferenceTargets.String(), + DependsOn: job.IDs{eSchemaId}, + IgnoreState: ignoreState, + }) + if err != nil { + return deferIds, err + } + deferIds = append(deferIds, refTargetsId) + + refOriginsId, err := f.stateStore.JobStore.EnqueueJob(ctx, job.Job{ + Dir: dir, + Func: func(ctx context.Context) error { + return jobs.DecodeReferenceOrigins(ctx, f.store, f.moduleFeature, f.rootFeature, path) + }, + Type: operation.OpTypeDecodeReferenceOrigins.String(), + DependsOn: job.IDs{eSchemaId}, + IgnoreState: ignoreState, + }) + if err != nil { + return deferIds, err + } + deferIds = append(deferIds, refOriginsId) + + if validationOptions.EnableEnhancedValidation { + _, err := f.stateStore.JobStore.EnqueueJob(ctx, job.Job{ + Dir: dir, + Func: func(ctx context.Context) error { + return jobs.SchemaSearchValidation(ctx, f.store, f.moduleFeature, f.rootFeature, dir.Path()) + }, + Type: operation.OpTypeSchemaSearchValidation.String(), + DependsOn: job.IDs{refOriginsId, refTargetsId}, + IgnoreState: ignoreState, + }) + if err != nil { + return ids, err + } + } + + return deferIds, nil + }, + }) + if err != nil { + return ids, err + } + ids = append(ids, metaId) + + return ids, nil +} + +func (f *SearchFeature) removeIndexedSearch(rawPath string) { + searchandle := document.DirHandleFromPath(rawPath) + + err := f.stateStore.JobStore.DequeueJobsForDir(searchandle) + if err != nil { + f.logger.Printf("failed to dequeue jobs for search: %s", err) + return + } + + err = f.store.Remove(rawPath) + if err != nil { + f.logger.Printf("failed to remove search from state: %s", err) + return + } +} diff --git a/internal/features/search/jobs/metadata.go b/internal/features/search/jobs/metadata.go new file mode 100644 index 00000000..f6765830 --- /dev/null +++ b/internal/features/search/jobs/metadata.go @@ -0,0 +1,131 @@ +// Copyright (c) HashiCorp, Inc. +// SPDX-License-Identifier: MPL-2.0 + +package jobs + +import ( + "context" + "log" + + tfaddr "github.com/hashicorp/terraform-registry-address" + "github.com/hashicorp/terraform-schema/module" + + "github.com/hashicorp/terraform-ls/internal/document" + "github.com/hashicorp/terraform-ls/internal/features/search/ast" + searchDecoder "github.com/hashicorp/terraform-ls/internal/features/search/decoder" + "github.com/hashicorp/terraform-ls/internal/features/search/state" + "github.com/hashicorp/terraform-ls/internal/job" + globalAst "github.com/hashicorp/terraform-ls/internal/terraform/ast" + "github.com/hashicorp/terraform-ls/internal/terraform/module/operation" + earlydecoder "github.com/hashicorp/terraform-schema/earlydecoder/search" + tfsearch "github.com/hashicorp/terraform-schema/search" +) + +// LoadSearchMetadata loads data about the search in a version-independent +// way that enables us to decode the rest of the configuration, +// e.g. by knowing provider versions, etc. +func LoadSearchMetadata(ctx context.Context, searchStore *state.SearchStore, moduleFeature searchDecoder.ModuleReader, logger *log.Logger, searchPath string) error { + record, err := searchStore.SearchRecordByPath(searchPath) + if err != nil { + return err + } + + // TODO: Avoid parsing if upstream (parsing) job reported no changes + + // Avoid parsing if it is already in progress or already known + if record.MetaState != operation.OpStateUnknown && !job.IgnoreState(ctx) { + return job.StateNotChangedErr{Dir: document.DirHandleFromPath(searchPath)} + } + + err = searchStore.SetMetaState(searchPath, operation.OpStateLoading) + if err != nil { + return err + } + + meta, diags := earlydecoder.LoadSearch(record.Path(), record.ParsedFiles.AsMap()) + + err = loadSearchModuleSources(meta, moduleFeature, searchPath) + if err != nil { + logger.Printf("loading search module sources returned error: %s", err) + } + + var mErr error + sErr := searchStore.UpdateMetadata(searchPath, meta, mErr) + if sErr != nil { + return sErr + } + + if len(diags) <= 0 { + // no new diagnostics, so return early + return mErr + } + + // Merge the new diagnostics with the existing ones + existingDiags, ok := record.Diagnostics[globalAst.HCLParsingSource] + if !ok { + existingDiags = make(ast.Diagnostics) + } else { + existingDiags = existingDiags.Copy() + } + + for fileName, diagnostic := range diags { + // Convert the filename to an AST filename + fn := ast.FilenameFromName(fileName) + + // Append the diagnostic to the existing diagnostics if it exists + existingDiags[fn] = existingDiags[fn].Extend(diagnostic) + } + + sErr = searchStore.UpdateDiagnostics(searchPath, globalAst.HCLParsingSource, existingDiags) + if sErr != nil { + return sErr + } + + return mErr +} + +func loadSearchModuleSources(searchMeta *tfsearch.Meta, moduleFeature searchDecoder.ModuleReader, path string) error { + // load metadata from the adjacent Terraform module + modMeta, err := moduleFeature.LocalModuleMeta(path) + if err != nil { + return err + } + + if modMeta != nil { + if searchMeta.CoreRequirements == nil { + searchMeta.CoreRequirements = modMeta.CoreRequirements + } + if searchMeta.ProviderRequirements == nil { + searchMeta.ProviderRequirements = make(tfsearch.ProviderRequirements) + } + // Copy provider requirements + for provider, constraints := range modMeta.ProviderRequirements { + searchMeta.ProviderRequirements[provider] = constraints + } + + for rf := range searchMeta.ProviderReferences { + src := modMeta.ProviderReferences[module.ProviderRef{ + LocalName: rf.LocalName, + }] + if rf.Alias != "" { + searchMeta.ProviderReferences[tfsearch.ProviderRef{ + LocalName: rf.LocalName, + Alias: rf.Alias, + }] = src + } + } + // Convert from module provider references to search provider references + for moduleProviderRef, provider := range modMeta.ProviderReferences { + searchProviderRef := tfsearch.ProviderRef{ + LocalName: moduleProviderRef.LocalName, + Alias: moduleProviderRef.Alias, + } + if searchMeta.ProviderReferences == nil { + searchMeta.ProviderReferences = make(map[tfsearch.ProviderRef]tfaddr.Provider) + } + searchMeta.ProviderReferences[searchProviderRef] = provider + } + } + + return nil +} diff --git a/internal/features/search/jobs/parse_search.go b/internal/features/search/jobs/parse_search.go new file mode 100644 index 00000000..0521d70c --- /dev/null +++ b/internal/features/search/jobs/parse_search.go @@ -0,0 +1,97 @@ +// Copyright (c) HashiCorp, Inc. +// SPDX-License-Identifier: MPL-2.0 + +package jobs + +import ( + "context" + "path/filepath" + + lsctx "github.com/hashicorp/terraform-ls/internal/context" + "github.com/hashicorp/terraform-ls/internal/document" + "github.com/hashicorp/terraform-ls/internal/features/search/ast" + "github.com/hashicorp/terraform-ls/internal/features/search/parser" + "github.com/hashicorp/terraform-ls/internal/features/search/state" + "github.com/hashicorp/terraform-ls/internal/job" + "github.com/hashicorp/terraform-ls/internal/lsp" + globalAst "github.com/hashicorp/terraform-ls/internal/terraform/ast" + "github.com/hashicorp/terraform-ls/internal/terraform/module/operation" + "github.com/hashicorp/terraform-ls/internal/uri" +) + +// ParseSearchConfiguration parses the whole Search configuration, +// i.e. turns bytes of `*.tfquery.hcl` files into AST ([*hcl.File]). +func ParseSearchConfiguration(ctx context.Context, fs ReadOnlyFS, searchStore *state.SearchStore, searchPath string) error { + record, err := searchStore.SearchRecordByPath(searchPath) + if err != nil { + return err + } + + // TODO: Avoid parsing if the content matches existing AST + + // Avoid parsing if it is already in progress or already known + if record.DiagnosticsState[globalAst.HCLParsingSource] != operation.OpStateUnknown && !job.IgnoreState(ctx) { + return job.StateNotChangedErr{Dir: document.DirHandleFromPath(searchPath)} + } + + var files ast.Files + var diags ast.Diagnostics + rpcContext := lsctx.DocumentContext(ctx) + + isMatchingLanguageId := (rpcContext.LanguageID == lsp.Search.String()) + + // Only parse the file that's being changed/opened, unless this is 1st-time parsing + if record.DiagnosticsState[globalAst.HCLParsingSource] == operation.OpStateLoaded && + rpcContext.IsDidChangeRequest() && + isMatchingLanguageId { + // the file has already been parsed, so only examine this file and not the whole module + err = searchStore.SetDiagnosticsState(searchPath, globalAst.HCLParsingSource, operation.OpStateLoading) + if err != nil { + return err + } + + filePath, err := uri.PathFromURI(rpcContext.URI) + if err != nil { + return err + } + fileName := filepath.Base(filePath) + + pFile, fDiags, err := parser.ParseFile(fs, filePath) + if err != nil { + return err + } + existingFiles := record.ParsedFiles.Copy() + existingFiles[ast.FilenameFromName(fileName)] = pFile + files = existingFiles + + existingDiags, ok := record.Diagnostics[globalAst.HCLParsingSource] + if !ok { + existingDiags = make(ast.Diagnostics) + } else { + existingDiags = existingDiags.Copy() + } + existingDiags[ast.FilenameFromName(fileName)] = fDiags + diags = existingDiags + + } else { + // this is the first time file is opened so parse the whole module + err = searchStore.SetDiagnosticsState(searchPath, globalAst.HCLParsingSource, operation.OpStateLoading) + if err != nil { + return err + } + + files, diags, err = parser.ParseFiles(fs, searchPath) + } + + sErr := searchStore.UpdateParsedFiles(searchPath, files, err) + if sErr != nil { + return sErr + } + + sErr = searchStore.UpdateDiagnostics(searchPath, globalAst.HCLParsingSource, diags) + if sErr != nil { + return sErr + } + + return err +} diff --git a/internal/features/search/jobs/parse_search_test.go b/internal/features/search/jobs/parse_search_test.go new file mode 100644 index 00000000..67af7cd3 --- /dev/null +++ b/internal/features/search/jobs/parse_search_test.go @@ -0,0 +1,120 @@ +// Copyright (c) HashiCorp, Inc. +// SPDX-License-Identifier: MPL-2.0 + +package jobs + +import ( + "context" + "path/filepath" + "testing" + + lsctx "github.com/hashicorp/terraform-ls/internal/context" + "github.com/hashicorp/terraform-ls/internal/features/search/ast" + "github.com/hashicorp/terraform-ls/internal/features/search/state" + "github.com/hashicorp/terraform-ls/internal/filesystem" + "github.com/hashicorp/terraform-ls/internal/job" + ilsp "github.com/hashicorp/terraform-ls/internal/lsp" + globalState "github.com/hashicorp/terraform-ls/internal/state" + globalAst "github.com/hashicorp/terraform-ls/internal/terraform/ast" + "github.com/hashicorp/terraform-ls/internal/uri" +) + +func TestParseSearchConfiguration(t *testing.T) { + ctx := context.Background() + gs, err := globalState.NewStateStore() + if err != nil { + t.Fatal(err) + } + ss, err := state.NewSearchStore(gs.ChangeStore, gs.ProviderSchemas) + if err != nil { + t.Fatal(err) + } + + testData, err := filepath.Abs("testdata") + if err != nil { + t.Fatal(err) + } + testFs := filesystem.NewFilesystem(gs.DocumentStore) + + simpleSearchPath := filepath.Join(testData, "simple-search") + + err = ss.Add(simpleSearchPath) + if err != nil { + t.Fatal(err) + } + + ctx = lsctx.WithDocumentContext(ctx, lsctx.Document{}) + err = ParseSearchConfiguration(ctx, testFs, ss, simpleSearchPath) + if err != nil { + t.Fatal(err) + } + + before, err := ss.SearchRecordByPath(simpleSearchPath) + if err != nil { + t.Fatal(err) + } + + // Verify that files were parsed + if len(before.ParsedFiles) == 0 { + t.Fatal("expected parsed files, got none") + } + + // Check that both config.tfquery.hcl and variables.tfquery.hcl were parsed + configFile := ast.SearchFilename("config.tfquery.hcl") + variablesFile := ast.SearchFilename("variables.tfquery.hcl") + + if _, exists := before.ParsedFiles[configFile]; !exists { + t.Fatal("expected config.tfquery.hcl to be parsed") + } + + if _, exists := before.ParsedFiles[variablesFile]; !exists { + t.Fatal("expected variables.tfquery.hcl to be parsed") + } + + // ignore job state for next test + ctx = job.WithIgnoreState(ctx, true) + + // Test single file change parsing (simulating didChange request) + configURI, err := filepath.Abs(filepath.Join(simpleSearchPath, "config.tfquery.hcl")) + if err != nil { + t.Fatal(err) + } + changeCtx := lsctx.WithDocumentContext(ctx, lsctx.Document{ + Method: "textDocument/didChange", + LanguageID: ilsp.Search.String(), + URI: uri.FromPath(configURI), + }) + + err = ParseSearchConfiguration(changeCtx, testFs, ss, simpleSearchPath) + if err != nil { + t.Fatal(err) + } + + after, err := ss.SearchRecordByPath(simpleSearchPath) + if err != nil { + t.Fatal(err) + } + + // Test that config.tfquery.hcl was re-parsed (pointer should be different) + if before.ParsedFiles[configFile] == after.ParsedFiles[configFile] { + t.Fatal("config.tfquery.hcl should have been re-parsed") + } + + // Test that variables.tfquery.hcl was not re-parsed (pointer should be the same) + if before.ParsedFiles[variablesFile] != after.ParsedFiles[variablesFile] { + t.Fatal("variables.tfquery.hcl should not have been re-parsed") + } + + // Verify diagnostics were updated for the changed file + beforeDiags, beforeOk := before.Diagnostics[globalAst.HCLParsingSource][configFile] + afterDiags, afterOk := after.Diagnostics[globalAst.HCLParsingSource][configFile] + + if !beforeOk || !afterOk { + t.Fatal("expected diagnostics for config.tfquery.hcl") + } + + // The diagnostic objects should be different instances even if content is the same + if &beforeDiags == &afterDiags { + t.Fatal("diagnostics should have been updated for config.tfquery.hcl") + } +} diff --git a/internal/features/search/jobs/references.go b/internal/features/search/jobs/references.go new file mode 100644 index 00000000..a9af4730 --- /dev/null +++ b/internal/features/search/jobs/references.go @@ -0,0 +1,125 @@ +// Copyright (c) HashiCorp, Inc. +// SPDX-License-Identifier: MPL-2.0 + +package jobs + +import ( + "context" + + "github.com/hashicorp/hcl-lang/decoder" + "github.com/hashicorp/hcl-lang/lang" + "github.com/hashicorp/hcl-lang/reference" + idecoder "github.com/hashicorp/terraform-ls/internal/decoder" + "github.com/hashicorp/terraform-ls/internal/document" + sdecoder "github.com/hashicorp/terraform-ls/internal/features/search/decoder" + "github.com/hashicorp/terraform-ls/internal/features/search/state" + "github.com/hashicorp/terraform-ls/internal/job" + ilsp "github.com/hashicorp/terraform-ls/internal/lsp" + "github.com/hashicorp/terraform-ls/internal/terraform/module/operation" +) + +// DecodeReferenceTargets collects reference targets, +// using previously parsed AST (via [ParseSearchConfiguration]), +// core schema of appropriate version (as obtained via [GetTerraformVersion]) +// and provider schemas ([PreloadEmbeddedSchema] or [ObtainSchema]). +// +// For example it tells us that variable block between certain LOC +// can be referred to as var.foobar. This is useful e.g. during completion, +// go-to-definition or go-to-references. +func DecodeReferenceTargets(ctx context.Context, searchStore *state.SearchStore, moduleReader sdecoder.ModuleReader, rootReader sdecoder.RootReader, searchPath string) error { + mod, err := searchStore.SearchRecordByPath(searchPath) + if err != nil { + return err + } + + // TODO: Avoid collection if upstream jobs reported no changes + + // Avoid collection if it is already in progress or already done + if mod.RefTargetsState != operation.OpStateUnknown && !job.IgnoreState(ctx) { + return job.StateNotChangedErr{Dir: document.DirHandleFromPath(searchPath)} + } + + err = searchStore.SetReferenceTargetsState(searchPath, operation.OpStateLoading) + if err != nil { + return err + } + + d := decoder.NewDecoder(&sdecoder.PathReader{ + StateReader: searchStore, + ModuleReader: moduleReader, + RootReader: rootReader, + }) + d.SetContext(idecoder.DecoderContext(ctx)) + + searchDecoder, err := d.Path(lang.Path{ + Path: searchPath, + LanguageID: ilsp.Search.String(), + }) + if err != nil { + return err + } + searchTargets, rErr := searchDecoder.CollectReferenceTargets() + + targets := make(reference.Targets, 0) + targets = append(targets, searchTargets...) + + sErr := searchStore.UpdateReferenceTargets(searchPath, targets, rErr) + if sErr != nil { + return sErr + } + + return rErr +} + +// DecodeReferenceOrigins collects reference origins, +// using previously parsed AST (via [ParseSearchConfiguration]), +// core schema of appropriate version (as obtained via [GetTerraformVersion]) +// and provider schemas ([PreloadEmbeddedSchema] or [ObtainSchema]). +// +// For example it tells us that there is a reference address var.foobar +// at a particular LOC. This can be later matched with targets +// (as obtained via [DecodeReferenceTargets]) during hover or go-to-definition. +func DecodeReferenceOrigins(ctx context.Context, searchStore *state.SearchStore, moduleReader sdecoder.ModuleReader, rootReader sdecoder.RootReader, searchPath string) error { + mod, err := searchStore.SearchRecordByPath(searchPath) + if err != nil { + return err + } + + // TODO: Avoid collection if upstream jobs reported no changes + + // Avoid collection if it is already in progress or already done + if mod.RefOriginsState != operation.OpStateUnknown && !job.IgnoreState(ctx) { + return job.StateNotChangedErr{Dir: document.DirHandleFromPath(searchPath)} + } + + err = searchStore.SetReferenceOriginsState(searchPath, operation.OpStateLoading) + if err != nil { + return err + } + + d := decoder.NewDecoder(&sdecoder.PathReader{ + StateReader: searchStore, + ModuleReader: moduleReader, + RootReader: rootReader, + }) + d.SetContext(idecoder.DecoderContext(ctx)) + + searchDecoder, err := d.Path(lang.Path{ + Path: searchPath, + LanguageID: ilsp.Search.String(), + }) + if err != nil { + return err + } + searchOrigins, rErr := searchDecoder.CollectReferenceOrigins() + + origins := make(reference.Origins, 0) + origins = append(origins, searchOrigins...) + + sErr := searchStore.UpdateReferenceOrigins(searchPath, origins, rErr) + if sErr != nil { + return sErr + } + + return rErr +} diff --git a/internal/features/search/jobs/schema.go b/internal/features/search/jobs/schema.go new file mode 100644 index 00000000..aae677e5 --- /dev/null +++ b/internal/features/search/jobs/schema.go @@ -0,0 +1,60 @@ +// Copyright (c) HashiCorp, Inc. +// SPDX-License-Identifier: MPL-2.0 + +package jobs + +import ( + "context" + "io/fs" + "log" + + "github.com/hashicorp/terraform-ls/internal/document" + "github.com/hashicorp/terraform-ls/internal/features/search/state" + "github.com/hashicorp/terraform-ls/internal/job" + globalState "github.com/hashicorp/terraform-ls/internal/state" + "github.com/hashicorp/terraform-ls/internal/terraform/module/operation" +) + +func PreloadEmbeddedSchema(ctx context.Context, logger *log.Logger, fs fs.ReadDirFS, searchStore *state.SearchStore, schemaStore *globalState.ProviderSchemaStore, searchPath string) error { + record, err := searchStore.SearchRecordByPath(searchPath) + + if err != nil { + return err + } + + // Avoid preloading schema if it is already in progress or already known + if record.PreloadEmbeddedSchemaState != operation.OpStateUnknown && !job.IgnoreState(ctx) { + return job.StateNotChangedErr{Dir: document.DirHandleFromPath(searchPath)} + } + + err = searchStore.SetPreloadEmbeddedSchemaState(searchPath, operation.OpStateLoading) + if err != nil { + return err + } + defer searchStore.SetPreloadEmbeddedSchemaState(searchPath, operation.OpStateLoaded) + + pReqs, err := searchStore.ProviderRequirementsForModule(searchPath) + if err != nil { + return err + } + + missingReqs, err := schemaStore.MissingSchemas(pReqs) + if err != nil { + return err + } + + if len(missingReqs) == 0 { + // avoid preloading any schemas if we already have all + return nil + } + + for _, pAddr := range missingReqs { + err := globalState.PreloadSchemaForProviderAddr(ctx, pAddr, fs, schemaStore, logger) + if err != nil { + return err + } + } + + return nil + +} diff --git a/internal/features/search/jobs/schema_test.go b/internal/features/search/jobs/schema_test.go new file mode 100644 index 00000000..c08a8ff4 --- /dev/null +++ b/internal/features/search/jobs/schema_test.go @@ -0,0 +1,596 @@ +// Copyright (c) HashiCorp, Inc. +// SPDX-License-Identifier: MPL-2.0 + +package jobs + +import ( + "bytes" + "compress/gzip" + "context" + "errors" + "io/fs" + "log" + "sync" + "testing" + "testing/fstest" + + "github.com/hashicorp/go-version" + lsctx "github.com/hashicorp/terraform-ls/internal/context" + "github.com/hashicorp/terraform-ls/internal/document" + "github.com/hashicorp/terraform-ls/internal/features/search/state" + "github.com/hashicorp/terraform-ls/internal/filesystem" + "github.com/hashicorp/terraform-ls/internal/job" + globalState "github.com/hashicorp/terraform-ls/internal/state" + "github.com/hashicorp/terraform-ls/internal/terraform/module/operation" + tfaddr "github.com/hashicorp/terraform-registry-address" + tfmod "github.com/hashicorp/terraform-schema/module" + tfschema "github.com/hashicorp/terraform-schema/schema" +) + +// Mock implementation of ModuleReader for testing +type mockModuleReader struct{} + +func (m *mockModuleReader) LocalModuleMeta(path string) (*tfmod.Meta, error) { + // Return module metadata with provider requirements that match what the search config uses + randomAddr := tfaddr.MustParseProviderSource("hashicorp/random") + awsAddr := tfaddr.MustParseProviderSource("hashicorp/aws") + unknownAddr := tfaddr.MustParseProviderSource("hashicorp/unknown") + + return &tfmod.Meta{ + ProviderRequirements: tfmod.ProviderRequirements{ + randomAddr: version.MustConstraints(version.NewConstraint("1.0.0")), + awsAddr: version.MustConstraints(version.NewConstraint("3.0.0")), + unknownAddr: version.MustConstraints(version.NewConstraint("5.0.0")), + }, + ProviderReferences: map[tfmod.ProviderRef]tfaddr.Provider{ + {LocalName: "aws"}: awsAddr, + {LocalName: "random"}: randomAddr, + {LocalName: "unknown"}: unknownAddr, + }, + }, nil +} + +// Mock implementation for tests that need empty provider requirements +type emptyMockModuleReader struct{} + +func (m *emptyMockModuleReader) LocalModuleMeta(path string) (*tfmod.Meta, error) { + return &tfmod.Meta{ + ProviderRequirements: tfmod.ProviderRequirements{}, + ProviderReferences: map[tfmod.ProviderRef]tfaddr.Provider{}, + }, nil +} + +func TestPreloadEmbeddedSchema_basic(t *testing.T) { + ctx := context.Background() + dataDir := "data" + schemasFS := fstest.MapFS{ + dataDir: &fstest.MapFile{Mode: fs.ModeDir}, + dataDir + "/registry.terraform.io": &fstest.MapFile{Mode: fs.ModeDir}, + dataDir + "/registry.terraform.io/hashicorp": &fstest.MapFile{Mode: fs.ModeDir}, + dataDir + "/registry.terraform.io/hashicorp/random": &fstest.MapFile{Mode: fs.ModeDir}, + dataDir + "/registry.terraform.io/hashicorp/random/1.0.0": &fstest.MapFile{Mode: fs.ModeDir}, + dataDir + "/registry.terraform.io/hashicorp/random/1.0.0/schema.json.gz": &fstest.MapFile{ + Data: gzipCompressBytes(t, []byte(randomSchemaJSON)), + }, + } + + gs, err := globalState.NewStateStore() + if err != nil { + t.Fatal(err) + } + ss, err := state.NewSearchStore(gs.ChangeStore, gs.ProviderSchemas) + if err != nil { + t.Fatal(err) + } + searchPath := "testsearch" + + fs := filesystem.NewFilesystem(gs.DocumentStore) + + err = ss.Add(searchPath) + if err != nil { + t.Fatal(err) + } + ctx = lsctx.WithDocumentContext(ctx, lsctx.Document{}) + err = ParseSearchConfiguration(ctx, fs, ss, searchPath) + if err != nil { + t.Fatal(err) + } + + mockReader := &mockModuleReader{} + err = LoadSearchMetadata(ctx, ss, mockReader, log.Default(), searchPath) + if err != nil { + t.Fatal(err) + } + + err = PreloadEmbeddedSchema(ctx, log.Default(), schemasFS, ss, gs.ProviderSchemas, searchPath) + if err != nil { + t.Fatal(err) + } + + // verify schema was loaded + pAddr := tfaddr.MustParseProviderSource("hashicorp/random") + vc := version.MustConstraints(version.NewConstraint(">= 1.0.0")) + + // ask for schema for an unrelated path to avoid path-based matching + s, err := gs.ProviderSchemas.ProviderSchema("unknown-path", pAddr, vc) + if err != nil { + t.Fatal(err) + } + if s == nil { + t.Fatalf("expected non-nil schema for %s %s", pAddr, vc) + } + + _, ok := s.Provider.Attributes["test"] + if !ok { + t.Fatalf("expected test attribute in provider schema, not found") + } +} + +func TestPreloadEmbeddedSchema_unknownProviderOnly(t *testing.T) { + ctx := context.Background() + dataDir := "data" + schemasFS := fstest.MapFS{ + dataDir: &fstest.MapFile{Mode: fs.ModeDir}, + } + + gs, err := globalState.NewStateStore() + if err != nil { + t.Fatal(err) + } + ss, err := state.NewSearchStore(gs.ChangeStore, gs.ProviderSchemas) + if err != nil { + t.Fatal(err) + } + searchPath := "testsearch" + + fs := filesystem.NewFilesystem(gs.DocumentStore) + + err = ss.Add(searchPath) + if err != nil { + t.Fatal(err) + } + ctx = lsctx.WithDocumentContext(ctx, lsctx.Document{}) + err = ParseSearchConfiguration(ctx, fs, ss, searchPath) + if err != nil { + t.Fatal(err) + } + + mockReader := &mockModuleReader{} + err = LoadSearchMetadata(ctx, ss, mockReader, log.Default(), searchPath) + if err != nil { + t.Fatal(err) + } + + err = PreloadEmbeddedSchema(ctx, log.Default(), schemasFS, ss, gs.ProviderSchemas, searchPath) + if err != nil { + t.Fatal(err) + } +} + +func TestPreloadEmbeddedSchema_idempotency(t *testing.T) { + ctx := context.Background() + dataDir := "data" + schemasFS := fstest.MapFS{ + dataDir: &fstest.MapFile{Mode: fs.ModeDir}, + dataDir + "/registry.terraform.io": &fstest.MapFile{Mode: fs.ModeDir}, + dataDir + "/registry.terraform.io/hashicorp": &fstest.MapFile{Mode: fs.ModeDir}, + dataDir + "/registry.terraform.io/hashicorp/random": &fstest.MapFile{Mode: fs.ModeDir}, + dataDir + "/registry.terraform.io/hashicorp/random/1.0.0": &fstest.MapFile{Mode: fs.ModeDir}, + dataDir + "/registry.terraform.io/hashicorp/random/1.0.0/schema.json.gz": &fstest.MapFile{ + Data: gzipCompressBytes(t, []byte(randomSchemaJSON)), + }, + } + + gs, err := globalState.NewStateStore() + if err != nil { + t.Fatal(err) + } + ss, err := state.NewSearchStore(gs.ChangeStore, gs.ProviderSchemas) + if err != nil { + t.Fatal(err) + } + searchPath := "testsearch" + + fs := filesystem.NewFilesystem(gs.DocumentStore) + + err = ss.Add(searchPath) + if err != nil { + t.Fatal(err) + } + ctx = lsctx.WithDocumentContext(ctx, lsctx.Document{}) + err = ParseSearchConfiguration(ctx, fs, ss, searchPath) + if err != nil { + t.Fatal(err) + } + + mockReader := &mockModuleReader{} + err = LoadSearchMetadata(ctx, ss, mockReader, log.Default(), searchPath) + if err != nil { + t.Fatal(err) + } + + // first + err = PreloadEmbeddedSchema(ctx, log.Default(), schemasFS, ss, gs.ProviderSchemas, searchPath) + if err != nil { + t.Fatal(err) + } + + // second - testing search state + err = PreloadEmbeddedSchema(ctx, log.Default(), schemasFS, ss, gs.ProviderSchemas, searchPath) + if err != nil { + if !errors.Is(err, job.StateNotChangedErr{Dir: document.DirHandleFromPath(searchPath)}) { + t.Fatal(err) + } + } + + ctx = job.WithIgnoreState(ctx, true) + // third - testing requirement matching + err = PreloadEmbeddedSchema(ctx, log.Default(), schemasFS, ss, gs.ProviderSchemas, searchPath) + if err != nil { + t.Fatal(err) + } +} + +func TestPreloadEmbeddedSchema_raceCondition(t *testing.T) { + ctx := context.Background() + dataDir := "data" + schemasFS := fstest.MapFS{ + dataDir: &fstest.MapFile{Mode: fs.ModeDir}, + dataDir + "/registry.terraform.io": &fstest.MapFile{Mode: fs.ModeDir}, + dataDir + "/registry.terraform.io/hashicorp": &fstest.MapFile{Mode: fs.ModeDir}, + dataDir + "/registry.terraform.io/hashicorp/random": &fstest.MapFile{Mode: fs.ModeDir}, + dataDir + "/registry.terraform.io/hashicorp/random/1.0.0": &fstest.MapFile{Mode: fs.ModeDir}, + dataDir + "/registry.terraform.io/hashicorp/random/1.0.0/schema.json.gz": &fstest.MapFile{ + Data: gzipCompressBytes(t, []byte(randomSchemaJSON)), + }, + } + + gs, err := globalState.NewStateStore() + if err != nil { + t.Fatal(err) + } + ss, err := state.NewSearchStore(gs.ChangeStore, gs.ProviderSchemas) + if err != nil { + t.Fatal(err) + } + searchPath := "testsearch" + + fs := filesystem.NewFilesystem(gs.DocumentStore) + + err = ss.Add(searchPath) + if err != nil { + t.Fatal(err) + } + ctx = lsctx.WithDocumentContext(ctx, lsctx.Document{}) + err = ParseSearchConfiguration(ctx, fs, ss, searchPath) + if err != nil { + t.Fatal(err) + } + + mockReader := &mockModuleReader{} + err = LoadSearchMetadata(ctx, ss, mockReader, log.Default(), searchPath) + if err != nil { + t.Fatal(err) + } + + var wg sync.WaitGroup + wg.Add(2) + go func() { + defer wg.Done() + err := PreloadEmbeddedSchema(ctx, log.Default(), schemasFS, ss, gs.ProviderSchemas, searchPath) + if err != nil && !errors.Is(err, job.StateNotChangedErr{Dir: document.DirHandleFromPath(searchPath)}) { + t.Error(err) + } + }() + go func() { + defer wg.Done() + err := PreloadEmbeddedSchema(ctx, log.Default(), schemasFS, ss, gs.ProviderSchemas, searchPath) + if err != nil && !errors.Is(err, job.StateNotChangedErr{Dir: document.DirHandleFromPath(searchPath)}) { + t.Error(err) + } + }() + wg.Wait() +} + +func TestPreloadEmbeddedSchema_noProviderRequirements(t *testing.T) { + ctx := context.Background() + dataDir := "data" + schemasFS := fstest.MapFS{ + dataDir: &fstest.MapFile{Mode: fs.ModeDir}, + } + + gs, err := globalState.NewStateStore() + if err != nil { + t.Fatal(err) + } + ss, err := state.NewSearchStore(gs.ChangeStore, gs.ProviderSchemas) + if err != nil { + t.Fatal(err) + } + searchPath := "testsearch" + + fs := filesystem.NewFilesystem(gs.DocumentStore) + + err = ss.Add(searchPath) + if err != nil { + t.Fatal(err) + } + ctx = lsctx.WithDocumentContext(ctx, lsctx.Document{}) + err = ParseSearchConfiguration(ctx, fs, ss, searchPath) + if err != nil { + t.Fatal(err) + } + + emptyMockReader := &emptyMockModuleReader{} + err = LoadSearchMetadata(ctx, ss, emptyMockReader, log.Default(), searchPath) + if err != nil { + t.Fatal(err) + } + + err = PreloadEmbeddedSchema(ctx, log.Default(), schemasFS, ss, gs.ProviderSchemas, searchPath) + if err != nil { + t.Fatal(err) + } +} + +func TestPreloadEmbeddedSchema_invalidSearchPath(t *testing.T) { + ctx := context.Background() + dataDir := "data" + schemasFS := fstest.MapFS{ + dataDir: &fstest.MapFile{Mode: fs.ModeDir}, + } + + gs, err := globalState.NewStateStore() + if err != nil { + t.Fatal(err) + } + ss, err := state.NewSearchStore(gs.ChangeStore, gs.ProviderSchemas) + if err != nil { + t.Fatal(err) + } + + // Don't add the search path to the store, causing SearchRecordByPath to fail + searchPath := "nonexistent" + + err = PreloadEmbeddedSchema(ctx, log.Default(), schemasFS, ss, gs.ProviderSchemas, searchPath) + if err == nil { + t.Fatal("expected error for invalid search path") + } +} + +func TestPreloadEmbeddedSchema_alreadyHasSchemas(t *testing.T) { + ctx := context.Background() + dataDir := "data" + schemasFS := fstest.MapFS{ + dataDir: &fstest.MapFile{Mode: fs.ModeDir}, + } + + gs, err := globalState.NewStateStore() + if err != nil { + t.Fatal(err) + } + ss, err := state.NewSearchStore(gs.ChangeStore, gs.ProviderSchemas) + if err != nil { + t.Fatal(err) + } + searchPath := "testsearch" + + fs := filesystem.NewFilesystem(gs.DocumentStore) + + err = ss.Add(searchPath) + if err != nil { + t.Fatal(err) + } + ctx = lsctx.WithDocumentContext(ctx, lsctx.Document{}) + err = ParseSearchConfiguration(ctx, fs, ss, searchPath) + if err != nil { + t.Fatal(err) + } + + mockReader := &mockModuleReader{} + err = LoadSearchMetadata(ctx, ss, mockReader, log.Default(), searchPath) + if err != nil { + t.Fatal(err) + } + + // Pre-load the schema into the store + pAddr := tfaddr.MustParseProviderSource("hashicorp/aws") + err = gs.ProviderSchemas.AddPreloadedSchema(pAddr, version.Must(version.NewVersion("3.0.0")), &tfschema.ProviderSchema{}) + if err != nil { + t.Fatal(err) + } + + // This should return early since we already have the schema + err = PreloadEmbeddedSchema(ctx, log.Default(), schemasFS, ss, gs.ProviderSchemas, searchPath) + if err != nil { + t.Fatal(err) + } +} + +func TestPreloadEmbeddedSchema_stateLoading(t *testing.T) { + ctx := context.Background() + dataDir := "data" + schemasFS := fstest.MapFS{ + dataDir: &fstest.MapFile{Mode: fs.ModeDir}, + } + + gs, err := globalState.NewStateStore() + if err != nil { + t.Fatal(err) + } + ss, err := state.NewSearchStore(gs.ChangeStore, gs.ProviderSchemas) + if err != nil { + t.Fatal(err) + } + searchPath := "testsearch" + + fs := filesystem.NewFilesystem(gs.DocumentStore) + + err = ss.Add(searchPath) + if err != nil { + t.Fatal(err) + } + ctx = lsctx.WithDocumentContext(ctx, lsctx.Document{}) + err = ParseSearchConfiguration(ctx, fs, ss, searchPath) + if err != nil { + t.Fatal(err) + } + + mockReader := &mockModuleReader{} + err = LoadSearchMetadata(ctx, ss, mockReader, log.Default(), searchPath) + if err != nil { + t.Fatal(err) + } + + // Set the state to loading manually + err = ss.SetPreloadEmbeddedSchemaState(searchPath, operation.OpStateLoading) + if err != nil { + t.Fatal(err) + } + + // This should return early with StateNotChangedErr since state is already loading + err = PreloadEmbeddedSchema(ctx, log.Default(), schemasFS, ss, gs.ProviderSchemas, searchPath) + if err == nil { + t.Fatal("expected StateNotChangedErr when state is already loading") + } + + expectedErr := job.StateNotChangedErr{Dir: document.DirHandleFromPath(searchPath)} + if !errors.Is(err, expectedErr) { + t.Fatalf("expected StateNotChangedErr, got: %v", err) + } +} + +func TestPreloadEmbeddedSchema_multipleProviders(t *testing.T) { + ctx := context.Background() + dataDir := "data" + schemasFS := fstest.MapFS{ + dataDir: &fstest.MapFile{Mode: fs.ModeDir}, + dataDir + "/registry.terraform.io": &fstest.MapFile{Mode: fs.ModeDir}, + dataDir + "/registry.terraform.io/hashicorp": &fstest.MapFile{Mode: fs.ModeDir}, + dataDir + "/registry.terraform.io/hashicorp/random": &fstest.MapFile{Mode: fs.ModeDir}, + dataDir + "/registry.terraform.io/hashicorp/random/1.0.0": &fstest.MapFile{Mode: fs.ModeDir}, + dataDir + "/registry.terraform.io/hashicorp/random/1.0.0/schema.json.gz": &fstest.MapFile{ + Data: gzipCompressBytes(t, []byte(randomSchemaJSON)), + }, + dataDir + "/registry.terraform.io/hashicorp/aws": &fstest.MapFile{Mode: fs.ModeDir}, + dataDir + "/registry.terraform.io/hashicorp/aws/3.0.0": &fstest.MapFile{Mode: fs.ModeDir}, + dataDir + "/registry.terraform.io/hashicorp/aws/3.0.0/schema.json.gz": &fstest.MapFile{ + Data: gzipCompressBytes(t, []byte(awsSchemaJSON)), + }, + } + + gs, err := globalState.NewStateStore() + if err != nil { + t.Fatal(err) + } + ss, err := state.NewSearchStore(gs.ChangeStore, gs.ProviderSchemas) + if err != nil { + t.Fatal(err) + } + searchPath := "testsearch" + + fs := filesystem.NewFilesystem(gs.DocumentStore) + + err = ss.Add(searchPath) + if err != nil { + t.Fatal(err) + } + ctx = lsctx.WithDocumentContext(ctx, lsctx.Document{}) + err = ParseSearchConfiguration(ctx, fs, ss, searchPath) + if err != nil { + t.Fatal(err) + } + + mockReader := &mockModuleReader{} + err = LoadSearchMetadata(ctx, ss, mockReader, log.Default(), searchPath) + if err != nil { + t.Fatal(err) + } + + err = PreloadEmbeddedSchema(ctx, log.Default(), schemasFS, ss, gs.ProviderSchemas, searchPath) + if err != nil { + t.Fatal(err) + } + + // verify both schemas were loaded + randomAddr := tfaddr.MustParseProviderSource("hashicorp/random") + awsAddr := tfaddr.MustParseProviderSource("hashicorp/aws") + vc := version.MustConstraints(version.NewConstraint(">= 1.0.0")) + + // Check random provider schema + s, err := gs.ProviderSchemas.ProviderSchema("unknown-path", randomAddr, vc) + if err != nil { + t.Fatal(err) + } + if s == nil { + t.Fatalf("expected non-nil schema for %s %s", randomAddr, vc) + } + + // Check AWS provider schema + awsVC := version.MustConstraints(version.NewConstraint(">= 3.0.0")) + s, err = gs.ProviderSchemas.ProviderSchema("unknown-path", awsAddr, awsVC) + if err != nil { + t.Fatal(err) + } + if s == nil { + t.Fatalf("expected non-nil schema for %s %s", awsAddr, awsVC) + } +} + +func gzipCompressBytes(t *testing.T, b []byte) []byte { + var compressedBytes bytes.Buffer + gw := gzip.NewWriter(&compressedBytes) + _, err := gw.Write(b) + if err != nil { + t.Fatal(err) + } + err = gw.Close() + if err != nil { + t.Fatal(err) + } + return compressedBytes.Bytes() +} + +var randomSchemaJSON = `{ + "format_version": "1.0", + "provider_schemas": { + "registry.terraform.io/hashicorp/random": { + "provider": { + "version": 0, + "block": { + "attributes": { + "test": { + "type": "string", + "description": "Test description", + "description_kind": "markdown", + "optional": true + } + }, + "description_kind": "plain" + } + } + } + } +}` + +var awsSchemaJSON = `{ + "format_version": "1.0", + "provider_schemas": { + "registry.terraform.io/hashicorp/aws": { + "provider": { + "version": 0, + "block": { + "attributes": { + "region": { + "type": "string", + "description": "AWS region", + "description_kind": "markdown", + "optional": true + } + }, + "description_kind": "plain" + } + } + } + } +}` diff --git a/internal/features/search/jobs/testdata/invalid-search/config.tfquery.hcl b/internal/features/search/jobs/testdata/invalid-search/config.tfquery.hcl new file mode 100644 index 00000000..4284ff46 --- /dev/null +++ b/internal/features/search/jobs/testdata/invalid-search/config.tfquery.hcl @@ -0,0 +1,4 @@ + +list "concept_pet" "invalid_name" { + +} diff --git a/internal/features/search/jobs/testdata/invalid-search/variables.tfquery.hcl b/internal/features/search/jobs/testdata/invalid-search/variables.tfquery.hcl new file mode 100644 index 00000000..fd761bb4 --- /dev/null +++ b/internal/features/search/jobs/testdata/invalid-search/variables.tfquery.hcl @@ -0,0 +1,11 @@ +variable { + type = string +} + +locals { + test = 1 +} + +provider "aws" { + region = var.region +} diff --git a/internal/features/search/jobs/testdata/simple-search/config.tfquery.hcl b/internal/features/search/jobs/testdata/simple-search/config.tfquery.hcl new file mode 100644 index 00000000..3a89f7c0 --- /dev/null +++ b/internal/features/search/jobs/testdata/simple-search/config.tfquery.hcl @@ -0,0 +1,22 @@ + +locals { + number_local = 500 +} + +locals { + include_resource_variable_2 = false +} + +provider "aws" { + alias = "this" +} + +list "concept_pet" "name_1" { + provider = aws.this + limit = local.number_local + include_resource = var.include_resource_variable + count = var.number_variable + config { + + } +} diff --git a/internal/features/search/jobs/testdata/simple-search/variables.tfquery.hcl b/internal/features/search/jobs/testdata/simple-search/variables.tfquery.hcl new file mode 100644 index 00000000..d7c4512a --- /dev/null +++ b/internal/features/search/jobs/testdata/simple-search/variables.tfquery.hcl @@ -0,0 +1,9 @@ +variable "include_resource_variable" { + default = true + type = bool +} + +variable "number_variable" { + default = 0 + type = number +} \ No newline at end of file diff --git a/internal/features/search/jobs/types.go b/internal/features/search/jobs/types.go new file mode 100644 index 00000000..6052cff7 --- /dev/null +++ b/internal/features/search/jobs/types.go @@ -0,0 +1,13 @@ +// Copyright (c) HashiCorp, Inc. +// SPDX-License-Identifier: MPL-2.0 + +package jobs + +import "io/fs" + +type ReadOnlyFS interface { + fs.FS + ReadDir(name string) ([]fs.DirEntry, error) + ReadFile(name string) ([]byte, error) + Stat(name string) (fs.FileInfo, error) +} diff --git a/internal/features/search/jobs/validation.go b/internal/features/search/jobs/validation.go new file mode 100644 index 00000000..156315e0 --- /dev/null +++ b/internal/features/search/jobs/validation.go @@ -0,0 +1,106 @@ +// Copyright (c) HashiCorp, Inc. +// SPDX-License-Identifier: MPL-2.0 + +package jobs + +import ( + "context" + "path" + + "github.com/hashicorp/hcl-lang/decoder" + "github.com/hashicorp/hcl-lang/lang" + "github.com/hashicorp/hcl/v2" + lsctx "github.com/hashicorp/terraform-ls/internal/context" + idecoder "github.com/hashicorp/terraform-ls/internal/decoder" + "github.com/hashicorp/terraform-ls/internal/document" + "github.com/hashicorp/terraform-ls/internal/features/search/ast" + searchDecoder "github.com/hashicorp/terraform-ls/internal/features/search/decoder" + "github.com/hashicorp/terraform-ls/internal/features/search/state" + "github.com/hashicorp/terraform-ls/internal/job" + ilsp "github.com/hashicorp/terraform-ls/internal/lsp" + globalAst "github.com/hashicorp/terraform-ls/internal/terraform/ast" + "github.com/hashicorp/terraform-ls/internal/terraform/module/operation" +) + +func SchemaSearchValidation(ctx context.Context, searchStore *state.SearchStore, moduleFeature searchDecoder.ModuleReader, rootFeature searchDecoder.RootReader, searchPath string) error { + rpcContext := lsctx.DocumentContext(ctx) + + record, err := searchStore.SearchRecordByPath(searchPath) + if err != nil { + return err + } + + // Avoid validation if it is already in progress or already finished + if record.DiagnosticsState[globalAst.SchemaValidationSource] != operation.OpStateUnknown && !job.IgnoreState(ctx) { + return job.StateNotChangedErr{Dir: document.DirHandleFromPath(searchPath)} + } + + err = searchStore.SetDiagnosticsState(searchPath, globalAst.SchemaValidationSource, operation.OpStateLoading) + if err != nil { + return err + } + + d := decoder.NewDecoder(&searchDecoder.PathReader{ + StateReader: searchStore, + ModuleReader: moduleFeature, + RootReader: rootFeature, + }) + d.SetContext(idecoder.DecoderContext(ctx)) + + var rErr error + if rpcContext.Method == "textDocument/didChange" { + // We validate only the file that has changed + // This means only creating a decoder for the file type that has changed + decoder, err := d.Path(lang.Path{ + Path: searchPath, + LanguageID: rpcContext.LanguageID, + }) + if err != nil { + return err + } + + filename := path.Base(rpcContext.URI) + + var fileDiags hcl.Diagnostics + fileDiags, rErr = decoder.ValidateFile(ctx, filename) + + diags, ok := record.Diagnostics[globalAst.SchemaValidationSource] + if !ok { + diags = make(ast.Diagnostics) + } + diags[ast.FilenameFromName(filename)] = fileDiags + + sErr := searchStore.UpdateDiagnostics(searchPath, globalAst.SchemaValidationSource, diags) + if sErr != nil { + return sErr + } + } else { + // We validate the whole search, and so need to create decoders for + // all the file types in the search + searchDecoder, err := d.Path(lang.Path{ + Path: searchPath, + LanguageID: ilsp.Search.String(), + }) + if err != nil { + return err + } + + diags := make(lang.DiagnosticsMap) + + searchDiags, err := searchDecoder.Validate(ctx) + if err != nil { + // TODO: Should we really return here or continue with the other decoders? + // Is this really a complete fail case? Shouldn't a failure in a search file + // not prevent the deploy file from being validated? + return err + } + diags = diags.Extend(searchDiags) + + sErr := searchStore.UpdateDiagnostics(searchPath, globalAst.SchemaValidationSource, ast.DiagnosticsFromMap(diags)) + if sErr != nil { + return sErr + } + } + + return rErr +} diff --git a/internal/features/search/jobs/validation_test.go b/internal/features/search/jobs/validation_test.go new file mode 100644 index 00000000..b4e76002 --- /dev/null +++ b/internal/features/search/jobs/validation_test.go @@ -0,0 +1,137 @@ +// Copyright (c) HashiCorp, Inc. +// SPDX-License-Identifier: MPL-2.0 + +package jobs + +import ( + "context" + "path/filepath" + "testing" + + "github.com/hashicorp/go-version" + lsctx "github.com/hashicorp/terraform-ls/internal/context" + "github.com/hashicorp/terraform-ls/internal/features/search/state" + "github.com/hashicorp/terraform-ls/internal/filesystem" + ilsp "github.com/hashicorp/terraform-ls/internal/lsp" + globalState "github.com/hashicorp/terraform-ls/internal/state" + "github.com/hashicorp/terraform-ls/internal/terraform/ast" + tfmod "github.com/hashicorp/terraform-schema/module" +) + +type ModuleReaderMock struct{} + +func (m ModuleReaderMock) LocalModuleMeta(modulePath string) (*tfmod.Meta, error) { + return nil, nil +} + +type RootReaderMock struct{} + +func (r RootReaderMock) InstalledModuleCalls(modPath string) (map[string]tfmod.InstalledModuleCall, error) { + return nil, nil +} + +func (r RootReaderMock) TerraformVersion(modPath string) *version.Version { + return nil +} + +func (r RootReaderMock) InstalledModulePath(rootPath string, normalizedSource string) (string, bool) { + return "", false +} + +func TestSchemaSearchValidation_FullSearch(t *testing.T) { + ctx := context.Background() + gs, err := globalState.NewStateStore() + if err != nil { + t.Fatal(err) + } + ms, err := state.NewSearchStore(gs.ChangeStore, gs.ProviderSchemas) + if err != nil { + t.Fatal(err) + } + + testData, err := filepath.Abs("testdata") + if err != nil { + t.Fatal(err) + } + searchPath := filepath.Join(testData, "invalid-search") + + err = ms.Add(searchPath) + if err != nil { + t.Fatal(err) + } + + fs := filesystem.NewFilesystem(gs.DocumentStore) + ctx = lsctx.WithDocumentContext(ctx, lsctx.Document{ + Method: "textDocument/didOpen", + LanguageID: ilsp.Search.String(), + URI: "file:///test/variables.tfquery.hcl", + }) + err = ParseSearchConfiguration(ctx, fs, ms, searchPath) + if err != nil { + t.Fatal(err) + } + err = SchemaSearchValidation(ctx, ms, ModuleReaderMock{}, RootReaderMock{}, searchPath) + if err != nil { + t.Fatal(err) + } + + record, err := ms.SearchRecordByPath(searchPath) + if err != nil { + t.Fatal(err) + } + + expectedCount := 3 + diagsCount := record.Diagnostics[ast.SchemaValidationSource].Count() + if diagsCount != expectedCount { + t.Fatalf("expected %d diagnostics, %d given", expectedCount, diagsCount) + } +} + +func TestSchemaSearchValidation_SingleFile(t *testing.T) { + ctx := context.Background() + gs, err := globalState.NewStateStore() + if err != nil { + t.Fatal(err) + } + ss, err := state.NewSearchStore(gs.ChangeStore, gs.ProviderSchemas) + if err != nil { + t.Fatal(err) + } + + testData, err := filepath.Abs("testdata") + if err != nil { + t.Fatal(err) + } + searchPath := filepath.Join(testData, "invalid-search") + + err = ss.Add(searchPath) + if err != nil { + t.Fatal(err) + } + + fs := filesystem.NewFilesystem(gs.DocumentStore) + ctx = lsctx.WithDocumentContext(ctx, lsctx.Document{ + Method: "textDocument/didChange", + LanguageID: ilsp.Search.String(), + URI: "file:///test/config.tfquery.hcl", + }) + err = ParseSearchConfiguration(ctx, fs, ss, searchPath) + if err != nil { + t.Fatal(err) + } + err = SchemaSearchValidation(ctx, ss, ModuleReaderMock{}, RootReaderMock{}, searchPath) + if err != nil { + t.Fatal(err) + } + + record, err := ss.SearchRecordByPath(searchPath) + if err != nil { + t.Fatal(err) + } + + expectedCount := 2 + diagsCount := record.Diagnostics[ast.SchemaValidationSource].Count() + if diagsCount != expectedCount { + t.Fatalf("expected %d diagnostics, %d given", expectedCount, diagsCount) + } +} diff --git a/internal/features/search/parser/search.go b/internal/features/search/parser/search.go new file mode 100644 index 00000000..64959070 --- /dev/null +++ b/internal/features/search/parser/search.go @@ -0,0 +1,67 @@ +// Copyright (c) HashiCorp, Inc. +// SPDX-License-Identifier: MPL-2.0 + +package parser + +import ( + "path/filepath" + + "github.com/hashicorp/hcl/v2" + "github.com/hashicorp/terraform-ls/internal/features/search/ast" + "github.com/hashicorp/terraform-ls/internal/terraform/parser" +) + +func ParseFiles(fs parser.FS, searchPath string) (ast.Files, ast.Diagnostics, error) { + files := make(ast.Files, 0) + diags := make(ast.Diagnostics, 0) + + infos, err := fs.ReadDir(searchPath) + if err != nil { + return nil, nil, err + } + + for _, info := range infos { + if info.IsDir() { + // We only care about files + continue + } + + name := info.Name() + if !ast.IsSearchFilename(name) { + continue + } + + fullPath := filepath.Join(searchPath, name) + + src, err := fs.ReadFile(fullPath) + if err != nil { + // If a file isn't accessible, continue with reading the + // remaining module files + continue + } + + filename := ast.FilenameFromName(name) + f, pDiags := parser.ParseFile(src, filename) + + diags[filename] = pDiags + if f != nil { + files[filename] = f + } + } + + return files, diags, nil +} + +func ParseFile(fs parser.FS, filePath string) (*hcl.File, hcl.Diagnostics, error) { + src, err := fs.ReadFile(filePath) + if err != nil { + // If a file isn't accessible, return + return nil, nil, err + } + + name := filepath.Base(filePath) + filename := ast.FilenameFromName(name) + f, pDiags := parser.ParseFile(src, filename) + + return f, pDiags, nil +} diff --git a/internal/features/search/search_feature.go b/internal/features/search/search_feature.go new file mode 100644 index 00000000..0ff482d7 --- /dev/null +++ b/internal/features/search/search_feature.go @@ -0,0 +1,145 @@ +// Copyright (c) HashiCorp, Inc. +// SPDX-License-Identifier: MPL-2.0 + +package search + +import ( + "context" + "io" + "log" + + "github.com/hashicorp/hcl-lang/decoder" + "github.com/hashicorp/hcl-lang/lang" + "github.com/hashicorp/terraform-ls/internal/eventbus" + "github.com/hashicorp/terraform-ls/internal/features/modules/jobs" + searchDecoder "github.com/hashicorp/terraform-ls/internal/features/search/decoder" + "github.com/hashicorp/terraform-ls/internal/features/search/state" + "github.com/hashicorp/terraform-ls/internal/job" + "github.com/hashicorp/terraform-ls/internal/langserver/diagnostics" + globalState "github.com/hashicorp/terraform-ls/internal/state" +) + +type SearchFeature struct { + store *state.SearchStore + stateStore *globalState.StateStore + bus *eventbus.EventBus + fs jobs.ReadOnlyFS + logger *log.Logger + stopFunc context.CancelFunc + + moduleFeature searchDecoder.ModuleReader + rootFeature searchDecoder.RootReader +} + +func NewSearchFeature(bus *eventbus.EventBus, stateStore *globalState.StateStore, fs jobs.ReadOnlyFS, moduleFeature searchDecoder.ModuleReader, rootFeature searchDecoder.RootReader) (*SearchFeature, error) { + store, err := state.NewSearchStore(stateStore.ChangeStore, stateStore.ProviderSchemas) + if err != nil { + return nil, err + } + discardLogger := log.New(io.Discard, "", 0) + + return &SearchFeature{ + store: store, + bus: bus, + fs: fs, + stateStore: stateStore, + logger: discardLogger, + stopFunc: func() {}, + moduleFeature: moduleFeature, + rootFeature: rootFeature, + }, nil +} + +func (f *SearchFeature) SetLogger(logger *log.Logger) { + f.logger = logger + f.store.SetLogger(logger) +} + +// Start starts the features separate goroutine. +// It listens to various events from the EventBus and performs corresponding actions. +func (f *SearchFeature) Start(ctx context.Context) { + ctx, cancelFunc := context.WithCancel(ctx) + f.stopFunc = cancelFunc + + topic := "feature.search" + + didOpenDone := make(chan job.IDs, 10) + didChangeDone := make(chan job.IDs, 10) + didChangeWatchedDone := make(chan job.IDs, 10) + + discover := f.bus.OnDiscover(topic, nil) + didOpen := f.bus.OnDidOpen(topic, didOpenDone) + didChange := f.bus.OnDidChange(topic, didChangeDone) + didChangeWatched := f.bus.OnDidChangeWatched(topic, didChangeWatchedDone) + + go func() { + for { + select { + case discover := <-discover: + // TODO? collect errors + f.discover(discover.Path, discover.Files) + case didOpen := <-didOpen: + // TODO? collect errors + spawnedIds, _ := f.didOpen(didOpen.Context, didOpen.Dir, didOpen.LanguageID) + didOpenDone <- spawnedIds + case didChange := <-didChange: + // TODO? collect errors + spawnedIds, _ := f.didChange(didChange.Context, didChange.Dir) + didChangeDone <- spawnedIds + case didChangeWatched := <-didChangeWatched: + // TODO? collect errors + spawnedIds, _ := f.didChangeWatched(didChangeWatched.Context, didChangeWatched.RawPath, didChangeWatched.ChangeType, didChangeWatched.IsDir) + didChangeWatchedDone <- spawnedIds + + case <-ctx.Done(): + return + } + } + }() +} + +func (f *SearchFeature) Stop() { + f.stopFunc() + f.logger.Print("stopped search feature") +} + +func (f *SearchFeature) PathContext(path lang.Path) (*decoder.PathContext, error) { + pathReader := &searchDecoder.PathReader{ + StateReader: f.store, + ModuleReader: f.moduleFeature, + RootReader: f.rootFeature, + } + + return pathReader.PathContext(path) +} + +func (f *SearchFeature) Paths(ctx context.Context) []lang.Path { + pathReader := &searchDecoder.PathReader{ + StateReader: f.store, + ModuleReader: f.moduleFeature, + RootReader: f.rootFeature, + } + + return pathReader.Paths(ctx) +} + +func (f *SearchFeature) Diagnostics(path string) diagnostics.Diagnostics { + diags := diagnostics.NewDiagnostics() + + mod, err := f.store.SearchRecordByPath(path) + if err != nil { + return diags + } + + for source, dm := range mod.Diagnostics { + diags.Append(source, dm.AutoloadedOnly().AsMap()) + } + + return diags +} + +func (f *SearchFeature) Telemetry(path string) map[string]interface{} { + properties := make(map[string]interface{}) + properties["search"] = true + return properties +} diff --git a/internal/features/search/state/schema.go b/internal/features/search/state/schema.go new file mode 100644 index 00000000..d1137699 --- /dev/null +++ b/internal/features/search/state/schema.go @@ -0,0 +1,48 @@ +// Copyright (c) HashiCorp, Inc. +// SPDX-License-Identifier: MPL-2.0 + +package state + +import ( + "io" + "log" + + "github.com/hashicorp/go-memdb" + globalState "github.com/hashicorp/terraform-ls/internal/state" +) + +const ( + searchTableName = "search" +) + +var dbSchema = &memdb.DBSchema{ + Tables: map[string]*memdb.TableSchema{ + searchTableName: { + Name: searchTableName, + Indexes: map[string]*memdb.IndexSchema{ + "id": { + Name: "id", + Unique: true, + Indexer: &memdb.StringFieldIndex{Field: "path"}, + }, + }, + }, + }, +} + +func NewSearchStore(changeStore *globalState.ChangeStore, providerSchemasStore *globalState.ProviderSchemaStore) (*SearchStore, error) { + db, err := memdb.NewMemDB(dbSchema) + if err != nil { + return nil, err + } + + discardLogger := log.New(io.Discard, "", 0) + + return &SearchStore{ + db: db, + tableName: searchTableName, + logger: discardLogger, + changeStore: changeStore, + providerSchemasStore: providerSchemasStore, + }, nil +} diff --git a/internal/features/search/state/search_meta.go b/internal/features/search/state/search_meta.go new file mode 100644 index 00000000..ce5abf05 --- /dev/null +++ b/internal/features/search/state/search_meta.go @@ -0,0 +1,61 @@ +// Copyright (c) HashiCorp, Inc. +// SPDX-License-Identifier: MPL-2.0 + +package state + +import ( + "github.com/hashicorp/go-version" + tfaddr "github.com/hashicorp/terraform-registry-address" + tfsearch "github.com/hashicorp/terraform-schema/search" +) + +// SearchMetadata contains the result of the early decoding of a Search, +// it will be used obtain the correct provider and related module schemas +type SearchMetadata struct { + CoreRequirements version.Constraints + Filenames []string + + Lists map[string]tfsearch.List + Variables map[string]tfsearch.Variable + + ProviderReferences map[tfsearch.ProviderRef]tfaddr.Provider + ProviderRequirements tfsearch.ProviderRequirements +} + +func (sm SearchMetadata) Copy() SearchMetadata { + newSm := SearchMetadata{ + CoreRequirements: sm.CoreRequirements, + Filenames: sm.Filenames, + } + + if sm.Lists != nil { + newSm.Lists = make(map[string]tfsearch.List, len(sm.Lists)) + for k, v := range sm.Lists { + newSm.Lists[k] = v + } + } + + if sm.Variables != nil { + newSm.Variables = make(map[string]tfsearch.Variable, len(sm.Variables)) + for k, v := range sm.Variables { + newSm.Variables[k] = v + } + } + + if sm.ProviderReferences != nil { + newSm.ProviderReferences = make(map[tfsearch.ProviderRef]tfaddr.Provider, len(sm.ProviderReferences)) + for ref, provider := range sm.ProviderReferences { + newSm.ProviderReferences[ref] = provider + } + } + + if sm.ProviderRequirements != nil { + newSm.ProviderRequirements = make(tfsearch.ProviderRequirements, len(sm.ProviderRequirements)) + for provider, vc := range sm.ProviderRequirements { + // version.Constraints is never mutated in this context + newSm.ProviderRequirements[provider] = vc + } + } + + return newSm +} diff --git a/internal/features/search/state/search_record.go b/internal/features/search/state/search_record.go new file mode 100644 index 00000000..6144f958 --- /dev/null +++ b/internal/features/search/state/search_record.go @@ -0,0 +1,110 @@ +// Copyright (c) HashiCorp, Inc. +// SPDX-License-Identifier: MPL-2.0 + +package state + +import ( + "github.com/hashicorp/hcl-lang/reference" + "github.com/hashicorp/hcl/v2" + "github.com/hashicorp/terraform-ls/internal/features/search/ast" + globalAst "github.com/hashicorp/terraform-ls/internal/terraform/ast" + "github.com/hashicorp/terraform-ls/internal/terraform/module/operation" +) + +// SearchRecord represents a single search in the state +// /some/path/lambda-multi-account-search +type SearchRecord struct { + path string + + // PreloadEmbeddedSchemaState tracks if we tried loading all provider + // schemas from our embedded schema data + PreloadEmbeddedSchemaState operation.OpState + + Meta SearchMetadata + MetaErr error + MetaState operation.OpState + + // ParsedFiles is a map of all the parsed files for the search, + // including Search files. + ParsedFiles ast.Files + ParsingErr error + Diagnostics ast.SourceDiagnostics + DiagnosticsState globalAst.DiagnosticSourceState + + RefTargets reference.Targets + RefTargetsErr error + RefTargetsState operation.OpState + + RefOrigins reference.Origins + RefOriginsErr error + RefOriginsState operation.OpState +} + +func (m *SearchRecord) Path() string { + return m.path +} + +func (m *SearchRecord) Copy() *SearchRecord { + if m == nil { + return nil + } + + newRecord := &SearchRecord{ + path: m.path, + + PreloadEmbeddedSchemaState: m.PreloadEmbeddedSchemaState, + + Meta: m.Meta.Copy(), + MetaErr: m.MetaErr, + MetaState: m.MetaState, + ParsingErr: m.ParsingErr, + DiagnosticsState: m.DiagnosticsState.Copy(), + + RefTargets: m.RefTargets.Copy(), + RefTargetsErr: m.RefTargetsErr, + RefTargetsState: m.RefTargetsState, + + RefOrigins: m.RefOrigins.Copy(), + RefOriginsErr: m.RefOriginsErr, + RefOriginsState: m.RefOriginsState, + } + + if m.ParsedFiles != nil { + newRecord.ParsedFiles = make(ast.Files, len(m.ParsedFiles)) + for name, f := range m.ParsedFiles { + // hcl.File is practically immutable once it comes out of parser + newRecord.ParsedFiles[name] = f + } + } + + if m.Diagnostics != nil { + newRecord.Diagnostics = make(ast.SourceDiagnostics, len(m.Diagnostics)) + + for source, searchDiags := range m.Diagnostics { + newRecord.Diagnostics[source] = make(ast.Diagnostics, len(searchDiags)) + + for name, diags := range searchDiags { + newRecord.Diagnostics[source][name] = make(hcl.Diagnostics, len(diags)) + copy(newRecord.Diagnostics[source][name], diags) + } + } + } + + return newRecord +} + +func newSearch(searchPath string) *SearchRecord { + return &SearchRecord{ + path: searchPath, + PreloadEmbeddedSchemaState: operation.OpStateUnknown, + RefOriginsState: operation.OpStateUnknown, + RefTargetsState: operation.OpStateUnknown, + MetaState: operation.OpStateUnknown, + DiagnosticsState: globalAst.DiagnosticSourceState{ + globalAst.HCLParsingSource: operation.OpStateUnknown, + globalAst.SchemaValidationSource: operation.OpStateUnknown, + globalAst.ReferenceValidationSource: operation.OpStateUnknown, + globalAst.TerraformValidateSource: operation.OpStateUnknown, + }, + } +} diff --git a/internal/features/search/state/search_store.go b/internal/features/search/state/search_store.go new file mode 100644 index 00000000..218a2bf6 --- /dev/null +++ b/internal/features/search/state/search_store.go @@ -0,0 +1,471 @@ +// Copyright (c) HashiCorp, Inc. +// SPDX-License-Identifier: MPL-2.0 + +package state + +import ( + "log" + + "github.com/hashicorp/go-memdb" + "github.com/hashicorp/go-version" + "github.com/hashicorp/hcl-lang/reference" + "github.com/hashicorp/terraform-ls/internal/document" + "github.com/hashicorp/terraform-ls/internal/features/search/ast" + globalState "github.com/hashicorp/terraform-ls/internal/state" + globalAst "github.com/hashicorp/terraform-ls/internal/terraform/ast" + "github.com/hashicorp/terraform-ls/internal/terraform/module/operation" + tfaddr "github.com/hashicorp/terraform-registry-address" + tfschema "github.com/hashicorp/terraform-schema/schema" + tfsearch "github.com/hashicorp/terraform-schema/search" +) + +type SearchStore struct { + db *memdb.MemDB + tableName string + logger *log.Logger + + changeStore *globalState.ChangeStore + providerSchemasStore *globalState.ProviderSchemaStore +} + +func (s *SearchStore) SetLogger(logger *log.Logger) { + s.logger = logger +} + +func (s *SearchStore) Add(searchPath string) error { + txn := s.db.Txn(true) + defer txn.Abort() + + err := s.add(txn, searchPath) + if err != nil { + return err + } + txn.Commit() + + return nil +} + +func (s *SearchStore) Remove(searchPath string) error { + txn := s.db.Txn(true) + defer txn.Abort() + + oldObj, err := txn.First(s.tableName, "id", searchPath) + if err != nil { + return err + } + + if oldObj == nil { + // already removed + return nil + } + + oldRecord := oldObj.(*SearchRecord) + err = s.queueRecordChange(oldRecord, nil) + if err != nil { + return err + } + + _, err = txn.DeleteAll(s.tableName, "id", searchPath) + if err != nil { + return err + } + + txn.Commit() + return nil +} + +func (s *SearchStore) List() ([]*SearchRecord, error) { + txn := s.db.Txn(false) + + it, err := txn.Get(s.tableName, "id") + if err != nil { + return nil, err + } + + searchRecords := make([]*SearchRecord, 0) + for item := it.Next(); item != nil; item = it.Next() { + search := item.(*SearchRecord) + searchRecords = append(searchRecords, search) + } + + return searchRecords, nil +} + +func (s *SearchStore) SearchRecordByPath(path string) (*SearchRecord, error) { + txn := s.db.Txn(false) + + mod, err := searchByPath(txn, path) + if err != nil { + return nil, err + } + + return mod, nil +} + +func (s *SearchStore) Exists(path string) bool { + txn := s.db.Txn(false) + + obj, err := txn.First(s.tableName, "id", path) + if err != nil { + return false + } + + return obj != nil +} + +func (s *SearchStore) AddIfNotExists(path string) error { + txn := s.db.Txn(true) + defer txn.Abort() + + _, err := searchByPath(txn, path) + if err == nil { + return nil + } + + if globalState.IsRecordNotFound(err) { + err := s.add(txn, path) + if err != nil { + return err + } + + txn.Commit() + return nil + } + + return err +} + +func (s *SearchStore) SetDiagnosticsState(path string, source globalAst.DiagnosticSource, state operation.OpState) error { + txn := s.db.Txn(true) + defer txn.Abort() + + record, err := searchCopyByPath(txn, path) + if err != nil { + return err + } + record.DiagnosticsState[source] = state + + err = txn.Insert(s.tableName, record) + if err != nil { + return err + } + + txn.Commit() + return nil +} + +func (s *SearchStore) UpdateParsedFiles(path string, pFiles ast.Files, pErr error) error { + txn := s.db.Txn(true) + defer txn.Abort() + + mod, err := searchCopyByPath(txn, path) + if err != nil { + return err + } + + mod.ParsedFiles = pFiles + + mod.ParsingErr = pErr + + err = txn.Insert(s.tableName, mod) + if err != nil { + return err + } + + txn.Commit() + return nil +} + +func (s *SearchStore) UpdateDiagnostics(path string, source globalAst.DiagnosticSource, diags ast.Diagnostics) error { + txn := s.db.Txn(true) + txn.Defer(func() { + s.SetDiagnosticsState(path, source, operation.OpStateLoaded) + }) + defer txn.Abort() + + oldMod, err := searchByPath(txn, path) + if err != nil { + return err + } + + mod := oldMod.Copy() + if mod.Diagnostics == nil { + mod.Diagnostics = make(ast.SourceDiagnostics) + } + mod.Diagnostics[source] = diags + + err = txn.Insert(s.tableName, mod) + if err != nil { + return err + } + + err = s.queueRecordChange(oldMod, mod) + if err != nil { + return err + } + + txn.Commit() + return nil +} + +func (s *SearchStore) SetMetaState(path string, state operation.OpState) error { + txn := s.db.Txn(true) + defer txn.Abort() + + search, err := searchCopyByPath(txn, path) + if err != nil { + return err + } + + search.MetaState = state + err = txn.Insert(s.tableName, search) + if err != nil { + return err + } + + txn.Commit() + return nil +} + +func (s *SearchStore) UpdateMetadata(path string, meta *tfsearch.Meta, mErr error) error { + txn := s.db.Txn(true) + txn.Defer(func() { + s.SetMetaState(path, operation.OpStateLoaded) + }) + defer txn.Abort() + + oldRecord, err := searchByPath(txn, path) + if err != nil { + return err + } + + record := oldRecord.Copy() + record.Meta = SearchMetadata{ + Lists: meta.Lists, + Variables: meta.Variables, + Filenames: meta.Filenames, + ProviderReferences: meta.ProviderReferences, + ProviderRequirements: meta.ProviderRequirements, + CoreRequirements: meta.CoreRequirements, + } + record.MetaErr = mErr + + err = txn.Insert(s.tableName, record) + if err != nil { + return err + } + + err = s.queueRecordChange(oldRecord, record) + if err != nil { + return err + } + + txn.Commit() + return nil +} + +func (s *SearchStore) SetPreloadEmbeddedSchemaState(path string, state operation.OpState) error { + txn := s.db.Txn(true) + defer txn.Abort() + + record, err := searchCopyByPath(txn, path) + if err != nil { + return err + } + + record.PreloadEmbeddedSchemaState = state + err = txn.Insert(s.tableName, record) + if err != nil { + return err + } + + txn.Commit() + return nil +} + +func (s *SearchStore) SetReferenceTargetsState(path string, state operation.OpState) error { + txn := s.db.Txn(true) + defer txn.Abort() + + record, err := searchCopyByPath(txn, path) + if err != nil { + return err + } + + record.RefTargetsState = state + err = txn.Insert(s.tableName, record) + if err != nil { + return err + } + + txn.Commit() + return nil +} + +func (s *SearchStore) UpdateReferenceTargets(path string, refs reference.Targets, rErr error) error { + txn := s.db.Txn(true) + txn.Defer(func() { + s.SetReferenceTargetsState(path, operation.OpStateLoaded) + }) + defer txn.Abort() + + record, err := searchCopyByPath(txn, path) + if err != nil { + return err + } + + record.RefTargets = refs + record.RefTargetsErr = rErr + + err = txn.Insert(s.tableName, record) + if err != nil { + return err + } + + txn.Commit() + return nil +} + +func (s *SearchStore) SetReferenceOriginsState(path string, state operation.OpState) error { + txn := s.db.Txn(true) + defer txn.Abort() + + search, err := searchCopyByPath(txn, path) + if err != nil { + return err + } + + search.RefOriginsState = state + err = txn.Insert(s.tableName, search) + if err != nil { + return err + } + + txn.Commit() + return nil +} + +func (s *SearchStore) UpdateReferenceOrigins(path string, origins reference.Origins, roErr error) error { + txn := s.db.Txn(true) + txn.Defer(func() { + s.SetReferenceOriginsState(path, operation.OpStateLoaded) + }) + defer txn.Abort() + + search, err := searchCopyByPath(txn, path) + if err != nil { + return err + } + + search.RefOrigins = origins + search.RefOriginsErr = roErr + + err = txn.Insert(s.tableName, search) + if err != nil { + return err + } + + txn.Commit() + return nil +} + +func (s *SearchStore) add(txn *memdb.Txn, searchPath string) error { + // TODO: Introduce Exists method to Txn? + obj, err := txn.First(s.tableName, "id", searchPath) + if err != nil { + return err + } + if obj != nil { + return &globalState.AlreadyExistsError{ + Idx: searchPath, + } + } + + record := newSearch(searchPath) + err = txn.Insert(s.tableName, record) + if err != nil { + return err + } + + err = s.queueRecordChange(nil, record) + if err != nil { + return err + } + + return nil +} + +func searchByPath(txn *memdb.Txn, path string) (*SearchRecord, error) { + obj, err := txn.First(searchTableName, "id", path) + if err != nil { + return nil, err + } + if obj == nil { + return nil, &globalState.RecordNotFoundError{ + Source: path, + } + } + return obj.(*SearchRecord), nil +} + +func searchCopyByPath(txn *memdb.Txn, path string) (*SearchRecord, error) { + record, err := searchByPath(txn, path) + if err != nil { + return nil, err + } + + return record.Copy(), nil +} + +func (s *SearchStore) queueRecordChange(oldRecord, newRecord *SearchRecord) error { + changes := globalState.Changes{} + + oldDiags, newDiags := 0, 0 + if oldRecord != nil { + oldDiags = oldRecord.Diagnostics.Count() + } + if newRecord != nil { + newDiags = newRecord.Diagnostics.Count() + } + // Comparing diagnostics accurately could be expensive + // so we just treat any non-empty diags as a change + if oldDiags > 0 || newDiags > 0 { + changes.Diagnostics = true + } + + var dir document.DirHandle + if oldRecord != nil { + dir = document.DirHandleFromPath(oldRecord.Path()) + } else { + dir = document.DirHandleFromPath(newRecord.Path()) + } + + return s.changeStore.QueueChange(dir, changes) +} + +func (s *SearchStore) ProviderSchema(modPath string, addr tfaddr.Provider, vc version.Constraints) (*tfschema.ProviderSchema, error) { + return s.providerSchemasStore.ProviderSchema(modPath, addr, vc) +} + +func (s *SearchStore) ProviderRequirementsForModule(modPath string) (tfsearch.ProviderRequirements, error) { + return s.providerRequirementsForModule(modPath, 0) +} + +func (s *SearchStore) providerRequirementsForModule(searchPath string, level int) (tfsearch.ProviderRequirements, error) { + mod, err := s.SearchRecordByPath(searchPath) + if err != nil { + // It's possible that the configuration contains a module with an + // invalid local source, so we just ignore it if it can't be found. + // This allows us to still return provider requirements for other modules + return tfsearch.ProviderRequirements{}, nil + } + + level++ + + requirements := make(tfsearch.ProviderRequirements, 0) + for k, v := range mod.Meta.ProviderRequirements { + requirements[k] = v + } + + return requirements, nil +} diff --git a/internal/features/search/state/search_store_test.go b/internal/features/search/state/search_store_test.go new file mode 100644 index 00000000..e7b117b3 --- /dev/null +++ b/internal/features/search/state/search_store_test.go @@ -0,0 +1,776 @@ +// Copyright (c) HashiCorp, Inc. +// SPDX-License-Identifier: MPL-2.0 + +package state + +import ( + "errors" + "path/filepath" + "testing" + + "github.com/google/go-cmp/cmp" + "github.com/hashicorp/go-version" + "github.com/hashicorp/hcl-lang/reference" + "github.com/hashicorp/hcl/v2" + "github.com/hashicorp/hcl/v2/hclparse" + "github.com/hashicorp/hcl/v2/hclsyntax" + "github.com/hashicorp/terraform-ls/internal/features/search/ast" + globalState "github.com/hashicorp/terraform-ls/internal/state" + globalAst "github.com/hashicorp/terraform-ls/internal/terraform/ast" + "github.com/hashicorp/terraform-ls/internal/terraform/module/operation" + tfaddr "github.com/hashicorp/terraform-registry-address" + tfsearch "github.com/hashicorp/terraform-schema/search" + "github.com/zclconf/go-cty-debug/ctydebug" +) + +var cmpOpts = cmp.Options{ + cmp.AllowUnexported(SearchRecord{}), + cmp.AllowUnexported(hclsyntax.Body{}), + cmp.Comparer(func(x, y version.Constraint) bool { + return x.String() == y.String() + }), + cmp.Comparer(func(x, y hcl.File) bool { + return (x.Body == y.Body && + cmp.Equal(x.Bytes, y.Bytes)) + }), + ctydebug.CmpOptions, +} + +func TestSearchStore_Add_duplicate(t *testing.T) { + globalStore, err := globalState.NewStateStore() + if err != nil { + t.Fatal(err) + } + s, err := NewSearchStore(globalStore.ChangeStore, globalStore.ProviderSchemas) + if err != nil { + t.Fatal(err) + } + + searchPath := t.TempDir() + + err = s.Add(searchPath) + if err != nil { + t.Fatal(err) + } + + err = s.Add(searchPath) + if err == nil { + t.Fatal("expected error for duplicate entry") + } + existsError := &globalState.AlreadyExistsError{} + if !errors.As(err, &existsError) { + t.Fatalf("unexpected error: %s", err) + } +} + +func TestSearchStore_SearchRecordByPath(t *testing.T) { + globalStore, err := globalState.NewStateStore() + if err != nil { + t.Fatal(err) + } + s, err := NewSearchStore(globalStore.ChangeStore, globalStore.ProviderSchemas) + if err != nil { + t.Fatal(err) + } + + searchPath := t.TempDir() + + err = s.Add(searchPath) + if err != nil { + t.Fatal(err) + } + + record, err := s.SearchRecordByPath(searchPath) + if err != nil { + t.Fatal(err) + } + + expectedRecord := &SearchRecord{ + path: searchPath, + PreloadEmbeddedSchemaState: operation.OpStateUnknown, + RefOriginsState: operation.OpStateUnknown, + RefTargetsState: operation.OpStateUnknown, + MetaState: operation.OpStateUnknown, + DiagnosticsState: globalAst.DiagnosticSourceState{ + globalAst.HCLParsingSource: operation.OpStateUnknown, + globalAst.SchemaValidationSource: operation.OpStateUnknown, + globalAst.ReferenceValidationSource: operation.OpStateUnknown, + globalAst.TerraformValidateSource: operation.OpStateUnknown, + }, + } + if diff := cmp.Diff(expectedRecord, record, cmpOpts); diff != "" { + t.Fatalf("unexpected record: %s", diff) + } +} + +func TestSearchStore_List(t *testing.T) { + globalStore, err := globalState.NewStateStore() + if err != nil { + t.Fatal(err) + } + s, err := NewSearchStore(globalStore.ChangeStore, globalStore.ProviderSchemas) + if err != nil { + t.Fatal(err) + } + + tmpDir := t.TempDir() + + searchPaths := []string{ + filepath.Join(tmpDir, "alpha"), + filepath.Join(tmpDir, "beta"), + filepath.Join(tmpDir, "gamma"), + } + for _, searchPath := range searchPaths { + err := s.Add(searchPath) + if err != nil { + t.Fatal(err) + } + } + + searches, err := s.List() + if err != nil { + t.Fatal(err) + } + + expectedRecords := []*SearchRecord{ + { + path: filepath.Join(tmpDir, "alpha"), + PreloadEmbeddedSchemaState: operation.OpStateUnknown, + RefOriginsState: operation.OpStateUnknown, + RefTargetsState: operation.OpStateUnknown, + MetaState: operation.OpStateUnknown, + DiagnosticsState: globalAst.DiagnosticSourceState{ + globalAst.HCLParsingSource: operation.OpStateUnknown, + globalAst.SchemaValidationSource: operation.OpStateUnknown, + globalAst.ReferenceValidationSource: operation.OpStateUnknown, + globalAst.TerraformValidateSource: operation.OpStateUnknown, + }, + }, + { + path: filepath.Join(tmpDir, "beta"), + PreloadEmbeddedSchemaState: operation.OpStateUnknown, + RefOriginsState: operation.OpStateUnknown, + RefTargetsState: operation.OpStateUnknown, + MetaState: operation.OpStateUnknown, + DiagnosticsState: globalAst.DiagnosticSourceState{ + globalAst.HCLParsingSource: operation.OpStateUnknown, + globalAst.SchemaValidationSource: operation.OpStateUnknown, + globalAst.ReferenceValidationSource: operation.OpStateUnknown, + globalAst.TerraformValidateSource: operation.OpStateUnknown, + }, + }, + { + path: filepath.Join(tmpDir, "gamma"), + PreloadEmbeddedSchemaState: operation.OpStateUnknown, + RefOriginsState: operation.OpStateUnknown, + RefTargetsState: operation.OpStateUnknown, + MetaState: operation.OpStateUnknown, + DiagnosticsState: globalAst.DiagnosticSourceState{ + globalAst.HCLParsingSource: operation.OpStateUnknown, + globalAst.SchemaValidationSource: operation.OpStateUnknown, + globalAst.ReferenceValidationSource: operation.OpStateUnknown, + globalAst.TerraformValidateSource: operation.OpStateUnknown, + }, + }, + } + + if diff := cmp.Diff(expectedRecords, searches, cmpOpts); diff != "" { + t.Fatalf("unexpected records: %s", diff) + } +} + +func TestSearchStore_Remove(t *testing.T) { + globalStore, err := globalState.NewStateStore() + if err != nil { + t.Fatal(err) + } + s, err := NewSearchStore(globalStore.ChangeStore, globalStore.ProviderSchemas) + if err != nil { + t.Fatal(err) + } + + searchPath := t.TempDir() + + err = s.Add(searchPath) + if err != nil { + t.Fatal(err) + } + + // Verify it exists + if !s.Exists(searchPath) { + t.Fatal("expected search to exist before removal") + } + + err = s.Remove(searchPath) + if err != nil { + t.Fatal(err) + } + + // Verify it's removed + if s.Exists(searchPath) { + t.Fatal("expected search to be removed") + } + + // Removing again should not error + err = s.Remove(searchPath) + if err != nil { + t.Fatal(err) + } +} + +func TestSearchStore_Exists(t *testing.T) { + globalStore, err := globalState.NewStateStore() + if err != nil { + t.Fatal(err) + } + s, err := NewSearchStore(globalStore.ChangeStore, globalStore.ProviderSchemas) + if err != nil { + t.Fatal(err) + } + + searchPath := t.TempDir() + + // Should not exist initially + if s.Exists(searchPath) { + t.Fatal("expected search to not exist initially") + } + + err = s.Add(searchPath) + if err != nil { + t.Fatal(err) + } + + // Should exist after adding + if !s.Exists(searchPath) { + t.Fatal("expected search to exist after adding") + } +} + +func TestSearchStore_AddIfNotExists(t *testing.T) { + globalStore, err := globalState.NewStateStore() + if err != nil { + t.Fatal(err) + } + s, err := NewSearchStore(globalStore.ChangeStore, globalStore.ProviderSchemas) + if err != nil { + t.Fatal(err) + } + + searchPath := t.TempDir() + + // Should add if not exists + err = s.AddIfNotExists(searchPath) + if err != nil { + t.Fatal(err) + } + + if !s.Exists(searchPath) { + t.Fatal("expected search to exist after AddIfNotExists") + } + + // Should not error if already exists + err = s.AddIfNotExists(searchPath) + if err != nil { + t.Fatal(err) + } +} + +func TestSearchStore_UpdateMetadata(t *testing.T) { + globalStore, err := globalState.NewStateStore() + if err != nil { + t.Fatal(err) + } + s, err := NewSearchStore(globalStore.ChangeStore, globalStore.ProviderSchemas) + if err != nil { + t.Fatal(err) + } + + tmpDir := t.TempDir() + + metadata := &tfsearch.Meta{ + Lists: map[string]tfsearch.List{ + "my_list": {}, + }, + Variables: map[string]tfsearch.Variable{ + "my_var": {}, + }, + Filenames: []string{"test.tfquery.hcl"}, + ProviderReferences: map[tfsearch.ProviderRef]tfaddr.Provider{ + {LocalName: "aws"}: tfaddr.MustParseProviderSource("hashicorp/aws"), + }, + ProviderRequirements: map[tfaddr.Provider]version.Constraints{ + tfaddr.MustParseProviderSource("hashicorp/aws"): testConstraint(t, "~> 5.7.0"), + }, + CoreRequirements: testConstraint(t, ">= 1.0"), + } + + err = s.Add(tmpDir) + if err != nil { + t.Fatal(err) + } + + err = s.UpdateMetadata(tmpDir, metadata, nil) + if err != nil { + t.Fatal(err) + } + + record, err := s.SearchRecordByPath(tmpDir) + if err != nil { + t.Fatal(err) + } + + expectedRecord := &SearchRecord{ + path: tmpDir, + PreloadEmbeddedSchemaState: operation.OpStateUnknown, + RefOriginsState: operation.OpStateUnknown, + RefTargetsState: operation.OpStateUnknown, + Meta: SearchMetadata{ + Lists: map[string]tfsearch.List{ + "my_list": {}, + }, + Variables: map[string]tfsearch.Variable{ + "my_var": {}, + }, + Filenames: []string{"test.tfquery.hcl"}, + ProviderReferences: map[tfsearch.ProviderRef]tfaddr.Provider{ + {LocalName: "aws"}: tfaddr.MustParseProviderSource("hashicorp/aws"), + }, + ProviderRequirements: map[tfaddr.Provider]version.Constraints{ + tfaddr.MustParseProviderSource("hashicorp/aws"): testConstraint(t, "~> 5.7.0"), + }, + CoreRequirements: testConstraint(t, ">= 1.0"), + }, + MetaState: operation.OpStateLoaded, + DiagnosticsState: globalAst.DiagnosticSourceState{ + globalAst.HCLParsingSource: operation.OpStateUnknown, + globalAst.SchemaValidationSource: operation.OpStateUnknown, + globalAst.ReferenceValidationSource: operation.OpStateUnknown, + globalAst.TerraformValidateSource: operation.OpStateUnknown, + }, + } + + if diff := cmp.Diff(expectedRecord, record, cmpOpts); diff != "" { + t.Fatalf("unexpected record data: %s", diff) + } +} + +func TestSearchStore_UpdateParsedFiles(t *testing.T) { + globalStore, err := globalState.NewStateStore() + if err != nil { + t.Fatal(err) + } + s, err := NewSearchStore(globalStore.ChangeStore, globalStore.ProviderSchemas) + if err != nil { + t.Fatal(err) + } + + tmpDir := t.TempDir() + err = s.Add(tmpDir) + if err != nil { + t.Fatal(err) + } + + p := hclparse.NewParser() + testFile, diags := p.ParseHCL([]byte(` +variable "test_var" { + type = string +} + +list "resource_search" "main" { + limit = 10 + include_resource = var.test_var +} +`), "test.tfquery.hcl") + if len(diags) > 0 { + t.Fatal(diags) + } + + err = s.UpdateParsedFiles(tmpDir, ast.Files{ + ast.SearchFilename("test.tfquery.hcl"): testFile, + }, nil) + if err != nil { + t.Fatal(err) + } + + record, err := s.SearchRecordByPath(tmpDir) + if err != nil { + t.Fatal(err) + } + + expectedParsedFiles := ast.Files{ + ast.SearchFilename("test.tfquery.hcl"): testFile, + } + if diff := cmp.Diff(expectedParsedFiles, record.ParsedFiles, cmpOpts); diff != "" { + t.Fatalf("unexpected parsed files: %s", diff) + } +} + +func TestSearchStore_UpdateDiagnostics(t *testing.T) { + globalStore, err := globalState.NewStateStore() + if err != nil { + t.Fatal(err) + } + s, err := NewSearchStore(globalStore.ChangeStore, globalStore.ProviderSchemas) + if err != nil { + t.Fatal(err) + } + + tmpDir := t.TempDir() + err = s.Add(tmpDir) + if err != nil { + t.Fatal(err) + } + + p := hclparse.NewParser() + _, diags := p.ParseHCL([]byte(` +variable "test_var" { + type = string +`), "test.tfquery.hcl") + + err = s.UpdateDiagnostics(tmpDir, globalAst.HCLParsingSource, ast.DiagnosticsFromMap(map[string]hcl.Diagnostics{ + "test.tfquery.hcl": diags, + })) + if err != nil { + t.Fatal(err) + } + + record, err := s.SearchRecordByPath(tmpDir) + if err != nil { + t.Fatal(err) + } + + expectedDiags := ast.SourceDiagnostics{ + globalAst.HCLParsingSource: ast.DiagnosticsFromMap(map[string]hcl.Diagnostics{ + "test.tfquery.hcl": { + { + Severity: hcl.DiagError, + Summary: "Unclosed configuration block", + Detail: "There is no closing brace for this block before the end of the file. This may be caused by incorrect brace nesting elsewhere in this file.", + Subject: &hcl.Range{ + Filename: "test.tfquery.hcl", + Start: hcl.Pos{ + Line: 2, + Column: 21, + Byte: 21, + }, + End: hcl.Pos{ + Line: 2, + Column: 22, + Byte: 22, + }, + }, + }, + }, + }), + } + if diff := cmp.Diff(expectedDiags, record.Diagnostics, cmpOpts); diff != "" { + t.Fatalf("unexpected diagnostics: %s", diff) + } +} + +func TestSearchStore_SetDiagnosticsState(t *testing.T) { + globalStore, err := globalState.NewStateStore() + if err != nil { + t.Fatal(err) + } + s, err := NewSearchStore(globalStore.ChangeStore, globalStore.ProviderSchemas) + if err != nil { + t.Fatal(err) + } + + tmpDir := t.TempDir() + err = s.Add(tmpDir) + if err != nil { + t.Fatal(err) + } + + err = s.SetDiagnosticsState(tmpDir, globalAst.HCLParsingSource, operation.OpStateLoaded) + if err != nil { + t.Fatal(err) + } + + record, err := s.SearchRecordByPath(tmpDir) + if err != nil { + t.Fatal(err) + } + + if record.DiagnosticsState[globalAst.HCLParsingSource] != operation.OpStateLoaded { + t.Fatalf("expected HCLParsingSource state to be OpStateLoaded, got %v", record.DiagnosticsState[globalAst.HCLParsingSource]) + } +} + +func TestSearchStore_SetMetaState(t *testing.T) { + globalStore, err := globalState.NewStateStore() + if err != nil { + t.Fatal(err) + } + s, err := NewSearchStore(globalStore.ChangeStore, globalStore.ProviderSchemas) + if err != nil { + t.Fatal(err) + } + + tmpDir := t.TempDir() + err = s.Add(tmpDir) + if err != nil { + t.Fatal(err) + } + + err = s.SetMetaState(tmpDir, operation.OpStateLoaded) + if err != nil { + t.Fatal(err) + } + + record, err := s.SearchRecordByPath(tmpDir) + if err != nil { + t.Fatal(err) + } + + if record.MetaState != operation.OpStateLoaded { + t.Fatalf("expected MetaState to be OpStateLoaded, got %v", record.MetaState) + } +} + +func TestSearchStore_SetPreloadEmbeddedSchemaState(t *testing.T) { + globalStore, err := globalState.NewStateStore() + if err != nil { + t.Fatal(err) + } + s, err := NewSearchStore(globalStore.ChangeStore, globalStore.ProviderSchemas) + if err != nil { + t.Fatal(err) + } + + tmpDir := t.TempDir() + err = s.Add(tmpDir) + if err != nil { + t.Fatal(err) + } + + err = s.SetPreloadEmbeddedSchemaState(tmpDir, operation.OpStateLoaded) + if err != nil { + t.Fatal(err) + } + + record, err := s.SearchRecordByPath(tmpDir) + if err != nil { + t.Fatal(err) + } + + if record.PreloadEmbeddedSchemaState != operation.OpStateLoaded { + t.Fatalf("expected PreloadEmbeddedSchemaState to be OpStateLoaded, got %v", record.PreloadEmbeddedSchemaState) + } +} + +func TestSearchStore_SetReferenceTargetsState(t *testing.T) { + globalStore, err := globalState.NewStateStore() + if err != nil { + t.Fatal(err) + } + s, err := NewSearchStore(globalStore.ChangeStore, globalStore.ProviderSchemas) + if err != nil { + t.Fatal(err) + } + + tmpDir := t.TempDir() + err = s.Add(tmpDir) + if err != nil { + t.Fatal(err) + } + + err = s.SetReferenceTargetsState(tmpDir, operation.OpStateLoaded) + if err != nil { + t.Fatal(err) + } + + record, err := s.SearchRecordByPath(tmpDir) + if err != nil { + t.Fatal(err) + } + + if record.RefTargetsState != operation.OpStateLoaded { + t.Fatalf("expected RefTargetsState to be OpStateLoaded, got %v", record.RefTargetsState) + } +} + +func TestSearchStore_UpdateReferenceTargets(t *testing.T) { + globalStore, err := globalState.NewStateStore() + if err != nil { + t.Fatal(err) + } + s, err := NewSearchStore(globalStore.ChangeStore, globalStore.ProviderSchemas) + if err != nil { + t.Fatal(err) + } + + tmpDir := t.TempDir() + err = s.Add(tmpDir) + if err != nil { + t.Fatal(err) + } + + // Test with empty targets - the actual structure would depend on the reference package + targets := make(reference.Targets, 0) + + err = s.UpdateReferenceTargets(tmpDir, targets, nil) + if err != nil { + t.Fatal(err) + } + + record, err := s.SearchRecordByPath(tmpDir) + if err != nil { + t.Fatal(err) + } + + if len(record.RefTargets) != 0 { + t.Fatalf("expected 0 reference targets, got %d", len(record.RefTargets)) + } + + if record.RefTargetsState != operation.OpStateLoaded { + t.Fatalf("expected RefTargetsState to be OpStateLoaded, got %v", record.RefTargetsState) + } +} + +func TestSearchStore_SetReferenceOriginsState(t *testing.T) { + globalStore, err := globalState.NewStateStore() + if err != nil { + t.Fatal(err) + } + s, err := NewSearchStore(globalStore.ChangeStore, globalStore.ProviderSchemas) + if err != nil { + t.Fatal(err) + } + + tmpDir := t.TempDir() + err = s.Add(tmpDir) + if err != nil { + t.Fatal(err) + } + + err = s.SetReferenceOriginsState(tmpDir, operation.OpStateLoaded) + if err != nil { + t.Fatal(err) + } + + record, err := s.SearchRecordByPath(tmpDir) + if err != nil { + t.Fatal(err) + } + + if record.RefOriginsState != operation.OpStateLoaded { + t.Fatalf("expected RefOriginsState to be OpStateLoaded, got %v", record.RefOriginsState) + } +} + +func TestSearchStore_UpdateReferenceOrigins(t *testing.T) { + globalStore, err := globalState.NewStateStore() + if err != nil { + t.Fatal(err) + } + s, err := NewSearchStore(globalStore.ChangeStore, globalStore.ProviderSchemas) + if err != nil { + t.Fatal(err) + } + + tmpDir := t.TempDir() + err = s.Add(tmpDir) + if err != nil { + t.Fatal(err) + } + + // Test with empty origins - the actual structure would depend on the reference package + origins := make(reference.Origins, 0) + + err = s.UpdateReferenceOrigins(tmpDir, origins, nil) + if err != nil { + t.Fatal(err) + } + + record, err := s.SearchRecordByPath(tmpDir) + if err != nil { + t.Fatal(err) + } + + if len(record.RefOrigins) != 0 { + t.Fatalf("expected 0 reference origins, got %d", len(record.RefOrigins)) + } + + if record.RefOriginsState != operation.OpStateLoaded { + t.Fatalf("expected RefOriginsState to be OpStateLoaded, got %v", record.RefOriginsState) + } +} + +func TestSearchStore_ProviderRequirementsForModule(t *testing.T) { + globalStore, err := globalState.NewStateStore() + if err != nil { + t.Fatal(err) + } + s, err := NewSearchStore(globalStore.ChangeStore, globalStore.ProviderSchemas) + if err != nil { + t.Fatal(err) + } + + tmpDir := t.TempDir() + err = s.Add(tmpDir) + if err != nil { + t.Fatal(err) + } + + // Update metadata with provider requirements + metadata := &tfsearch.Meta{ + ProviderRequirements: map[tfaddr.Provider]version.Constraints{ + tfaddr.MustParseProviderSource("hashicorp/aws"): testConstraint(t, "~> 5.7.0"), + tfaddr.MustParseProviderSource("hashicorp/random"): testConstraint(t, "~> 3.5.1"), + }, + } + + err = s.UpdateMetadata(tmpDir, metadata, nil) + if err != nil { + t.Fatal(err) + } + + requirements, err := s.ProviderRequirementsForModule(tmpDir) + if err != nil { + t.Fatal(err) + } + + expectedRequirements := tfsearch.ProviderRequirements{ + tfaddr.MustParseProviderSource("hashicorp/aws"): testConstraint(t, "~> 5.7.0"), + tfaddr.MustParseProviderSource("hashicorp/random"): testConstraint(t, "~> 3.5.1"), + } + + if diff := cmp.Diff(expectedRequirements, requirements, cmpOpts); diff != "" { + t.Fatalf("unexpected provider requirements: %s", diff) + } +} + +func TestSearchStore_ProviderRequirementsForModule_NotFound(t *testing.T) { + globalStore, err := globalState.NewStateStore() + if err != nil { + t.Fatal(err) + } + s, err := NewSearchStore(globalStore.ChangeStore, globalStore.ProviderSchemas) + if err != nil { + t.Fatal(err) + } + + // Don't add the module to the store + tmpDir := t.TempDir() + + requirements, err := s.ProviderRequirementsForModule(tmpDir) + if err != nil { + t.Fatal(err) + } + + // Should return empty requirements when module is not found + if len(requirements) != 0 { + t.Fatalf("expected empty provider requirements for non-existent module, got %d", len(requirements)) + } +} + +func testConstraint(t *testing.T, v string) version.Constraints { + constraints, err := version.NewConstraint(v) + if err != nil { + t.Fatal(err) + } + return constraints +} diff --git a/internal/langserver/handlers/did_change_watched_files_test.go b/internal/langserver/handlers/did_change_watched_files_test.go index 74b22f0d..a549756b 100644 --- a/internal/langserver/handlers/did_change_watched_files_test.go +++ b/internal/langserver/handlers/did_change_watched_files_test.go @@ -69,6 +69,8 @@ func TestLangServer_DidChangeWatchedFiles_change_file(t *testing.T) { defer features.Stacks.Stop() features.Tests.Start(ctx) defer features.Tests.Stop() + features.Search.Start(ctx) + defer features.Search.Stop() wc := walker.NewWalkerCollector() @@ -251,6 +253,8 @@ func TestLangServer_DidChangeWatchedFiles_create_file(t *testing.T) { defer features.Stacks.Stop() features.Tests.Start(ctx) defer features.Tests.Stop() + features.Search.Start(ctx) + defer features.Search.Stop() wc := walker.NewWalkerCollector() ls := langserver.NewLangServerMock(t, NewMockSession(&MockSessionInput{ @@ -392,6 +396,8 @@ func TestLangServer_DidChangeWatchedFiles_delete_file(t *testing.T) { defer features.Stacks.Stop() features.Tests.Start(ctx) defer features.Tests.Stop() + features.Search.Start(ctx) + defer features.Search.Stop() wc := walker.NewWalkerCollector() ls := langserver.NewLangServerMock(t, NewMockSession(&MockSessionInput{ @@ -528,6 +534,8 @@ func TestLangServer_DidChangeWatchedFiles_change_dir(t *testing.T) { defer features.Stacks.Stop() features.Tests.Start(ctx) defer features.Tests.Stop() + features.Search.Start(ctx) + defer features.Search.Stop() wc := walker.NewWalkerCollector() ls := langserver.NewLangServerMock(t, NewMockSession(&MockSessionInput{ @@ -671,6 +679,8 @@ func TestLangServer_DidChangeWatchedFiles_create_dir(t *testing.T) { defer features.Stacks.Stop() features.Tests.Start(ctx) defer features.Tests.Stop() + features.Search.Start(ctx) + defer features.Search.Stop() wc := walker.NewWalkerCollector() ls := langserver.NewLangServerMock(t, NewMockSession(&MockSessionInput{ @@ -811,6 +821,8 @@ func TestLangServer_DidChangeWatchedFiles_delete_dir(t *testing.T) { defer features.Stacks.Stop() features.Tests.Start(ctx) defer features.Tests.Stop() + features.Search.Start(ctx) + defer features.Search.Stop() wc := walker.NewWalkerCollector() ls := langserver.NewLangServerMock(t, NewMockSession(&MockSessionInput{ @@ -982,6 +994,9 @@ func TestLangServer_DidChangeWatchedFiles_pluginChange(t *testing.T) { defer features.Stacks.Stop() features.Tests.Start(ctx) defer features.Tests.Stop() + features.Search.Start(ctx) + defer features.Search.Stop() + wc := walker.NewWalkerCollector() ls := langserver.NewLangServerMock(t, NewMockSession(&MockSessionInput{ @@ -1088,6 +1103,9 @@ func TestLangServer_DidChangeWatchedFiles_moduleInstalled(t *testing.T) { defer features.Stacks.Stop() features.Tests.Start(ctx) defer features.Tests.Stop() + features.Search.Start(ctx) + defer features.Search.Stop() + wc := walker.NewWalkerCollector() ls := langserver.NewLangServerMock(t, NewMockSession(&MockSessionInput{ diff --git a/internal/langserver/handlers/hooks_module.go b/internal/langserver/handlers/hooks_module.go index a0c5e98d..e157fde0 100644 --- a/internal/langserver/handlers/hooks_module.go +++ b/internal/langserver/handlers/hooks_module.go @@ -38,12 +38,16 @@ func sendModuleTelemetry(features *Features, telemetrySender telemetry.Sender) n properties := features.Modules.Telemetry(path) rootTelemetry := features.RootModules.Telemetry(path) stacksTelemetry := features.Stacks.Telemetry(path) + searchTelemetry := features.Search.Telemetry(path) for property, value := range rootTelemetry { properties[property] = value } for property, value := range stacksTelemetry { properties[property] = value } + for property, value := range searchTelemetry { + properties[property] = value + } telemetrySender.SendEvent(ctx, "moduleData", properties) @@ -66,6 +70,7 @@ func updateDiagnostics(features *Features, dNotifier *diagnostics.Notifier) noti diags.Extend(features.Variables.Diagnostics(path)) diags.Extend(features.Stacks.Diagnostics(path)) diags.Extend(features.Tests.Diagnostics(path)) + diags.Extend(features.Search.Diagnostics(path)) dNotifier.PublishHCLDiags(ctx, path, diags) } diff --git a/internal/langserver/handlers/initialize_test.go b/internal/langserver/handlers/initialize_test.go index 262ff9d5..a346c848 100644 --- a/internal/langserver/handlers/initialize_test.go +++ b/internal/langserver/handlers/initialize_test.go @@ -528,6 +528,8 @@ func TestInitialize_differentWorkspaceLayouts(t *testing.T) { defer features.Stacks.Stop() features.Tests.Start(ctx) defer features.Tests.Stop() + features.Search.Start(ctx) + defer features.Search.Stop() wc := walker.NewWalkerCollector() diff --git a/internal/langserver/handlers/service.go b/internal/langserver/handlers/service.go index b74f0f0a..b1ccad8b 100644 --- a/internal/langserver/handlers/service.go +++ b/internal/langserver/handlers/service.go @@ -21,6 +21,7 @@ import ( "github.com/hashicorp/terraform-ls/internal/eventbus" fmodules "github.com/hashicorp/terraform-ls/internal/features/modules" frootmodules "github.com/hashicorp/terraform-ls/internal/features/rootmodules" + "github.com/hashicorp/terraform-ls/internal/features/search" "github.com/hashicorp/terraform-ls/internal/features/stacks" ftests "github.com/hashicorp/terraform-ls/internal/features/tests" fvariables "github.com/hashicorp/terraform-ls/internal/features/variables" @@ -52,6 +53,7 @@ type Features struct { Variables *fvariables.VariablesFeature Stacks *stacks.StacksFeature Tests *ftests.TestsFeature + Search *search.SearchFeature } type service struct { @@ -552,12 +554,20 @@ func (svc *service) configureSessionDependencies(ctx context.Context, cfgOpts *s testsFeature.SetLogger(svc.logger) testsFeature.Start(svc.sessCtx) + searchFeature, err := search.NewSearchFeature(svc.eventBus, svc.stateStore, svc.fs, modulesFeature, rootModulesFeature) + if err != nil { + return err + } + searchFeature.SetLogger(svc.logger) + searchFeature.Start(svc.sessCtx) + svc.features = &Features{ Modules: modulesFeature, RootModules: rootModulesFeature, Variables: variablesFeature, Stacks: stacksFeature, Tests: testsFeature, + Search: searchFeature, } } @@ -569,6 +579,7 @@ func (svc *service) configureSessionDependencies(ctx context.Context, cfgOpts *s "terraform-deploy": svc.features.Stacks, "terraform-test": svc.features.Tests, "terraform-mock": svc.features.Tests, + "terraform-search": svc.features.Search, }, }) decoderContext := idecoder.DecoderContext(ctx) @@ -664,6 +675,9 @@ func (svc *service) shutdown() { if svc.features.Tests != nil { svc.features.Tests.Stop() } + if svc.features.Search != nil { + svc.features.Search.Stop() + } } } diff --git a/internal/langserver/handlers/session_mock_test.go b/internal/langserver/handlers/session_mock_test.go index 27366e1b..758b790f 100644 --- a/internal/langserver/handlers/session_mock_test.go +++ b/internal/langserver/handlers/session_mock_test.go @@ -17,6 +17,7 @@ import ( "github.com/hashicorp/terraform-ls/internal/eventbus" fmodules "github.com/hashicorp/terraform-ls/internal/features/modules" frootmodules "github.com/hashicorp/terraform-ls/internal/features/rootmodules" + fsearch "github.com/hashicorp/terraform-ls/internal/features/search" fstacks "github.com/hashicorp/terraform-ls/internal/features/stacks" ftests "github.com/hashicorp/terraform-ls/internal/features/tests" fvariables "github.com/hashicorp/terraform-ls/internal/features/variables" @@ -173,11 +174,17 @@ func NewTestFeatures(eventBus *eventbus.EventBus, s *state.StateStore, fs *files return nil, err } + searchFeature, err := fsearch.NewSearchFeature(eventBus, s, fs, modulesFeature, rootModulesFeature) + if err != nil { + return nil, err + } + return &Features{ Modules: modulesFeature, RootModules: rootModulesFeature, Variables: variablesFeature, Stacks: stacksFeature, Tests: testsFeature, + Search: searchFeature, }, nil } diff --git a/internal/lsp/language_id.go b/internal/lsp/language_id.go index 2d82deb1..4935152f 100644 --- a/internal/lsp/language_id.go +++ b/internal/lsp/language_id.go @@ -14,6 +14,7 @@ const ( Deploy LanguageID = "terraform-deploy" Test LanguageID = "terraform-test" Mock LanguageID = "terraform-mock" + Search LanguageID = "terraform-search" ) func (l LanguageID) String() string { diff --git a/internal/terraform/module/operation/op_type_string.go b/internal/terraform/module/operation/op_type_string.go index 6c1964ce..87e819bd 100644 --- a/internal/terraform/module/operation/op_type_string.go +++ b/internal/terraform/module/operation/op_type_string.go @@ -24,26 +24,30 @@ func _() { _ = x[OpTypeParseProviderVersions-13] _ = x[OpTypePreloadEmbeddedSchema-14] _ = x[OpTypeStacksPreloadEmbeddedSchema-15] - _ = x[OpTypeSchemaModuleValidation-16] - _ = x[OpTypeSchemaStackValidation-17] - _ = x[OpTypeSchemaVarsValidation-18] - _ = x[OpTypeReferenceValidation-19] - _ = x[OpTypeReferenceStackValidation-20] - _ = x[OpTypeTerraformValidate-21] - _ = x[OpTypeParseStackConfiguration-22] - _ = x[OpTypeLoadStackMetadata-23] - _ = x[OpTypeLoadStackRequiredTerraformVersion-24] - _ = x[OpTypeParseTestConfiguration-25] - _ = x[OpTypeLoadTestMetadata-26] - _ = x[OpTypeDecodeTestReferenceTargets-27] - _ = x[OpTypeDecodeTestReferenceOrigins-28] - _ = x[OpTypeDecodeWriteOnlyAttributes-29] - _ = x[OpTypeSchemaTestValidation-30] + _ = x[OpTypeSearchPreloadEmbeddedSchema-16] + _ = x[OpTypeSchemaModuleValidation-17] + _ = x[OpTypeSchemaStackValidation-18] + _ = x[OpTypeSchemaSearchValidation-19] + _ = x[OpTypeSchemaVarsValidation-20] + _ = x[OpTypeReferenceValidation-21] + _ = x[OpTypeReferenceStackValidation-22] + _ = x[OpTypeTerraformValidate-23] + _ = x[OpTypeParseStackConfiguration-24] + _ = x[OpTypeParseSearchConfiguration-25] + _ = x[OpTypeLoadStackMetadata-26] + _ = x[OpTypeLoadSearchMetadata-27] + _ = x[OpTypeLoadStackRequiredTerraformVersion-28] + _ = x[OpTypeParseTestConfiguration-29] + _ = x[OpTypeLoadTestMetadata-30] + _ = x[OpTypeDecodeTestReferenceTargets-31] + _ = x[OpTypeDecodeTestReferenceOrigins-32] + _ = x[OpTypeDecodeWriteOnlyAttributes-33] + _ = x[OpTypeSchemaTestValidation-34] } -const _OpType_name = "OpTypeUnknownOpTypeGetTerraformVersionOpTypeGetInstalledTerraformVersionOpTypeObtainSchemaOpTypeParseModuleConfigurationOpTypeParseVariablesOpTypeParseModuleManifestOpTypeParseTerraformSourcesOpTypeLoadModuleMetadataOpTypeDecodeReferenceTargetsOpTypeDecodeReferenceOriginsOpTypeDecodeVarsReferencesOpTypeGetModuleDataFromRegistryOpTypeParseProviderVersionsOpTypePreloadEmbeddedSchemaOpTypeStacksPreloadEmbeddedSchemaOpTypeSchemaModuleValidationOpTypeSchemaStackValidationOpTypeSchemaVarsValidationOpTypeReferenceValidationOpTypeReferenceStackValidationOpTypeTerraformValidateOpTypeParseStackConfigurationOpTypeLoadStackMetadataOpTypeLoadStackRequiredTerraformVersionOpTypeParseTestConfigurationOpTypeLoadTestMetadataOpTypeDecodeTestReferenceTargetsOpTypeDecodeTestReferenceOriginsOpTypeDecodeWriteOnlyAttributesOpTypeSchemaTestValidation" +const _OpType_name = "OpTypeUnknownOpTypeGetTerraformVersionOpTypeGetInstalledTerraformVersionOpTypeObtainSchemaOpTypeParseModuleConfigurationOpTypeParseVariablesOpTypeParseModuleManifestOpTypeParseTerraformSourcesOpTypeLoadModuleMetadataOpTypeDecodeReferenceTargetsOpTypeDecodeReferenceOriginsOpTypeDecodeVarsReferencesOpTypeGetModuleDataFromRegistryOpTypeParseProviderVersionsOpTypePreloadEmbeddedSchemaOpTypeStacksPreloadEmbeddedSchemaOpTypeSearchPreloadEmbeddedSchemaOpTypeSchemaModuleValidationOpTypeSchemaStackValidationOpTypeSchemaSearchValidationOpTypeSchemaVarsValidationOpTypeReferenceValidationOpTypeReferenceStackValidationOpTypeTerraformValidateOpTypeParseStackConfigurationOpTypeParseSearchConfigurationOpTypeLoadStackMetadataOpTypeLoadSearchMetadataOpTypeLoadStackRequiredTerraformVersionOpTypeParseTestConfigurationOpTypeLoadTestMetadataOpTypeDecodeTestReferenceTargetsOpTypeDecodeTestReferenceOriginsOpTypeDecodeWriteOnlyAttributesOpTypeSchemaTestValidation" -var _OpType_index = [...]uint16{0, 13, 38, 72, 90, 120, 140, 165, 192, 216, 244, 272, 298, 329, 356, 383, 416, 444, 471, 497, 522, 552, 575, 604, 627, 666, 694, 716, 748, 780, 811, 837} +var _OpType_index = [...]uint16{0, 13, 38, 72, 90, 120, 140, 165, 192, 216, 244, 272, 298, 329, 356, 383, 416, 449, 477, 504, 532, 558, 583, 613, 636, 665, 695, 718, 742, 781, 809, 831, 863, 895, 926, 952} func (i OpType) String() string { if i >= OpType(len(_OpType_index)-1) { diff --git a/internal/terraform/module/operation/operation.go b/internal/terraform/module/operation/operation.go index 85f1dcdc..a90d8365 100644 --- a/internal/terraform/module/operation/operation.go +++ b/internal/terraform/module/operation/operation.go @@ -33,14 +33,18 @@ const ( OpTypeParseProviderVersions OpTypePreloadEmbeddedSchema OpTypeStacksPreloadEmbeddedSchema + OpTypeSearchPreloadEmbeddedSchema OpTypeSchemaModuleValidation OpTypeSchemaStackValidation + OpTypeSchemaSearchValidation OpTypeSchemaVarsValidation OpTypeReferenceValidation OpTypeReferenceStackValidation OpTypeTerraformValidate OpTypeParseStackConfiguration + OpTypeParseSearchConfiguration OpTypeLoadStackMetadata + OpTypeLoadSearchMetadata OpTypeLoadStackRequiredTerraformVersion OpTypeParseTestConfiguration OpTypeLoadTestMetadata