Conversation
… tests (#1779) * [DAPS-1775] - fix: core, foxx, add missing {}, foxx query_router add params object schema to routes. (#1781) * [DAPS-1777] - fix: foxx, user_router fix regression in missing response. (#1778) * [DAPS-1786] - refactor: web tests, add test for hitting password reset. (#1787) * [DAPS-1277] - fix: mock, core, common, PROXY_BASIC_ZMQ and PROXY_CUSTOM correctly defined * [DAPS-1790] - fix: common, core, repo, zmq assertion failure during EXCEPT call due to callin zmq_msg with invalid state after closing it. * [DAPS-1791] - fix: build, python client, requirements.txt was being moved to a folder named requirements.txt during cmake configure script.
* refactor: only print subset of user properties. * chore: Auto-format JavaScript files with Prettier
* [DAPS-1770] - release: v4.0.0 * [DAPS-1585] - update: dependencies, upgrade ssl dependency. 3.2.5 (#1646) * [DAPS-1605] - fix: scripts, install_foxx.sh by splitting ssl_args (#1623) * [DAPS-1651] - refactor: scripts, compose, univify treatment of env variables in compose env generator (#1656) (#1658) * [DAPS-1675] - feature: foxx, adding the logger functions for future PR's (#1675) * [DAPS-1659] - refactor: scripts, remove dependencies install scripts (#1660) * [DAPS-1670] - feature: common, core, repo, python_client, web, allow passing repo types in protobuf messages (#1670) * [DAPS-1671] - feature: foxx, add repository and execution strategy types (#1672) * [DAPS-1661] - refactor: compose, scripts, remove remaining occurences of zeromq system secret. (#1661) * [DAPS-1522] - refactor: foxx, user router logging improvements, remove non helpful logs from tasks.js (#1629) * [DAPS-1691] - refactor: foxx, adjust validation.js swap g_lib with error_code require rem… (#1691) * [DAPS-1692] - tests: ci, End-to-end web tests, fix flaky test (#1693) * [DAPS-1694] - refactor: foxx, move permissions functions from support.js to lib/permissions (#1695) * [DAPS-1685] - feature: compose, enable arangodb ssl (#1687) * [DAPS-1700] - fix: ci, limit arangodb job output to last 3 hours. (#1701) * [DAPS-1676] - feature: foxx, arango add factory for repositories for metadata and globus (#1697) * [DAPS-1718] - feature: web, core, python client, Protobuf ExecutionMethod enum, add RepoAllocationCreateResponse (#1719) * [DAPS-1713] - refactor: core, web, python client, protobuf, allow optional fields when creating repo to support metadat… (#1714) * [DAPS-1715] - refactor: core, make path, pub_key, address, endpoint optional in repoCreateRequest (#1716) * [DAPS-1705] - feature: foxx, integrate metadata globus factory repo router create (#1706) * [DAPS-1688] - update: dependencies, core, repo, authz, gcs, Crypto libssl switched to version 3 globus_sdk version pinned (#1689) * [DAPS-1729] - fix: ci, downstream datafed dependencies pipelines are building the container image from incorrect sha (#1732) * [DAPS-1711] - refactor: foxx standardize repo response schema (#1712) * [DAPS-1725] - refactor: remove confusing apache conf file. (#1728) * [DAPS-1707] - update: dependencies, web, update web dependencies before install (#1709) * [DAPS-1522] - refactor: foxx, task router logging improvements (#1648) * [DAPS-1522] - refactor: foxx, query router logging improvements (#1627) * [DAPS-1735] - bug: foxx, remove duplicate user_router test (#1736) * [DAPS-1731] - reature: scripts, compose, add scripts to generate globus credentials for web service (#1731) * [DAPS-1725] - refactor: tests, mock core server centralized (#1726) * [DAPS-1741] - update: scripts, native client id in intialize_globus_endpoint and globus_clea… (#1741) * [DAPS-1745] - fix: scripts, account for nested client credentials. (#1746) * [DAPS-1725-2] - fix; tests, centralized mock core service libraries fixed (part 2) (#1747) * [DAPS-1742] - refactor script replace os.path.join with urllib.parse.urljoin (#1744) * [DAPS-1749] - refactor: cmake, set cmake policy to silence noisy warning. (#1750) * [DAPS-1522] - refactor: foxx, feature tag router logging improvements (#1734) * [DAPS-1378] - fix: web, mapping of multiple globus accounts. (#1753) * [DAPS-1756] - fix: scripts, foxx, add retries and connection check to install_foxx.sh script. (#1757) * [DAPS-1522] - refactor: foxx, Version Router Logging Improvements (#1758) * [DAPS-1737] - refactor: compose, cleanup arango ssl env variables (#1765) * [DAPS-1766] - fix: ci, python client provisioning job * [DAPS-1663] - feature: core Service, adding Correlation ID to Logging (#1704) Co-authored-by: Aaron Perez <perezam@ornl.gov> Co-authored-by: AronPerez <aperez0295@gmail.com> Co-authored-by: Blake Nedved <nedvedba@ornl.gov> Co-authored-by: sourcery-ai[bot] <58596630+sourcery-ai[bot]@users.noreply.github.com> Co-authored-by: nedvedba <145805866+nedvedba@users.noreply.github.com> Co-authored-by: Austin Hampton <amh107@latech.edu> Co-authored-by: Austin Hampton <44103380+megatnt1122@users.noreply.github.com> Co-authored-by: Blake Nedved <blakeanedved@gmail.com> Co-authored-by: Polina Shpilker <infinite.loopholes@gmail.com> * [DAPS-1777] - Foxx, fix: user_router fix regression in missing response. (#1778) [DAPS-1786] - Web tests, refactor: add test for hitting password reset. (#1787) * [DAPS-1774] - Core, Python, Database, Foxx, Test add query end to end tests (#1779) * [DAPS-1775] - fix: core, foxx, add missing {}, foxx query_router add params object schema to routes. (#1781) * [DAPS-1777] - fix: foxx, user_router fix regression in missing response. (#1778) * [DAPS-1786] - refactor: web tests, add test for hitting password reset. (#1787) * [DAPS-1277] - fix: mock, core, common, PROXY_BASIC_ZMQ and PROXY_CUSTOM correctly defined * [DAPS-1790] - fix: common, core, repo, zmq assertion failure during EXCEPT call due to callin zmq_msg with invalid state after closing it. * [DAPS-1791] - fix: build, python client, requirements.txt was being moved to a folder named requirements.txt during cmake configure script. * chore: Auto-format JavaScript files with Prettier * [DAPS-1774] - continuation of fix (#1798) * [DAPS-1774] - continuation of fix (#1798) * release: version 4.0.1 (#1807) * refactor: only print subset of user properties. (#1804) * refactor: only print subset of user properties. * chore: Auto-format JavaScript files with Prettier * [DAPS-1806] - refactor: foxx, user tokens expiring route. (#1806) --------- Co-authored-by: Aaron Perez <perezam@ornl.gov> Co-authored-by: AronPerez <aperez0295@gmail.com> Co-authored-by: Blake Nedved <nedvedba@ornl.gov> Co-authored-by: sourcery-ai[bot] <58596630+sourcery-ai[bot]@users.noreply.github.com> Co-authored-by: nedvedba <145805866+nedvedba@users.noreply.github.com> Co-authored-by: Austin Hampton <amh107@latech.edu> Co-authored-by: Austin Hampton <44103380+megatnt1122@users.noreply.github.com> Co-authored-by: Blake Nedved <blakeanedved@gmail.com> Co-authored-by: Polina Shpilker <infinite.loopholes@gmail.com>
* fix: prevent defaults being set to undefined, and interpret numbers a… (#1861) * fix: prevent defaults being set to undefined, and interpret numbers and enums as strings. * chore: Auto-format JavaScript files with Prettier * fix: version numbers from proto3 messages follow camel case. (#1868) --------- Co-authored-by: Joshua S Brown <joshbro42867@yahoo.com>
Master staging merge
…ity bugs, log bugs, removed proj search
Reviewer's GuideThis PR delivers a staging release that upgrades protocol and component versions, tightens security and logging across Foxx APIs, refactors query and record path handling, removes the project AQL search surface, and adds end‑to‑end tests (including query CRUD and Playwright web tests) along with minor web/server wiring adjustments. Sequence diagram for saved query update and execution across Web, Core, and FoxxsequenceDiagram
actor User
participant WebServer
participant CoreServer
participant FoxxQueryRouter
participant ArangoDB
User->>WebServer: POST /api/query/update?id=Q123&replaceQuery=true
Note over WebServer: HTTP body contains updated query JSON (params as object)
WebServer->>CoreServer: QueryUpdateRequest(id=Q123, replace_query=true, query)
CoreServer->>CoreServer: queryUpdate(request)
CoreServer->>CoreServer: parseSearchRequest(query, out qry_begin, qry_end, qry_filter, params)
CoreServer->>CoreServer: MessageToJsonString(query, options)
CoreServer->>FoxxQueryRouter: POST /qry/update
Note over CoreServer,FoxxQueryRouter: payload.qry_begin, payload.qry_end, payload.qry_filter, payload.params (JSON object), payload.limit
FoxxQueryRouter->>ArangoDB: _executeTransaction(read:["u","uuid","accn","admin"], write:["q"])
FoxxQueryRouter->>ArangoDB: q.document(req.body.id)
FoxxQueryRouter->>ArangoDB: _update(q._id, merged_query, {mergeObjects:false, returnNew:true})
FoxxQueryRouter->>FoxxQueryRouter: delete qry.qry_begin/qry_end/qry_filter/params/limit
FoxxQueryRouter->>CoreServer: reply with updated query metadata
CoreServer->>CoreServer: setQueryData(a_reply, result)
CoreServer->>WebServer: QueryDataReply
WebServer->>User: HTTP 200 with updated query metadata
User->>WebServer: GET /api/query/exec?id=Q123
WebServer->>CoreServer: QueryExecRequest(id=Q123)
CoreServer->>FoxxQueryRouter: GET /qry/exec?id=Q123
FoxxQueryRouter->>ArangoDB: q.document(id)
FoxxQueryRouter->>FoxxQueryRouter: if typeof params == "string" then JSON.parse(params)
FoxxQueryRouter->>FoxxQueryRouter: execQuery(client, mode, published, query)
FoxxQueryRouter->>ArangoDB: _query(built_AQL, query.params)
FoxxQueryRouter->>CoreServer: results[]
CoreServer->>WebServer: ExecReply(results)
WebServer->>User: HTTP 200 with results[]
Updated class diagram for Record and Repo path handling and AuthZ integrationclassDiagram
class Record {
-string #key
-object #loc
-object #repo
-number #error
-string #err_msg
+boolean isManaged()
+boolean isPathConsistent(a_path string)
-string _pathToRecord(uid string, basePath string)
}
class Repo {
-string id_value
-object doc
+Repo(id string)
+string id()
+PathType pathType(file_path string)
+static Repo resolveFromPath(file_path string)
}
class PathType {
<<enumeration>>
UNKNOWN
REPO
USER
PROJECT
}
class PosixPath {
<<module>>
+string normalizePOSIXPath(a_posix_path string)
+string joinPOSIX(...segments)
+string dirnamePOSIX(a_posix_path string)
+string basenamePOSIX(a_posix_path string)
+string~[]~ splitPOSIX(a_posix_path string)
}
class AuthzRouter {
<<module>>
+get_file_access(client string, repo string, file string)
}
Record --> "1" Repo : uses
Repo --> PathType : returns
AuthzRouter --> Repo : resolveFromPath(file)
AuthzRouter --> PosixPath : normalizePOSIXPath(file)
PosixPath ..> Record : shared path normalization convention
File-Level Changes
Tips and commandsInteracting with Sourcery
Customizing Your ExperienceAccess your dashboard to:
Getting Help
|
There was a problem hiding this comment.
Hey - I've found 2 security issues, 5 other issues, and left some high level feedback:
Security issues:
- Detected user input used to manually construct a SQL string. This is usually bad practice because manual construction could accidentally result in a SQL injection. An attacker could use a SQL injection to steal or modify contents of the database. Instead, use a parameterized query which is available by default in most database engines. Alternatively, consider using an object-relational mapper (ORM) such as Sequelize which will protect your queries. (link)
- Detected user input used to manually construct a SQL string. This is usually bad practice because manual construction could accidentally result in a SQL injection. An attacker could use a SQL injection to steal or modify contents of the database. Instead, use a parameterized query which is available by default in most database engines. Alternatively, consider using an object-relational mapper (ORM) such as Sequelize which will protect your queries. (link)
General comments:
- In
execQuery(query_router.js), the change fromif (!query.params.cols)toif (!query.params.cols.length)will throw ifquery.params.colsis undefined; consider guarding with!query.params.cols || !query.params.cols.length(or defaulting it to an empty array) before accessing.length. - In
data_router.js's/update/batcherror logging, the description template usesdisplayedIdswhile the variable constructed earlier isdisplayIds, which will cause aReferenceErrorat runtime; these names should be made consistent. - The
message("ENABLE END TO END!!!!!!!")line intests/end-to-end/web-UI/CMakeLists.txtlooks like leftover debug output and will fire on every configure; consider removing or gating it behind a developer-only option.
Prompt for AI Agents
Please address the comments from this code review:
## Overall Comments
- In `execQuery` (query_router.js), the change from `if (!query.params.cols)` to `if (!query.params.cols.length)` will throw if `query.params.cols` is undefined; consider guarding with `!query.params.cols || !query.params.cols.length` (or defaulting it to an empty array) before accessing `.length`.
- In `data_router.js`'s `/update/batch` error logging, the description template uses `displayedIds` while the variable constructed earlier is `displayIds`, which will cause a `ReferenceError` at runtime; these names should be made consistent.
- The `message("ENABLE END TO END!!!!!!!")` line in `tests/end-to-end/web-UI/CMakeLists.txt` looks like leftover debug output and will fire on every configure; consider removing or gating it behind a developer-only option.
## Individual Comments
### Comment 1
<location path="core/database/foxx/api/data_router.js" line_range="711-712" />
<code_context>
}
- if (record.deps != undefined && (record.deps_add != undefined || record.deps_rem != undefined))
+ if (record.deps != undefined && (record.dep_add != undefined || record.dep_rem != undefined))
throw [error.ERR_INVALID_PARAM, "Cannot use both dependency set and add/remove."];
var dep,
</code_context>
<issue_to_address>
**issue (bug_risk):** Inconsistent dependency field names likely break validation of mixed dep operations.
Elsewhere in the codebase these fields are consistently named `deps_add` / `deps_rem`, but this guard uses `dep_add` / `dep_rem`. As a result, requests using `deps` with `deps_add` / `deps_rem` will skip this check instead of throwing. Please either align the field names to `deps_add` / `deps_rem` here, or explicitly handle both name variants if legacy payloads must be supported.
</issue_to_address>
### Comment 2
<location path="core/database/foxx/api/data_router.js" line_range="1052-1053" />
<code_context>
httpVerb: "POST",
routePath: basePath + "/update/batch",
status: "Failure",
- description: `Update a batch of existing data record. RecordIDs: ${displayIds}`,
+ description: `Update a batch of existing data record. RecordIDs: ${displayedIds}`,
extra: {
count: totalCount,
</code_context>
<issue_to_address>
**issue (bug_risk):** `displayedIds` is undefined in the error logger and will mask the original error.
The template literal now references `displayedIds`, but the variable defined earlier is `displayIds`. This will throw a `ReferenceError` when logging failures and hide the original exception. Please update the description to use the correct variable name (or rename the variable consistently).
</issue_to_address>
### Comment 3
<location path="core/database/foxx/api/group_router.js" line_range="442-443" />
<code_context>
});
} catch (e) {
logger.logRequestFailure({
- client: "N/A",
</code_context>
<issue_to_address>
**issue (bug_risk):** Error path in group listing no longer sends a response on failure.
Previously this catch block returned `groups` to the client; now no response is sent at all, so any exception will leave the HTTP request hanging until it times out. If you no longer want to return partial data, please still send an error response here (e.g., via `g_lib.handleException(e, res)` or another appropriate error handler).
</issue_to_address>
### Comment 4
<location path="core/database/foxx/api/query_router.js" line_range="669-670" />
<code_context>
+ // (joi.any()) accepted both. New documents are stored as objects
+ // (joi.object()), but old records remain until migrated.
+ // TODO: Remove after backfilling existing queries in ArangoDB.
+ if (typeof qry.params === "string") {
+ qry.params = JSON.parse(qry.params);
+ }
+
</code_context>
<issue_to_address>
**suggestion (bug_risk):** Bare `JSON.parse` on legacy `params` string can throw and may deserve guarding.
For legacy documents you’re now calling `JSON.parse(qry.params)` without handling failures. If any stored `params` string is malformed (e.g. corrupted or from an older buggy version), this will throw a `SyntaxError` and surface as a generic error. To be more robust with historical data, consider wrapping this in a try/catch, logging the offending value (or its ID), and returning a controlled `ERR_INVALID_PARAM` (or similar) instead of letting it crash execution.
Suggested implementation:
```javascript
// Legacy query documents may have `params` stored as a JSON string
// rather than an object, because the original schema validation
// (joi.any()) accepted both. New documents are stored as objects
// (joi.object()), but old records remain until migrated.
// TODO: Remove after backfilling existing queries in ArangoDB.
if (typeof qry.params === "string") {
try {
qry.params = JSON.parse(qry.params);
} catch (e) {
// Guard against malformed legacy JSON in stored queries.
// Log enough context to investigate without leaking full params.
const queryId = req && req.queryParams && req.queryParams.id;
const paramsSample =
typeof qry.params === "string"
? qry.params.slice(0, 200)
: undefined;
console.warn("Failed to parse legacy query params JSON", {
queryId,
paramsSample,
error: String(e),
});
const err = new Error("ERR_INVALID_PARAM: invalid legacy query params JSON");
err.errorNum = g_lib.ERR_INVALID_PARAM;
err.code = "ERR_INVALID_PARAM";
throw err;
}
}
```
If `g_lib.ERR_INVALID_PARAM` or the `err.errorNum`/`err.code` convention does not exist in your codebase, you should adapt the thrown error to your existing error-handling pattern. For example, if you have a helper like `g_lib.createError(code, message)` or a custom error class, replace the manual `Error` construction with that helper so the error surfaces as a standard validation/parameter error in your API responses.
</issue_to_address>
### Comment 5
<location path="CMakeLists.txt" line_range="37" />
<code_context>
libraries must be shared libraries for DataFed to be interoperable. If this
setting is turned on DataFed will build it's libraries as shared and try to
link to shared libraries." OFF)
-OPTION(ENABLE_END_TO_END_API_TESTS "Enable end-to-end API testing" FALSE)
</code_context>
<issue_to_address>
**issue (typo):** Use "its" (possessive) instead of "it's" in this description.
In this line, replace "it's" with the possessive "its" ("DataFed will build its libraries as shared...").
```suggestion
setting is turned on DataFed will build its libraries as shared and try to
```
</issue_to_address>
### Comment 6
<location path="core/database/foxx/api/data_router.js" line_range="2400" />
<code_context>
description: `Attempting to delete a total of: ${req.body.ids.length}`,
</code_context>
<issue_to_address>
**security (javascript.express.security.injection.tainted-sql-string):** Detected user input used to manually construct a SQL string. This is usually bad practice because manual construction could accidentally result in a SQL injection. An attacker could use a SQL injection to steal or modify contents of the database. Instead, use a parameterized query which is available by default in most database engines. Alternatively, consider using an object-relational mapper (ORM) such as Sequelize which will protect your queries.
*Source: opengrep*
</issue_to_address>
### Comment 7
<location path="core/database/foxx/api/data_router.js" line_range="2464" />
<code_context>
description: `Attempting to delete a total of: ${req.body.ids.length}`,
</code_context>
<issue_to_address>
**security (javascript.express.security.injection.tainted-sql-string):** Detected user input used to manually construct a SQL string. This is usually bad practice because manual construction could accidentally result in a SQL injection. An attacker could use a SQL injection to steal or modify contents of the database. Instead, use a parameterized query which is available by default in most database engines. Alternatively, consider using an object-relational mapper (ORM) such as Sequelize which will protect your queries.
*Source: opengrep*
</issue_to_address>Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.
| if (record.deps != undefined && (record.dep_add != undefined || record.dep_rem != undefined)) | ||
| throw [error.ERR_INVALID_PARAM, "Cannot use both dependency set and add/remove."]; |
There was a problem hiding this comment.
issue (bug_risk): Inconsistent dependency field names likely break validation of mixed dep operations.
Elsewhere in the codebase these fields are consistently named deps_add / deps_rem, but this guard uses dep_add / dep_rem. As a result, requests using deps with deps_add / deps_rem will skip this check instead of throwing. Please either align the field names to deps_add / deps_rem here, or explicitly handle both name variants if legacy payloads must be supported.
| description: `Update a batch of existing data record. RecordIDs: ${displayIds}`, | ||
| description: `Update a batch of existing data record. RecordIDs: ${displayedIds}`, | ||
| extra: { |
There was a problem hiding this comment.
issue (bug_risk): displayedIds is undefined in the error logger and will mask the original error.
The template literal now references displayedIds, but the variable defined earlier is displayIds. This will throw a ReferenceError when logging failures and hide the original exception. Please update the description to use the correct variable name (or rename the variable consistently).
| } catch (e) { | ||
| res.send(groups); |
There was a problem hiding this comment.
issue (bug_risk): Error path in group listing no longer sends a response on failure.
Previously this catch block returned groups to the client; now no response is sent at all, so any exception will leave the HTTP request hanging until it times out. If you no longer want to return partial data, please still send an error response here (e.g., via g_lib.handleException(e, res) or another appropriate error handler).
| if (typeof qry.params === "string") { | ||
| qry.params = JSON.parse(qry.params); |
There was a problem hiding this comment.
suggestion (bug_risk): Bare JSON.parse on legacy params string can throw and may deserve guarding.
For legacy documents you’re now calling JSON.parse(qry.params) without handling failures. If any stored params string is malformed (e.g. corrupted or from an older buggy version), this will throw a SyntaxError and surface as a generic error. To be more robust with historical data, consider wrapping this in a try/catch, logging the offending value (or its ID), and returning a controlled ERR_INVALID_PARAM (or similar) instead of letting it crash execution.
Suggested implementation:
// Legacy query documents may have `params` stored as a JSON string
// rather than an object, because the original schema validation
// (joi.any()) accepted both. New documents are stored as objects
// (joi.object()), but old records remain until migrated.
// TODO: Remove after backfilling existing queries in ArangoDB.
if (typeof qry.params === "string") {
try {
qry.params = JSON.parse(qry.params);
} catch (e) {
// Guard against malformed legacy JSON in stored queries.
// Log enough context to investigate without leaking full params.
const queryId = req && req.queryParams && req.queryParams.id;
const paramsSample =
typeof qry.params === "string"
? qry.params.slice(0, 200)
: undefined;
console.warn("Failed to parse legacy query params JSON", {
queryId,
paramsSample,
error: String(e),
});
const err = new Error("ERR_INVALID_PARAM: invalid legacy query params JSON");
err.errorNum = g_lib.ERR_INVALID_PARAM;
err.code = "ERR_INVALID_PARAM";
throw err;
}
}If g_lib.ERR_INVALID_PARAM or the err.errorNum/err.code convention does not exist in your codebase, you should adapt the thrown error to your existing error-handling pattern. For example, if you have a helper like g_lib.createError(code, message) or a custom error class, replace the manual Error construction with that helper so the error surfaces as a standard validation/parameter error in your API responses.
| @@ -36,7 +36,7 @@ building with static depencies is not completely possible because some system | |||
| libraries must be shared libraries for DataFed to be interoperable. If this | |||
| setting is turned on DataFed will build it's libraries as shared and try to | |||
There was a problem hiding this comment.
issue (typo): Use "its" (possessive) instead of "it's" in this description.
In this line, replace "it's" with the possessive "its" ("DataFed will build its libraries as shared...").
| setting is turned on DataFed will build it's libraries as shared and try to | |
| setting is turned on DataFed will build its libraries as shared and try to |
| httpVerb: "POST", | ||
| routePath: basePath + "/delete", | ||
| status: "Started", | ||
| description: `Attempting to delete a total of: ${req.body.ids.length}`, |
There was a problem hiding this comment.
security (javascript.express.security.injection.tainted-sql-string): Detected user input used to manually construct a SQL string. This is usually bad practice because manual construction could accidentally result in a SQL injection. An attacker could use a SQL injection to steal or modify contents of the database. Instead, use a parameterized query which is available by default in most database engines. Alternatively, consider using an object-relational mapper (ORM) such as Sequelize which will protect your queries.
Source: opengrep
| httpVerb: "POST", | ||
| routePath: basePath + "/delete", | ||
| status: "Failure", | ||
| description: `Attempting to delete a total of: ${req.body.ids.length}`, |
There was a problem hiding this comment.
security (javascript.express.security.injection.tainted-sql-string): Detected user input used to manually construct a SQL string. This is usually bad practice because manual construction could accidentally result in a SQL injection. An attacker could use a SQL injection to steal or modify contents of the database. Instead, use a parameterized query which is available by default in most database engines. Alternatively, consider using an object-relational mapper (ORM) such as Sequelize which will protect your queries.
Source: opengrep
Ticket
Description
How Has This Been Tested?
Artifacts (if appropriate):
Tasks
Summary by Sourcery
Update DataFed core, Foxx services, and web server for the 3.0 release, tightening security around paths and queries, improving logging and error handling, and adding new end-to-end tests for API queries and Playwright-based web UI.
New Features:
Bug Fixes:
Enhancements:
Build:
Documentation:
Tests:
Chores: