Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion docs/administration/harper-studio/create-account.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ Start at the [Harper Studio sign up page](https://studio.harperdb.io/sign-up).
- Email Address
- Subdomain

_Part of the URL that will be used to identify your Harper Cloud Instances. For example, with subdomain demo and instance name “c1” the instance URL would be: [https://c1-demo.harperdbcloud.com](https://c1-demo.harperdbcloud.com)._
_Part of the URL that will be used to identify your Harper Cloud Instances. For example, with subdomain "demo" and instance name "c1" the instance URL would be: [https://c1-demo.harperdbcloud.com](https://c1-demo.harperdbcloud.com)._

- Coupon Code (optional)

Expand Down
2 changes: 1 addition & 1 deletion docs/administration/harper-studio/instances.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ A summary view of all instances within an organization can be viewed by clicking
1. Fill out Instance Info.
1. Enter Instance Name

_This will be used to build your instance URL. For example, with subdomain demo and instance name “c1” the instance URL would be: [https://c1-demo.harperdbcloud.com](https://c1-demo.harperdbcloud.com). The Instance URL will be previewed below._
_This will be used to build your instance URL. For example, with subdomain "demo" and instance name "c1" the instance URL would be: [https://c1-demo.harperdbcloud.com](https://c1-demo.harperdbcloud.com). The Instance URL will be previewed below._

1. Enter Instance Username

Expand Down
2 changes: 1 addition & 1 deletion docs/administration/harper-studio/organizations.md
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ A new organization can be created as follows:
- Enter Organization Name
_This is used for descriptive purposes only._
- Enter Organization Subdomain
_Part of the URL that will be used to identify your Harper Cloud Instances. For example, with subdomain demo and instance name “c1” the instance URL would be: [https://c1-demo.harperdbcloud.com](https://c1-demo.harperdbcloud.com)._
_Part of the URL that will be used to identify your Harper Cloud Instances. For example, with subdomain "demo" and instance name "c1" the instance URL would be: [https://c1-demo.harperdbcloud.com](https://c1-demo.harperdbcloud.com)._
1. Click Create Organization.

## Delete an Organization
Expand Down
2 changes: 1 addition & 1 deletion docs/administration/harper-studio/query-instance-data.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ SQL queries can be executed directly through the Harper Studio with the followin
1. Enter your SQL query in the SQL query window.
1. Click **Execute**.

_Please note, the Studio will execute the query exactly as entered. For example, if you attempt to `SELECT _` from a table with millions of rows, you will most likely crash your browser.\*
_Please note, the Studio will execute the query exactly as entered. For example, if you attempt to `SELECT _` from a table with millions of rows, you will most likely crash your browser._

## Browse Query Results Set

Expand Down
20 changes: 10 additions & 10 deletions docs/administration/logging/standard-logging.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,15 +22,15 @@ For example, a typical log entry looks like:

The components of a log entry are:

- timestamp - This is the date/time stamp when the event occurred
- level - This is an associated log level that gives a rough guide to the importance and urgency of the message. The available log levels in order of least urgent (and more verbose) are: `trace`, `debug`, `info`, `warn`, `error`, `fatal`, and `notify`.
- thread/ID - This reports the name of the thread and the thread ID that the event was reported on. Note that NATS logs are recorded by their process name and there is no thread id for them since they are a separate process. Key threads are:
- main - This is the thread that is responsible for managing all other threads and routes incoming requests to the other threads
- http - These are the worker threads that handle the primary workload of incoming HTTP requests to the operations API and custom functions.
- Clustering\* - These are threads and processes that handle replication.
- job - These are job threads that have been started to handle operations that are executed in a separate job thread.
- tags - Logging from a custom function will include a "custom-function" tag in the log entry. Most logs will not have any additional tags.
- message - This is the main message that was reported.
- `timestamp` - This is the date/time stamp when the event occurred
- `level` - This is an associated log level that gives a rough guide to the importance and urgency of the message. The available log levels in order of least urgent (and more verbose) are: `trace`, `debug`, `info`, `warn`, `error`, `fatal`, and `notify`.
- `thread/ID` - This reports the name of the thread and the thread ID that the event was reported on. Note that NATS logs are recorded by their process name and there is no thread id for them since they are a separate process. Key threads are:
- `main` - This is the thread that is responsible for managing all other threads and routes incoming requests to the other threads
- `http` - These are the worker threads that handle the primary workload of incoming HTTP requests to the operations API and custom functions.
- `Clustering` - These are threads and processes that handle replication.
- `job` - These are job threads that have been started to handle operations that are executed in a separate job thread.
- `tags` - Logging from a custom function will include a "custom-function" tag in the log entry. Most logs will not have any additional tags.
- `message` - This is the main message that was reported.

We try to keep logging to a minimum by default, to do this the default log level is `error`. If you require more information from the logs, increasing the log level down will provide that.

Expand All @@ -46,7 +46,7 @@ Harper logs can optionally be streamed to standard streams. Logging to standard

## Logging Rotation

Log rotation allows for managing log files, such as compressing rotated log files, archiving old log files, determining when to rotate, and the like. This will allow for organized storage and efficient use of disk space. For more information see logging in our [config docs](../../deployments/configuration).
Log rotation allows for managing log files, such as compressing rotated log files, archiving old log files, determining when to rotate, and the like. This will allow for organized storage and efficient use of disk space. For more information see "logging" in our [config docs](../../deployments/configuration).

## Read Logs via the API

Expand Down
2 changes: 1 addition & 1 deletion docs/custom-functions/restarting-server.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ title: Restarting the Server

# Restarting the Server

One way to manage Custom Functions is through [Harper Studio](../harper-studio/). It performs all the necessary operations automatically. To get started, navigate to your instance in Harper Studio and click the subnav link for functions. If you have not yet enabled Custom Functions, it will walk you through the process. Once configuration is complete, you can manage and deploy Custom Functions in minutes.
One way to manage Custom Functions is through [Harper Studio](../harper-studio/). It performs all the necessary operations automatically. To get started, navigate to your instance in Harper Studio and click the subnav link for "functions". If you have not yet enabled Custom Functions, it will walk you through the process. Once configuration is complete, you can manage and deploy Custom Functions in minutes.

For any changes made to your routes, helpers, or projects, you’ll need to restart the Custom Functions server to see them take effect. Harper Studio does this automatically whenever you create or delete a project, or add, edit, or edit a route or helper. If you need to start the Custom Functions server yourself, you can use the following operation to do so:

Expand Down
2 changes: 1 addition & 1 deletion docs/deployments/install-harper/linux.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ These instructions assume that the following has already been completed:

While you will need to access Harper through port 9925 for the administration through the operations API, and port 9932 for clustering, for higher level of security, you may want to consider keeping both of these ports restricted to a VPN or VPC, and only have the application interface (9926 by default) exposed to the public Internet.

For this example, we will use an AWS Ubuntu Server 22.04 LTS m5.large EC2 Instance with an additional General Purpose SSD EBS volume and the default ubuntu user account.
For this example, we will use an AWS Ubuntu Server 22.04 LTS m5.large EC2 Instance with an additional General Purpose SSD EBS volume and the default "ubuntu" user account.

---

Expand Down
6 changes: 3 additions & 3 deletions docs/developers/applications/caching.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,9 +22,9 @@ While you can provide a single expiration time, there are actually several expir

You can provide a single expiration and it defines the behavior for all three. You can also provide three settings for expiration, through table directives:

- expiration - The amount of time until a record goes stale.
- eviction - The amount of time after expiration before a record can be evicted (defaults to zero).
- scanInterval - The interval for scanning for expired records (defaults to one quarter of the total of expiration and eviction).
- `expiration` - The amount of time until a record goes stale.
- `eviction` - The amount of time after expiration before a record can be evicted (defaults to zero).
- `scanInterval` - The interval for scanning for expired records (defaults to one quarter of the total of expiration and eviction).

## Define External Data Source

Expand Down
4 changes: 2 additions & 2 deletions docs/developers/applications/define-routes.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ However, you can specify the path to be `/` if you wish to have your routes hand

- The route below, using the default config, within the **dogs** project, with a route of **breeds** would be available at **http:/localhost:9926/dogs/breeds**.

In effect, this route is just a pass-through to Harper. The same result could have been achieved by hitting the core Harper API, since it uses **hdbCore.preValidation** and **hdbCore.request**, which are defined in the helper methods section, below.
In effect, this route is just a pass-through to Harper. The same result could have been achieved by hitting the core Harper API, since it uses **hdbCore.preValidation** and **hdbCore.request**, which are defined in the "helper methods" section, below.

```javascript
export default async (server, { hdbCore, logger }) => {
Expand All @@ -39,7 +39,7 @@ export default async (server, { hdbCore, logger }) => {

For endpoints where you want to execute multiple operations against Harper, or perform additional processing (like an ML classification, or an aggregation, or a call to a 3rd party API), you can define your own logic in the handler. The function below will execute a query against the dogs table, and filter the results to only return those dogs over 4 years in age.

**IMPORTANT: This route has NO preValidation and uses hdbCore.requestWithoutAuthentication, which- as the name implies- bypasses all user authentication. See the security concerns and mitigations in the helper methods section, below.**
**IMPORTANT: This route has NO preValidation and uses hdbCore.requestWithoutAuthentication, which- as the name implies- bypasses all user authentication. See the security concerns and mitigations in the "helper methods" section, below.**

```javascript
export default async (server, { hdbCore, logger }) => {
Expand Down
10 changes: 5 additions & 5 deletions docs/developers/clustering/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,10 +22,10 @@ A common use case is an edge application collecting and analyzing sensor data th

Harper simplifies the architecture of such an application with its bi-directional, table-level replication:

- The edge instance subscribes to a thresholds table on the cloud instance, so the application only makes localhost calls to get the thresholds.
- The application continually pushes sensor data into a sensor_data table via the localhost API, comparing it to the threshold values as it does so.
- When a threshold violation occurs, the application adds a record to the alerts table.
- The application appends to that record array sensor_data entries for the 60 seconds (or minutes, or days) leading up to the threshold violation.
- The edge instance publishes the alerts table up to the cloud instance.
- The edge instance subscribes to a "thresholds" table on the cloud instance, so the application only makes localhost calls to get the thresholds.
- The application continually pushes sensor data into a "sensor_data" table via the localhost API, comparing it to the threshold values as it does so.
- When a threshold violation occurs, the application adds a record to the "alerts" table.
- The application appends to that record array "sensor_data" entries for the 60 seconds (or minutes, or days) leading up to the threshold violation.
- The edge instance publishes the "alerts" table up to the cloud instance.

By letting Harper focus on the fault-tolerant logistics of transporting your data, you get to write less code. By moving data only when and where it’s needed, you lower storage and bandwidth costs. And by restricting your app to only making local calls to Harper, you reduce the overall exposure of your application to outside forces.
6 changes: 3 additions & 3 deletions docs/developers/miscellaneous/google-data-studio.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,9 +19,9 @@ Get started by selecting the Harper connector from the [Google Data Studio Partn
1. Log in to https://datastudio.google.com/.
1. Add a new Data Source using the Harper connector. The current release version can be added as a data source by following this link: [Harper Google Data Studio Connector](https://datastudio.google.com/datasources/create?connectorId=AKfycbxBKgF8FI5R42WVxO-QCOq7dmUys0HJrUJMkBQRoGnCasY60_VJeO3BhHJPvdd20-S76g).
1. Authorize the connector to access other servers on your behalf (this allows the connector to contact your database).
1. Enter the Web URL to access your database (preferably with HTTPS), as well as the Basic Auth key you use to access the database. Just include the key, not the word Basic at the start of it.
1. Check the box for Secure Connections Only if you want to always use HTTPS connections for this data source; entering a Web URL that starts with https:// will do the same thing, if you prefer.
1. Check the box for Allow Bad Certs if your Harper instance does not have a valid SSL certificate. [Harper Cloud](../../deployments/harper-cloud/) always has valid certificates, and so will never require this to be checked. Instances you set up yourself may require this, if you are using self-signed certs. If you are using [Harper Cloud](../../deployments/harper-cloud/) or another instance you know should always have valid SSL certificates, do not check this box.
1. Enter the Web URL to access your database (preferably with HTTPS), as well as the Basic Auth key you use to access the database. Just include the key, not the word "Basic" at the start of it.
1. Check the box for "Secure Connections Only" if you want to always use HTTPS connections for this data source; entering a Web URL that starts with https:// will do the same thing, if you prefer.
1. Check the box for "Allow Bad Certs" if your Harper instance does not have a valid SSL certificate. [Harper Cloud](../../deployments/harper-cloud/) always has valid certificates, and so will never require this to be checked. Instances you set up yourself may require this, if you are using self-signed certs. If you are using [Harper Cloud](../../deployments/harper-cloud/) or another instance you know should always have valid SSL certificates, do not check this box.
1. Choose your Query Type. This determines what information the configuration will ask for after pressing the Next button.
- Table will ask you for a Schema and a Table to return all fields of using `SELECT *`.
- SQL will ask you for the SQL query you’re using to retrieve fields from the database. You may `JOIN` multiple tables together, and use Harper specific SQL functions, along with the usual power SQL grants.
Expand Down
20 changes: 10 additions & 10 deletions docs/developers/operations-api/analytics.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,12 +8,12 @@ title: Analytics Operations

Retrieves analytics data from the server.

- operation _(required)_ - must always be `get_analytics`
- metric _(required)_ - any value returned by `list_metrics`
- start*time *(optional)\_ - Unix timestamp in seconds
- end*time *(optional)\_ - Unix timestamp in seconds
- get*attributes *(optional)\_ - array of attribute names to retrieve
- conditions _(optional)_ - array of conditions to filter results (see [search_by_conditions docs](developers/operations-api/nosql-operations) for details)
- `operation` _(required)_ - must always be `get_analytics`
- `metric` _(required)_ - any value returned by `list_metrics`
- `start_time` _(optional)_ - Unix timestamp in seconds
- `end_time` _(optional)_ - Unix timestamp in seconds
- `get_attributes` _(optional)_ - array of attribute names to retrieve
- `conditions` _(optional)_ - array of conditions to filter results (see [search_by_conditions docs](developers/operations-api/nosql-operations) for details)

### Body

Expand Down Expand Up @@ -57,8 +57,8 @@ Retrieves analytics data from the server.

Returns a list of available metrics that can be queried.

- operation _(required)_ - must always be `list_metrics`
- metric*types *(optional)\_ - array of metric types to filter results; one or both of `custom` and `builtin`; default is `builtin`
- `operation` _(required)_ - must always be `list_metrics`
- `metric_types` _(optional)_ - array of metric types to filter results; one or both of `custom` and `builtin`; default is `builtin`

### Body

Expand All @@ -79,8 +79,8 @@ Returns a list of available metrics that can be queried.

Provides detailed information about a specific metric, including its structure and available parameters.

- operation _(required)_ - must always be `describe_metric`
- metric _(required)_ - name of the metric to describe
- `operation` _(required)_ - must always be `describe_metric`
- `metric` _(required)_ - name of the metric to describe

### Body

Expand Down
Loading