Skip to content

Update logging with new abstraction/configuration #7271

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 11 commits into
base: main
Choose a base branch
from
196 changes: 135 additions & 61 deletions 16/umbraco-cms/fundamentals/backoffice/logviewer.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,13 +30,13 @@

If you frequently use a custom query, you can save it for quick access. Type your query in the search box and click the heart icon to save it with a friendly name. Saved queries are stored in the `umbracoLogViewerQuery` table in the database.

## Implementing Your Own Log Viewer
## Implementing Your Own Log Viewer Source

Umbraco allows you to implement a customn `ILogViewer` to fetch logs from alternative sources, such as **Azure Table Storage**.
Umbraco allows you to implement a customn `ILogViewerRepository` to fetch logs from alternative sources, such as **Azure Table Storage**.

### Creating a Custom Log Viewer
### Creating a Custom Log Viewer Repository

To fetch logs from Azure Table Storage, implement the `SerilogLogViewerSourceBase` class from `Umbraco.Cms.Core.Logging.Viewer`.
To fetch logs from Azure Table Storage, extend the `LogViewerRepositoryBase` class from `Umbraco.Cms.Infrastructure.Services.Implement`.

{% hint style="info" %}
This implementation requires the `Azure.Data.Tables` NuGet package.
Expand All @@ -47,114 +47,188 @@
using Azure.Data.Tables;
using Serilog.Events;
using Serilog.Formatting.Compact.Reader;
using Serilog.Sinks.AzureTableStorage;
using Umbraco.Cms.Core.Composing;
using Umbraco.Cms.Core.Logging.Viewer;
using ITableEntity = Azure.Data.Tables.ITableEntity;
using Umbraco.Cms.Core.Serialization;
using Umbraco.Cms.Core.Services;
using Umbraco.Cms.Infrastructure.Logging.Serilog;
using Umbraco.Cms.Infrastructure.Services.Implement;

namespace My.Website;

public class AzureTableLogViewer : SerilogLogViewerSourceBase
public class AzureTableLogsRepository : LogViewerRepositoryBase
{
public AzureTableLogViewer(ILogViewerConfig logViewerConfig, Serilog.ILogger serilogLog, ILogLevelLoader logLevelLoader)
: base(logViewerConfig, logLevelLoader, serilogLog)
private readonly IJsonSerializer _jsonSerializer;

public AzureTableLogsRepository(UmbracoFileConfiguration umbracoFileConfig, IJsonSerializer jsonSerializer) : base(
umbracoFileConfig)
{
_jsonSerializer = jsonSerializer;
}

public override bool CanHandleLargeLogs => true;

// This method will not be called - as we have indicated that this 'CanHandleLargeLogs'
public override bool CheckCanOpenLogs(LogTimePeriod logTimePeriod) => throw new NotImplementedException();

protected override IReadOnlyList<LogEvent> GetLogs(LogTimePeriod logTimePeriod, ILogFilter filter, int skip, int take)
protected override IEnumerable<ILogEntry> GetLogs(LogTimePeriod logTimePeriod, ILogFilter logFilter)
{
//Replace ACCOUNT_NAME and KEY with your actual Azure Storage Account details. The "Logs" parameter refers to the table name where logs will be stored and retrieved from.
// This example uses a connection string compatible with the Azurite emulator
// https://learn.microsoft.com/en-us/azure/storage/common/storage-use-azurite
var client =
new TableClient(
"DefaultEndpointsProtocol=https;AccountName=ACCOUNT_NAME;AccountKey=KEY;EndpointSuffix=core.windows.net",
"Logs");

// Table storage does not support skip, only take, so the best we can do is to not fetch more entities than we need in total.
// See: https://learn.microsoft.com/en-us/rest/api/storageservices/writing-linq-queries-against-the-table-service#returning-the-top-n-entities for more info.
var requiredEntities = skip + take;
IEnumerable<AzureTableLogEntity> results = client.Query<AzureTableLogEntity>().Take(requiredEntities);

return results
.Skip(skip)
.Take(take)
.Select(x => LogEventReader.ReadFromString(x.Data))
// Filter by timestamp to avoid retrieving all logs from the table, preventing memory and performance issues
.Where(evt => evt.Timestamp >= logTimePeriod.StartTime.Date &&
evt.Timestamp <= logTimePeriod.EndTime.Date.AddDays(1).AddSeconds(-1))
.Where(filter.TakeLogEvent)
.ToList();
"UseDevelopmentStorage=true",
"LogEventEntity");

// Filter by timestamp to avoid retrieving all logs from the table, preventing memory and performance issues
IEnumerable<AzureTableLogEntity> results = client.Query<AzureTableLogEntity>(
entity => entity.Timestamp >= logTimePeriod.StartTime.Date &&
entity.Timestamp <= logTimePeriod.EndTime.Date.AddDays(1).AddSeconds(-1));

// Read the data and apply logfilters
IEnumerable<LogEvent> filteredData = results.Select(x => LogEventReader.ReadFromString(x.Data))
.Where(logFilter.TakeLogEvent);

return filteredData.Select(x => new LogEntry
{
Timestamp = x.Timestamp,
Level = Enum.Parse<Core.Logging.LogLevel>(x.Level.ToString()),
MessageTemplateText = x.MessageTemplate.Text,
Exception = x.Exception?.ToString(),
Properties = MapLogMessageProperties(x.Properties),
RenderedMessage = x.RenderMessage(),
});
}

public override IReadOnlyList<SavedLogSearch>? GetSavedSearches()
private IReadOnlyDictionary<string, string?> MapLogMessageProperties(
IReadOnlyDictionary<string, LogEventPropertyValue>? properties)
{
//This method is optional. If you store saved searches in Azure Table Storage, implement fetching logic here.
return base.GetSavedSearches();
var result = new Dictionary<string, string?>();

if (properties is not null)
{
foreach (KeyValuePair<string, LogEventPropertyValue> property in properties)
{
string? value;

if (property.Value is ScalarValue scalarValue)
{
value = scalarValue.Value?.ToString();
}
else if (property.Value is StructureValue structureValue)
{
var textWriter = new StringWriter();
structureValue.Render(textWriter);
value = textWriter.ToString();
}
else
{
value = _jsonSerializer.Serialize(property.Value);
}

result.Add(property.Key, value);
}
}

return result;
}

public override IReadOnlyList<SavedLogSearch>? AddSavedSearch(string? name, string? query)
public class AzureTableLogEntity : ITableEntity
{
//This method is optional. If you store saved searches in Azure Table Storage, implement adding logic here.
return base.AddSavedSearch(name, query);
}
public required string Data { get; set; }

public override IReadOnlyList<SavedLogSearch>? DeleteSavedSearch(string? name, string? query)
{
//This method is optional. If you store saved searches in Azure Table Storage, implement deleting logic here.
return base.DeleteSavedSearch(name, query);
public required string PartitionKey { get; set; }

public required string RowKey { get; set; }

public DateTimeOffset? Timestamp { get; set; }

public ETag ETag { get; set; }
}
}
```

Azure Table Storage requires entities to implement the `ITableEntity` interface. Since Umbraco’s default log entity does not implement this, a custom entity (`AzureTableLogEntity`) must be created to ensure logs are correctly fetched.

public class AzureTableLogEntity : LogEventEntity, ITableEntity
### Creating a custom log viewer service

The next thing we need to do is create a new service that amongs other things is responsible to figure out whether a certain log query is allowed.

Check warning on line 150 in 16/umbraco-cms/fundamentals/backoffice/logviewer.md

View workflow job for this annotation

GitHub Actions / runner / vale

[vale] reported by reviewdog 🐶 [UmbracoDocs.SentenceLength] Write shorter sentences (less than 25 words). Raw Output: {"message": "[UmbracoDocs.SentenceLength] Write shorter sentences (less than 25 words).", "location": {"path": "16/umbraco-cms/fundamentals/backoffice/logviewer.md", "range": {"start": {"line": 150, "column": 1}}}, "severity": "WARNING"}

```csharp
public class AzureTableLogsService : LogViewerServiceBase
{
public DateTimeOffset? Timestamp { get; set; }
private readonly ILogViewerRepository _logViewerRepository;

public ETag ETag { get; set; }
public AzureTableLogsService(
ILogViewerQueryRepository logViewerQueryRepository,
ICoreScopeProvider provider,
ILogViewerRepository logViewerRepository)
: base(logViewerQueryRepository, provider, logViewerRepository)
{
_logViewerRepository = logViewerRepository;
}

// Change this to what you think is sensible
// as an example we check whether more than 5 days off logs are requested
public override Task<Attempt<bool, LogViewerOperationStatus>> CanViewLogsAsync(LogTimePeriod logTimePeriod)
{
return logTimePeriod.EndTime - logTimePeriod.StartTime < TimeSpan.FromDays(5)
? Task.FromResult(Attempt.SucceedWithStatus(LogViewerOperationStatus.Success, true))
: Task.FromResult(Attempt.FailWithStatus(LogViewerOperationStatus.CancelledByLogsSizeValidation, false));
}

public override ReadOnlyDictionary<string, LogLevel> GetLogLevelsFromSinks()
{
var configuredLogLevels = new Dictionary<string, LogLevel>
{
{ "Global", GetGlobalMinLogLevel() },
{ "AzureTableStorage", _logViewerRepository.RestrictedToMinimumLevel() },
};

return configuredLogLevels.AsReadOnly();
}
}
```

Azure Table Storage requires entities to implement the `ITableEntity` interface. Since Umbraco’s default log entity does not implement this, a custom entity (`AzureTableLogEntity`) must be created to ensure logs are correctly fetched and stored.

### Register implementation

Umbraco needs to be made aware that there is a new implementation of an `ILogViewer` to register. We also need to replace the default JSON LogViewer that we ship in the core of Umbraco.
Umbraco needs to be made aware that there is a new implementation of an `ILogViewerRepository` to register. We also need to replace the default JSON LogViewer that is shipped in the core of Umbraco.

```csharp
using Umbraco.Cms.Core.Composing;
using Umbraco.Cms.Infrastructure.DependencyInjection;
using Umbraco.Cms.Core.Services;

namespace My.Website;

public class LogViewerSavedSearches : IComposer
{
public void Compose(IUmbracoBuilder builder) => builder.SetLogViewer<AzureTableLogViewer>();
public class AzureTableLogsComposer : IComposer
{
public void Compose(IUmbracoBuilder builder)
{
builder.Services.AddUnique<ILogViewerRepository, AzureTableLogsRepository>();
builder.Services.AddUnique<ILogViewerService, AzureTableLogsService>();
}
}
}
```

### Configuring Logging to Azure Table Storage

With the above two classes, the setup is in place to view logs from an Azure Table. However, logs are not yet persisted into the Azure Table Storage account. To enable persistence, configure the Serilog logging pipeline to store logs in Azure Table Storage.

* Install `Serilog.Sinks.AzureTableStorage` from NuGet.
* Add a new sink to `appsettings.json` with credentials to persist logs to Azure.
- Install `Serilog.Sinks.AzureTableStorage` from NuGet.
- Add a new sink to `appsettings.json` with credentials to persist logs to Azure.

The following sink needs to be added to the [`Serilog:WriteTo`](https://github.com/serilog/serilog-sinks-azuretablestorage#json-configuration) array.

```json
{
"Name": "AzureTableStorage",
"Args": {
"storageTableName": "LogEventEntity",
"formatter": "Serilog.Formatting.Compact.CompactJsonFormatter, Serilog.Formatting.Compact",
"connectionString": "DefaultEndpointsProtocol=https;AccountName=ACCOUNT_NAME;AccountKey=KEY;EndpointSuffix=core.windows.net"}
"Name": "AzureTableStorage",
"Args": {
"storageTableName": "LogEventEntity",
"formatter": "Serilog.Formatting.Compact.CompactJsonFormatter, Serilog.Formatting.Compact",
"connectionString": "DefaultEndpointsProtocol=https;AccountName=ACCOUNT_NAME;AccountKey=KEY;EndpointSuffix=core.windows.net"
}
}
```

For more in-depth information about logging and how to configure it, see the [Logging](../code/debugging/logging.md) article.

### Compact Log Viewer - Desktop App

[Compact Log Viewer](https://www.microsoft.com/store/apps/9N8RV8LKTXRJ?cid=storebadge\&ocid=badge). A desktop tool is available for viewing and querying JSON log files in the same way as the built-in Log Viewer in Umbraco.
[Compact Log Viewer](https://www.microsoft.com/store/apps/9N8RV8LKTXRJ?cid=storebadge&ocid=badge). A desktop tool is available for viewing and querying JSON log files in the same way as the built-in Log Viewer in Umbraco.
4 changes: 4 additions & 0 deletions 16/umbraco-cms/fundamentals/code/debugging/logging.md
Original file line number Diff line number Diff line change
Expand Up @@ -117,6 +117,10 @@ Serilog uses levels as the primary means for assigning importance to log events.

Serilog can be configured and extended by using the .NET Core configuration such as the AppSetting.json files or environment variables. For more information, see the [Serilog config](../../../reference/configuration/serilog.md) article.

## The UmbracoFile Sink

Serilog uses the concept of Sinks to output the log messages to different places. Umbraco ships with a custom sink configuration called UmbracoFile that uses the [Serilog.Sinks.File](https://github.com/serilog/serilog-sinks-file) sink. This will save the logs to a rolling file on disk. You can disable this sink by setting its Enabled configuration flag to false, see [Serilog config](../../../reference/configuration/serilog.md) for more information.

## The logviewer dashboard

Learn more about the [logviewer dashboard](../../backoffice/logviewer.md) in the backoffice and how it can be extended.
Expand Down
3 changes: 3 additions & 0 deletions 16/umbraco-cms/reference/configuration/serilog.md
Original file line number Diff line number Diff line change
Expand Up @@ -105,6 +105,7 @@ By default, Umbraco uses a special Serilog 'sink' that is optimized for performa
{
"Name": "UmbracoFile",
"Args": {
"Enabled": "True",
"RestrictedToMinimumLevel": "Warning",
"FileSizeLimitBytes": 1073741824,
"RollingInterval" : "Day",
Expand All @@ -117,6 +118,8 @@ By default, Umbraco uses a special Serilog 'sink' that is optimized for performa
}
```

You can also disable this sink if you do not wish to write files to disk.

## Adding a custom log property to all log items

You may wish to add a log property to all log messages. A good example could be a log property for the `environment` to determine if the log message came from `development` or `production`.
Expand Down
Loading