| title | category | order | keywords | ||||||
|---|---|---|---|---|---|---|---|---|---|
Logging Troubleshooting Guide |
reference |
85 |
|
Documentation > Reference > Logging Troubleshooting
Common logging issues and their solutions, plus guidance on using logs to debug AOT and runtime issues.
Symptoms:
- Logger is passed to table constructor
- No log output in console/file/sink
- Application runs normally
Diagnosis:
// Check if logger is enabled
var logger = loggerFactory.CreateLogger<ProductsTable>().ToDynamoDbLogger();
Console.WriteLine($"Debug enabled: {logger.IsEnabled(LogLevel.Debug)}");
Console.WriteLine($"Info enabled: {logger.IsEnabled(LogLevel.Information)}");Solutions:
- Check minimum log level:
// Too high - won't see Debug/Trace logs
builder.Services.AddLogging(logging =>
{
logging.SetMinimumLevel(LogLevel.Warning); // Change to Debug or Information
});- Check category filters:
// May be filtering out DynamoDB logs
builder.Services.AddLogging(logging =>
{
logging.AddFilter("Microsoft", LogLevel.Warning); // Doesn't affect DynamoDB
logging.AddFilter("ProductsTable", LogLevel.Debug); // Add this
});- Check sink configuration:
// Serilog - ensure sink is configured
Log.Logger = new LoggerConfiguration()
.WriteTo.Console() // Add this
.CreateLogger();- Verify logger is configured in options:
// Wrong - logger not configured
var table = new ProductsTable(client, "products");
// Correct - use FluentDynamoDbOptions
var options = new FluentDynamoDbOptions()
.WithLogger(logger);
var table = new ProductsTable(client, "products", options);Symptoms:
- Logs worked in Debug build
- No logs in Release build
- No errors or warnings
Diagnosis:
# Check if DISABLE_DYNAMODB_LOGGING is defined
dotnet build -c Release -v detailed | grep DISABLE_DYNAMODB_LOGGINGSolution:
<!-- Remove or comment out in .csproj -->
<PropertyGroup Condition="'$(Configuration)' == 'Release'">
<!-- <DefineConstants>$(DefineConstants);DISABLE_DYNAMODB_LOGGING</DefineConstants> -->
</PropertyGroup>See Conditional Compilation for details.
Symptoms:
- Thousands of log entries per second
- Performance degradation
- Storage costs increasing
- Difficult to find relevant logs
Solutions:
- Increase minimum log level:
// Production - only Information and above
builder.Services.AddLogging(logging =>
{
logging.SetMinimumLevel(LogLevel.Information);
});- Filter by event ID:
// Only log DynamoDB operations, not mapping details
builder.Services.AddLogging(logging =>
{
logging.AddFilter((category, level, eventId) =>
(eventId.Id >= 3000 && eventId.Id < 4000) || // Operations
eventId.Id >= 9000); // Errors
});- Filter by category:
// Different levels for different tables
builder.Services.AddLogging(logging =>
{
logging.AddFilter("ProductsTable", LogLevel.Debug);
logging.AddFilter("OrdersTable", LogLevel.Information);
});- Sample high-volume logs:
// Sample 10% of Debug logs
builder.Services.AddLogging(logging =>
{
logging.AddFilter((category, level, eventId) =>
{
if (level == LogLevel.Debug)
return Random.Shared.Next(100) < 10;
return true;
});
});- Use conditional compilation:
<!-- Disable logging in production -->
<PropertyGroup Condition="'$(Configuration)' == 'Release'">
<DefineConstants>$(DefineConstants);DISABLE_DYNAMODB_LOGGING</DefineConstants>
</PropertyGroup>Symptoms:
- Custom logger implemented
- IsEnabled returns true
- Log methods never called
Diagnosis:
public class TestLogger : IDynamoDbLogger
{
public bool IsEnabled(LogLevel logLevel)
{
Console.WriteLine($"IsEnabled called: {logLevel}");
return true;
}
public void LogDebug(int eventId, string message, params object[] args)
{
Console.WriteLine($"LogDebug called: {eventId}");
}
// Implement other methods...
}Solutions:
- Implement all methods:
// Must implement ALL interface methods
public class CustomLogger : IDynamoDbLogger
{
public bool IsEnabled(LogLevel logLevel) => true;
public void LogTrace(int eventId, string message, params object[] args) { }
public void LogDebug(int eventId, string message, params object[] args) { }
public void LogInformation(int eventId, string message, params object[] args) { }
public void LogWarning(int eventId, string message, params object[] args) { }
public void LogError(int eventId, string message, params object[] args) { }
public void LogError(int eventId, Exception exception, string message, params object[] args) { }
public void LogCritical(int eventId, Exception exception, string message, params object[] args) { }
}- Check IsEnabled implementation:
// Wrong - always returns false
public bool IsEnabled(LogLevel logLevel) => false;
// Correct
public bool IsEnabled(LogLevel logLevel) => logLevel >= LogLevel.Debug;- Verify logger is passed to operations:
// Generated methods need logger parameter
var item = Product.ToDynamoDb(entity, logger); // Pass logger
var product = Product.FromDynamoDb<Product>(item, logger); // Pass loggerUse logging to identify the issue:
- Enable detailed logging:
builder.Services.AddLogging(logging =>
{
logging.SetMinimumLevel(LogLevel.Trace);
});- Run the operation that fails:
try
{
var product = Product.FromDynamoDb<Product>(item, logger);
}
catch (Exception ex)
{
// Check logs for the last successful operation
}- Analyze logs:
[Trace] Starting FromDynamoDb mapping for Product with 8 attributes
[Debug] Mapping property Id from String
[Debug] Mapping property Name from String
[Error] Failed to map DynamoDB item to Product
Exception: System.InvalidCastException: Unable to cast object...
The error shows which property failed, helping identify AOT issues.
Symptoms:
- Works in Debug
- Fails in AOT-compiled build
- Generic type errors
Debugging with logs:
[Debug] Mapping property Metadata from Map
[Error] Failed to convert Metadata to Map. PropertyType: Dictionary<string, string>
Exception: System.MissingMethodException: Cannot create instance...
Solution: Use source-generated code instead of reflection:
// Generated code is AOT-safe
[DynamoDbAttribute("metadata")]
public Dictionary<string, string> Metadata { get; set; }Symptoms:
- JsonBlob property fails in AOT
- Works with Newtonsoft.Json in non-AOT
Debugging with logs:
[Debug] Converting JsonBlob property Data
[Error] JSON serialization failed for property Data
Exception: System.Text.Json.JsonException: The JSON value could not be converted...
Solution: Use System.Text.Json with source generation:
[JsonBlob(JsonSerializerType.SystemTextJson)]
[DynamoDbAttribute("data")]
public MyData Data { get; set; }
// Add JsonSerializerContext
[JsonSerializable(typeof(MyData))]
public partial class MyJsonContext : JsonSerializerContext { }Symptoms:
- Property value is null after FromDynamoDb
- No error thrown
Debugging:
- Enable Debug logging:
logging.SetMinimumLevel(LogLevel.Debug);- Check logs:
[Trace] Starting FromDynamoDb mapping for Product with 8 attributes
[Debug] Mapping property Id from String
[Debug] Mapping property Name from String
[Debug] Skipping empty collection Tags
[Trace] Completed FromDynamoDb mapping for Product
- Look for:
- Property not mentioned → Not in DynamoDB item
- "Skipping" message → Value is null/empty
- No error → Mapping succeeded but value is null
Solutions:
// Check if attribute exists in item
if (item.ContainsKey("tags"))
{
// Attribute exists
}
else
{
// Attribute missing - check DynamoDB table
}
// Check attribute type
if (item.TryGetValue("tags", out var value))
{
Console.WriteLine($"Type: {value.SS != null ? "SS" : "Unknown"}");
}Symptoms:
- DynamoDbMappingException thrown
- Property type mismatch
Debugging:
[Debug] Converting Tags from String Set with 3 elements
[Error] Failed to convert Tags to Set. PropertyType: HashSet<string>, ElementCount: 3
Exception: InvalidCastException...
Solutions:
- Check property type matches DynamoDB type:
// DynamoDB: String Set (SS)
// C#: Must be HashSet<string> or ISet<string>
[DynamoDbAttribute("tags")]
public HashSet<string> Tags { get; set; }- Check for null values:
// DynamoDB doesn't support null in sets
// Ensure no null values in collection
public HashSet<string> Tags { get; set; } = new();Symptoms:
- Slow mapping
- High memory usage
- Timeout errors
Debugging:
[Debug] Converting Tags from String Set with 10000 elements
[Warning] Large collection detected: Tags has 10000 elements
Solutions:
- Use pagination:
// Don't load entire collection at once
await table.Query
.Where("pk = :pk")
.WithValue(":pk", "product-123")
.Take(100) // Limit results
.ExecuteAsync();- Use projection:
// Only load needed properties
await table.Query
.Where("pk = :pk")
.WithValue(":pk", "product-123")
.WithProjection("pk, sk, name") // Exclude large collections
.ExecuteAsync();- Store large data externally:
// Use blob storage for large collections
[BlobReference(BlobProvider.S3)]
[DynamoDbAttribute("tags_ref")]
public HashSet<string> Tags { get; set; }Debugging:
[Information] Executing Query on table products. KeyCondition: pk = :pk
[Debug] Query parameters: 1 values
[Information] Query completed. ItemCount: 0, ConsumedCapacity: 1.0
Analysis:
- Query executed successfully (no error)
- Consumed capacity > 0 (query ran)
- ItemCount: 0 (no matching items)
Solutions:
- Check key condition:
// Log shows actual condition used
// Verify it matches your data- Check parameter values:
// Enable Trace logging to see parameter values
logging.SetMinimumLevel(LogLevel.Trace);- Verify data exists:
// Use AWS Console or CLI to verify data
aws dynamodb query --table-name products --key-condition-expression "pk = :pk" --expression-attribute-values '{":pk":{"S":"product-123"}}'Debugging:
[Information] Query completed. ItemCount: 5, ConsumedCapacity: 25.0
[Warning] High capacity consumption detected
Analysis:
- 5 items returned
- 25 capacity units consumed
- ~5 units per item (very high)
Solutions:
- Use projection:
await table.Query
.Where("pk = :pk")
.WithValue(":pk", "product-123")
.WithProjection("pk, sk, name") // Only needed attributes
.ExecuteAsync();- Check item sizes:
[Debug] Mapping property Data from Binary
[Warning] Large property detected: Data is 400KB
- Use eventually consistent reads:
await table.Query
.Where("pk = :pk")
.WithValue(":pk", "product-123")
.WithConsistentRead(false) // Halves capacity cost
.ExecuteAsync();Debugging:
[Error] Query failed on table products
Exception: ProvisionedThroughputExceededException: Rate exceeded
Solutions:
- Implement exponential backoff:
var retryCount = 0;
while (retryCount < 3)
{
try
{
return await table.Query().ToListAsync();
}
catch (ProvisionedThroughputExceededException)
{
await Task.Delay(TimeSpan.FromMilliseconds(Math.Pow(2, retryCount) * 100));
retryCount++;
}
}- Monitor capacity consumption:
// Track capacity in logs
logging.AddFilter((category, level, eventId) =>
eventId.Id == 3110); // ConsumedCapacity event- Use batch operations:
// More efficient than individual operations
await table.BatchGet
.WithKeys(keys)
.ExecuteAsync();# Grep logs
grep "EntityType.*Product" logs.txt | grep "Error"
# Application Insights
traces
| where customDimensions.EntityType == "Product"
| where customDimensions.EventId >= 9000# Grep logs
grep "ConsumedCapacity" logs.txt
# Application Insights
traces
| where customDimensions.EventId == 3110
| summarize TotalCapacity = sum(todouble(customDimensions.ConsumedCapacity)) by bin(timestamp, 1h)# Find operations taking > 1 second
grep "Operation completed" logs.txt | awk '{print $1, $NF}' | awk '$NF > 1000'
# Application Insights
traces
| where customDimensions.EventId == 3100
| extend Duration = datetime_diff('millisecond', timestamp, prev(timestamp))
| where Duration > 1000# Grep logs
grep "EventId.*9000" logs.txt
# Application Insights
traces
| where customDimensions.EventId == 9000
| project timestamp, customDimensions.EntityType, customDimensions.PropertyName, message- Start with Information level - See high-level operations
- Enable Debug for specific issues - Get property-level details
- Use Trace sparingly - Only for deep debugging
- Filter by event ID - Focus on specific operation types
- Check structured properties - Query by EntityType, PropertyName, etc.
- Compare Debug vs Release - Ensure consistent behavior
- Test with logging disabled - Verify no dependencies on logging
- Use log scopes - Add request/user context
- Monitor capacity consumption - Track event ID 3110
- Alert on errors - Event IDs >= 9000
If you're still stuck:
- Collect logs - Enable Debug level and capture full logs
- Identify pattern - When does it fail? Which entities?
- Minimal reproduction - Create smallest example that fails
- Check documentation - Review relevant guides
- Search issues - Check GitHub issues for similar problems
- Ask for help - Provide logs, code, and context
- Logging Configuration - Configure loggers
- Log Levels and Event IDs - Understand event IDs
- Structured Logging - Query logs effectively
- Error Handling - Handle exceptions properly
See Also: