Skip to content

Commit 400125f

Browse files
authored
beta3 release changes (#767)
* beta3 release changes * small readme fixes * review changes
1 parent eee0de8 commit 400125f

File tree

15 files changed

+55
-43
lines changed

15 files changed

+55
-43
lines changed

CHANGELOG.md

Lines changed: 17 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -3,21 +3,23 @@
33
## New Features
44

55
- Revamped producer and consumer serialization functionality.
6-
- All producer functionality except the public methods used to produce messages is now provided by `ProducerBase`.
7-
- There are two producer classes deriving from `ProducerBase`: `Producer` and `Producer<TKey, TValue>`.
8-
- `Producer` is specialized for the case of producing messages with `byte[]` keys and values.
9-
- `Producer<TKey, TValue>` provides flexible integration with serialization functionality.
10-
- On the consumer side, there are analogous classes: `ConsumerBase`, `Consumer` and `Consumer<TKey, TValue>`.
11-
- There are two types of serializer and deserializer: `ISerializer<T>` / `IAsyncSerializer<T>` and `IDeserializer<T>` / `IAsyncDeserializer<T>`.
6+
- There are now two types of serializer and deserializer: `ISerializer<T>` / `IAsyncSerializer<T>` and `IDeserializer<T>` / `IAsyncDeserializer<T>`.
127
- `ISerializer<T>`/`IDeserializer<T>` are appropriate for most use cases.
13-
- `IAsyncSerializer<T>`/`IAsyncDeserializer<T>` are more general, but less performant (they return `Task`s).
14-
- The generic producer and consumer can be used with both types of serializer.
8+
- `IAsyncSerializer<T>`/`IAsyncDeserializer<T>` are async friendly, but less performant (they return `Task`s).
159
- Changed the name of `Confluent.Kafka.Avro` to `Confluent.SchemaRegistry.Serdes` (Schema Registry may support other serialization formats in the future).
16-
- Added a example demonstrating working with protobuf serialized data.
10+
- Added an example demonstrating working with protobuf serialized data.
11+
- `Consumer`s, `Producer`s and `AdminClient`s are now constructed using builder classes.
12+
- This is more verbose, but provides a sufficiently flexible and future proof API for specifying serdes and other configuration information.
13+
- All `event`s on the client classes have been replaced with corresponding `Set...Handler` methods on the builder classes.
14+
- This allows (enforces) handlers are set on librdkafka initialization (which is important for some handlers, particularly the log handler).
15+
- `event`s allow for more than one handler to be set, but this is often not appropriate (e.g. `OnPartitionsAssigned`), and never necessary. This is no longer possible.
16+
- `event`s are also not async friendly (handlers can't return `Task`). The Set...Handler appropach can be extend in such a way that it is.
1717
- Avro serdes no longer make blocking calls to `ICachedSchemaRegistryClient` - everything is `await`ed.
18-
- References librdkafka.redist [1.0.0-RC5](https://github.com/edenhill/librdkafka/releases/tag/v1.0.0-RC5)
18+
- Note: The `Consumer` implementation still calls async deserializers synchronously because the `Consumer` API is still otherwise fully synchronous.
19+
- Reference librdkafka.redist [1.0.0-RC7](https://github.com/edenhill/librdkafka/releases/tag/v1.0.0-RC7)
20+
- Notable features: idempotent producer, sparse connections, KIP-62 (max.poll.interval.ms).
1921
- Note: End of partition notification is now disabled by default (enable using the `EnablePartitionEof` config property).
20-
- Removed `Consumer.OnPartitionEOF` in favor of `ConsumeResult.IsPartitionEOF`.
22+
- Removed the `Consumer.OnPartitionEOF` event in favor notifying of partition eof via `ConsumeResult.IsPartitionEOF`.
2123
- Removed `ErrorEvent` class and added `IsFatal` to `Error` class.
2224
- The `IsFatal` flag is now set appropriately for all errors (previously it was always set to `false`).
2325
- Added `PersistenceStatus` property to `DeliveryResult`, which provides information on the persitence status of the message.
@@ -26,6 +28,10 @@
2628

2729
- Added `Close` method to `IConsumer` interface.
2830
- Changed the name of `ProduceException.DeliveryReport` to `ProduceException.DeliveryResult`.
31+
- Fixed bug where enum config property couldn't be read after setting it.
32+
- Added `SchemaRegistryBasicAuthCredentialsSource` back into `SchemaRegistryConfig` (#679).
33+
- Fixed schema registry client failover connection issue (#737).
34+
- Improvements to librdkafka dependnecy discovery (#743).
2935

3036

3137
# 1.0.0-beta2

README.md

Lines changed: 24 additions & 18 deletions
Original file line numberDiff line numberDiff line change
@@ -42,16 +42,16 @@ confluent-kafka-dotnet is distributed via NuGet. We provide three packages:
4242
To install Confluent.Kafka from within Visual Studio, search for Confluent.Kafka in the NuGet Package Manager UI, or run the following command in the Package Manager Console:
4343

4444
```
45-
Install-Package Confluent.Kafka -Version 1.0-beta2
45+
Install-Package Confluent.Kafka -Version 1.0.0-beta3
4646
```
4747

4848
To add a reference to a dotnet core project, execute the following at the command line:
4949

5050
```
51-
dotnet add package -v 1.0-beta2 Confluent.Kafka
51+
dotnet add package -v 1.0.0-beta3 Confluent.Kafka
5252
```
5353

54-
**Note:** We recommend using the `1.0-beta2` version of Confluent.Kafka for new projects in preference to the most recent stable release (0.11.5).
54+
**Note:** We recommend using the `1.0.0-beta3` version of Confluent.Kafka for new projects in preference to the most recent stable release (0.11.6).
5555
The 1.0 API provides more features, is considerably improved and is more performant than 0.11.x releases. In choosing the label 'beta',
5656
we are signaling that we do not anticipate making any high impact changes to the API before the 1.0 release, however be warned that some
5757
breaking changes are still planned. You can track progress and provide feedback on the new 1.0 API
@@ -90,10 +90,10 @@ class Program
9090
{
9191
var config = new ProducerConfig { BootstrapServers = "localhost:9092" };
9292

93-
// If serializers are not specified as constructor arguments, default
94-
// serializers from `Confluent.Kafka.Serializers` will be automatically
95-
// used where available. Note: by default strings are encoded as UTF8.
96-
using (var p = new Producer<Null, string>(config))
93+
// If serializers are not specified, default serializers from
94+
// `Confluent.Kafka.Serializers` will be automatically used where
95+
// available. Note: by default strings are encoded as UTF8.
96+
using (var p = new ProducerBuilder<Null, string>(config).Build())
9797
{
9898
try
9999
{
@@ -125,12 +125,12 @@ class Program
125125
{
126126
var conf = new ProducerConfig { BootstrapServers = "localhost:9092" };
127127

128-
Action<DeliveryReportResult<Null, string>> handler = r =>
128+
Action<DeliveryReport<Null, string>> handler = r =>
129129
Console.WriteLine(!r.Error.IsError
130130
? $"Delivered message to {r.TopicPartitionOffset}"
131131
: $"Delivery Error: {r.Error.Reason}");
132132

133-
using (var p = new Producer<Null, string>(conf))
133+
using (var p = new ProducerBuilder<Null, string>(conf).Build())
134134
{
135135
for (int i=0; i<100; ++i)
136136
{
@@ -148,6 +148,7 @@ class Program
148148

149149
```csharp
150150
using System;
151+
using System.Threading;
151152
using Confluent.Kafka;
152153

153154
class Program
@@ -163,29 +164,34 @@ class Program
163164
// topic/partitions of interest. By default, offsets are committed
164165
// automatically, so in this example, consumption will only start from the
165166
// earliest message in the topic 'my-topic' the first time you run the program.
166-
AutoOffsetReset = AutoOffsetResetType.Earliest
167+
AutoOffsetReset = AutoOffsetReset.Earliest
167168
};
168169

169-
using (var c = new Consumer<Ignore, string>(conf))
170+
using (var c = new ConsumerBuilder<Ignore, string>(conf).Build())
170171
{
171172
c.Subscribe("my-topic");
172173

173-
bool consuming = true;
174-
// The client will automatically recover from non-fatal errors. You typically
175-
// don't need to take any action unless an error is marked as fatal.
176-
c.OnError += (_, e) => consuming = !e.IsFatal;
174+
CancellationTokenSource cts = new CancellationTokenSource();
175+
Console.CancelKeyPress += (_, e) => {
176+
e.Cancel = true; // prevent the process from terminating.
177+
cts.Cancel();
178+
};
177179

178-
while (consuming)
180+
while (!cts.IsCancellationRequested)
179181
{
180182
try
181183
{
182-
var cr = c.Consume();
184+
var cr = c.Consume(cts.Token);
183185
Console.WriteLine($"Consumed message '{cr.Value}' at: '{cr.TopicPartitionOffset}'.");
184186
}
185187
catch (ConsumeException e)
186188
{
187189
Console.WriteLine($"Error occured: {e.Error.Reason}");
188190
}
191+
catch (OperationCanceledException)
192+
{
193+
break;
194+
}
189195
}
190196

191197
// Ensure the consumer leaves the group cleanly and final offsets are committed.
@@ -219,7 +225,7 @@ For more information about working with Avro in .NET, refer to the the blog post
219225

220226
### Error Handling
221227

222-
Errors raised via a client's `OnError` event should be considered informational except when the `IsFatal` flag
228+
Errors delivered to a client's error handler should be considered informational except when the `IsFatal` flag
223229
is set to `true`, indicating that the client is in an un-recoverable state. Currently, this can only happen on
224230
the producer, and only when `enable.itempotence` has been set to `true`. In all other scenarios, clients are
225231
able to recover from all errors automatically.

examples/AdminClient/AdminClient.csproj

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@
88
</PropertyGroup>
99

1010
<ItemGroup>
11-
<!-- nuget package reference: <PackageReference Include="Confluent.Kafka" Version="1.0-beta2" /> -->
11+
<!-- nuget package reference: <PackageReference Include="Confluent.Kafka" Version="1.0.0-beta3" /> -->
1212
<ProjectReference Include="../../src/Confluent.Kafka/Confluent.Kafka.csproj" />
1313
</ItemGroup>
1414

examples/AvroBlogExamples/AvroBlogExamples.csproj

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@
88
</PropertyGroup>
99

1010
<ItemGroup>
11-
<!-- nuget package reference: <PackageReference Include="Confluent.SchemaRegistry.Serdes" Version="1.0-beta2" /> -->
11+
<!-- nuget package reference: <PackageReference Include="Confluent.SchemaRegistry.Serdes" Version="1.0.0-beta3" /> -->
1212
<ProjectReference Include="../../src/Confluent.SchemaRegistry.Serdes/Confluent.SchemaRegistry.Serdes.csproj" />
1313
</ItemGroup>
1414

examples/AvroGeneric/AvroGeneric.csproj

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@
99
</PropertyGroup>
1010

1111
<ItemGroup>
12-
<!-- nuget package reference: <PackageReference Include="Confluent.SchemaRegistry.Serdes" Version="1.0-beta2" /> -->
12+
<!-- nuget package reference: <PackageReference Include="Confluent.SchemaRegistry.Serdes" Version="1.0.0-beta3" /> -->
1313
<ProjectReference Include="../../src/Confluent.SchemaRegistry.Serdes/Confluent.SchemaRegistry.Serdes.csproj" />
1414
</ItemGroup>
1515

examples/AvroSpecific/AvroSpecific.csproj

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@
99
</PropertyGroup>
1010

1111
<ItemGroup>
12-
<!-- nuget package reference: <PackageReference Include="Confluent.SchemaRegistry.Serdes" Version="1.0-beta2" /> -->
12+
<!-- nuget package reference: <PackageReference Include="Confluent.SchemaRegistry.Serdes" Version="1.0.0-beta3" /> -->
1313
<ProjectReference Include="../../src/Confluent.SchemaRegistry.Serdes/Confluent.SchemaRegistry.Serdes.csproj" />
1414
</ItemGroup>
1515

examples/ConfluentCloud/ConfluentCloud.csproj

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@
77
</PropertyGroup>
88

99
<ItemGroup>
10-
<!-- nuget package reference: <PackageReference Include="Confluent.Kafka" Version="1.0-beta2" /> -->
10+
<!-- nuget package reference: <PackageReference Include="Confluent.Kafka" Version="1.0.0-beta3" /> -->
1111
<ProjectReference Include="../../src/Confluent.Kafka/Confluent.Kafka.csproj" />
1212
</ItemGroup>
1313

examples/Consumer/Consumer.csproj

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@
88
</PropertyGroup>
99

1010
<ItemGroup>
11-
<!-- nuget package reference: <PackageReference Include="Confluent.Kafka" Version="1.0-beta2" /> -->
11+
<!-- nuget package reference: <PackageReference Include="Confluent.Kafka" Version="1.0.0-beta3" /> -->
1212
<ProjectReference Include="../../src/Confluent.Kafka/Confluent.Kafka.csproj" />
1313
</ItemGroup>
1414

examples/MultiProducer/MultiProducer.csproj

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@
88
</PropertyGroup>
99

1010
<ItemGroup>
11-
<!-- nuget package reference: <PackageReference Include="Confluent.Kafka" Version="1.0-beta2" /> -->
11+
<!-- nuget package reference: <PackageReference Include="Confluent.Kafka" Version="1.0.0-beta3" /> -->
1212
<ProjectReference Include="../../src/Confluent.Kafka/Confluent.Kafka.csproj" />
1313
</ItemGroup>
1414

examples/Producer/Producer.csproj

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,7 +9,7 @@
99
</PropertyGroup>
1010

1111
<ItemGroup>
12-
<!-- nuget package reference: <PackageReference Include="Confluent.Kafka" Version="1.0-beta2" /> -->
12+
<!-- nuget package reference: <PackageReference Include="Confluent.Kafka" Version="1.0.0-beta3" /> -->
1313
<ProjectReference Include="../../src/Confluent.Kafka/Confluent.Kafka.csproj" />
1414
</ItemGroup>
1515

0 commit comments

Comments
 (0)