Skip to content

Commit d1aeb00

Browse files
committed
Switch up coordinates and README
1 parent d3a74b5 commit d1aeb00

File tree

3 files changed

+14
-155
lines changed

3 files changed

+14
-155
lines changed

.gitignore

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -13,6 +13,11 @@ project/plugins/project/
1313
*.class
1414
*.log
1515

16+
### Metals ###
17+
.metals/
18+
.bsp/
19+
.bloop/
20+
1621
### IntelliJ ###
1722
.idea
1823

README.md

Lines changed: 7 additions & 152 deletions
Original file line numberDiff line numberDiff line change
@@ -6,164 +6,19 @@
66

77

88
Provides FS2 Kafka `Serializer`s and `Deserializer`s that provide integration with Confluent Schema Registry for JSON messages with JSON Schemas.
9-
This library also provides an enrichment to fs2 kafka's vulcan `SchemaRegistryClientSettings` which is needed to enable additional JSON validation support
10-
inside the Schema Registry client.
9+
10+
__Note:__ _This library only works with Scala 3.3.x and above._ For Scala 2.x, see [here](https://github.com/kaizen-solutions/fs2-kafka-jsonschema-support).
1111

1212
This functionality is backed by the following libraries:
13-
- [scala-jsonschema](https://github.com/andyglow/scala-jsonschema) which is used to derive JSON Schemas for almost any Scala data-type
14-
- [circe-jackson](https://github.com/circe/circe-jackson) which is used to derive JSON Encoders and Decoders for any Scala data-type and is further used to interop with Confluent's + Jackson's Schema validation mechanismss
15-
- [fs2-kafka & fs2-kafka-vulcan](https://github.com/fd4s/fs2-kafka) which provides the serializers and deserializers interfaces that we implement along with the Schema Registry client that we enrich
16-
- [confluent-schema-registry](https://github.com/confluentinc/schema-registry) is used as a basis for implementation and small portions are used for JSON Schema validation
13+
- [Tapir's JSON Pickler](https://tapir.softwaremill.com/en/latest/endpoint/pickler.html)
14+
- [Tapir's JSON Schema](https://tapir.softwaremill.com/en/latest/docs/json-schema.html)
15+
- [FS2 kafka](https://github.com/fd4s/fs2-kafka)
16+
- [Confluent Schema Registry](https://github.com/confluentinc/schema-registry)
1717

1818
### Usage ###
1919

2020
Add the following to your `build.sbt`
2121
```sbt
2222
resolvers ++= Seq("confluent" at "https://packages.confluent.io/maven")
23-
24-
libraryDependencies += "io.kaizen-solutions" %% "fs2-kafka-jsonschema-support" % "<latest-version>"
25-
```
26-
27-
1. Define your data-types
28-
```scala
29-
object Book {}
30-
final case class Book(
31-
name: String,
32-
isbn: Int
33-
)
34-
35-
object Person {}
36-
final case class PersonV1(
37-
name: String,
38-
age: Int,
39-
books: List[Book]
40-
)
41-
```
42-
43-
2. Derive JSON Schemas for your case classes and add extra JSON Schema information using `scala-jsonschema`
44-
```scala
45-
import json.schema.description
46-
import json.{Json, Schema}
47-
48-
object Book {
49-
implicit val bookJsonSchema: Schema[Book] = Json.schema[Book]
50-
}
51-
final case class Book(
52-
@description("name of the book") name: String,
53-
@description("international standard book number") isbn: Int
54-
)
55-
56-
object Person {
57-
implicit val personJsonSchema: Schema[Person] = Json.schema[Person]
58-
}
59-
final case class Person(
60-
@description("name of the person") name: String,
61-
@description("age of the person") age: Int,
62-
@description("A list of books that the person has read") books: List[Book]
63-
)
23+
libraryDependencies += "io.kaizen-solutions" %% "fs2-kafka-jsonschema" % "<latest-version>"
6424
```
65-
66-
3. Use `circe` to derive Encoders & Decoders (or Codecs) for your data-types:
67-
```scala
68-
import io.circe.generic.semiauto._
69-
import io.circe.Codec
70-
import json.schema.description
71-
import json.{Json, Schema}
72-
73-
object Book {
74-
implicit val bookJsonSchema: Schema[Book] = Json.schema[Book]
75-
implicit val bookCodec: Codec[Book] = deriveCodec[Book]
76-
}
77-
final case class Book(
78-
@description("name of the book") name: String,
79-
@description("international standard book number") isbn: Int
80-
)
81-
82-
object Person {
83-
implicit val personJsonSchema: Schema[Person] = Json.schema[Person]
84-
implicit val personCodec: Codec[Person] = deriveCodec[Person]
85-
}
86-
final case class Person(
87-
@description("name of the person") name: String,
88-
@description("age of the person") age: Int,
89-
@description("A list of books that the person has read") books: List[Book]
90-
)
91-
```
92-
93-
4. Instantiate and configure the Schema Registry
94-
```scala
95-
import cats.effect._
96-
import io.kaizensolutions.jsonschema._
97-
98-
def schemaRegistry[F[_]: Sync]: F[SchemaRegistryClient] =
99-
SchemaRegistryClientSettings("http://localhost:8081")
100-
.withJsonSchemaSupport
101-
.createSchemaRegistryClient
102-
```
103-
104-
5. Configure your FS2 Kafka Producers and Consumers to pull Serializers (or do this process manually)
105-
```scala
106-
import cats.effect._
107-
import fs2.Stream
108-
import fs2.kafka._
109-
110-
def kafkaProducer[F[_]: Async, K, V](implicit
111-
keySerializer: Serializer[F, K],
112-
valueSerializer: Serializer[F, V]
113-
): Stream[F, KafkaProducer[F, K, V]] = {
114-
val settings: ProducerSettings[F, K, V] =
115-
ProducerSettings[F, K, V].withBootstrapServers("localhost:9092")
116-
KafkaProducer.stream(settings)
117-
}
118-
119-
def kafkaConsumer[F[_]: Async, K, V](groupId: String)(implicit
120-
keyDeserializer: Deserializer[F, K],
121-
valueDeserializer: Deserializer[F, V]
122-
): Stream[F, KafkaConsumer[F, K, V]] = {
123-
val settings = ConsumerSettings[F, K, V]
124-
.withBootstrapServers("localhost:9092")
125-
.withGroupId(groupId)
126-
.withAutoOffsetReset(AutoOffsetReset.Earliest)
127-
KafkaConsumer.stream(settings)
128-
}
129-
```
130-
**Note:** In some cases you will need to adjust the Decoder to account for missing data
131-
132-
6. Produce data to Kafka with automatic Confluent Schema Registry support:
133-
```scala
134-
import cats.effect._
135-
import fs2._
136-
import fs2.kafka._
137-
import json._
138-
import io.circe._
139-
import io.kaizensolutions.jsonschema._
140-
import scala.reflect.ClassTag
141-
142-
def jsonSchemaProducer[F[_]: Async, A: Encoder: json.Schema: ClassTag](
143-
settings: JsonSchemaSerializerSettings
144-
): Stream[F, KafkaProducer[F, String, A]] =
145-
Stream
146-
.eval[F, SchemaRegistryClient](schemaRegistry[F])
147-
.evalMap(schemaRegistryClient => JsonSchemaSerializer[F, A](settings, schemaRegistryClient))
148-
.evalMap(_.forValue)
149-
.flatMap(implicit serializer => kafkaProducer[F, String, A])
150-
151-
152-
def jsonSchemaConsumer[F[_]: Async, A: Decoder: json.Schema: ClassTag](
153-
settings: JsonSchemaDeserializerSettings,
154-
groupId: String
155-
): Stream[F, KafkaConsumer[F, String, A]] =
156-
Stream
157-
.eval(schemaRegistry[F])
158-
.evalMap(client => JsonSchemaDeserializer[F, A](settings, client))
159-
.flatMap(implicit des => kafkaConsumer[F, String, A](groupId))
160-
```
161-
162-
### Settings ###
163-
There are a number of settings that control a number of behaviors when it comes to serialization and deserialization of data.
164-
Please check `JsonSchemaDeserializerSettings` and `JsonSchemaSerializerSettings` for more information. The `default` settings
165-
work great unless you need fine-grained control
166-
167-
### Notes ###
168-
- Please note that this is only an initial design to prove the functionality, and I'm very happy to integrate this back into FS2 Kafka (and other Kafka libraries) so please submit an issue and we can take it from there
169-
- This library provides additional validation checks for the Deserialization side on top of what Confluent provides in their Java JSON Schema Deserializer

build.sbt

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -39,8 +39,7 @@ inThisBuild {
3939
)
4040
),
4141
developers := List(
42-
Developer("calvinlfer", "Calvin Fernandes", "[email protected]", url("https://www.kaizen-solutions.io")),
43-
Developer("anakos", "Alex Nakos", "[email protected]", url("https://github.com"))
42+
Developer("calvinlfer", "Calvin Fernandes", "[email protected]", url("https://www.kaizen-solutions.io"))
4443
),
4544
licenses := List("MIT" -> url("https://opensource.org/licenses/MIT")),
4645
organization := "io.kaizen-solutions",
@@ -59,7 +58,7 @@ lazy val root =
5958
project
6059
.in(file("."))
6160
.settings(
62-
name := "fs2-kafka-jsonschema-support",
61+
name := "fs2-kafka-jsonschema",
6362
libraryDependencies ++= {
6463
val circe = "io.circe"
6564
val fd4s = "com.github.fd4s"

0 commit comments

Comments
 (0)