Skip to content

Commit 32b8bcb

Browse files
authored
Introduce @flatten annotation (#642)
closes #623 This PR introduces the `@flatten` annotation. ## Usage The `@flatten` annotation can only be applied to: `case class`es: Flatten fields of a nested case class into the parent structure. ```scala case class A(i: Int, @flatten b: B) case class B(msg: String) implicit val rw: ReadWriter[A] = macroRW implicit val rw: ReadWriter[B] = macroRW write(A(1, B("Hello"))) // {"i":1, "msg": "Hello"} ``` `Iterable`: Flatten key-value pairs of a Iterable[(String, _)] into the parent structure. ```scala case class A(i: Int, @flatten map: Map[String, String]) implicit val rw: ReadWriter[A] = macroRW val map = Map("a" -> "1", "b" -> "2") write(A(1, map)) // {"i":1, "a":"1", "b": "2"} ``` Nested flattening allows you to apply the `@flatten` annotation recursively to fields within nested case classes. ```scala case class Outer(msg: String, @flatten inner: Inner) case class Inner(@flatten inner2: Inner2) case class Inner2(i: Int) implicit val rw: ReadWriter[Inner2] = macroRW implicit val rw: ReadWriter[Inner] = macroRW implicit val rw: ReadWriter[Outer] = macroRW write(Outer("abc", Inner(Inner2(7)))) // {"msg": "abc", "i": 7} ``` The Reader also recognizes the `@flatten` annotation. ```scala case class A(i: Int, @flatten b: B) case class B(msg: String) implicit val rw: ReadWriter[A] = macroRW implicit val rw: ReadWriter[B] = macroRW read("""{"i": 1, "msg": "Hello"}""") // The top-level field "msg": "Hello" is correctly mapped to the field in B. ``` For collection, during deserialization, all key-value pairs in the JSON that do not directly map to a specific field in the case class are attempted to be stored in the Map. If a key in the JSON does not correspond to any field in the case class, it is stored in the collection. ```scala case class A(i: Int,@flatten Map[String, String]) implicit val rw: ReadWriter[A] = macroRW read("""{"i":1, "a" -> "1", "b" -> "2"}""") // Output: A(1, Map("a" -> "1", "b" -> "2")) ``` If there are no keys in the JSON that can be stored in the collection, it is treated as an empty collection. ```scala read("""{"i":1}""") // Output: A(1, Map.empty) ``` If a key’s value in the JSON cannot be converted to the Map’s value type (e.g., String), the deserialization fails. ```scala read("""{"i":1, "a":{"name":"foo"}}""") // Error: Failed to deserialize because the value for "a" is not a String, as required by Map[String, String]. ``` ## Limitations 1. Flattening more than two collections to a same level is not supported. Flattening multiple collections to a same level feels awkward to support because, when deriving a Reader, it becomes unclear which collection the data should be stored in. 2. Type parameters do not seem to be properly resolved in the following scenario: ```scala case class Param[T](@flatten t: T) object Param { implicit def rw[T: RW]: RW[Param[T]] = upickle.default.macroRW // compile error when this function is called to derive instance implicit val rw[SomeClass]: RW[Param[SomeClass]] = upickle.default.macroRW // works } ``` 3. When using the `@flatten` annotation on a `Iterable`, the type of key must be String. ## Implementation Strategy ### Writer Derivation From my understanding, deriving a Writer for a case class involves implementing a dispatch function that iterates through each field of the case class. In the existing implementation, the Writer is generated by processing the information of each field in the type T being derived. ```scala this.writeSnippetMappedName[R, Int](ctx, upickle.default.objectAttributeKeyWriteMap("i"), implicitly[upickle.default.Writer[Int]], v.i); this.writeSnippetMappedName[R, upickle.Nested](ctx, upickle.default.objectAttributeKeyWriteMap("n"), implicitly[upickle.default.Writer[upickle.Nested]], v.n); ctx.visitEnd(-1) ``` When deriving a Writer, the above snippet shows how the visit function is invoked for each field, passing the corresponding field’s Writer as an argument. If the field is a Map or a case class and needs to be flattened, additional processing is required: 1. For case classes, instead of delegating to the Writer of the nested case class, the visit function should be called directly for each field in the nested case class. For example: ```scala this.writeSnippetMappedName[R, upickle.Int](ctx, upickle.default.objectAttributeKeyWriteMap("integerValue"), implicitly[upickle.default.Writer[upickle.Int]], v.n.integerValue); ``` 2. For `Map`, iterate through all key-value pairs in the Map, calling visit for each pair: ```scala mapField.foreach { case (key, value) => this.writeSnippetMappedName[R, $valueType]( ctx, key.toString, implicitly[${c.prefix}.Writer[$valueType]], value ) } ``` ### Reader Derivation Deriving a Reader is more complex, especially with support for Map, as it introduces several edge cases. A Reader is essentially a Visitor for JSON, as I understand it. The process involves three main steps: 1. Mapping read keys to indices. 2. Storing the read values in variables corresponding to the indices. 3. Constructing the case class instance from the read values after traversal. To support flattening, additional steps were required in these stages: 1. Mapping Keys to Indices: Flattened fields must be accounted for in the key-to-index mapping. For example: ```scala case class A(@flatten b: B) case class B(i: Int, d: Double) // Without flattening key match { "b" => 0 _ => -1 } // With flattening key match { "i" => 0 "d" => 1 _ => -1 } ``` 2. Allocate storage for flattened fields, similar to step 1, but extended to account for flattened structures. 3. Modify the class construction logic to recursively handle flattened fields. Special Case for Maps: - Since Map must capture all key-value pairs not corresponding to a specific field, I implemented logic to handle this: - If a key’s index is -1 an type T we derive has a flatten annotation on map, the key-value pair is stored in a ListBuffer within the Reader. - These pairs are used during class construction to populate the Map.
1 parent fe53cf3 commit 32b8bcb

File tree

11 files changed

+1358
-149
lines changed

11 files changed

+1358
-149
lines changed

upickle/implicits/src-2/upickle/implicits/MacroImplicits.scala

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -43,7 +43,7 @@ trait MacroImplicits extends MacrosCommon { this: upickle.core.Types =>
4343
def macroW[T]: Writer[T] = macro MacroImplicits.applyW[T]
4444
def macroRW[T]: ReadWriter[T] = macro MacroImplicits.applyRW[ReadWriter[T]]
4545

46-
def macroR0[T, M[_]]: Reader[T] = macro internal.Macros.macroRImpl[T, M]
47-
def macroW0[T, M[_]]: Writer[T] = macro internal.Macros.macroWImpl[T, M]
46+
def macroR0[T, M[_]]: Reader[T] = macro internal.Macros2.macroRImpl[T, M]
47+
def macroW0[T, M[_]]: Writer[T] = macro internal.Macros2.macroWImpl[T, M]
4848
}
4949

upickle/implicits/src-2/upickle/implicits/internal/Macros.scala

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -10,6 +10,11 @@ import upickle.implicits.{MacrosCommon, key}
1010
import language.higherKinds
1111
import language.existentials
1212

13+
/**
14+
* This file is deprecated and remained here for binary compatibility.
15+
* Please use upickle/implicits/src-2/upickle/implicits/internal/Macros2.scala instead.
16+
*/
17+
1318
/**
1419
* Implementation of macros used by uPickle to serialize and deserialize
1520
* case classes automatically. You probably shouldn't need to use these
@@ -466,6 +471,8 @@ object Macros {
466471
q"${c.prefix}.Writer.merge[$targetType](..$subtree)"
467472
}
468473
}
474+
475+
@deprecated("Use Macros2 instead")
469476
def macroRImpl[T, R[_]](c0: scala.reflect.macros.blackbox.Context)
470477
(implicit e1: c0.WeakTypeTag[T], e2: c0.WeakTypeTag[R[_]]): c0.Expr[R[T]] = {
471478
import c0.universe._
@@ -477,6 +484,7 @@ object Macros {
477484
c0.Expr[R[T]](res)
478485
}
479486

487+
@deprecated("Use Macros2 instead")
480488
def macroWImpl[T, W[_]](c0: scala.reflect.macros.blackbox.Context)
481489
(implicit e1: c0.WeakTypeTag[T], e2: c0.WeakTypeTag[W[_]]): c0.Expr[W[T]] = {
482490
import c0.universe._

0 commit comments

Comments
 (0)