My understanding of these two functions is that they return the number of Unicode scalars in a string, such that List.length(decodeUcharInString(s)) == countInString(s) should always be true (but let me know if that isn't the intended behavior of countInString).
countInString("✨") returns a value of 3, but I expected it to return 1. That seems to be the case for most 3-byte utf8 characters, while 1-, 2-, and 4-byte characters return 1.