Conversation
JaroslavTulach
left a comment
There was a problem hiding this comment.
Hashset existence is surprising to me...
| "Hashset.from_vector " + self.to_vector.pretty | ||
|
|
||
| ## Convert from a Vector to a Hashset | ||
| Hashset.from (that:Vector) (error_on_duplicates:Boolean=False) = |
There was a problem hiding this comment.
- The name
Hashsetis strange knowing we haveDictionaryand notHashmap - Shouldn't it be just a
Set?
There was a problem hiding this comment.
Was named a long time ago to avoid some naming clashes at the time - Set and Map had too many meanings within the Enso context. Map was hence chosen to be Dictionary, and Set was called Hashset (allowing the technical details to leak as its a more internal class).
| icon: convert | ||
| --- | ||
| Converts the given value to a JSON serialized value. | ||
| Any.json_stringify self -> Text = self.to_json |
There was a problem hiding this comment.
Why do we have to_json and json_stringify?
There was a problem hiding this comment.
For compatibility with Table. Table needs a row-based JSON conversion function, and to_json is the standard API we use for serialising objects to JSON. Table is serialised by the to_json and uses json_stringify for the row-based operation.
Having the same method on Any is for compatibility.
Pull Request Description
Hashset:to_text,to_display_text,prettyand Table visualization now work.Dictionary:to_display_textso shows first 40 (same asVector).VectorforHashsetandDictionary.JsonOperation:TableVizOperationcode to its own class.Table.json_stringifytoTable,ColumnandAny.parse_jsontoTableandColumn.XML_DocumenttoXML_Element.get_xpath_textmethod toXML_ElementandXML_Document- reading the text value of the first matching node or attribute.to_js_objectonXML_ElementandXML_Document.Important Notes
Checklist
Please ensure that the following checklist has been satisfied before submitting the PR:
Scala,
Java,
TypeScript,
and
Rust
style guides. In case you are using a language not listed above, follow the Rust style guide.
or the Snowflake database integration, a run of the Extra Tests has been scheduled.