@@ -25,15 +25,15 @@ You can link against this library in your program at the following coordinates:
2525```
2626groupId: com.databricks
2727artifactId: spark-xml_2.11
28- version: 0.9 .0
28+ version: 0.10 .0
2929```
3030
3131### Scala 2.12
3232
3333```
3434groupId: com.databricks
3535artifactId: spark-xml_2.12
36- version: 0.9 .0
36+ version: 0.10 .0
3737```
3838
3939## Using with Spark shell
@@ -42,12 +42,12 @@ This package can be added to Spark using the `--packages` command line option. F
4242
4343### Spark compiled with Scala 2.11
4444```
45- $SPARK_HOME/bin/spark-shell --packages com.databricks:spark-xml_2.11:0.9 .0
45+ $SPARK_HOME/bin/spark-shell --packages com.databricks:spark-xml_2.11:0.10 .0
4646```
4747
4848### Spark compiled with Scala 2.12
4949```
50- $SPARK_HOME/bin/spark-shell --packages com.databricks:spark-xml_2.12:0.9 .0
50+ $SPARK_HOME/bin/spark-shell --packages com.databricks:spark-xml_2.12:0.10 .0
5151```
5252
5353## Features
@@ -400,7 +400,7 @@ Automatically infer schema (data types)
400400``` R
401401library(SparkR )
402402
403- sparkR.session(" local[4]" , sparkPackages = c(" com.databricks:spark-xml_2.11:0.9 .0" ))
403+ sparkR.session(" local[4]" , sparkPackages = c(" com.databricks:spark-xml_2.11:0.10 .0" ))
404404
405405df <- read.df(" books.xml" , source = " xml" , rowTag = " book" )
406406
@@ -412,7 +412,7 @@ You can manually specify schema:
412412``` R
413413library(SparkR )
414414
415- sparkR.session(" local[4]" , sparkPackages = c(" com.databricks:spark-xml_2.11:0.9 .0" ))
415+ sparkR.session(" local[4]" , sparkPackages = c(" com.databricks:spark-xml_2.11:0.10 .0" ))
416416customSchema <- structType(
417417 structField(" _id" , " string" ),
418418 structField(" author" , " string" ),
0 commit comments