@@ -26,15 +26,15 @@ You can link against this library in your program at the following coordinates:
2626```
2727groupId: com.databricks
2828artifactId: spark-xml_2.11
29- version: 0.10 .0
29+ version: 0.11 .0
3030```
3131
3232### Scala 2.12
3333
3434```
3535groupId: com.databricks
3636artifactId: spark-xml_2.12
37- version: 0.10 .0
37+ version: 0.11 .0
3838```
3939
4040## Using with Spark shell
@@ -43,12 +43,12 @@ This package can be added to Spark using the `--packages` command line option. F
4343
4444### Spark compiled with Scala 2.11
4545```
46- $SPARK_HOME/bin/spark-shell --packages com.databricks:spark-xml_2.11:0.10 .0
46+ $SPARK_HOME/bin/spark-shell --packages com.databricks:spark-xml_2.11:0.11 .0
4747```
4848
4949### Spark compiled with Scala 2.12
5050```
51- $SPARK_HOME/bin/spark-shell --packages com.databricks:spark-xml_2.12:0.10 .0
51+ $SPARK_HOME/bin/spark-shell --packages com.databricks:spark-xml_2.12:0.11 .0
5252```
5353
5454## Features
@@ -409,7 +409,7 @@ Automatically infer schema (data types)
409409``` R
410410library(SparkR )
411411
412- sparkR.session(" local[4]" , sparkPackages = c(" com.databricks:spark-xml_2.11:0.10 .0" ))
412+ sparkR.session(" local[4]" , sparkPackages = c(" com.databricks:spark-xml_2.11:0.11 .0" ))
413413
414414df <- read.df(" books.xml" , source = " xml" , rowTag = " book" )
415415
@@ -421,7 +421,7 @@ You can manually specify schema:
421421``` R
422422library(SparkR )
423423
424- sparkR.session(" local[4]" , sparkPackages = c(" com.databricks:spark-xml_2.11:0.10 .0" ))
424+ sparkR.session(" local[4]" , sparkPackages = c(" com.databricks:spark-xml_2.11:0.11 .0" ))
425425customSchema <- structType(
426426 structField(" _id" , " string" ),
427427 structField(" author" , " string" ),
0 commit comments