Skip to content

Commit 29e31d3

Browse files
Update Docs to Point to Working Artifact Repos
Looks like Spark Packages is now dead, so lets point to Maven Central instead.
1 parent 92be24c commit 29e31d3

File tree

3 files changed

+7
-8
lines changed

3 files changed

+7
-8
lines changed

doc/0_quick_start.md

Lines changed: 5 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -13,17 +13,16 @@ Install and launch a Cassandra cluster and a Spark cluster.
1313

1414
Configure a new Scala project with the Apache Spark and dependency.
1515

16-
The dependencies are easily retrieved via the spark-packages.org website. For example, if you're using `sbt`, your build.sbt should include something like this:
16+
The dependencies are easily retrieved via Maven Central
1717

18-
resolvers += "Spark Packages Repo" at "https://dl.bintray.com/spark-packages/maven"
19-
libraryDependencies += "datastax" % "spark-cassandra-connector" % "2.4.1-s_2.11"
18+
libraryDependencies += "com.datastax.spark" % "spark-cassandra-connector" % "2.4.1-s_2.11"
2019

2120
The spark-packages libraries can also be used with spark-submit and spark shell, these
2221
commands will place the connector and all of its dependencies on the path of the
2322
Spark Driver and all Spark Executors.
2423

25-
$SPARK_HOME/bin/spark-shell --packages datastax:spark-cassandra-connector:2.4.1-s_2.11
26-
$SPARK_HOME/bin/spark-submit --packages datastax:spark-cassandra-connector:2.4.1-s_2.11
24+
$SPARK_HOME/bin/spark-shell --packages com.datastax.spark:spark-cassandra-connector_2.11:2.4.1
25+
$SPARK_HOME/bin/spark-submit --packages com.datastax.spark:spark-cassandra-connector_2.11:2.4.1
2726

2827
For the list of available versions, see:
2928
- https://spark-packages.org/package/datastax/spark-cassandra-connector
@@ -59,7 +58,7 @@ Run the `spark-shell` with the packages line for your version. To configure
5958
the default Spark Configuration pass key value pairs with `--conf`
6059

6160
$SPARK_HOME/bin/spark-shell --conf spark.cassandra.connection.host=127.0.0.1 \
62-
--packages datastax:spark-cassandra-connector:2.4.1-s_2.11
61+
--packages com.datastax.spark:spark-cassandra-connector_2.11:2.4.1
6362

6463
This command would set the Spark Cassandra Connector parameter
6564
`spark.cassandra.connection.host` to `127.0.0.1`. Change this

doc/13_spark_shell.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@ Find additional versions at [Spark Packages](https://spark-packages.org/package/
1818
```bash
1919
cd spark/install/dir
2020
#Include the --master if you want to run against a spark cluster and not local mode
21-
./bin/spark-shell [--master sparkMasterAddress] --jars yourAssemblyJar --packages datastax:spark-cassandra-connector:2.4.1-s_2.11 --conf spark.cassandra.connection.host=yourCassandraClusterIp
21+
./bin/spark-shell [--master sparkMasterAddress] --jars yourAssemblyJar --packages com.datastax.spark:spark-cassandra-connector_2.11:2.4.1 --conf spark.cassandra.connection.host=yourCassandraClusterIp
2222
```
2323

2424
By default spark will log everything to the console and this may be a bit of an overload. To change this copy and modify the `log4j.properties` template file

doc/15_python.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@ https://spark-packages.org/package/datastax/spark-cassandra-connector
1616

1717
```bash
1818
./bin/pyspark \
19-
--packages com.datastax.spark:spark-cassandra-connector_2.11:2.3.2
19+
--packages com.datastax.spark:spark-cassandra-connector_2.11:2.4.1
2020
```
2121

2222
### Loading a DataFrame in Python

0 commit comments

Comments
 (0)