Skip to content

Commit 7b36b6e

Browse files
authored
Merge pull request #1276 from datastax/SPARKC-593-2.5
SPARKC-593 2.5 doc fixes
2 parents ae825f6 + 9de088a commit 7b36b6e

File tree

3 files changed

+13
-12
lines changed

3 files changed

+13
-12
lines changed

README.md

Lines changed: 5 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -64,7 +64,7 @@ development for the next connector release in progress.
6464
API documentation for the Scala and Java interfaces are available online:
6565

6666
### 2.5.1
67-
* [Spark-Cassandra-Connector](https://datastax.github.io/spark-cassandra-connector/ApiDocs/2.5.1/#package)
67+
* [Spark-Cassandra-Connector](https://datastax.github.io/spark-cassandra-connector/ApiDocs/2.5.1/connector/#package)
6868

6969
### 2.4.2
7070
* [Spark-Cassandra-Connector](http://datastax.github.io/spark-cassandra-connector/ApiDocs/2.4.2/spark-cassandra-connector/)
@@ -75,10 +75,8 @@ API documentation for the Scala and Java interfaces are available online:
7575
* [Embedded-Cassandra](http://datastax.github.io/spark-cassandra-connector/ApiDocs/2.3.2/spark-cassandra-connector-embedded/)
7676

7777
## Download
78-
This project is available on Spark Packages; this is the easiest way to start using the connector:
79-
https://spark-packages.org/package/datastax/spark-cassandra-connector
8078

81-
This project has also been published to the Maven Central Repository.
79+
This project is available on the Maven Central Repository.
8280
For SBT to download the connector binaries, sources and javadoc, put this in your project
8381
SBT config:
8482

@@ -162,6 +160,9 @@ To run unit and integration tests:
162160
./sbt/sbt test
163161
./sbt/sbt it:test
164162

163+
Note that the integration tests require [CCM](https://github.com/riptano/ccm) to be installed on your machine.
164+
See [Tips for Developing the Spark Cassandra Connector](doc/developers.md) for details.
165+
165166
By default, integration tests start up a separate, single Cassandra instance and run Spark in local mode.
166167
It is possible to run integration tests with your own Cassandra and/or Spark cluster.
167168
First, prepare a jar with testing code:

doc/12_building_and_artifacts.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -36,7 +36,7 @@ A fat jar with `assembly` suffix will be generated to:
3636

3737
spark-cassandra-connector/connector/target/scala-{binary.version}/
3838

39-
The jar contains Spark Cassandra Connector and its dependencies. Some of the dependencies are shaded to avoid
39+
The jar contains the Spark Cassandra Connector and its dependencies. Some of the dependencies are shaded to avoid
4040
classpath conflicts.
4141
It is recommended to use the main artifact when possible.
4242

doc/developers.md

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -13,25 +13,25 @@ Help](https://help.github.com/articles/cloning-a-repository/).
1313

1414
Once in the sbt shell you will be able to build and run tests for the
1515
connector without any Spark or Cassandra nodes running. The integration tests
16-
require [CCM](https://github.com/riptano/ccm) installed on your machine.
17-
This can be accomplished with `pip install ccm`
16+
require a valid path to java home set in the `JAVA_HOME` env variable and
17+
[CCM](https://github.com/riptano/ccm) to be installed on your machine.
18+
This can be accomplished with `pip install ccm`.
1819

1920
The most common commands to use when developing the connector are
2021

2122
1. `test` - Runs the the unit tests for the project.
22-
2. `it:test` - Runs the integration tests with embedded Cassandra and Spark
23+
2. `it:test` - Runs the integration tests with Cassandra (started by CCM) and Spark
2324
3. `package` - Builds the project and produces a runtime jar
2425
4. `publishM2` - Publishes a snapshot of the project to your local maven repository allowing for usage with --packages in the spark-shell
2526

2627
The integration tests located in `connector/src/it` should
2728
probably be the first place to look for anyone considering adding code.
2829
There are many examples of executing a feature of the connector with
29-
the embedded Cassandra and Spark nodes and are the core of our test
30-
coverage.
30+
Cassandra and Spark nodes and are the core of our test coverage.
3131

3232
### Merge Path
3333

34-
b2.5 => b3.0 => Master
34+
b2.5 => Master
3535

3636
New features can be considered for 2.5 as long as they do not break apis
3737
In general 3.0 should be the target for new features
@@ -68,7 +68,7 @@ listed in build.yaml. In addition the test-support module supports Cassandra
6868
or other CCM Compatible installations.
6969

7070
If using SBT you can set
71-
`CCM_CASSANDRA_VERSION` to propagate a version for CCM to use during tests
71+
`CCM_CASSANDRA_VERSION` to propagate a version for CCM to use during tests.
7272

7373
If you are running tests through IntelliJ or through an alternative framework (jUnit)
7474
set the system property `ccm.version` to the version you like.

0 commit comments

Comments
 (0)