-
Notifications
You must be signed in to change notification settings - Fork 713
[#9622] improvement(docs): Add guide for Lance REST integration with Spark and Ray #9623
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
+264
−1
Merged
Changes from 3 commits
Commits
Show all changes
18 commits
Select commit
Hold shift + click to select a range
299da20
docs: Add guide for Lance REST integration with Spark and Ray
yuqi1129 dcf29e9
fix
yuqi1129 35cee21
fix
yuqi1129 3e66bff
fix
yuqi1129 4b83045
fix
yuqi1129 587d49e
fix
yuqi1129 d44226a
fix
yuqi1129 7aa0c5c
fix
yuqi1129 2a6a23f
fix
yuqi1129 d9a063b
fix
yuqi1129 a38b8fe
fix
yuqi1129 5916bbf
fix
yuqi1129 175e106
Polish docs
yuqi1129 4b53811
polish again
yuqi1129 aa11382
fix
yuqi1129 2dfb5ff
fix
yuqi1129 510dc72
fix minor problems
yuqi1129 c527d18
polish
yuqi1129 File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,112 @@ | ||
| --- | ||
| title: "Lance REST integration with Spark and Ray" | ||
| slug: /lance-rest-spark-ray-integration | ||
| keywords: | ||
| - lance | ||
| - lance-rest | ||
| - spark | ||
| - ray | ||
| - integration | ||
| license: "This software is licensed under the Apache License version 2." | ||
| --- | ||
|
|
||
| ## Overview | ||
|
|
||
| This guide shows how to use the Lance REST service from Apache Gravitino with the Lance Spark connector (`lance-spark`) and the Lance Ray connector (`lance-ray`). It builds on the Lance REST service setup described in [Lance REST service](./lance-rest-service). | ||
|
|
||
| ## Compatibility matrix | ||
|
|
||
| | Gravitino version (Lance REST) | Supported lance-spark versions | Supported lance-ray versions | | ||
| |--------------------------------|--------------------------------|------------------------------| | ||
| | 1.1.1 | 0.0.10 – 0.0.15 | 0.0.6 – 0.0.8 | | ||
|
|
||
| :::note | ||
| - Update this matrix when newer Gravitino versions (for example 1.2.0) are released. | ||
| - Align connector versions with the Lance REST service bundled in the target release. | ||
yuqi1129 marked this conversation as resolved.
Outdated
Show resolved
Hide resolved
|
||
| ::: | ||
|
|
||
| ## Prerequisites | ||
|
|
||
| - Gravitino server running with Lance REST service enabled (default endpoint: `http://localhost:9101/lance`). | ||
| - A Lance catalog created in Gravitino, for example `lance_catalog`. | ||
| - Downloaded `lance-spark` bundle JAR that matches your Spark version (set the absolute path in the examples below). | ||
| - Python environments with required packages: | ||
| - Spark: `pyspark` | ||
| - Ray: `ray`, `lance-namespace`, `lance-ray` | ||
|
|
||
| ## Using Lance REST with Spark | ||
|
|
||
| The example below starts a local PySpark session that talks to Lance REST and creates a table through Spark SQL. | ||
|
|
||
| ```python | ||
| from pyspark.sql import SparkSession | ||
| import os | ||
| import logging | ||
| logging.basicConfig(level=logging.INFO) | ||
|
|
||
| # Point to your downloaded lance-spark bundle. | ||
| os.environ["PYSPARK_SUBMIT_ARGS"] = "--jars /path/to/lance-spark-bundle-3.5_2.12-0.0.15.jar --conf \"spark.driver.extraJavaOptions=--add-opens=java.base/sun.nio.ch=ALL-UNNAMED\" --conf \"spark.executor.extraJavaOptions=--add-opens=java.base/sun.nio.ch=ALL-UNNAMED\" --master local[1] pyspark-shell" | ||
|
|
||
| # Create the Lance catalog named "lance_catalog" in Gravitino beforehand. | ||
| spark = SparkSession.builder \ | ||
| .appName("lance_rest_example") \ | ||
| .config("spark.sql.catalog.lance", "com.lancedb.lance.spark.LanceNamespaceSparkCatalog") \ | ||
| .config("spark.sql.catalog.lance.impl", "rest") \ | ||
| .config("spark.sql.catalog.lance.uri", "http://localhost:9101/lance") \ | ||
| .config("spark.sql.catalog.lance.parent", "lance_catalog") \ | ||
| .config("spark.sql.defaultCatalog", "lance") \ | ||
| .getOrCreate() | ||
|
|
||
| spark.sparkContext.setLogLevel("DEBUG") | ||
|
|
||
| # Create schema and table, write, then read data. | ||
| spark.sql("create database schema") | ||
| spark.sql(""" | ||
| create table schema.sample(id int, score float) | ||
| USING lance | ||
| LOCATION '/tmp/schema/sample.lance/' | ||
| TBLPROPERTIES ('format' = 'lance') | ||
| """) | ||
| spark.sql(""" | ||
| insert into schema.sample values(1, 1.1) | ||
| """) | ||
| spark.sql("select * from schema.sample").show() | ||
| ``` | ||
|
|
||
| :::note | ||
| - Keep the Lance REST service reachable from Spark executors. | ||
| - Replace the JAR path with the actual location on your machine or cluster. | ||
| - Add your own JVM debugging flags only when needed. | ||
| ::: | ||
yuqi1129 marked this conversation as resolved.
Outdated
Show resolved
Hide resolved
|
||
|
|
||
| ## Using Lance REST with Ray | ||
|
|
||
| The snippet below writes and reads a Lance dataset through the Lance REST namespace. | ||
|
|
||
| ```python | ||
| import ray | ||
| import lance_namespace as ln | ||
| from lance_ray import read_lance, write_lance | ||
|
|
||
| ray.init() | ||
|
|
||
| namespace = ln.connect("rest", {"uri": "http://localhost:9101/lance"}) | ||
|
|
||
| data = ray.data.range(1000).map(lambda row: {"id": row["id"], "value": row["id"] * 2}) | ||
|
|
||
| write_lance(data, namespace=namespace, table_id=["lance_catalog", "schema", "my_table"]) | ||
| ray_dataset = read_lance(namespace=namespace, table_id=["lance_catalog", "schema", "my_table"]) | ||
|
|
||
| result = ray_dataset.filter(lambda row: row["value"] < 100).count() | ||
| print(f"Filtered count: {result}") | ||
| ``` | ||
|
|
||
| :::note | ||
| - Ensure the target Lance catalog (`lance_catalog`) and schema (`schema`) already exist in Gravitino. | ||
| - The table path is represented as `["catalog", "schema", "table"]` when using Lance Ray helpers. | ||
| ::: | ||
|
|
||
| ## Troubleshooting | ||
|
|
||
| - **TypeError `_BaseLanceDatasink.on_write_start()` when using Ray 2.53.3**: downgrade Ray to `2.40.0` (for example, `pip install 'ray==2.40.0'`). | ||
| - **ValueError “too many values to unpack” from `lance_ray` datasink**: this is a known issue in `lance-ray`; track progress at [lance-format/lance-ray#68](https://github.com/lance-format/lance-ray/issues/68). | ||
yuqi1129 marked this conversation as resolved.
Outdated
Show resolved
Hide resolved
|
||
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.