Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@
<tbody align="center">
<tr>
<td >2.3.*</td>
<td rowspan=6><a href="https://github.com/dotnet/spark/releases/tag/v0.11.0">v0.11.0</a></td>
<td rowspan=6><a href="https://github.com/dotnet/spark/releases/tag/v0.12.0">v0.12.0</a></td>
</tr>
<tr>
<td>2.4.0</td>
Expand Down
2 changes: 1 addition & 1 deletion benchmark/scala/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
<modelVersion>4.0.0</modelVersion>
<groupId>com.microsoft.spark</groupId>
<artifactId>microsoft-spark-benchmark</artifactId>
<version>0.11.0</version>
<version>0.12.0</version>
<inceptionYear>2019</inceptionYear>
<properties>
<encoding>UTF-8</encoding>
Expand Down
115 changes: 115 additions & 0 deletions docs/release-notes/0.12/release-0.12.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,115 @@
# .NET for Apache Spark 0.12 Release Notes

### New Features/Improvements and Bug Fixes

* Expose `DataStreamWriter.ForeachBatch` API ([#549](https://github.com/dotnet/spark/pull/549))
* Support for [dotnet-interactive](https://github.com/dotnet/interactive) ([#515](https://github.com/dotnet/spark/pull/515)) ([#517](https://github.com/dotnet/spark/pull/517)) ([#554](https://github.com/dotnet/spark/pull/554))
* Support for [Hyperspace v0.1.0](https://github.com/microsoft/hyperspace) APIs ([#555](https://github.com/dotnet/spark/pull/555))
* Support for Spark 2.4.6 ([#547](https://github.com/dotnet/spark/pull/547))
* Bug fixes:
* Udf bug caused by `BroadcastVariablesRegistry` ([#551](https://github.com/dotnet/spark/pull/551))
* Null checks for `TimestampType` and `DateType` ([#530](https://github.com/dotnet/spark/pull/530))
* Update `Microsoft.Data.Analysis` to v`0.4.0` ([#528](https://github.com/dotnet/spark/pull/528))

### Infrastructure / Documentation / Etc.

* Improve build pipeline ([#510](https://github.com/dotnet/spark/pull/510)) ([#511](https://github.com/dotnet/spark/pull/511)) ([#512](https://github.com/dotnet/spark/pull/512)) ([#513](https://github.com/dotnet/spark/pull/513)) ([#524](https://github.com/dotnet/spark/pull/524))
* Update AppName for the C# Spark Examples ([#548](https://github.com/dotnet/spark/pull/548))
* Update maven links in build documentation ([#558](https://github.com/dotnet/spark/pull/558)) ([#560](https://github.com/dotnet/spark/pull/560))

### Breaking Changes
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you add a known issue section and add the issue where broadcast variables do not work with dotnet-interactive and reference #561?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also if #530 goes in before 9am tomorrow, let's add it. I will update this thread.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we add #530 now?


* None

### Known Issues

* Broadcast variables do not work with [dotnet-interactive](https://github.com/dotnet/interactive) ([#561](https://github.com/dotnet/spark/pull/561))

### Compatibility

#### Backward compatibility

The following table describes the oldest version of the worker that the current version is compatible with, along with new features that are incompatible with the worker.

<table>
<thead>
<tr>
<th>Oldest compatible Microsoft.Spark.Worker version</th>
<th>Incompatible features</th>
</tr>
</thead>
<tbody align="center">
<tr>
<td rowspan=4>v0.9.0</td>
<td>DataFrame with Grouped Map UDF <a href="https://github.com/dotnet/spark/pull/277">(#277)</a></td>
</tr>
<tr>
<td>DataFrame with Vector UDF <a href="https://github.com/dotnet/spark/pull/277">(#277)</a></td>
</tr>
<tr>
<td>Support for Broadcast Variables <a href="https://github.com/dotnet/spark/pull/414">(#414)</a></td>
</tr>
<tr>
<td>Support for TimestampType <a href="https://github.com/dotnet/spark/pull/428">(#428)</a></td>
</tr>
</tbody>
</table>

#### Forward compatibility

The following table describes the oldest version of .NET for Apache Spark release that the current worker is compatible with.

<table>
<thead>
<tr>
<th>Oldest compatible .NET for Apache Spark release version</th>
</tr>
</thead>
<tbody align="center">
<tr>
<td>v0.9.0</td>
</tr>
</tbody>
</table>

### Supported Spark Versions

The following table outlines the supported Spark versions along with the microsoft-spark JAR to use with:

<table>
<thead>
<tr>
<th>Spark Version</th>
<th>microsoft-spark JAR</th>
</tr>
</thead>
<tbody align="center">
<tr>
<td>2.3.*</td>
<td>microsoft-spark-2.3.x-0.12.0.jar</td>
</tr>
<tr>
<td>2.4.0</td>
<td rowspan=6>microsoft-spark-2.4.x-0.12.0.jar</td>
</tr>
<tr>
<td>2.4.1</td>
</tr>
<tr>
<td>2.4.3</td>
</tr>
<tr>
<td>2.4.4</td>
</tr>
<tr>
<td>2.4.5</td>
</tr>
<tr>
<td>2.4.6</td>
</tr>
<tr>
<td>2.4.2</td>
<td><a href="https://github.com/dotnet/spark/issues/60">Not supported</a></td>
</tr>
</tbody>
</table>
2 changes: 1 addition & 1 deletion eng/Versions.props
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<PropertyGroup>
<VersionPrefix>0.11.0</VersionPrefix>
<VersionPrefix>0.12.0</VersionPrefix>
<PreReleaseVersionLabel>prerelease</PreReleaseVersionLabel>
<RestoreSources>
$(RestoreSources);
Expand Down
2 changes: 1 addition & 1 deletion src/scala/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@
<version>${microsoft-spark.version}</version>
<properties>
<encoding>UTF-8</encoding>
<microsoft-spark.version>0.11.0</microsoft-spark.version>
<microsoft-spark.version>0.12.0</microsoft-spark.version>
</properties>

<modules>
Expand Down