Skip to content

Commit f092160

Browse files
authored
Update librdkafka from 1.6.1 -> 1.7.0 (#915)
1 parent 57d6fd7 commit f092160

File tree

8 files changed

+75
-25
lines changed

8 files changed

+75
-25
lines changed

CONTRIBUTING.md

Lines changed: 30 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -27,6 +27,8 @@ so if you feel something is missing feel free to send a pull request.
2727
[Debugging](#debugging)
2828
* [Debugging C++](#debugging-c)
2929

30+
[Updating librdkafka version](#updating-librdkafka-version)
31+
3032
## What should I know before I get started?
3133

3234
### Contributor Agreement
@@ -190,3 +192,31 @@ gdb node
190192
```
191193

192194
You can add breakpoints and so on after that.
195+
196+
## Updating librdkafka version
197+
198+
The librdkafka should be periodically updated to the latest release in https://github.com/edenhill/librdkafka/releases
199+
200+
Steps to update:
201+
1. Update the `librdkafka` property in [`package.json`](https://github.com/Blizzard/node-rdkafka/blob/master/package.json) to the desired version.
202+
203+
1. Update the librdkafka git submodule to that versions release commit (example below)
204+
205+
```bash
206+
cd deps/librdkafka
207+
git checkout 77a013b7a2611f7bdc091afa1e56b1a46d1c52f5 # for version 1.70
208+
```
209+
210+
1. Update [`config.d.ts`](https://github.com/Blizzard/node-rdkafka/blob/master/config.d.ts) and [`errors.d.ts`](https://github.com/Blizzard/node-rdkafka/blob/master/errors.d.ts) TypeScript definitions by running:
211+
```bash
212+
node ci/librdkafka-defs-generator.js
213+
```
214+
Note: This is ran automatically during CI flows but it's good to run it during the version upgrade pull request.
215+
216+
1. Run `npm install` to build with the new version and fix any build errors that occur.
217+
218+
1. Run unit tests: `npm run test`
219+
220+
1. Run end to end tests: `npm run test:e2e`. This requires running kafka & zookeeper locally.
221+
222+
1. Update the version numbers referenced in the [`README.md`](https://github.com/Blizzard/node-rdkafka/blob/master/README.md) file to the new version.

README.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ Copyright (c) 2016 Blizzard Entertainment.
55

66
[https://github.com/blizzard/node-rdkafka](https://github.com/blizzard/node-rdkafka)
77

8-
[![Build Status](https://travis-ci.com/Blizzard/node-rdkafka.svg?branch=master)](https://travis-ci.com/Blizzard/node-rdkafka)
8+
[![Build Status](https://app.travis-ci.com/Blizzard/node-rdkafka.svg?branch=master)](https://app.travis-ci.com/github/Blizzard/node-rdkafka)
99
[![npm version](https://badge.fury.io/js/node-rdkafka.svg)](https://badge.fury.io/js/node-rdkafka)
1010

1111
# Looking for Collaborators!
@@ -16,7 +16,7 @@ I am looking for *your* help to make this project even better! If you're interes
1616

1717
The `node-rdkafka` library is a high-performance NodeJS client for [Apache Kafka](http://kafka.apache.org/) that wraps the native [librdkafka](https://github.com/edenhill/librdkafka) library. All the complexity of balancing writes across partitions and managing (possibly ever-changing) brokers should be encapsulated in the library.
1818

19-
__This library currently uses `librdkafka` version `1.6.1`.__
19+
__This library currently uses `librdkafka` version `1.7.0`.__
2020

2121
## Reference Docs
2222

@@ -59,7 +59,7 @@ Using Alpine Linux? Check out the [docs](https://github.com/Blizzard/node-rdkafk
5959

6060
### Windows
6161

62-
Windows build **is not** compiled from `librdkafka` source but it is rather linked against the appropriate version of [NuGet librdkafka.redist](https://www.nuget.org/packages/librdkafka.redist/) static binary that gets downloaded from `https://globalcdn.nuget.org/packages/librdkafka.redist.1.6.1.nupkg` during installation. This download link can be changed using the environment variable `NODE_RDKAFKA_NUGET_BASE_URL` that defaults to `https://globalcdn.nuget.org/packages/` when it's no set.
62+
Windows build **is not** compiled from `librdkafka` source but it is rather linked against the appropriate version of [NuGet librdkafka.redist](https://www.nuget.org/packages/librdkafka.redist/) static binary that gets downloaded from `https://globalcdn.nuget.org/packages/librdkafka.redist.1.7.0.nupkg` during installation. This download link can be changed using the environment variable `NODE_RDKAFKA_NUGET_BASE_URL` that defaults to `https://globalcdn.nuget.org/packages/` when it's no set.
6363

6464
Requirements:
6565
* [node-gyp for Windows](https://github.com/nodejs/node-gyp#on-windows) (the easies way to get it: `npm install --global --production windows-build-tools`, if your node version is 6.x or below, pleasse use `npm install --global --production [email protected]`)
@@ -96,7 +96,7 @@ var Kafka = require('node-rdkafka');
9696

9797
## Configuration
9898

99-
You can pass many configuration options to `librdkafka`. A full list can be found in `librdkafka`'s [Configuration.md](https://github.com/edenhill/librdkafka/blob/v1.6.1/CONFIGURATION.md)
99+
You can pass many configuration options to `librdkafka`. A full list can be found in `librdkafka`'s [Configuration.md](https://github.com/edenhill/librdkafka/blob/v1.7.0/CONFIGURATION.md)
100100

101101
Configuration keys that have the suffix `_cb` are designated as callbacks. Some
102102
of these keys are informational and you can choose to opt-in (for example, `dr_cb`). Others are callbacks designed to
@@ -131,7 +131,7 @@ You can also get the version of `librdkafka`
131131
const Kafka = require('node-rdkafka');
132132
console.log(Kafka.librdkafkaVersion);
133133

134-
// #=> 1.6.1
134+
// #=> 1.7.0
135135
```
136136

137137
## Sending Messages
@@ -144,7 +144,7 @@ var producer = new Kafka.Producer({
144144
});
145145
```
146146

147-
A `Producer` requires only `metadata.broker.list` (the Kafka brokers) to be created. The values in this list are separated by commas. For other configuration options, see the [Configuration.md](https://github.com/edenhill/librdkafka/blob/v1.6.1/CONFIGURATION.md) file described previously.
147+
A `Producer` requires only `metadata.broker.list` (the Kafka brokers) to be created. The values in this list are separated by commas. For other configuration options, see the [Configuration.md](https://github.com/edenhill/librdkafka/blob/v1.7.0/CONFIGURATION.md) file described previously.
148148

149149
The following example illustrates a list with several `librdkafka` options set.
150150

ci/librdkafka-defs-generator.js

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -172,7 +172,7 @@ function updateErrorDefinitions(file) {
172172
// validate body
173173
const emptyCheck = body
174174
.replace(/(( \/\*)|( ?\*)).*/g, '')
175-
.replace(/ ERR_\w+: -?\d+,?\n/g, '')
175+
.replace(/ ERR_\w+: -?\d+,?\r?\n/g, '')
176176
.trim()
177177
if (emptyCheck !== '') {
178178
throw new Error(`Fail to parse ${file}. It contains these extra details:\n${emptyCheck}`);
@@ -184,7 +184,7 @@ function updateErrorDefinitions(file) {
184184
.replace(/(\/\/.*\n)?LibrdKafkaError.codes = {[^}]+/g, `${getHeader(file)}\nLibrdKafkaError.codes = {\n${body}`)
185185

186186
fs.writeFileSync(error_js_file, error_js);
187-
fs.writeFileSync(path.resolve(__dirname, '../errors.d.ts'), `${getHeader(file)}\nexport const CODES: { ERRORS: {${body.replace(/[ \.]*(\*\/\n \w+: )(-?\d+),?/g, ' (**$2**) $1number,')}}}`)
187+
fs.writeFileSync(path.resolve(__dirname, '../errors.d.ts'), `${getHeader(file)}\nexport const CODES: { ERRORS: {${body.replace(/[ \.]*(\*\/\r?\n \w+: )(-?\d+),?/g, ' (**$2**) $1number,')}}}`)
188188
}
189189

190190
(async function updateTypeDefs() {

config.d.ts

Lines changed: 28 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
// ====== Generated from librdkafka 1.6.1 file CONFIGURATION.md ======
1+
// ====== Generated from librdkafka 1.7.0 file CONFIGURATION.md ======
22
// Code that generated this is a derivative work of the code from Nam Nguyen
33
// https://gist.github.com/ntgn81/066c2c8ec5b4238f85d1e9168a04e3fb
44

@@ -62,13 +62,6 @@ export interface GlobalConfig {
6262
*/
6363
"max.in.flight"?: number;
6464

65-
/**
66-
* Non-topic request timeout in milliseconds. This is for metadata requests, etc.
67-
*
68-
* @default 60000
69-
*/
70-
"metadata.request.timeout.ms"?: number;
71-
7265
/**
7366
* Period of time in milliseconds at which topic and broker metadata is refreshed in order to proactively discover any new brokers, topics, partitions or partition leader changes. Use -1 to disable the intervalled refresh (not recommended). If there are no locally referenced topics (no topic objects created, no messages produced, no subscription or no assignment) then only the broker list will be refreshed every interval but no more often than every 10s.
7467
*
@@ -184,6 +177,13 @@ export interface GlobalConfig {
184177
*/
185178
"broker.address.family"?: 'any' | 'v4' | 'v6';
186179

180+
/**
181+
* Close broker connections after the specified time of inactivity. Disable with 0. If this property is left at its default value some heuristics are performed to determine a suitable default value, this is currently limited to identifying brokers on Azure (see librdkafka issue #3109 for more info).
182+
*
183+
* @default 0
184+
*/
185+
"connections.max.idle.ms"?: number;
186+
187187
/**
188188
* **DEPRECATED** No longer used. See `reconnect.backoff.ms` and `reconnect.backoff.max.ms`.
189189
*
@@ -403,8 +403,6 @@ export interface GlobalConfig {
403403

404404
/**
405405
* File or directory path to CA certificate(s) for verifying the broker's key. Defaults: On Windows the system's CA certificates are automatically looked up in the Windows Root certificate store. On Mac OSX this configuration defaults to `probe`. It is recommended to install openssl using Homebrew, to provide CA certificates. On Linux install the distribution's ca-certificates package. If OpenSSL is statically linked or `ssl.ca.location` is set to `probe` a list of standard paths will be probed and the first one found will be used as the default CA certificate location path. If OpenSSL is dynamically linked the OpenSSL library's default path will be used (see `OPENSSLDIR` in `openssl version -a`).
406-
*
407-
* @default probe
408406
*/
409407
"ssl.ca.location"?: string;
410408

@@ -435,6 +433,23 @@ export interface GlobalConfig {
435433
*/
436434
"ssl.keystore.password"?: string;
437435

436+
/**
437+
* Path to OpenSSL engine library. OpenSSL >= 1.1.0 required.
438+
*/
439+
"ssl.engine.location"?: string;
440+
441+
/**
442+
* OpenSSL engine id is the name used for loading engine.
443+
*
444+
* @default dynamic
445+
*/
446+
"ssl.engine.id"?: string;
447+
448+
/**
449+
* OpenSSL engine callback data (set with rd_kafka_conf_set_engine_callback_data()).
450+
*/
451+
"ssl_engine_callback_data"?: any;
452+
438453
/**
439454
* Enable OpenSSL's builtin broker (server) certificate verification. This verification can be extended by the application by implementing a certificate_verify_cb.
440455
*
@@ -708,7 +723,7 @@ export interface ConsumerGlobalConfig extends GlobalConfig {
708723
/**
709724
* Client group session and failure detection timeout. The consumer sends periodic heartbeats (heartbeat.interval.ms) to indicate its liveness to the broker. If no hearts are received by the broker for a group member within the session timeout, the broker will remove the consumer from the group and trigger a rebalance. The allowed range is configured with the **broker** configuration properties `group.min.session.timeout.ms` and `group.max.session.timeout.ms`. Also see `max.poll.interval.ms`.
710725
*
711-
* @default 10000
726+
* @default 45000
712727
*/
713728
"session.timeout.ms"?: number;
714729

@@ -966,14 +981,14 @@ export interface ProducerTopicConfig extends TopicConfig {
966981

967982
export interface ConsumerTopicConfig extends TopicConfig {
968983
/**
969-
* **DEPRECATED** [**LEGACY PROPERTY:** This property is used by the simple legacy consumer only. When using the high-level KafkaConsumer, the global `enable.auto.commit` property must be used instead]. If true, periodically commit offset of the last message handed to the application. This committed offset will be used when the process restarts to pick up where it left off. If false, the application will have to call `rd_kafka_offset_store()` to store an offset (optional). **NOTE:** There is currently no zookeeper integration, offsets will be written to broker or local file according to offset.store.method.
984+
* **DEPRECATED** [**LEGACY PROPERTY:** This property is used by the simple legacy consumer only. When using the high-level KafkaConsumer, the global `enable.auto.commit` property must be used instead]. If true, periodically commit offset of the last message handed to the application. This committed offset will be used when the process restarts to pick up where it left off. If false, the application will have to call `rd_kafka_offset_store()` to store an offset (optional). Offsets will be written to broker or local file according to offset.store.method.
970985
*
971986
* @default true
972987
*/
973988
"auto.commit.enable"?: boolean;
974989

975990
/**
976-
* **DEPRECATED** Alias for `auto.commit.enable`: [**LEGACY PROPERTY:** This property is used by the simple legacy consumer only. When using the high-level KafkaConsumer, the global `enable.auto.commit` property must be used instead]. If true, periodically commit offset of the last message handed to the application. This committed offset will be used when the process restarts to pick up where it left off. If false, the application will have to call `rd_kafka_offset_store()` to store an offset (optional). **NOTE:** There is currently no zookeeper integration, offsets will be written to broker or local file according to offset.store.method.
991+
* **DEPRECATED** Alias for `auto.commit.enable`: [**LEGACY PROPERTY:** This property is used by the simple legacy consumer only. When using the high-level KafkaConsumer, the global `enable.auto.commit` property must be used instead]. If true, periodically commit offset of the last message handed to the application. This committed offset will be used when the process restarts to pick up where it left off. If false, the application will have to call `rd_kafka_offset_store()` to store an offset (optional). Offsets will be written to broker or local file according to offset.store.method.
977992
*
978993
* @default true
979994
*/

deps/librdkafka

errors.d.ts

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
// ====== Generated from librdkafka 1.6.1 file src-cpp/rdkafkacpp.h ======
1+
// ====== Generated from librdkafka 1.7.0 file src-cpp/rdkafkacpp.h ======
22
export const CODES: { ERRORS: {
33
/* Internal errors to rdkafka: */
44
/** Begin internal error codes (**-200**) */
@@ -126,8 +126,10 @@ export const CODES: { ERRORS: {
126126
ERR__NOOP: number,
127127
/** No offset to automatically reset to (**-140**) */
128128
ERR__AUTO_OFFSET_RESET: number,
129+
129130
/** End internal error codes (**-100**) */
130131
ERR__END: number,
132+
131133
/* Kafka broker errors: */
132134
/** Unknown broker error (**-1**) */
133135
ERR_UNKNOWN: number,

lib/error.js

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -27,7 +27,7 @@ LibrdKafkaError.wrap = errorWrap;
2727
* @enum {number}
2828
* @constant
2929
*/
30-
// ====== Generated from librdkafka 1.6.1 file src-cpp/rdkafkacpp.h ======
30+
// ====== Generated from librdkafka 1.7.0 file src-cpp/rdkafkacpp.h ======
3131
LibrdKafkaError.codes = {
3232

3333
/* Internal errors to rdkafka: */
@@ -156,8 +156,10 @@ LibrdKafkaError.codes = {
156156
ERR__NOOP: -141,
157157
/** No offset to automatically reset to */
158158
ERR__AUTO_OFFSET_RESET: -140,
159+
159160
/** End internal error codes */
160161
ERR__END: -100,
162+
161163
/* Kafka broker errors: */
162164
/** Unknown broker error */
163165
ERR_UNKNOWN: -1,

package.json

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,12 +2,13 @@
22
"name": "node-rdkafka",
33
"version": "v2.11.0",
44
"description": "Node.js bindings for librdkafka",
5-
"librdkafka": "1.6.1",
5+
"librdkafka": "1.7.0",
66
"main": "lib/index.js",
77
"scripts": {
88
"configure": "node-gyp configure",
99
"build": "node-gyp build",
1010
"test": "make test",
11+
"test:e2e": "make e2e",
1112
"install": "node-gyp rebuild",
1213
"prepack": "node ./ci/prepublish.js"
1314
},

0 commit comments

Comments
 (0)