Skip to content

Latest commit

 

History

History
238 lines (171 loc) · 7.7 KB

README.md

File metadata and controls

238 lines (171 loc) · 7.7 KB

node Build Status Test Coverage license

kafka-observable

kafka-observable is the easiest way to exchange messages through kafka with node.js.

Using the solid no-kafka as default client, kafka-observable creates RxJS observables that can be manipulated as if you were using Kafka Streams, but with a familiar interface to javascript developers.

Why observables?

Think of observables as collections which elements arrive over time. You may iterate over them, which means you can also apply filter, map or reduce.

Many of the operations provided by observables are very similar to the capabilities available in Kafka Streams, including the ability to use window (accumulate values for a period).

Installation

npm install --save kafka-observable

Example usage

Imagine your customers can subscribe to out-of-stock products in your online store to receive emails when the product is in stock. Your stock management publishes updates to a kafka topic called inventory_updates and you mailer consumes from notifications.

const opts = { 
    brokers: 'kafka://kafka-broker.example.com:9092', 
    groupId: 'inventory-notifications' 
};

const KafkaObservable = require('kafka-observable')(opts);
// assumes getWatchers will execute a network request and returns an observable
const getWatchers = require('./lib/observables/watchers');

const subscription = KafkaObservable.fromTopic('inventory_updates')
    // gets messge as JSON
    .let(KafkaObservable.JSONMessage())
    // just arrived
    .filter(({inventory}) => inventory.previous === 0 && inventory.current > 0)
    // gets watchers, format message and concat 2 dimensional observable
    .concatMap(product => 
        getWatchers(product.id)
            .map(watchers => ({ watchers, product }))
    // sends formated message to new topic and concats 2 dimensional observable
    .concatMap(message => KafkaObservable.toTopic('notifications', message));

subscription.subscribe(success => console.log(success), err => console.error(err));    

Methods

fromTopic(topic, options, adapterFactory = defaultAdapterFactory)

Creates an observable that will consume from a Kafka topic.

Parameters

  • topic (String) - topic name
  • options (Object) - client options
  • adapterFactory (Object) - client adapter (Defaults to no-kafka adapter)

Examples

KafkaObservable as a function:

const opts = { brokers: 'kafka://127.0.0.1:9092', groupId: 'test' };

const KafkaObservable = require('kafka-observable')(opts);
const consumer = KafkaObservable.fromTopic('my_topic')
    .map(({message}) => message.value.toString('utf8'));

consumer.subscribe(message => console.info(message));

Passing options parameter to fromTopic:

const opts = { brokers: 'kafka://127.0.0.1:9092', groupId: 'test' };

const KafkaObservable = require('kafka-observable');
const consumer = KafkaObservable.fromTopic('my_topic', opts)
    .map(({message}) => message.value.toString('utf8'));

consumer.subscribe(message => console.info(message));

Options

Below are the main options for the consumer. For more consumer options, please refer to no-kafka options (in case you use the provided default adapter).

Option Required Type Default Description
brokers yes Array/String - list of Kafka brokers
groupId yes String - consumer group id
autoCommit no boolean true commits the message offset automatically if no exception is thrown
strategy no String Default name of the assignment strategy for the consumer (Default/Consistent/WeightedRoundRobin)

toTopic(topic, messages, options, adapterFactory = defaultAdapterFactory)

Creates an observable that publishes messages to a Kafka topic.

Parameters

  • topic (String) - topic name
  • messages (String|Array|Observable) - messages to be published in kafka topic
  • options (Object) - client options
  • adapterFactory (Object) - client adapter (Defaults to no-kafka adapter)

Examples

KafkaObservable as a function:

const opts = { brokers: 'kafka://127.0.0.1:9092' };
const messages = [{key: 'value1'}, {key: 'value2'}];

const KafkaObservable = require('kafka-observable')(opts);
const producer = KafkaObservable.toTopic('my_topic', messages);

producer.subscribe(message => console.info(message));

Passing options parameter to toTopic:

const opts = { brokers: 'kafka://127.0.0.1:9092' };
const messages = Observable.from([{key: 'value1'}, {key: 'value2'}]);

const KafkaObservable = require('kafka-observable');
const producer = KafkaObservable.toTopic('my_topic', messages, opts);

producer.subscribe(message => console.info(message));

Options

Below are the main options for the producer. For more producer options, please refer to no-kafka options (in case you use the provided default adapter).

Option Required Type Default Description
brokers yes Array/String - list of Kafka brokers
partitioner no prototype/String Default name (Default/HashCRC32) or prototype (instance of Kafka.DefaultPartitioner) to use as producer partitioner

TextMessage(mapper = (x) => x)

Convenience operator which converts a Buffer message value into utf8 string.

Parameters

  • mapper (Function) - mapper function

Example

const opts = { brokers: 'kafka://127.0.0.1:9092', groupId: 'test' };

const KafkaObservable = require('kafka-observable');
const consumer = KafkaObservable.fromTopic('my_topic', opts)
    .let(KafkaObservable.TextMessage());

consumer.subscribe(message => console.info(message));

JSONMessage(mapper = (x) => x)

Convenience operator provided to deserialize an object from a JSON message.

Parameters

  • mapper (Function) - mapper function

Example

const opts = { brokers: 'kafka://127.0.0.1:9092', groupId: 'test' };

const KafkaObservable = require('kafka-observable');
const consumer = KafkaObservable.fromTopic('my_topic', opts)
    .let(KafkaObservable.JSONMessage());

consumer.subscribe(json => console.info(json.key));

Custom Kafka Adapter

If you don't want to use no-kafka you can write an adapter for your client which respects the interface established by the code in lib/client.

Why an adapter?

I currently use an internal kafka client at Netflix with an interface very similar to this adapter and I wanted it to work out-of-the-box.

Development

Unit tests

npm install
npm run unit-test

Integration tests

requires docker to be installed and accessible through the docker command

npm install
docker pull spotify/kafka
npm run unit-test

Test coverage

based on unit tests

npm install
npm run coverage
open coverage/lcov-report/index.html 

Documentation

npm install
npm run gen-docs
open out/index.html 

License: MIT