RabbitMQ Stream tutorial - "Hello World!"
Introduction
Prerequisites
This tutorial assumes RabbitMQ is installed, running on
localhost
and the stream plugin enabled.
The standard stream port is 5552. In case you
use a different host, port or credentials, connections settings would require
adjusting.
Using docker
If you don't have RabbitMQ installed, you can run it in a Docker container:
docker run -it --rm --name rabbitmq -p 5552:5552 -p 15672:15672 -p 5672:5672 \
-e RABBITMQ_SERVER_ADDITIONAL_ERL_ARGS='-rabbitmq_stream advertised_host localhost' \
rabbitmq:3.13
wait for the server to start and then enable the stream and stream management plugins:
docker exec rabbitmq rabbitmq-plugins enable rabbitmq_stream rabbitmq_stream_management
Where to get help
If you're having trouble going through this tutorial you can contact us through the mailing list or discord community server.
RabbitMQ Streams was introduced in RabbitMQ 3.9. More information is available here.
"Hello World"
(using the NodeJs Stream Client)
In this part of the tutorial we'll write two programs in JavaScript; a producer that sends a single message, and a consumer that receives messages and prints them out. We'll gloss over some of the detail in the JavaScript client API, concentrating on this very simple thing just to get started. It's the "Hello World" of RabbitMQ Streams.
The Node.js stream client library
RabbitMQ speaks multiple protocols. This tutorial uses RabbitMQ stream protocol which is a dedicated protocol for RabbitMQ streams. There are a number of clients for RabbitMQ in many different languages, see the stream client libraries for each language. We'll use the Node.js stream client built and supported by Coders51.
The client supports Node.js >= 16.x. This tutorial will use Node.js stream client 0.3.1 version. The client 0.3.1 and later versions are distributed via npm.
This tutorial assumes you are using powershell on Windows. On MacOS and Linux nearly any shell will work.
Setup
First let's verify that you have the Node.js toolchain in PATH
:
npm --help
Running that command should produce a help message.
Now let's create a project:
npm init
then install the client:
npm install rabbitmq-stream-js-client
This is how package.json
should look like:
{
"name": "rabbitmq-stream-node-tutorial",
"version": "1.0.0",
"description": "Tutorial for the nodejs RabbitMQ stream client",
"scripts": {
"send": "node send.js",
"receive": "node receive.js"
},
"dependencies": {
"rabbitmq-stream-js-client": "^0.3.1"
}
}
Now create new files named receive.js
and send.js
.
Now we have the Node.js project set up we can write some code.
Sending
We'll call our message producer (sender) send.js
and our message consumer (receiver)
receive.js
. The producer will connect to RabbitMQ, send a single message,
then exit.
In
send.js
,
we need to add the client:
const rabbit = require("rabbitmq-stream-js-client")
then we can create a connection to the server:
const client = await rabbit.connect({
hostname: "localhost",
port: 5552,
username: "guest",
password: "guest",
vhost: "/",
})
The entry point of the client is the Client
class.
It is used for stream management and the creation of publisher instances.
It abstracts the socket connection, and takes care of protocol version negotiation and authentication and so on for us.
This tutorial assumes that stream publisher and consumer connect to a RabbitMQ node running locally, that is, on localhost.
To connect to a node on a different machine, simply specify target hostname or IP address Client
parameters.
Next let's create a producer.
The producer will also declare a stream it will publish messages to and then publish a message:
const streamName = "hello-nodejs-stream";
console.log("Connecting...");
const client = await rabbit.connect({
vhost: "/",
port: 5552,
hostname: "localhost",
username: "guest",
password: "guest",
});
console.log("Making sure the stream exists...");
const streamSizeRetention = 5 * 1e9
await client.createStream({ stream: streamName, arguments: { "max-length-bytes": streamSizeRetention } });
const publisher = await client.declarePublisher({ stream: streamName });
console.log("Sending a message...");
await publisher.send(Buffer.from("Test message"));
The stream declaration operation is idempotent: the stream will only be created if it doesn't exist already.
A stream is an append-only log abstraction that allows for repeated consumption of messages until they expire. It is a good practice to always define the retention policy. In the example above, the stream is limited to be 5 GiB in size.
The message content is a byte array. Applications can encode the data they need to transfer using any appropriate format such as JSON, MessagePack, and so on.
When the code above finishes running, the producer connection and stream-system connection will be closed. That's it for our producer.
Each time the producer is run, it will send a single message to the server and the message will be appended to the stream.
The complete send.js file can be found on GitHub.
Sending does not work
If this is your first time using RabbitMQ and you don't see the "Sent" message then you may be left scratching your head wondering what could be wrong. Maybe the broker was started without enough free disk space (by default it needs at least 50 MB free) and is therefore refusing to accept messages. Check the broker log file to see if there is a resource alarm logged and reduce the free disk space threshold if necessary. The Configuration guide will show you how to set
disk_free_limit
.Another reason may be that the program exits before the message makes it to the broker. Sending is asynchronous in some client libraries: the function returns immediately but the message is enqueued in the IO layer before going over the wire. The sending program asks the user to press a key to finish the process: the message has plenty of time to reach the broker. The stream protocol provides a confirm mechanism to make sure the broker receives outbound messages, but this tutorial does not use this mechanism for simplicity's sake.
Receiving
The other part of this tutorial, the consumer, will connect to a RabbitMQ node and wait for messages to be pushed to it. Unlike the producer, which in this tutorial publishes a single message and stops, the consumer will be running continuously, consume the messages RabbitMQ will push to it, and print the received payloads out.
Similarly to send.js
, receive.js
will need to use the client:
const rabbit = require("rabbitmq-stream-js-client")
When it comes to the initial setup, the consumer part is very similar the producer one; we use the default connection settings and declare the stream from which the consumer will consume.
const client = await rabbit.connect({
hostname: "localhost",
port: 5552,
username: "guest",
password: "guest",
vhost: "/",
})
const streamSizeRetention = 5 * 1e9
await client.createStream({ stream: streamName, arguments: { "max-length-bytes": streamSizeRetention } });
Note that the consumer part also declares the stream. This is to allow either part to be started first, be it the producer or the consumer.
We use the declareConsumer
method to create the consumer.
We provide a callback to process delivered messages.
offset
defines the starting point of the consumer.
In this case, the consumer starts from the very first message available in the stream.
await client.declareConsumer({ stream: streamName, offset: rabbit.Offset.first() }, (message) => {
console.log(`Received message ${message.content.toString()}`)
})
The complete receive.js file can be found on GitHub.
Putting It All Together
In order to run both examples, open two terminal (shell) tabs.
Both parts of this tutorial can be run in any order, as they both declare the stream. Let's run the consumer first so that when the first publisher is started, the consumer will print it:
npm run receive
Then run the producer:
npm run send
The consumer will print the message it gets from the publisher via RabbitMQ. The consumer will keep running, waiting for new deliveries. Try re-running the publisher several times to observe that.
Streams are different from queues in that they are append-only logs of messages that can be consumed repeatedly. When multiple consumers consume from a stream, they will start from the first available message.