Ingest data from RedHat AMQ Streams
You can ingest data from RedHat AMQ Streams into RisingWave by using the Kafka source connector in RisingWave.
AMQ Streams offers a distributed backbone that allows microservices and other applications to share data with extremely high throughput and extremely low latency. It is based on the Strimzi and Apache Kafka projects. AMQ Streams features a wide range of capabilities, including publish and subscribe functionality, long-term data retention, advanced queueing, replayable events, and partitioned messages for scalability.
Prerequisites
Before ingesting data from RedHat AMQ Streams into RisingWave, please ensure the following:
- The AMQ Streams cluster is running and accessible from your RisingWave cluster.
- If authentication is required for the AMQ Streams cluster, ensure you have the client username and password.
- Create the AMQ Streams topic from which you want to ingest data.
- Ensure that your RisingWave cluster is running.
For example, we create a topic named financial-transactions
with the following sample data from various financial transactions data, formatted as JSON. Each sample represents a unique transaction with distinct transaction IDs, sender and receiver accounts, amounts, currencies, and timestamps. Hence AMQ Streams is compatible with Apache Kafka. For more information, refer to Apache Kafka.
Ingest data into RisingWave
Create a table
In RisingWave, create a table named financial-transactions
to connect RisingWave to the AMQ Streams topic.
Query the table
Let’s retrieve data from the created table:
Expected result:
You have consumed data from an AMQ Streams topic into the RisingWave, created a table, and then queried it.
Was this page helpful?