Skip to content

Commit 377132d

Browse files
committed
Ready for publishing
1 parent 8137813 commit 377132d

File tree

1 file changed

+24
-13
lines changed

1 file changed

+24
-13
lines changed

docs/modules/ROOT/pages/sql-gen-data.adoc

Lines changed: 24 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -6,11 +6,11 @@ Make sure to rename this file to the name of your repository and add the filenam
66
= Generating Streaming data using SQL
77
// Add required variables
88
:page-layout: tutorial
9-
:page-product: cloud // Required: Define the product filter for this tutorial. Add one of the following: platform, imdg, cloud, operator
10-
:page-categories: Stream Processing, Get Started, SQL // Optional: Define the categories for this tutorial. Check the current categories on the tutorials homepage (https://docs.hazelcast.com/tutorials/). Add one or more of the existing categories or add new ones as a comma-separated list. Make sure that you use title case for all categories.
11-
:page-lang: sql // Optional: Define what Hazelcast client languages are supported by this tutorial. Leave blank or add one or more of: java, go, python, cplus, node, csharp.
12-
:page-enterprise: // Required: Define whether this tutorial requires an Enterprise license (true or blank)
13-
:page-est-time: 10 mins // Required: Define the estimated number of time required to complete the tutorial in minutes. For example, 10 mins
9+
:page-product: cloud
10+
:page-categories: Stream Processing, Get Started, SQL
11+
:page-lang: sql
12+
:page-enterprise:
13+
:page-est-time: 10 mins
1414
:description: Use SQL on Hazelcast to generate randomized streaming data for demo/POC purposes.
1515
// Required: Summarize what this tutorial is about in a sentence or two. What you put here is reused as the tutorial's first paragraph and included in HTML description tags. Start the sentence with an action verb such as 'Deploy' or 'Connect'.
1616

@@ -38,9 +38,9 @@ In an upcoming release of Hazelcast, the Kafka Connect connector will expose SQL
3838

3939
Before starting this tutorial, make sure that you meet the following prerequisites:
4040

41-
* Running cluster of Hazelcast
41+
* Running cluster of Hazelcast, either Viridian or Platform
4242
* Connection to SQL command line, either through CLC or through Management Center
43-
* (Optional) a Kafka instance accessible by your Hazelcast cluster
43+
* (For Step 2) a Kafka instance accessible by your Hazelcast cluster
4444

4545

4646
== Step 1. Generating Data
@@ -55,8 +55,9 @@ You can choose one of these approaches to write your tutorial part:
5555
Whatever option you choose when designing your tutorial should be carried through in subsequent parts.
5656
////
5757

58-
The following code is from the link:https://docs.hazelcast.com/tutorials/SQL-Basics-on-Viridian[SQL Basics on Viridian (Stock Ticker Demo) tutorial]. The comments break down what each part of the code is doing.
58+
. Look through the following SQL code from the link:https://docs.hazelcast.com/tutorials/SQL-Basics-on-Viridian[SQL Basics on Viridian (Stock Ticker Demo)] tutorial. The comments explain what the code is doing.
5959

60+
+
6061
```sql
6162
CREATE OR REPLACE VIEW trades AS
6263
SELECT id,
@@ -99,7 +100,9 @@ FROM
99100
<4> We seed the timestamp with a base value that equates to 21 Feb 2022. You can change this to any reasonable Unix timestamp.
100101
<5> The `generate_stream` function is what makes this all work. In this example, we're generating 100 events per second.
101102

102-
Once this view is created, you can access it via SQL. Because you're looking at streaming data, you'll need to use CTRL-C to stop each query.
103+
. Paste the above code into your SQL interface.
104+
105+
. Verify that the data is being generated using SQL queries. Because you're looking at streaming data, you'll need to use CTRL-C to stop each query.
103106

104107
```sql
105108
SELECT * from trades;
@@ -109,7 +112,6 @@ FROM trades;
109112
```
110113

111114

112-
113115
== Step 2. Inserting into Kafka
114116

115117
////
@@ -118,7 +120,7 @@ Continue the design approach you chose in the previous part and continue it thro
118120

119121
You can send generated data to Kafka. Kafka will store and replay it as it would data from any other streaming source. Instead of creating a view local to Hazelcast, you'll create a mapping within SQL for the Kafka topic, then use the `INSERT` function to send generated data to that topic.
120122

121-
. First, create a mapping for the data. Include all the fields that you'll generate with SQL. This creates a topic in Kafka as well as m
123+
. First, create a mapping for the data. Include all the fields that you'll generate with SQL. This statement creates the `trades` topic in Kafka.
122124
+
123125
```sql
124126
CREATE or REPLACE MAPPING trades (
@@ -133,7 +135,8 @@ OPTIONS (
133135
'bootstrap.servers' = 'broker:9092'
134136
);
135137
```
136-
. Next, use the `INSERT` function to send the data to the `trades` topic you just created.
138+
. Next, use the `INSERT` function to send the data to the `trades` topic you just created. The code to generate the data is exactly the same; the only difference is that we're sending it to Kafka instead of creating a local view.
139+
+
137140
```sql
138141
INSERT INTO trades
139142
SELECT id,
@@ -167,7 +170,15 @@ FROM
167170
ROUND(RAND()*100, 0) as amount
168171
FROM TABLE(generate_stream(100))); <5>
169172
```
170-
The code to generate the data is exactly the same; the only difference is that we're sending it to Kafka instead of creating a local view.
173+
174+
. You can now query this data as above using SQL.
175+
+
176+
You can also access this streaming data with the Pipeline API using the following call. (For details on setting up the Kakfa sources properties, see the link:https://docs.hazelcast.com/hazelcast/5.3/integrate/kafka-connector[Apache Kafka Connector]) section of the documentation.
177+
+
178+
```java
179+
Pipeline p = Pipeline.create();
180+
p.readFrom(KafkaSources.kafka(properties, "trades"))
181+
```
171182

172183
== Summary
173184

0 commit comments

Comments
 (0)