Skip to content

Commit 6a39ef5

Browse files
chetanthotechetan thotekesmit13
authored
Modified Notebooks according to feedback (#106)
* Create load-CSV-data-S3 * Added notebooks for Load data sections of UI * Modified with suggested changes * Modified with suggested changes * Remove extra header * Modified with suggested changes and changed Kai Credentials * Modified with suggested changes and add JSON notebook * Update notebook.ipynb * Update notebook.ipynb * Modified with pre-commit checks --------- Co-authored-by: chetan thote <[email protected]> Co-authored-by: Kevin D Smith <[email protected]>
1 parent 5df9689 commit 6a39ef5

File tree

5 files changed

+577
-26
lines changed

5 files changed

+577
-26
lines changed

notebooks/load-csv-data-s3/notebook.ipynb

Lines changed: 52 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -81,7 +81,7 @@
8181
"id": "2d22fd53-2c18-40e5-bb38-6d8ebc06f1b8",
8282
"metadata": {},
8383
"source": [
84-
"## Create a database\n",
84+
"## Create a database (You can skip this Step if you are using Free Starter Tier)\n",
8585
"\n",
8686
"We need to create a database to work with in the following examples."
8787
]
@@ -161,6 +161,15 @@
161161
"START PIPELINE SalesData_Pipeline;"
162162
]
163163
},
164+
{
165+
"attachments": {},
166+
"cell_type": "markdown",
167+
"id": "a402a924-5e09-4213-88f6-2723b39ee2aa",
168+
"metadata": {},
169+
"source": [
170+
"### It might take around 1 min to load data from S3 to SingleStore table"
171+
]
172+
},
164173
{
165174
"cell_type": "code",
166175
"execution_count": 4,
@@ -169,7 +178,7 @@
169178
"outputs": [],
170179
"source": [
171180
"%%sql\n",
172-
"SELECT * FROM SalesData LIMIT 10"
181+
"SELECT count(*) FROM SalesData"
173182
]
174183
},
175184
{
@@ -296,28 +305,60 @@
296305
"source": [
297306
"## Conclusion\n",
298307
"\n",
299-
"<div class=\"alert alert-block alert-warning\">\n",
300-
" <b class=\"fa fa-solid fa-exclamation-circle\"></b>\n",
301-
" <div>\n",
302-
" <p><b>Action Required</b></p>\n",
303-
" <p> If you created a new database in your Standard or Premium Workspace, you can drop the database by running the cell below. Note: this will not drop your database for Free Starter Workspaces. To drop a Free Starter Workspace, terminate the Workspace using the UI. </p>\n",
304-
" </div>\n",
305-
"</div>\n",
306-
"\n",
307308
"We have shown how to insert data from a Amazon S3 using `Pipelines` to SingleStoreDB. These techniques should enable you to\n",
308309
"integrate your Amazon S3 with SingleStoreDB."
309310
]
310311
},
312+
{
313+
"attachments": {},
314+
"cell_type": "markdown",
315+
"id": "83b2d1e6-58b8-493e-a698-2fd46e2ac5a1",
316+
"metadata": {},
317+
"source": [
318+
"## Clean up"
319+
]
320+
},
321+
{
322+
"cell_type": "markdown",
323+
"id": "f028e26e-66c0-44dc-9024-221687334301",
324+
"metadata": {},
325+
"source": [
326+
"#### Drop Pipeline"
327+
]
328+
},
311329
{
312330
"cell_type": "code",
313331
"execution_count": 10,
332+
"id": "f1f7b94f-2018-464e-9a28-b71cb89d65e3",
333+
"metadata": {},
334+
"outputs": [],
335+
"source": [
336+
"%%sql\n",
337+
"STOP PIPELINE SalesData_Pipeline;\n",
338+
"\n",
339+
"DROP PIPELINE SalesData_Pipeline;"
340+
]
341+
},
342+
{
343+
"cell_type": "markdown",
344+
"id": "33a246bd-36a3-4027-b44d-8c46768ff96d",
345+
"metadata": {},
346+
"source": [
347+
"#### Drop Data"
348+
]
349+
},
350+
{
351+
"cell_type": "code",
352+
"execution_count": 11,
314353
"id": "d5053a52-5579-4fea-9594-5250f6fcc289",
315354
"metadata": {},
316355
"outputs": [],
317356
"source": [
318357
"shared_tier_check = %sql show variables like 'is_shared_tier'\n",
319358
"if not shared_tier_check or shared_tier_check[0][1] == 'OFF':\n",
320-
" %sql DROP DATABASE IF EXISTS SalesAnalysis;"
359+
" %sql DROP DATABASE IF EXISTS SalesAnalysis;\n",
360+
"else:\n",
361+
" %sql DROP TABLE SalesData;"
321362
]
322363
},
323364
{

notebooks/load-data-json/meta.toml

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,12 @@
1+
[meta]
2+
authors=["chetan-thote"]
3+
title="Employee Data Analysis JSON Dataset"
4+
description="""\
5+
Employee Data Analysis use case illustrates how to leverage Singlestore's capabilities to process and analyze JSON data from a Amazon S3 data source.
6+
"""
7+
difficulty="beginner"
8+
tags=["starter", "loaddata", "json"]
9+
lesson_areas=["Ingest"]
10+
icon="database"
11+
destinations=["spaces"]
12+
minimum_tier="free-shared"

0 commit comments

Comments
 (0)