Continuous Data: Ingesting Continuous Data in Snowflake
Snowflake
| Intermediate
- 7 videos | 48m 20s
- Includes Assessment
- Earns a Badge
Data is generally processed using a batch or stream methodology depending on how much time between data generation and processing is acceptable. The Snowflake feature Snowpipes processes data in micro-batches which fall in between these two scenarios. In this course, you will cover the implementation of Snowpipes when data is sourced from an internal Snowflake stage. You will kick things off by looking at data ingestion options in Snowflake from a theoretical standpoint, including the differences between bulk data loading and Snowpipes. Then, you get hands-on to set up the infrastructure for data ingestion: an internal stage for CSV data, a destination table for a data load, and a pipe to carry out the load in micro-batches. Next, you will ingest the data into the destination table and explore how this process can be monitored by tracking the pipe status. Finally, you will implement a Snowflake task to trigger a Snowpipe at regular time intervals.
WHAT YOU WILL LEARN
-
Discover the key concepts covered in this courseOutline how batch, stream, and micro-batch processing worksRecognize how the snowpipe service works in snowflakeSet up the stage, file format, and data for loading continuous data
-
Create a snowpipe and view its metadataLoad data into a table automatically using a snowflake pipeSummarize the key concepts covered in this course
IN THIS COURSE
-
2m 10s
-
7m 50s
-
9m 5s
-
11m 3s
-
6m 39s
-
9m 47s
-
1m 47s
EARN A DIGITAL BADGE WHEN YOU COMPLETE THIS COURSE
Skillsoft is providing you the opportunity to earn a digital badge upon successful completion on some of our courses, which can be shared on any social network or business platform.
Digital badges are yours to keep, forever.