Snowflake: Snowflake 2022 expert
Technology:
Expertise:
- 3 Courses | 4h 50m
- 8 Books | 28h 32m
- 4 Courses | 6h 24m 38s
- 4 Books | 14h 7m
- 6 Courses | 9h 17m 40s
- 6 Books | 19h 33m
- 9 Courses | 14h 2m 36s
- 9 Courses | 12h 19m 27s
Explore Snowflake, a global cloud platform for data and essential workloads, with seamless data collaboration.
GETTING STARTED
Getting Started with Snowflake: Using the Snowflake Data Platform
-
1m 24s
-
3m 53s
GETTING STARTED
Data Loading in Snowflake: Fundamentals of Stages
-
1m 37s
-
7m 12s
GETTING STARTED
Advanced Analytics: Performing Analytics Using Snowflake
-
1m
-
3m 32s
GETTING STARTED
Data Transformation Using the Snowpark API
-
1m 58s
-
10m 51s
GETTING STARTED
Snowflake Performance: Scaling and Autoscaling Warehouses
-
1m 14s
-
11m 21s
COURSES INCLUDED
Getting Started with Snowflake: Using the Snowflake Data Platform
Snowflake is a cloud-native, managed data platform for big data storage and processing that does not require any hardware or software installation, maintenance, or upgrades and can be connected to in a variety of ways. In this course, learn how to set up a Snowflake free trial account, log into it, and navigate the classic Snowflake UI. Next, practice creating and monitoring virtual warehouses and executing DDL operations. Finally, discover how to create databases and tables, load data into tables, instantiate file format objects, and query tables. Upon completion, you'll be able to list features and use cases of Snowflake, differentiate Snowflake editions, and utilize virtual warehouses and databases to separate data and compute.
16 videos |
1h 58m
Assessment
Badge
Getting Started with Snowflake: Queries, Dashboards, & Tables
Scanning an entire table to return output for a single query is computationally intensive, so many technologies rely on partitioning tables to improve query performance for big data. Snowflake implements a form of partitioning known as micro-partitioning where all tables are automatically divided into micro-partitions that users don't need to manage. In this course, explore and execute queries using Snowflake's web interface, Snowsight. Next, learn how to create dashboards in Snowsight and add various charts. Finally, discover how permanent, temporary, and transient tables in Snowflake differ and how such tables can be created, used, and cloned. Upon completion, you'll be able to view database objects in Snowsight, run queries and build dashboards, and differentiate the types of Snowflake tables.
13 videos |
1h 25m
Assessment
Badge
Getting Started with Snowflake: Using Time Travel & the SnowSQL CLI
Data in Snowflake goes through a three-phase lifecycle: current data storage, where the data is stored in regular database tables; Time Travel, where you can access data from a specific time; and Fail-safe, which preserves historical data for a configurable retention period. In this course, learn how to use Time Travel to access historical data using query ID filters, relative time offsets, and absolute timestamp values. Next, practice cloning historical copies of tables into current tables and restoring dropped tables with the UNDROP command. Finally, discover how to install SnowSQL, define and use variables, and spool query results to files. Upon completion, you'll be able to utilize Time Travel and Fail-safe in Snowflake, configure SnowSQL, define and use variables, and utilize SnowSQL features.
13 videos |
1h 25m
Assessment
Badge
COURSES INCLUDED
Data Loading in Snowflake: Fundamentals of Stages
Snowflake is a modern data cloud solution/ platform that uses an abstraction known as a stage to provide bulk loading of data for analysis, resulting in faster query processing. Begin this course by exploring the internal and external stages in Snowflake. Then, you will load data using Snowflake's classic user interface (UI). Next, examine erroneous data and discover how to query the data and download the results. You will also load data from the stage into an actual table using the SnowSQL command-line interface. Finally, you will create and use table stages, as well as named stages. When you have completed this course, you will be able to differentiate between internal and external stages, list types and features of internal stages, and load data using the classic Snowflake UI.
17 videos |
2h 9m
Assessment
Badge
Data Loading in Snowflake: Using External Stages
Snowflake can load data from the Google Cloud Platform (GCP), Microsoft Azure, and Amazon Simple Storage Service (S3) using external stages. Each cloud platform has one or more techniques for integrating with Snowflake. One method of integration that is supported across all three of the major cloud platforms is storage integration objects. Begin this course by loading data from Google Cloud Storage buckets. Create a Snowflake storage integration object and a GCP cloud storage service account for your Snowflake account. Then, load data from Azure Blob Storage using both the storage integration method and the SAS token method. Finally, integrate with Amazon S3 via storage integrations and using an access key and access key ID credentials. When you have completed this course, you will be able to easily integrate Snowflake with Google Cloud Storage, Azure Blob Storage, and Amazon S3.
15 videos |
1h 33m
Assessment
Badge
Data Loading in Snowflake: Unloading Data
Snowflake supports the bulk unloading, or export, of batch data using both internal and external stages. Begin this course by unloading data to all three types of internal stages - user, table, and named. Then, unload data to external stages on the Google Cloud Platform, Microsoft Azure, and Amazon Simple Storage Service (S3). Finally, query data directly from staged files, focusing on the syntax and restrictions of querying data in this manner. When you have completed this course, you will be able to unload data from Snowflake to all three types of internal storage, as well as externally to Google Cloud Storage buckets, Amazon S3 bucket, and Azure Blob Storage.
12 videos |
1h 13m
Assessment
Badge
Managing Snowflake: Administering a Snowflake Account
Snowflake is a powerful enterprise Software as a Service (SaaS) offering, and like all SaaS technologies, it is particularly important to administer it correctly, from the perspective of both security and access control and of monitoring and attributing resource usage. Otherwise, costs and security vulnerabilities can quickly pile up as users, roles, and accounts increasingly multiply. In this course, discover how to manage the accounts and users within an organization. Then, explore how Snowflake allows users to configure various properties on objects, using parameters. Next, use resource monitors to impose limits on the number of credits consumed by Snowflake and examine what happens if that limit is breached. Learn how to configure single sign-on using Otka and implement federated authentication. Finally, perform data masking and investigate how Snowflake allows you to redact and censor data at the column level using masking polices.
14 videos |
1h 27m
Assessment
Badge
SHOW MORE
FREE ACCESS
COURSES INCLUDED
Advanced Analytics: Performing Analytics Using Snowflake
Data analytics is the systematic, computational analysis of data or statistics used to discover and communicate meaningful patterns. In business, analytics can be used to extract insights for a business strategy or identify new business opportunities. Snowflake is a managed data platform for big data storage, processing, and analytics that allows for common SQL operations and additional operations. In this course, explore various types of Snowflake join operations and data sampling. Next, learn how to use common table expressions (CTEs) and construct queries. Finally, work with functions related to partitioning, windowing, and ranks. Upon completion, you'll be able to use joins, perform row-based and block-based sampling, construct CTEs, and perform windowing and partitioning operations in Snowflake.
16 videos |
1h 49m
Assessment
Badge
Queries in Snowflake: Getting Started with Performance Optimizations
Partitioning data based on a column is a common technique for performance query optimization in many database technologies. Snowflake is a highly-performant, big-data technology that uses its own form of data partitioning called micro-partitioning that stores data in columnar format. In this course, discover the advantages of Snowflake micro-partitions over regular, static partitioning. Next, examine how caching works in Snowflake. Finally, learn how to perform the clustering of Snowflake tables and how clustering helps the performance of filters, point-lookups, and range queries on the clustering key columns. Upon completion, you'll be able to leverage micro-partitioning in Snowflake, differentiate between retrieval optimization caching and local disk caching, and implement clustering with the correct clustering key.
16 videos |
1h 51m
Assessment
Badge
Queries in Snowflake: Search Optimization, External Table Partitions, & Views
Search optimization service is a Snowflake feature used to create an auxiliary data structure optimized as a search access path to improve selective point lookup query performance. While search optimization is an extremely handy feature, it can also run up substantial storage and compute costs. In this course, learn how to implement search optimizations by measuring the cost of search optimizations for your table and adding the service to it. Next, differentiate between search optimizations and clustering and discover how to implement partitioning for external tables. Finally, practice working with views in Snowflake and differentiate between non-materialized, materialized, and secure views. Upon completion, you'll be able to leverage search optimization in Snowflake, implement external table partitioning, and use and analyze views.
15 videos |
1h 32m
Assessment
Badge
Continuous Data: Ingesting Continuous Data in Snowflake
Data is generally processed using a batch or stream methodology depending on how much time between data generation and processing is acceptable. The Snowflake feature Snowpipes processes data in micro-batches which fall in between these two scenarios. In this course, you will cover the implementation of Snowpipes when data is sourced from an internal Snowflake stage. You will kick things off by looking at data ingestion options in Snowflake from a theoretical standpoint, including the differences between bulk data loading and Snowpipes. Then, you get hands-on to set up the infrastructure for data ingestion: an internal stage for CSV data, a destination table for a data load, and a pipe to carry out the load in micro-batches. Next, you will ingest the data into the destination table and explore how this process can be monitored by tracking the pipe status. Finally, you will implement a Snowflake task to trigger a Snowpipe at regular time intervals.
7 videos |
48m
Assessment
Badge
Continuous Data: Automating Data Ingestion from Cloud Storage into Snowflake
The Snowpipe feature allows Snowflake to input micro-batches of data as it becomes available, generally within minutes of the data being added to a stage and submitted for ingestion. In this course, you will implement the auto-ingestion of CSV files from external Snowflake stages located on the AWS and Azure cloud platforms. You will begin by setting up a continuous data ingestion pipeline where the data source is located in an Azure Storage Container. This pipeline will include several components, such as queues, enterprise applications, and storage integrations as well as the permissions required to get these pieces to talk to one another. You will then implement something similar with an Amazon S3 bucket as the source of data. This set-up will involve AWS services such as IAM roles, SNS topics, as well as Snowflake objects such as notification integrations and pipes.
11 videos |
1h 27m
Assessment
Badge
Data Sharing in Snowflake: Implementing Secure Data Sharing
In the traditional approach to data sharing, consumers receive a copy of the provider data, which can allow for the creation of multiple copies of data. Secure Data Sharing in Snowflake enables the secure sharing of specific database objects without data duplication between accounts. In this course, explore the motivations behind Secure Data Sharing and its implementation, as well as how to create and share a Snowflake share. Next, learn how to use Snowflake Marketplace data, the specific permissions required to create and manage shares, and how to set permissions using queries. Finally, practice implementing cross-region data replication to enable Secure Data Sharing across cloud and geographical boundaries. Upon completion, you'll be able to securely share Snowflake objects with other Snowflake users.
14 videos |
1h 48m
Assessment
Badge
SHOW MORE
FREE ACCESS
COURSES INCLUDED
Data Transformation Using the Snowpark API
The Snowpark API is a framework for writing code in Python, Java, or Scala to work with Snowflake. The Snowpark libraries make it very easy to programmatically implement complex data transformations on Snowflake data using DataFrames. In this course, learn how to use Snowpark with Snowflake, build and execute Snowpark handlers, create and query Snowflake tables, perform data transformations, and use external libraries in Snowpark handlers. Next, discover how to connect to Snowflake from a Jupyter Notebook, create and query tables with Snowpark APIs, handle DataFrames in Snowpark, and implement the commands on DataFrame objects. Finally, explore how to perform DataFrame joins and set operations, leverage views in Snowpark with Snowpark APIs, work with semi-structured Snowpark data, and gain insights by creating and querying tables with semi-structured JSON data. Upon course completion, you will be able to use the Snowpark API with Snowflake.
12 videos |
1h 49m
Assessment
Badge
Snowpark pandas and User-defined Functions
DataFrames are the core Snowpark table abstraction. Snowpark supports the Snowpark pandas API and its DataFrames, as well as the rich functionality related to user-defined functions (UDFs). In this course, learn how to work with the Snowpark pandas API, create Snowflake Notebooks, use Snowpark pandas via the Modin plugin, and convert between Snowpark pandas and Snowpark DataFrame objects. Next, explore Snowflake UDFs, UDAFs, UDTFs, and stored procedures. Finally, discover how to register and invoke permanent and anonymous UDFs in Snowflake and register UDFs from SQL and Python files. After completing this course, you will be able to use Snowpark pandas DataFrames and register and invoke user-defined functions (UDFs).
9 videos |
1h 20m
Assessment
Badge
Snowpark UDTFs, UDAFs, and Stored Procedures
Snowpark offers powerful tools for developers to write custom code in the form of UDFs, UDTFs, UDAFs, and stored procedures, each of which is implemented using extremely powerful handlers. In this course, learn about Snowflake UDTFs and partitioning, register and invoke UDTFs, construct a UDTF to normalize denormalized JSON data, and implement stateful processing using the end_partition and init functions. Next, discover how to partition rows to sort within a partition using UDTFs, explore Snowflake UDAFs and UDAF handler class methods, perform aggregation operations, and implement UDAFs that use Python objects and user-defined classes. Finally, examine Snowflake stored procedures and differentiate them from UDFs, UDTFs, and UDAFs, as well as register and invoke stored procedures and write Python functions using the Snowpark APIs. Upon completion of this course, you will be able to outline and use snowpark UDTFs, UDAFs, and stored procedures.
13 videos |
1h 51m
Assessment
Badge
Snowpark ML APIs and the Model Registry
Snowflake has several powerful AI/ML features. These are available under two broad categories: Snowflake Cortex for LLM-related activities and Snowflake ML for more traditional ML model-building. In this course, explore how Snowflake integrates AI/ML capabilities across its platform, how Snowpark ML APIs support model training with popular libraries, and how to perform hyperparameter tuning to optimize model performance. Next, learn how to configure Python and Jupyter for Snowflake ML and set up a virtual environment to run a Jupyter Notebook that leverages Snowflake ML APIs. Finally, discover how to connect to Snowflake using the Snowpark API, work with the Snowflake Model Registry, and manage models. Upon course completion, you will be able to outline Snowpark ML APIs and the Snowflake Model Registry.
10 videos |
1h 20m
Assessment
Badge
Snowflake Feature Store and Datasets
Snowflake ML has now introduced Snowflake Feature Store, which can improve collaboration, break down data silos, and facilitate feature reuse. Datasets are another great new feature, offering data versioning to drive model result reproducibility. In this course, learn how to use Snowflake Datasets for various data operations, including materializing DataFrames into datasets and managing versions, building a Snowflake ML pipeline for logistic regression, and creating and applying model tags. Next, discover how to utilize Snowpark-optimized warehouses for hyperparameter tuning, register tuned models with the Snowflake Model Registry, and create feature stores and entities using Snowpark APIs. Finally, explore how to build managed feature views and the workflow of feature stores. After course completion, you will be able to use Snowflake Feature Store and datasets.
14 videos |
1h 58m
Assessment
Badge
Using Streamlit with Snowflake
Streamlit is an open-source library used to build interactive, visual-heavy web applications that work with a variety of DataFrames, including Snowpark DataFrames. For that reason, it is a natural fit with Snowflake. In this course, learn how to use the Streamlit library to create interactive web applications within Snowflake, build basic Streamlit apps directly in Snowsight, and enhance your Streamlit apps by adding visualizations using seaborn and Matplotlib. Next, discover how to implement various UI controls, build a UI where the user selects their ideal model type, and access the model registry within a Streamlit app. Finally, explore how to share your completed Streamlit app with other users in view-only mode. Upon course completion, you will be able to use Streamlit with Snowflake.
8 videos |
1h 3m
Assessment
Badge
Anomaly Detection with Snowflake ML Functions
Snowflake ML functions offer powerful SQL functionality for several common use cases, including anomaly detection, time series forecasting, and classification. In this course, learn about the types of models available in Snowflake ML functions, when to use functions for different machine learning Snowflake tasks, and the required data formats for input into anomaly detection and forecasting models. Next, examine how to use Snowflake ML functions to implement anomaly detection, interpret the output of the anomaly detection model, and tune model sensitivity and save model results. Finally, discover how to add exogenous variables for anomaly detection model enhancement and extend a model to work with multi-series data. After completing this course, you will be able to implement anomaly detection with Snowflake ML functions.
12 videos |
1h 36m
Assessment
Badge
Snowflake Forecasting Models and the AI & ML Studio
Snowflake ML functions support forecasting models that share many characteristics with anomaly detection models. In addition, the new AI & ML Studio provides a point-and-click interface for building models for forecasting, classification, and anomaly detection. In this course, learn how to create and use Snowflake forecasting models, build a time series forecasting model with ML functions, and enhance your time series forecasting models. Next, examine how to handle multiple time series concurrently, use Snowflake AI & ML Studio for forecasting, and generate SQL code to build and execute a time series forecasting model. Finally, discover how to use Snowflake AI & ML Studio for classification tasks, generate and execute SQL code to invoke ML classification functions, and evaluate classification model output.
12 videos |
1h 28m
Assessment
Badge
Snowflake Cortex for LLMs, RAG, and Search
Snowflake Cortex is a suite of AI-related features that facilitate working with large language models (LLMs). These include Snowflake Copilot, AI-powered Universal Search, and Cortex Search for Retrieval Augmented Generation (RAG). In this course, learn how to use Snowflake Cortex to work with LLMs, adjust hyperparameters to control LLM output, and use the COMPLETE Cortex LLM function. Next, discover how to use Cortex LLM functions directly from SQL and Python and utilize advanced features like Snowflake Copilot, Universal Search, and Document AI for user experience enhancement through AI-driven search and document processing. Finally, explore how to customize LLMs through Cortex Fine-Tuning and implement Retrieval Augmented Generation using Cortex Search. Upon course completion, you will be able to work with LLMs in Snowflake Cortex.
12 videos |
1h 36m
Assessment
Badge
SHOW MORE
FREE ACCESS
COURSES INCLUDED
Snowflake Performance: Scaling and Autoscaling Warehouses
By understanding Snowflake's warehouse configurations and scaling, you can optimize performance and ensure efficient resource utilization in Snowflake. In this course, you will discover the features and architecture of Snowflake, gaining a foundational understanding of how it operates as a cloud data platform. Then you will explore the different editions of Snowflake and their billing structures, helping you make informed decisions about which version best fits your organizational needs. Next, you will learn how to choose between resizing warehouses and multi-cluster warehouses within Snowflake. You will focus on creating warehouses and selecting appropriate configurations to optimize performance and cost. You will also find out how to scale up warehouses to handle increasing workloads, ensuring efficient data processing. You will investigate advanced topics such as multi-cluster warehouses and their modes of operation and examine how to dynamically adjust resources based on workload demands, leading to cost efficiency and improved performance. Finally, you will use resource monitors to effectively track and manage warehouse usage, ensuring optimal utilization without unnecessary costs.
11 videos |
1h 39m
Assessment
Badge
Snowflake Performance: Query Acceleration and Caching
Optimizing data queries in Snowflake involves partitioning and clustering large datasets to ensure quicker access and improved performance. Mastering query acceleration and caching techniques is essential for faster response times and accurate, up-to-date query results. In this course, you will learn about the Snowflake data model, focusing on how data is structured and managed within the platform. Then you will explore the importance of partitions and clustering, essential techniques for optimizing data queries by dividing large datasets into smaller, manageable pieces for quicker access and improved performance. Next, you will investigate techniques to enhance query performance, such as query acceleration and caching. You will enable query acceleration for warehouses for faster response times in complex queries and assess which queries benefit most from these enhancements. Additionally, you will use SnowSQL for efficient data loading. Finally, you will discover the intricacies of query result caching, see how your query structure affects caching, and manage caching effectively, including turning it off to ensure accurate and up-to-date query results.
10 videos |
1h 23m
Assessment
Badge
Snowflake Performance: Clustering and Search Optimization
Clustering and search optimization in Snowflake are crucial for enhancing query performance, reducing data retrieval times, and effectively managing large datasets. These techniques streamline data access, ensuring scalable and efficient data handling. In this course, you will explore how clustering helps improve the performance of point lookup and range queries. You will investigate the importance of choosing the appropriate clustering key and examine various approaches to implementing clustering, focusing on performance and scalability. Next, you will discover different methods for evaluating your clustering strategies and see how clustering can make your data retrieval queries more performant. You will also be introduced to search optimization in Snowflake to improve point lookup queries by building an auxiliary data structure to help quickly access data. Then you will compare search optimization and clustering to understand their effective use cases and gain insights on refining searches with complex predicates using AND and OR Clauses and optimizing searches on specific columns. Finally, you will work with VARIANTS, OBJECTS, and ARRAYS for versatile data management and improve queries with semi-structured data.
15 videos |
2h 16m
Assessment
Badge
Snowflake Performance: Iceberg Tables, External Tables, and Views
Understanding the concept of views in Snowflake is vital for creating and querying various types of views. Views allow us to use role-based access control to manage permissions and data security effectively and are essential for organizing and simplifying data and ensuring that only authorized users can access sensitive information. In this course, you will discover how to create and query standard, materialized, and secure views in Snowflake. Then you will configure role-based access control to allow users to access specific views and you will use materialized views to improve the performance of your queries and secure views to control access to the details of the underlying table. Next, you will learn how to query data stored in external cloud locations using Snowflake, integrate Snowflake with Google Cloud Storage buckets, and create an external table to access and query data stored on the Google Cloud using Snowflake. Finally, you will create and configure Iceberg tables in Snowflake for a modern, high-performance format that aids in managing large-scale datasets, providing features like schema evolution, partitioning, and time travel for enhanced data management.
12 videos |
1h 50m
Assessment
Badge
Continuous Data Pipelines and Dynamic Tables in Snowflake
Snowflake offers powerful support for the construction of complex data pipelines. This support includes constructs such as dynamic tables, streams, and tasks. In this course, learn about continuous data pipelines in Snowflake, including Snowflake's support for continuous data loading and transformation, change data capture, and recurring operations, and configure dynamic tables to manage and automate these processes. Next, discover how to create and configure dynamic tables, set properties to control their behavior, and verify the change tracking property of base tables. Finally, explore how to connect and manage dependencies between dynamic tables, create dynamic tables that depend on other dynamic tables, ensure their refresh modes are compatible, and configure dynamic tables to refresh on demand. After completing this course, you will be able to describe continuous data pipelines and use and configure dynamic tables in Snowflake.
8 videos |
1h 3m
Assessment
Badge
Streams and Change Data Capture in Snowflake
Streams are Snowflake's construct for change data capture (CDC) and process only changes in an underlying table or view. Used with dynamic tables and tasks, streams are an important and powerful building block of pipelines in Snowflake. In this course, learn about the usage and internal workings of streams for change data capture (CDC), stream types, and standard stream contents during insert, update, and delete operations. Next, discover how to create and read standard streams, combine stream contents with the target table for inserts and updates, and the effects of insert, update, and delete operations on standard stream contents. Finally, explore append-only streams, the relationship between streams and transactions, repeatable read isolations in streams, stream behavior within transactions, and how to implement streams on views. Upon course completion, you will be able to outline streams and change data capture in Snowflake.
11 videos |
1h 27m
Assessment
Badge
Using Tasks and Architecting Snowflake Data Pipelines
Tasks are Snowflake constructs that can execute code on a fixed schedule or when a stream has data to consume. Tasks are similar to cron jobs but are more complex because they can be chained into complex dependency networks called task graphs. In this course, learn about continuous data processing tasks, create and execute scheduled serverless and user-managed scheduled tasks, and implement task graphs and child tasks. Next, use dummy root nodes to bypass Snowflake's root node restrictions, create and use triggered tasks, and construct an architecture that utilizes streams, tasks, stages, and dynamic tables to feed into a dynamic dashboard. Finally, discover how to implement data pipelines with a stage, scheduled task, and table, add dynamic pipelines and triggered tasks to data pipelines, and create dashboards in Snowflake to consume data from different tables. Upon completion of this course, you will be able to use tasks and architect Snowflake data pipelines.
14 videos |
1h 49m
Assessment
Badge
Introduction to Snowflake
Why should you learn Snowflake? Over the last few years, Snowflake has become increasingly popular with data-led organizations and data enthusiasts. Snowflake is a data warehousing and data analysis platform built for the cloud and has been built from scratch to take full advantage of cloud features. It provides several novel features that changes how you work with data platforms. Snowflake does not have a steep learning curve because of its simplicity and its use of SQL as the primary language. Once you have grasped the foundations of Snowflake architecture, the rest of the learning naturally flows. Snowflake has a range of certifications available. SnowPro Core certification is at the foundation level, while other role-specific advanced certifications are also available for focused areas such as data engineering, administration, or data science on the Snowflake platform. The SnowPro Core certification is a prerequisite for the advanced certifications; therefore, it's a good way to kick-start your Snowflake journey. In this course, you'll learn about the history of Snowflake and an overview of the Snowflake journey. You'll also learn about the various Snowflake certifications with a focus on the SnowPro Core certification.
5 videos |
21m
Assessment
Badge
Snowflake Architecture
Snowflake employs a hybrid architecture to solve the massively parallel processing problem. This architecture uses a shared-disk approach for storage but uses multiple independent compute clusters to process the data. In this course, you'll learn about traditional database architectures, Snowflake hybrid architecture, and the three layers of Snowflake architecture. You'll explore micro-partitioning and how Snowflake stores data. Finally, you'll also learn about the purpose of virtual warehouses and configuring a virtual warehouse.
8 videos |
27m
Assessment
Badge
Snowflake: Data Loading and Stages
Data loading and processing is a crucial activity for a data analytics system. Snowflake provides a variety of methods for loading data, including bulk data loading and processing data in a continuous manner. In this course, you'll learn about the concepts of data loading and stages in Snowflake. You'll explore data loading using the Table, User, and Named internal stages; data ingestion using the named external stage; and data loading using the Snowflake web UI. Finally, you'll also learn about basic transformations of data while ingesting the data.
8 videos |
29m
Assessment
Badge
SHOW MORE
FREE ACCESS
EARN A DIGITAL BADGE WHEN YOU COMPLETE THESE COURSES
Skillsoft is providing you the opportunity to earn a digital badge upon successful completion on some of our courses, which can be shared on any social network or business platform.
Digital badges are yours to keep, forever.BOOKS INCLUDED
Book
Snowflake Essentials: Getting Started with Big Data in the Cloud, 1st EditionThis book covers how Snowflake's architecture is different from prior on-premises and cloud databases.
3h 45m
By Bhaskar B. Joshi, Bjorn Lindstrom, Frank Bell, Raj Chirumamilla, Ruchi Soni, Sameer Videkar
Book
Jumpstart Snowflake: A Step-by-Step Guide to Modern Cloud AnalyticsSnowflake was built specifically for the cloud and it is a true game changer for the analytics market. This book will help onboard you to Snowflake, present best practices to deploy, and use the Snowflake data warehouse.
2h 19m
By Dmitry Anoshin, Dmitry Shirokov, Donna Strok
Book
Mastering Snowflake Solutions: Supporting Analytics and Data SharingThis book also helps you protect your most valuable data assets using built-in security features such as end-to-end encryption for data at rest and in transit.
2h 50m
By Adam Morton
Book
Snowflake Access Control: Mastering the Features for Data Privacy and Regulatory ComplianceThis book shows how to use Snowflake's wide range of features that support access control, making it easier to protect data access from the data origination point all the way to the presentation and visualization layer.
3h 41m
By Jessica Megan Larson
Book
Snowflake Security: Securing Your Snowflake Data CloudThis Book Is for data engineers, data privacy professionals, and security teams either with security knowledge (preferably some data security knowledge) or with data engineering knowledge; in other words, either "Snowflake people" or "data people" who want to get security right, or "security people" who want to make sure that Snowflake gets handled right in terms of security
3h 2m
By Ben Herzberg, Yoav Cohen
Book
Building the Snowflake Data Cloud: Monetizing and Democratizing Your DataThe book helps you succeed by delivering faster than you can deliver with legacy products and techniques. You will learn how to leverage what you already know, and what you don't, all applied in a Snowflake Data Cloud context.
5h 14m
By Andrew Carruthers
Book
Snowflake Architecture and SQLIn this book, readers will learn the Snowflake architecture with an excellent level of detail. Readers will also learn the clever tricks and tips that make Snowflake unique, including failsafe, time travel, and min-max pruning.
3h 3m
By Tom Coffing
Book
Maturing the Snowflake Data Cloud: A Templated Approach to Delivering and Governing Snowflake in Large EnterprisesThe book provides a proven pathway to success by equipping you with skill, knowledge, and expertise to accelerate Snowflake adoption within your organization. The patterns delivered within this book are used for production deployment, and are proven in real-world use.
4h 38m
By Andrew Carruthers, Sahir Ahmed
SHOW MORE
FREE ACCESS
BOOKS INCLUDED
Book
Mastering Snowflake Solutions: Supporting Analytics and Data SharingThis book also helps you protect your most valuable data assets using built-in security features such as end-to-end encryption for data at rest and in transit.
2h 50m
By Adam Morton
Book
Snowflake Essentials: Getting Started with Big Data in the Cloud, 1st EditionThis book covers how Snowflake's architecture is different from prior on-premises and cloud databases.
3h 45m
By Bhaskar B. Joshi, Bjorn Lindstrom, Frank Bell, Raj Chirumamilla, Ruchi Soni, Sameer Videkar
Book
Jumpstart Snowflake: A Step-by-Step Guide to Modern Cloud AnalyticsSnowflake was built specifically for the cloud and it is a true game changer for the analytics market. This book will help onboard you to Snowflake, present best practices to deploy, and use the Snowflake data warehouse.
2h 19m
By Dmitry Anoshin, Dmitry Shirokov, Donna Strok
Book
Mastering Snowflake Platform: Generate, fetch, and automate Snowflake data as a skilled data practitionerThis book is for data practitioners, data engineers, data architects, or every data enthusiast who is keen on learning Snowflake. It does not need any prior experience, however, it is beneficial to have a basic understanding of cloud computing, data concepts and basic programming skills.
5h 13m
By Pooja Kelgaonkar
SHOW MORE
FREE ACCESS
BOOKS INCLUDED
Book
Snowflake Architecture and SQLIn this book, readers will learn the Snowflake architecture with an excellent level of detail. Readers will also learn the clever tricks and tips that make Snowflake unique, including failsafe, time travel, and min-max pruning.
3h 3m
By Tom Coffing
Book
Mastering Snowflake Solutions: Supporting Analytics and Data SharingThis book also helps you protect your most valuable data assets using built-in security features such as end-to-end encryption for data at rest and in transit.
2h 50m
By Adam Morton
Book
Snowflake Access Control: Mastering the Features for Data Privacy and Regulatory ComplianceThis book shows how to use Snowflake's wide range of features that support access control, making it easier to protect data access from the data origination point all the way to the presentation and visualization layer.
3h 41m
By Jessica Megan Larson
Book
Jumpstart Snowflake: A Step-by-Step Guide to Modern Cloud AnalyticsSnowflake was built specifically for the cloud and it is a true game changer for the analytics market. This book will help onboard you to Snowflake, present best practices to deploy, and use the Snowflake data warehouse.
2h 19m
By Dmitry Anoshin, Dmitry Shirokov, Donna Strok
Book
Snowflake Security: Securing Your Snowflake Data CloudThis Book Is for data engineers, data privacy professionals, and security teams either with security knowledge (preferably some data security knowledge) or with data engineering knowledge; in other words, either "Snowflake people" or "data people" who want to get security right, or "security people" who want to make sure that Snowflake gets handled right in terms of security
3h 2m
By Ben Herzberg, Yoav Cohen
Book
Maturing the Snowflake Data Cloud: A Templated Approach to Delivering and Governing Snowflake in Large EnterprisesThe book provides a proven pathway to success by equipping you with skill, knowledge, and expertise to accelerate Snowflake adoption within your organization. The patterns delivered within this book are used for production deployment, and are proven in real-world use.
4h 38m
By Andrew Carruthers, Sahir Ahmed
SHOW MORE
FREE ACCESS
SKILL BENCHMARKS INCLUDED
Data Infrastructures with Snowflake Literacy (Beginner Level)
The Data Infrastructures with Snowflake Literacy (Beginner Level) benchmark measures your ability to recall, recognize, and list features and use cases of Snowflake and differentiate the Snowflake editions. You will be evaluated on your skills in utilizing virtual warehouses and databases to separate data and compute, identify, and view database objects in Snowsight. You will also be evaluated on your ability to run queries using Snowsight worksheets, build dashboards in Snowsight, and list use cases and attributes of permanent, temporary, and transient tables. A learner who scores high on this benchmark demonstrates that they have the knowledge required to start learning about and working on Snowflake with supervision.
16m
| 16 questions
SKILL BENCHMARKS INCLUDED
Data Infrastructures with Snowflake Proficiency (Advanced Level)
The Data Infrastructures with Snowflake Proficiency (Advanced Level) benchmark measures your ability to work with continuous data, perform advanced analytics in Snowflake, and work with semi-structured data in Snowflake. You will be evaluated on your skills in setting up a table and internal Snowflake stage for data ingestion, integrating Snowflake with Azure and AWS, executing various join operations, and loading and querying JSON and XML data in Snowflake. A learner who scores high on this benchmark demonstrates that they have the skills to work on advanced analytics using Snowflake without any supervision.
19m
| 19 questions
Data Infrastructures with Snowflake Mastery (Expert Level)
The Data Infrastructures with Snowflake Mastery (Expert Level) benchmark measures your ability to work with continuous and semi-structured data, perform advanced analytics in Snowflake, and manage a Snowflake account. You will be evaluated on your ability to perform load operations for JSON data, configure parameters and security in Snowflake, and set up a secure share of Snowflake objects between two accounts. A learner who scores high on this benchmark demonstrates that they have the skills to work on advanced analytics projects using Snowflake without any supervision.
24m
| 24 questions
SKILL BENCHMARKS INCLUDED
Data Infrastructures with Snowflake Competency (Intermediate Level)
The Data Infrastructures with Snowflake Competency (Intermediate Level) benchmark measures your ability to recall and recognize how to utilize Time Travel and Fail-safe in Snowflake, install and configure SnowSQL, define and use variables, and utilize SnowSQL features. You will be evaluated on your ability to differentiate internal and external stages, load data using the classic Snowflake UI, and integrate Snowflake with Google Cloud Storage, Azure Blob Storage, and S3. You will also be evaluated on your skills in unloading data from Snowflake to different types of internal storage, querying staged files directly into Snowflake tables, and optimizing query performance. A learner who scores high on this benchmark demonstrates that they have the ability to work on Snowflake with minimal supervision.
25m
| 25 questions
YOU MIGHT ALSO LIKE
Channel
Data Management - AWS Learning
Channel
LogMeIn