Final Exam: Advanced Analytics and Machine Learning in Snowflake

Intermediate
  • 1 video | 32s
  • Includes Assessment
  • Earns a Badge
Final Exam: Advanced Analytics and Machine Learning in Snowflake will test your knowledge and application of the topics presented throughout the Advanced Analytics and Machine Learning track.

WHAT YOU WILL LEARN

  • Identify snowflake ai/ml offerings across both snowflake cortex and snowflake ml and the functionality available across the platform
    outline support in snowpark ml for model training using scikit-learn, xgboost, and lightgbm, as well as for hyperparameter tuning
    connect to snowflake from jupyter and use the snowpark api from python
    utilize the snowflake ml apis to compute correlation matrices, construct pipelines, and fit models
    recognize the process and benefits of registering a model version
    register models and versions, view model artifacts, delete model versions, and invoke model methods dynamically
    provide an overview of the intuition behind clustering depth and the number of overlapping partitions
    create a snowflake ml pipeline for logistic regression
    create tags and associate metrics and a tag with a model
    utilize snowpark-optimized warehouses and grid search for hyperparameter tuning
    register a model with tuned hyperparameter values with the model registry
    identify the need for the snowflake feature store and analyze how feature views and entities work
    create a feature store and entity using snowpark apis
    create a managed feature view and analyze the implementation as a dynamic table
    create an external feature view, join two feature views, and query for all feature views associated with a specific entity
    create a basic streamlit app to display data from a table when a checkbox is selected
    add heatmaps, scatter plots, and other seaborn and matplotlib visualizations to a streamlit app
    implement sliders, selection boxes, radio buttons, and other ui controls in streamlit
    access the model registry and display a dropdown of models and model version numbers in a streamlit app
    share the completed streamlit app with a different user in view-only mode
    outline the steps for training an anomaly detection model and how to use it for prediction
    recognize how to analyze each column in the output of the anomaly detection model
  • use snowflake ml functions to train a single-series unsupervised anomaly detection model
    invoke the anomaly detection function, interpret the results, and save them to a table
    create an anomaly detection model for single-series data with no exogenous variables using snowflake ml functions and a filtered query on a multi-series dataset
    extend the single-series anomaly detection model by adding exogenous variables and observe changes in model sensitivity and feature importance scores
    save model results to a table using sqlid and the result_scan function, then calibrate model sensitivity using prediction_interval
    extend the anomaly detection model to work with multi-series data and verify that model feature scores are now reported for each series
    invoke the explain_feature_importance and show_evaluation_metrics on a time series forecasting model and analyze the results
    extend time series forecasting models by providing exogenous explanatory variables
    utilize snowpark ml functions for multi time series forecasting
    generate sql code that builds and invokes a time series forecasting model using the ai & ml studio wizard for forecasting
    analyze and execute the sql code generated by the ai & ml studio from the forecasting workflow
    generate sql code that invokes ml classification functions using the ai & ml studio wizard for classification
    analyze and execute the sql code generated by the ai & ml studio from the classification workflow
    analyze evaluation metrics, global evaluation metrics, and feature importance scores from the output of models created by snowflake ai & ml studio
    recognize how to use temperature and top_p hyperparameters to determine the predictability of llm output
    use the complete cortex llm function with different types of roles
    utilize the temperature, max_p, and guardrails properties to control the attributes of responses from the complete cortex llm function
    invoke the extract_answer, summarize, sentiment, and translate functions from sql
    invoke the complete, extract_answer, summarize, sentiment, and translate functions from python
    recognize the functionality of snowflake copilot, universal search, and document ai
    implement search optimizations for individual columns for both equality and substring matches
    outline retrieval augmented generation (rag) and the use of cortex search for rag

EARN A DIGITAL BADGE WHEN YOU COMPLETE THIS COURSE

Skillsoft is providing you the opportunity to earn a digital badge upon successful completion on some of our courses, which can be shared on any social network or business platform.

Digital badges are yours to keep, forever.

YOU MIGHT ALSO LIKE

Rating 5.0 of 1 users Rating 5.0 of 1 users (1)
Rating 5.0 of 2 users Rating 5.0 of 2 users (2)
Rating 5.0 of 2 users Rating 5.0 of 2 users (2)