Linear Models & Gradient Descent: Managing Linear Models

Intermediate
  • 11 videos | 47m 46s
  • Includes Assessment
  • Earns a Badge
Rating 3.8 of 8 users Rating 3.8 of 8 users (8)
Explore the concept of machine learning linear models, classifications of linear models, and prominent statistical approaches used to implement linear models. This 11-video course also explores the concepts of bias, variance, and regularization. Key concepts covered here include learning about linear models and various classifications used in predictive analytics; learning different statistical approaches that are used to implement linear models [single regression, multiple regression and analysis of variance (ANOVA)]; and various essential components of a generalized linear model (random component, linear predictor and link function). Next, discover differences between the ANOVA and analysis of covariance (ANCOVA) approaches of statistical testing; learn about implementation of linear regression models by using Scikit-learn; and learn about the concepts of bias, variance, and regularization and their usages in evaluating predictive models. Learners explore the concept of ensemble techniques and illustrate how bagging and boosting algorithms are used to manage predictions, and learn to implement bagging algorithms with the approach of random forest by using Scikit-learn. Finally, observe how to implement boosting ensemble algorithms by using Adaboost classifier in Python.

WHAT YOU WILL LEARN

  • Discover the key concepts covered in this course
    Define linear model and the various classification of linear models that are used in predictive analytics
    Recognize the different statistical approaches that are used to implement linear models (single regression, multiple regression and anova)
    Define generalized linear model and the various essential components of generalized linear model (random component, linear predictor and link function)
    Compare the differences between the anova and ancova approaches of statistical test
    Demonstrate the implementation of linear regression models using scikit-learn
  • Describe the concept of bias, variance and regularization and their usages in evaluating predictive models
    Define the concept of ensemble techniques and illustrate how bagging and boosting algorithms are used to manage predictions
    Implement bagging algorithms with the approach of random forest using scikit-learn
    Implement boosting ensemble algorithms using adaboost classifier in python
    List the classifications of linear models, recall the essential components of generalized linear models, and implement boosting algorithm using adaboost classifier

IN THIS COURSE

  • 53s
  • 7m 9s
    Find out how to define a linear model and the various classification of linear models that are used in predictive analytics. FREE ACCESS
  • Locked
    3.  Linear Modeling Approach
    4m 24s
    After completing this video, you will be able to recognize the different statistical approaches that are used to implement linear models (single regression, multiple regression, and ANOVA). FREE ACCESS
  • Locked
    4.  Generalized Linear Model
    2m 55s
    In this video, learn how to define a generalized linear model and the various essential components of a generalized linear model (random component, linear predictor and link function). FREE ACCESS
  • Locked
    5.  ANOVA and ANCOVA
    3m 39s
    In this video, learn how to compare the differences between the ANOVA and ANCOVA approaches to statistical testing. FREE ACCESS
  • Locked
    6.  Linear Model Implementation
    3m 49s
    Learn about the implementation of linear regression models using Scikit-learn. FREE ACCESS
  • Locked
    7.  Bias, Variance and Regularization
    6m 51s
    Upon completion of this video, you will be able to describe the concepts of bias, variance and regularization and their usages in evaluating predictive models. FREE ACCESS
  • Locked
    8.  Ensemble Techniques
    7m 28s
    In this video, you will define the concept of ensemble techniques and illustrate how bagging and boosting algorithms can be used to improve predictions. FREE ACCESS
  • Locked
    9.  Bagging Implementation
    3m 35s
    During this video, you will learn how to implement bagging algorithms with the random forest approach using Scikit-learn. FREE ACCESS
  • Locked
    10.  Implementing Boosting Algorithm
    4m 10s
    In this video, you will learn how to implement boosting ensemble algorithms using the Adaboost classifier in Python. FREE ACCESS
  • Locked
    11.  Exercise: Linear Models and Ensemble
    2m 55s
    Upon completion of this video, you will be able to list the classifications of linear models, recall the essential components of generalized linear models, and implement the boosting algorithm using the Adaboost classifier. FREE ACCESS

EARN A DIGITAL BADGE WHEN YOU COMPLETE THIS COURSE

Skillsoft is providing you the opportunity to earn a digital badge upon successful completion on some of our courses, which can be shared on any social network or business platform.

Digital badges are yours to keep, forever.

YOU MIGHT ALSO LIKE

PEOPLE WHO VIEWED THIS ALSO VIEWED THESE

Rating 4.7 of 7 users Rating 4.7 of 7 users (7)
Rating 4.2 of 319 users Rating 4.2 of 319 users (319)
Rating 4.4 of 294 users Rating 4.4 of 294 users (294)