Microsoft Fabric: beginner
Expertise:
- 7 Courses | 12h 10m 25s
- 20 Courses | 18h 43m
- 8 Courses | 12h 37m 51s
Explore Microsoft Fabric, an AI-powered end-to-end analytics and data platform designed for enterprises that require a unified solution.
GETTING STARTED
Microsoft Fabric: Working with Data Lakehouses
-
2m 9s
-
9m 6s
COURSES INCLUDED
Microsoft Fabric: Working with Data Lakehouses
Microsoft Fabric's OneLake includes a data lakehouse that combines some of the most attractive features of data warehouses and data lakes. Heavily leveraging Delta tables as well as Apache Spark, it also adds the power of Power BI and Power Query. In this course, you'll explore data warehouses, comparing them with Azure Data Lake and Amazon S3. You'll build your first data lakehouse in Microsoft Fabric, running T-SQL queries, exploring parquet files and the delta log. You'll write SQL queries, construct visual queries and work with Spark notebooks, using PySpark and SparkSQL. After that, you'll study the Delta Lake architecture and its implementation as Delta tables. You'll explore data ingestion, including data pipelines, dataflows, and Spark notebooks. Finally, you'll create visual queries from the SQL analytics endpoint, learn about query folding, work with M-language expressions, and create and customize Power BI reports. This course is part of a series that prepares learners for Exam DP-600: Implementing Analytics Solutions Using Microsoft Fabric.
17 videos |
2h 12m
Assessment
Badge
Microsoft Fabric: Spark & the Capacity Metrics App for Lakehouses
Spark is a key technology on Fabric Lakehouses, and a fundamental part of the DP-600 test curriculum. In this course, you'll learn how Apache Spark integrates with Microsoft Fabric to handle large-scale data processing through distributed computing. First, learn about Spark pools and study the role of the T-SQL endpoint. Create Fabric shortcuts, set up storage accounts, enable hierarchical namespaces, and use Shared Access Signatures (SAS) to link these sources and build Delta tables from the connected data. Next, create notebooks with Apache Spark in Microsoft Fabric, run PySpark and SparkSQL commands, monitor resource usage and learn how to associate lakehouses with notebooks. Finally, explore the Microsoft Fabric Capacity Metrics App, tracking capacity units (CUs), managing SKUs, and handling overages and throttling. Complete the course by installing the app, entering your Fabric capacity ID, and using charts to analyze utilization metrics. This course is part of a series that prepares learners for Exam DP-600: Implementing Analytics Solutions Using Microsoft Fabric.
15 videos |
1h 57m
Assessment
Badge
Microsoft Fabric: Spark Configuration & Delta Tables
Spark is essential both in Microsoft Fabric and for the DP-600 certification test. In this course, you'll learn how to create Python scripts for Spark batch jobs in Fabric, writing ETL transformations and modifying them for batch execution. You'll configure and monitor Spark batch jobs, analyze job logs, use the Spark History Server, and learn how to schedule jobs with retry policies. Next, you'll configure starter pools and explore high concurrency sessions. You'll customize Spark settings in Fabric, and create custom Spark pools and environments, linking them to notebooks. After that, you'll focus on Delta tables, working with version history and table contents. You'll use the DESCRIBE HISTORY command to analyze table versions and explore time travel by viewing and restoring data to specific versions or timestamps with SparkSQL and PySpark. Finally, you'll explore the differences between managed and external Delta tables. This course is part of a series that prepares learners for Exam DP-600: Implementing Analytics Solutions Using Microsoft Fabric.
12 videos |
1h 47m
Assessment
Badge
Microsoft Fabric: The Medallion Architecture with a Star Schema
The medallion architecture, which features bronze, silver, and gold layers, is a data organization approach and an important part of Microsoft Fabric and the DP-600 certification test. In this course, learn how the bronze layer stores raw data, the silver layer holds cleaned and transformed data, and the gold layer is for business-ready reporting and analytics. Begin by implementing the bronze layer. Create code for incremental data processing, perform data cleaning, and use merge and upsert operations with Spark Delta tables to load data into the Silver layer. Next, prepare the gold layer by using fact and dimension tables within a star schema. Transform silver layer data into the gold layer and create a semantic model that defines relationships between tables for use in Power BI reports. Finally, explore partitioning and table maintenance techniques and study the use of features like optimize, v-order, and vacuum. This course is part of a series that prepares learners for Exam DP-600: Implementing Analytics Solutions Using Microsoft Fabric.
10 videos |
1h 23m
Assessment
Badge
Microsoft Fabric: Data Transformations Using Dataflow Gen2
Microsoft Fabric Data Factory is a powerful tool for integrating and orchestrating data workflows, enabling organizations to manage data efficiently. Dataflow Gen2 is an important component in Fabric Data Factory that focuses on transforming and managing data within those workflows, providing advanced capabilities for data cleaning and transformation. In this course, learn about Data Factory and Dataflow Gen2, beginning with building data flows using Power Query, cleaning data, replacing values, and applying essential mathematical transformations. Next, examine the fast copy Dataflow Gen2 feature, including when and why you would want to use fast copy. Finally, explore connectors for Dataflows Gen2 and incremental refresh techniques. This course is part of a series that prepares learners for Exam DP-600: Implementing Analytics Solutions Using Microsoft Fabric.
18 videos |
1h 54m
Assessment
Badge
Microsoft Fabric: Creating and Using Simple Data Pipelines
Data pipelines are an important component in Microsoft Fabric Data Factory that are crucial in automating and managing data workflows. In this course, learn the essentials of creating and using simple data pipelines and how various activities can be integrated within these pipelines. Next, discover how to enhance pipeline flexibility to build robust data workflows and configure a complete data pipeline by integrating components like notebooks, data flows, and email notifications. Finally, explore how to run the pipeline, monitor its performance, and set up alerts for successful completion and failures, ensuring efficient data management and responsiveness. This course is part of a series that prepares learners for Exam DP-600: Implementing Analytics Solutions Using Microsoft Fabric.
11 videos |
1h 10m
Assessment
Badge
Microsoft Fabric: Working with Complex Data Pipelines
Complex data pipelines are essential for automating, managing, and optimizing data workflows, making data-driven decision-making more efficient and scalable. In this course, learn how to work with complex data pipelines to enhance data processing, including configuring pipelines with activities, setting up conditional flows, and utilizing the ForEach activity. Next, discover how to set up source and watermark tables to track changes, configure watermark lookup activities, set up conditions to load new data, and schedule pipelines to maintain up-to-date datasets without full data reloads. Finally, explore how to use copy assistant to load data into partitioned tables and compare data pipelines with Dataflow Gen2. This course is part of a series that prepares learners for Exam DP-600: Implementing Analytics Solutions Using Microsoft Fabric.
1h 43m
Assessment
Badge
SHOW MORE
FREE ACCESS
COURSES INCLUDED
Microsoft Fabric: Introducing Data Warehouses
What are data warehouses and how do they compare to data lakehouses? What are their key differences and when should you choose each of these storage technologies? In this course, you'll learn the answers to these questions while building hands-on experience. First, learn how warehouses handle structured data through T-SQL queries. Then, get hands-on with data ingestion to a warehouse. Upload data to Azure Data Lakehouse Storage Gen2, ingest it into a Fabric data warehouse using the COPY INTO command, and implement strategies to manage potential errors and maintain data integrity. Finally, dive into advanced data management, running T-SQL queries to perform DDL and DML operations in warehouses. Learn about database mirroring in Fabric, which allows data replication within external storage systems and gain a comprehensive understanding of secure and resilient data workflows in Fabric warehouses. This course is part of a series that prepares learners for Exam DP-600: Implementing Analytics Solutions Using Microsoft Fabric.
1h 49m
Assessment
Badge
Microsoft Fabric: Getting Started and the DP-600 Exam
Microsoft Fabric is an all-in-one analytics platform that seamlessly integrates data storage, processing, and analysis tools to empower businesses with unified data insights. In this course, you will be introduced to Microsoft Fabric, a powerful platform that unifies analytics and data management. You will explore its key components, including OneLake, Delta Lake, and the integration of tools like Spark, T-SQL, and KQL to streamline data processing and analysis. Next, you will gain insights into the DP-600 Fabric Analytics Engineer Associate Certification. You will gain the core skills required to succeed, such as preparing data, managing analytics environments, and designing semantic models within Microsoft Fabric. By the end of the course, you will have a solid foundation in Microsoft Fabric, preparing you to apply its features effectively in real-world scenarios.
23m
Assessment
Badge
COURSES INCLUDED
Microsoft Fabric: Monitoring Fabric Warehouses
The Fabric Capacity Metrics App provides the key tools and features needed to monitor and optimize warehouse performance. In this course, learn all about the app, gain insights into capacity utilization and find out how to proactively address potential bottlenecks. First, use query activity to identify long-running and frequently run queries without writing any SQL code. Then, work with dynamic management views (DMVs) to analyze connections, sessions, and requests, and see how you can combine data from various DMVs to gather comprehensive system metrics. Finally, dive into query insights, which enable detailed visibility into query behavior and learn how to identify inefficiencies and analyze performance trends over time. This course is part of a series that prepares learners for Exam DP-600: Implementing Analytics Solutions Using Microsoft Fabric.
57m
Assessment
Badge
Microsoft Fabric: Implementing Security in Fabric Warehouses
Implementing security in Microsoft Fabric warehouses is crucial for protecting sensitive data from unauthorized access, data breaches, and cyberattacks. Additionally, a secure system fosters trust among stakeholders, ensures regulatory compliance, and safeguards the organization's reputation. In this course, learn essential security practices for safeguarding data in Fabric warehouses, beginning with assigning workspace roles, managing object-level permissions, and applying the principle of least privilege. Next, discover how to configure targeted security controls, implement column-level and row-level security, and apply dynamic data masking to protect sensitive information. Finally, explore advanced query constructs and how to choose the appropriate query construct for different use cases. This course is part of a series that prepares learners for Exam DP-600: Implementing Analytics Solutions Using Microsoft Fabric.
1h 54m
Assessment
Badge
Microsoft Fabric: Getting Started with Semantic Models
Semantic models in Fabric empower organizations to organize, analyze, and derive meaningful insights from complex data. They enable efficient data structuring, seamless integration, and optimized performance. In this course, learn the fundamentals of Fabric semantic models, including how to work with them, their structure and components, and the different storage modes. Next, discover how relationships in semantic models connect data and enable meaningful insights, how to store data using normalized or denormalized schemas, and star and snowflake schemas. Finally, discover how to create fact and dimension tables, model a star schema structure in Fabric, manage slowly changing dimensions to track historical changes, and create hierarchies. This course is part of a series that prepares learners for Exam DP-600: Implementing Analytics Solutions Using Microsoft Fabric.
2h 1m
Assessment
Badge
Microsoft Fabric: Semantic Models and Semantic Link
Semantic modeling supports secure, optimized performance and empowers users to build accurate, dependable models for enterprise decision-making. In this course, learn how to create semantic models using lakehouse tables and explore how storage modes like Direct Lake and DirectQuery optimize performance while enabling real-time access to data. Next, explore how large semantic models help handle vast datasets for enterprise analytics and how security measures help ensure compliance with organizational and regulatory requirements. Finally, discover how to use semantic link in Fabric to identify relationships, detect violations, and analyze dependencies within models, and how to evaluate measures and understand calculation logic. This course is part of a series that prepares learners for Exam DP-600: Implementing Analytics Solutions Using Microsoft Fabric.
8 videos |
53m
Assessment
Badge
Microsoft Fabric: Getting Started with DAX Queries
DAX empowers organizations to perform advanced analytics by creating dynamic, meaningful calculations that drive better decision-making. In this course, learn the fundamentals of DAX, how DAX integrates with semantic models to enable dynamic data analysis, and how to write, test, and optimize DAX queries effectively in DAX Studio. Next, discover how to create measures, run queries, and write results to various sources, and construct DAX expressions and queries for meaningful insights. Finally, explore how to create and evaluate complex measures and the CALCULATE function's integration with the FILTER and ALL functions. This course is part of a series that prepares learners for Exam DP-600: Implementing Analytics Solutions Using Microsoft Fabric.
14 videos |
1h 35m
Assessment
Badge
Microsoft Fabric: Advanced DAX Queries and Tools to Manage Semantic Models
Mastering advanced DAX techniques enables organizations to enhance performance, validate data, and manage high-performing semantic models. In this course, learn advanced DAX concepts, starting with iterators for performing row-by-row operations and multi-column calculations, DAX variables, and information functions for data validation and error handling. Next, explore time intelligence, table filter and window functions, and DAX modeling time-savers for optimizing workflows. Finally, examine advanced tools for managing semantic models, including the Tabular Editor, Performance Analyzer, Best Practices Analyzer (BPA), VertiPaq Analyzer, ALM Toolkit, and SQL Server Management Studio (SSMS). This course is part of a series that prepares learners for Exam DP-600: Implementing Analytics Solutions Using Microsoft Fabric.
1h 50m
Assessment
Badge
Microsoft Fabric: Administration, T-SQL, and Data Analytics
Effective Microsoft Fabric administration enables organizations to manage resources, streamline workflows, and maintain secure, collaborative environments for data operations. By leveraging T-SQL and advanced analytics, businesses can optimize data quality, uncover actionable insights, and drive informed decision-making. In this course, learn the fundamentals of Fabric administration, including managing workspace roles and permissions, configuring admin portal options, and exploring Fabric capacities for resource allocation. Next, learn about Git integration in Fabric, Power BI file formats, and compare standard SQL and T-SQL. Finally, explore supported and unsupported T-SQL constructs in Fabric warehouses, useful T-SQL constructs, data profiling techniques, and descriptive, diagnostic, predictive, and prescriptive analytics. This course is part of a series that prepares learners for Exam DP-600: Implementing Analytics Solutions Using Microsoft Fabric.
2h 19m
Assessment
Badge
Microsoft Fabric: OneLake, KQL, and KQL DB
OneLake provides a unified data storage solution for seamless collaboration, while KQL DB enables powerful querying and analysis for actionable insights. In this course, you will learn about the changes to the DP-600 curriculum, including topics that have been moved to the DP-700 certification and new additions to the DP-600 like OneLake and the Real-Time Hub. Next, you will explore OneLake and the Fabric Real-Time Hub, two essential components of Microsoft Fabric. You will discover how OneLake enables unified data storage and sharing while the Real-Time Hub supports instant data ingestion and processing for dynamic insights. Finally, you will build expertise in KQL and KQL DB, starting with basic concepts and progressing to ingesting, querying, and analyzing data. By the end of the course, you will be well-prepared to tackle the DP-600 certification and leverage Microsoft Fabric effectively in real-world scenarios.
1h 6m
Assessment
Badge
SHOW MORE
FREE ACCESS
EARN A DIGITAL BADGE WHEN YOU COMPLETE THESE COURSES
Skillsoft is providing you the opportunity to earn a digital badge upon successful completion on some of our courses, which can be shared on any social network or business platform.
Digital badges are yours to keep, forever.