Study guide for Exam DP-600: Implementing Analytics Solutions Using Microsoft Fabric

Purpose of this document

This study guide should help you understand what to expect on the exam and includes a summary of the topics the exam might cover and links to additional resources. The information and materials in this document should help you focus your studies as you prepare for the exam.

Useful links Description
Review the skills measured as of July 22, 2024 This list represents the skills measured AFTER the date provided. Study this list if you plan to take the exam AFTER that date.
Review the skills measured prior to July 22, 2024 Study this list of skills if you take your exam PRIOR to the date provided.
Change log You can go directly to the change log if you want to see the changes that will be made on the date provided.
How to earn the certification Some certifications only require passing one exam, while others require passing multiple exams.
Certification renewal Microsoft associate, expert, and specialty certifications expire annually. You can renew by passing a free online assessment on Microsoft Learn.
Your Microsoft Learn profile Connecting your certification profile to Microsoft Learn allows you to schedule and renew exams and share and print certificates.
Exam scoring and score reports A score of 700 or greater is required to pass.
Exam sandbox You can explore the exam environment by visiting our exam sandbox.
Request accommodations If you use assistive devices, require extra time, or need modification to any part of the exam experience, you can request an accommodation.
Take a free Practice Assessment Test your skills with practice questions to help you prepare for the exam.

About the exam

Our exams are updated periodically to reflect skills that are required to perform a role. We have included two versions of the Skills Measured objectives depending on when you are taking the exam.

We always update the English language version of the exam first. Some exams are localized into other languages, and those are updated approximately eight weeks after the English version is updated. While Microsoft makes every effort to update localized versions as noted, there may be times when the localized versions of an exam are not updated on this schedule. Other available languages are listed in the Schedule Exam section of the Exam Details webpage. If the exam isn't available in your preferred language, you can request an additional 30 minutes to complete the exam.

Note

The bullets that follow each of the skills measured are intended to illustrate how we are assessing that skill. Related topics may be covered in the exam.

Note

Most questions cover features that are general availability (GA). The exam may contain questions on Preview features if those features are commonly used.

Skills measured as of July 22, 2024

Audience profile

As a candidate for this exam, you should have subject matter expertise in designing, creating, and deploying enterprise-scale data analytics solutions.

Your responsibilities for this role include transforming data into reusable analytics assets by using Microsoft Fabric components, such as:

  • Lakehouses

  • Data warehouses

  • Notebooks

  • Dataflows

  • Data pipelines

  • Semantic models

  • Reports

You implement analytics best practices in Fabric, including version control and deployment.

To implement solutions as a Fabric analytics engineer, you partner with other roles, such as:

  • Solution architects

  • Data engineers

  • Data scientists

  • AI engineers

  • Database administrators

  • Power BI data analysts

In addition to in-depth work with the Fabric platform, you need experience with:

  • Data modeling

  • Data transformation

  • Git-based source control

  • Exploratory analytics

  • Programming languages (including Structured Query Language (SQL), Data Analysis Expressions (DAX), and PySpark)

Skills at a glance

  • Plan, implement, and manage a solution for data analytics (10–15%)

  • Prepare and serve data (40–45%)

  • Implement and manage semantic models (20–25%)

  • Explore and analyze data (20–25%)

Plan, implement, and manage a solution for data analytics (10–15%)

Plan a data analytics environment

  • Identify requirements for a solution, including components, features, performance, and capacity stock-keeping units (SKUs)

  • Recommend settings in the Fabric admin portal

  • Choose a data gateway type

  • Create a custom Power BI report theme

Implement and manage a data analytics environment

  • Implement workspace and item-level access controls for Fabric items

  • Implement data sharing for workspaces, warehouses, and lakehouses

  • Manage sensitivity labels in semantic models and lakehouses

  • Configure Fabric-enabled workspace settings

  • Manage Fabric capacity and configure capacity settings

Manage the analytics development lifecycle

  • Implement version control for a workspace

  • Create and manage a Power BI Desktop project (.pbip)

  • Plan and implement deployment solutions

  • Perform impact analysis of downstream dependencies from lakehouses, data warehouses, dataflows, and semantic models

  • Deploy and manage semantic models by using the XMLA endpoint

  • Create and update reusable assets, including Power BI template (.pbit) files, Power BI data source (.pbids) files, and shared semantic models

Prepare and serve data (40–45%)

Create objects in a lakehouse or warehouse

  • Ingest data by using a data pipeline, dataflow, or notebook

  • Create and manage shortcuts

  • Implement file partitioning for analytics workloads in a lakehouse

  • Create views, functions, and stored procedures

  • Enrich data by adding new columns or tables

Copy data

  • Choose an appropriate method for copying data from a Fabric data source to a lakehouse or warehouse

  • Copy data by using a data pipeline, dataflow, or notebook

  • Implement Fast Copy when using dataflows

  • Add stored procedures, notebooks, and dataflows to a data pipeline

  • Schedule data pipelines

  • Schedule dataflows and notebooks

Transform data

  • Implement a data cleansing process

  • Implement a star schema for a lakehouse or warehouse, including Type 1 and Type 2 slowly changing dimensions

  • Implement bridge tables for a lakehouse or a warehouse

  • Denormalize data

  • Aggregate or de-aggregate data

  • Merge or join data

  • Identify and resolve duplicate data, missing data, or null values

  • Convert data types by using SQL or PySpark

  • Filter data

Optimize performance

  • Identify and resolve data loading performance bottlenecks in dataflows, notebooks, and SQL queries

  • Implement performance improvements in dataflows, notebooks, and SQL queries

  • Identify and resolve issues with the structure or size of Delta table files (including v-order and optimized writes)

Implement and manage semantic models (20–25%)

Design and build semantic models

  • Choose a storage mode, including Direct Lake

  • Identify use cases for DAX Studio and Tabular Editor 2

  • Implement a star schema for a semantic model

  • Implement relationships, such as bridge tables and many-to-many relationships

  • Write calculations that use DAX variables and functions, such as iterators, table filtering, windowing, and information functions

  • Implement calculation groups, dynamic strings, and field parameters

  • Design and build a large format dataset

  • Design and build composite models that include aggregations

  • Implement dynamic row-level security and object-level security

  • Validate row-level security and object-level security

Optimize enterprise-scale semantic models

  • Implement performance improvements in queries and report visuals

  • Improve DAX performance by using DAX Studio

  • Optimize a semantic model by using Tabular Editor 2

  • Implement incremental refresh

Explore and analyze data (20–25%)

Perform exploratory analytics

  • Implement descriptive and diagnostic analytics

  • Integrate prescriptive and predictive analytics into a visual or report

  • Profile data

Query data by using SQL

  • Query a lakehouse in Fabric by using SQL queries or the visual query editor

  • Query a warehouse in Fabric by using SQL queries or the visual query editor

  • Connect to and query datasets by using the XMLA endpoint

Study resources

We recommend that you train and get hands-on experience before you take the exam. We offer self-study options and classroom training as well as links to documentation, community sites, and videos.

Study resources Links to learning and documentation
Get trained Choose from self-paced learning paths and modules or take an instructor-led course
Find documentation Microsoft Fabric
What is a lakehouse?
What is data warehousing?
Data warehousing and analytics
Ask a question Microsoft Q&A | Microsoft Docs
Get community support Analytics on Azure - Microsoft Tech Community
Microsoft Fabric Blog
Follow Microsoft Learn Microsoft Learn - Microsoft Tech Community
Find a video Exam Readiness Zone
Data Exposed
Browse other Microsoft Learn shows

Change log

Key to understanding the table: The topic groups (also known as functional groups) are in bold typeface followed by the objectives within each group. The table is a comparison between the two versions of the exam skills measured and the third column describes the extent of the changes.

Skill area prior to July 22, 2024 Skill area as of July 22, 2024 Change
Audience profile Minor
Plan, implement and manage a solution for data analytics Plan, implement and manage a solution for data analytics No change
Plan a data analytics environment Plan a data analytics environment No change
Implement and manage a data analytics environment Implement and manage a data analytics environment Minor
Manage the analytics development lifecycle Manage the analytics development lifecycle No change
Prepare and serve data Prepare and serve data No change
Create objects in a lakehouse or warehouse Create objects in a lakehouse or warehouse No change
Copy data Copy data Minor
Transform data Transform data No change
Optimize performance Optimize performance Minor
Implement and manage semantic models Implement and manage semantic models No change
Design and build semantic models Design and build semantic models No change
Optimize enterprise-scale semantic models Optimize enterprise-scale semantic models No change
Explore and analyze data Explore and analyze data No change
Perform exploratory analytics Perform exploratory analytics No change
Query data by using SQL Query data by using SQL No change

Skills measured prior to July 22, 2024

Audience profile

As a candidate for this exam, you should have subject matter expertise in designing, creating, and deploying enterprise-scale data analytics solutions.

Your responsibilities for this role include transforming data into reusable analytics assets by using Microsoft Fabric components, such as:

  • Lakehouses

  • Data warehouses

  • Notebooks

  • Dataflows

  • Data pipelines

  • Semantic models

  • Reports

You implement analytics best practices in Fabric, including version control and deployment.

To implement solutions as a Fabric analytics engineer, you partner with other roles, such as:

  • Solution architects

  • Data engineers

  • Data scientists

  • AI engineers

  • Database administrators

  • Power BI data analysts

In addition to in-depth work with the Fabric platform, you need experience with:

  • Data modeling

  • Data transformation

  • Git-based source control

  • Exploratory analytics

  • Languages, including Structured Query Language (SQL), Data Analysis Expressions (DAX), and PySpark

Skills at a glance

  • Plan, implement, and manage a solution for data analytics (10–15%)

  • Prepare and serve data (40–45%)

  • Implement and manage semantic models (20–25%)

  • Explore and analyze data (20–25%)

Plan, implement, and manage a solution for data analytics (10–15%)

Plan a data analytics environment

  • Identify requirements for a solution, including components, features, performance, and capacity stock-keeping units (SKUs)

  • Recommend settings in the Fabric admin portal

  • Choose a data gateway type

  • Create a custom Power BI report theme

Implement and manage a data analytics environment

  • Implement workspace and item-level access controls for Fabric items

  • Implement data sharing for workspaces, warehouses, and lakehouses

  • Manage sensitivity labels in semantic models and lakehouses

  • Configure Fabric-enabled workspace settings

  • Manage Fabric capacity

Manage the analytics development lifecycle

  • Implement version control for a workspace

  • Create and manage a Power BI Desktop project (.pbip)

  • Plan and implement deployment solutions

  • Perform impact analysis of downstream dependencies from lakehouses, data warehouses, dataflows, and semantic models

  • Deploy and manage semantic models by using the XMLA endpoint

  • Create and update reusable assets, including Power BI template (.pbit) files, Power BI data source (.pbids) files, and shared semantic models

Prepare and serve data (40–45%)

Create objects in a lakehouse or warehouse

  • Ingest data by using a data pipeline, dataflow, or notebook

  • Create and manage shortcuts

  • Implement file partitioning for analytics workloads in a lakehouse

  • Create views, functions, and stored procedures

  • Enrich data by adding new columns or tables

Copy data

  • Choose an appropriate method for copying data from a Fabric data source to a lakehouse or warehouse

  • Copy data by using a data pipeline, dataflow, or notebook

  • Add stored procedures, notebooks, and dataflows to a data pipeline

  • Schedule data pipelines

  • Schedule dataflows and notebooks

Transform data

  • Implement a data cleansing process

  • Implement a star schema for a lakehouse or warehouse, including Type 1 and Type 2 slowly changing dimensions

  • Implement bridge tables for a lakehouse or a warehouse

  • Denormalize data

  • Aggregate or de-aggregate data

  • Merge or join data

  • Identify and resolve duplicate data, missing data, or null values

  • Convert data types by using SQL or PySpark

  • Filter data

Optimize performance

  • Identify and resolve data loading performance bottlenecks in dataflows, notebooks, and SQL queries

  • Implement performance improvements in dataflows, notebooks, and SQL queries

  • Identify and resolve issues with Delta table file sizes

Implement and manage semantic models (20–25%)

Design and build semantic models

  • Choose a storage mode, including Direct Lake

  • Identify use cases for DAX Studio and Tabular Editor 2

  • Implement a star schema for a semantic model

  • Implement relationships, such as bridge tables and many-to-many relationships

  • Write calculations that use DAX variables and functions, such as iterators, table filtering, windowing, and information functions

  • Implement calculation groups, dynamic strings, and field parameters

  • Design and build a large format dataset

  • Design and build composite models that include aggregations

  • Implement dynamic row-level security and object-level security

  • Validate row-level security and object-level security

Optimize enterprise-scale semantic models

  • Implement performance improvements in queries and report visuals

  • Improve DAX performance by using DAX Studio

  • Optimize a semantic model by using Tabular Editor 2

  • Implement incremental refresh

Explore and analyze data (20–25%)

Perform exploratory analytics

  • Implement descriptive and diagnostic analytics

  • Integrate prescriptive and predictive analytics into a visual or report

  • Profile data

Query data by using SQL

  • Query a lakehouse in Fabric by using SQL queries or the visual query editor

  • Query a warehouse in Fabric by using SQL queries or the visual query editor

  • Connect to and query datasets by using the XMLA endpoint