International Certification Fabric Data Engineer Associate (DP-700)

Fabric Data Engineer Associate Certification (DP-700)

This program is designed for experienced data professionals who seek to design, implement, and operate enterprise-scale data engineering solutions using Microsoft Fabric. It combines advanced theory and practice in data ingestion, transformation, orchestration, security, and optimization of pipelin…

32 hours
Official Certificate
Expert Instructors
Online Learning
Certificación Internacional Fabric Data Engineer Associate (DP-700)
Microsoft It Academy logo

This program is designed for experienced data professionals who seek to design, implement, and operate enterprise-scale data engineering solutions using Microsoft Fabric. It combines advanced theory and practice in data ingestion, transformation, orchestration, security, and optimization of pipelines and lakehouses.

This program prepares participants to obtain the international certification:

Fabric Data Engineer Associate (DP-700)

Additionally, with the purchase of this course, participants will receive Azure Data Fundamentals (DP-900) completely free in offline mode (pre-recorded videos with certified instructors).

Participants will learn to design and implement data processing solutions in Azure, ensuring data flow, efficient storage, source integration, and secure data management.

Following the Practical Learning Method, students will engage in hands-on labs, simulations, and real-world projects to ensure the application of acquired knowledge in a professional environment.

Upon completing the course, participants will be able to:

  • Configure Fabric workspaces, including Spark, OneLake, and Data Workflows
  • Implement version control, CI/CD pipelines, and database project deployments
  • Apply security controls, such as row-level, column-level, and file-level access, sensitivity labels, and audit logs
  • Design and implement large-scale and streaming data ingestion and transformation using SQL, PySpark, or KQL
  • Orchestrate data flows with pipelines, notebooks, and scheduled or event-based triggers
  • Set up monitoring and alerts, and optimize the performance of data solutions

To participate in this training, attendees must meet the following requirements:

  • Previous experience in data ingestion, transformation, and loading (ETL)
  • Advanced knowledge of SQL, PySpark, or KQL
  • Familiarity with data warehouse and lakehouse concepts, pipelines, and data security models

Certificación Internacional Fabric Data Engineer Associate (DP-700) Applies
Certificación Internacional Fabric Data Engineer Associate (DP-700) 32 hours

Learning Methodology

The learning methodology, regardless of the modality (in-person or remote), is based on the development of workshops or labs that lead to the construction of a project, emulating real activities in a company.

The instructor (live), a professional with extensive experience in work environments related to the topics covered, acts as a workshop leader, guiding students' practice through knowledge transfer processes, applying the concepts of the proposed syllabus to the project.

The methodology seeks that the student does not memorize, but rather understands the concepts and how they are applied in a work environment.

As a result of this work, at the end of the training the student will have gained real experience, will be prepared for work and to pass an interview, a technical test, and/or achieve higher scores on international certification exams.

Conditions to guarantee successful results:
  • a. An institution that requires the application of the model through organization, logistics, and strict control over the activities to be carried out by the participants in each training session.
  • b. An instructor located anywhere in the world, who has the required in-depth knowledge, expertise, experience, and outstanding values, ensuring a very high-level knowledge transfer.
  • c. A committed student, with the space, time, and attention required by the training process, and the willingness to focus on understanding how concepts are applied in a work environment, and not memorizing concepts just to take an exam.

Pre-enrollment

You do not need to pay to pre-enroll. By pre-enrolling, you reserve a spot in the group for this course or program. Our team will contact you to complete your enrollment.

Pre-enroll now

Infinity Payments

Make your payment quickly, safely and reliably


- For bank transfer payments, request the details by email capacita@aulamatriz.edu.co.

- If you wish to finance your payment through our credit options
(Sufi, Cooperativa Unimos or Fincomercio), click on the following link:
Ver opciones de crédito.

To continue you must
Or if you don't have an account you must

Description

This program is designed for experienced data professionals who seek to design, implement, and operate enterprise-scale data engineering solutions using Microsoft Fabric. It combines advanced theory and practice in data ingestion, transformation, orchestration, security, and optimization of pipelines and lakehouses.

This program prepares participants to obtain the international certification:

Fabric Data Engineer Associate (DP-700)

Additionally, with the purchase of this course, participants will receive Azure Data Fundamentals (DP-900) completely free in offline mode (pre-recorded videos with certified instructors).

Participants will learn to design and implement data processing solutions in Azure, ensuring data flow, efficient storage, source integration, and secure data management.

Following the Practical Learning Method, students will engage in hands-on labs, simulations, and real-world projects to ensure the application of acquired knowledge in a professional environment.

Objectives

Upon completing the course, participants will be able to:

  • Configure Fabric workspaces, including Spark, OneLake, and Data Workflows
  • Implement version control, CI/CD pipelines, and database project deployments
  • Apply security controls, such as row-level, column-level, and file-level access, sensitivity labels, and audit logs
  • Design and implement large-scale and streaming data ingestion and transformation using SQL, PySpark, or KQL
  • Orchestrate data flows with pipelines, notebooks, and scheduled or event-based triggers
  • Set up monitoring and alerts, and optimize the performance of data solutions

To participate in this training, attendees must meet the following requirements:

  • Previous experience in data ingestion, transformation, and loading (ETL)
  • Advanced knowledge of SQL, PySpark, or KQL
  • Familiarity with data warehouse and lakehouse concepts, pipelines, and data security models

offers

Certificación Internacional Fabric Data Engineer Associate (DP-700) Applies
Certificación Internacional Fabric Data Engineer Associate (DP-700) 32 hours

Learning Methodology

The learning methodology, regardless of the modality (in-person or remote), is based on the development of workshops or labs that lead to the construction of a project, emulating real activities in a company.

The instructor(live), a professional with extensive experience in work environments related to the topics covered, acts as a workshop leader, guiding students' practice through knowledge transfer processes, applying the concepts of the proposed syllabus to the project.

La metodología persigue que el estudiante "does not memorize", but rather "understands" the concepts and how they are applied in a work environment."

As a result of this work, at the end of the training the student will have gained real experience, will be prepared for work and to pass an interview, a technical test, and/or achieve higher scores on international certification exams.

Conditions to guarantee successful results:
  • a. An institution that requires the application of the model through organization, logistics, and strict control over the activities to be carried out by the participants in each training session.
  • b. An instructor located anywhere in the world, who has the required in-depth knowledge, expertise, experience, and outstanding values, ensuring a very high-level knowledge transfer.
  • c. A committed student, with the space, time, and attention required by the training process, and the willingness to focus on understanding how concepts are applied in a work environment, and not memorizing concepts just to take an exam.

Course Modules

Module 1: Introduction to Microsoft Fabric and Its Architecture

  • Fundamentals of Microsoft Fabric: Lakehouse, Data Warehouse, OneLake.

  • Main services: Dataflows Gen2, Pipelines, Notebooks, Eventstreams.

  • Differences with other platforms (Synapse, Databricks, etc.).

  • Navigation through the portal and initial configuration of workspaces.

  • Importing data from on-premises and cloud sources.

  • Available connectors and dataflow configuration.

  • Incremental vs full ingestion.

  • Automation and data quality control.

  • Fundamentals of Eventstreams.

  • Configuration of inputs and outputs.

  • Streaming with Spark Structured Streaming and KQL.

  • Use cases with sensors, logs, and live applications.

  • Use of notebooks with PySpark and Spark SQL.

  • Transformations with dataflows: cleaning, combining, enriching.

  • Error handling and optimization of transformation tasks.

  • Differences between Lakehouse and Data Warehouse in Fabric.

  • Definition of tables, partitions, and storage formats (Delta, Parquet).

  • Creation of views, relationships, and tabular models.

  • Optimization of data schemas.

  • Creation of Pipelines in Fabric.

  • Use of scheduled and event-based triggers.

  • Calling notebooks, dataflows, and external data.

  • Managing dependencies and errors in complex pipelines.

  • Configuration of roles and permissions (RBAC).

  • Application of row, column, and file access policies.

  • Labeling of sensitive data and use of Microsoft Purview.

  • Auditing and regulatory compliance.

  • Use of real-time metrics and activity logs.

  • Detection of bottlenecks and performance optimization.

  • Integration with Power BI for visual monitoring.

  • Automated alerts and notifications.

  • Versioning and code control with Git in Fabric.

  • CI/CD for data flows, notebooks, and reports.

  • Use of templates and deployment in multiple environments.

  • Introduction to Terraform, ARM templates, and PowerShell.

  • Strategies for answering scenario-based questions.

  • Practice with labs and simulations.

  • Review of real case studies.

  • Resource guide and self-assessment.