International AWS Certified Data Engineering - Associate

AWS Certified Data Engineering Associate

The AWS Certified Data Engineering - Associate (DEA -C01) course is designed to provide practical skills in the design, implementation and optimization of data pipes on AWS. Under the practical Learning Method approach, participants will receive a cloud entrance pack to apply the concepts through …

40 hours
Official Certificate
Expert Instructors
Online Learning
Certificación internacional AWS Certified Data Engineering - Associate
AWS ACADEMY MEMBER INSTITUTION logo

The AWS Certified Data Engineering - Associate (DEA -C01) course is designed to provide practical skills in the design, implementation and optimization of data pipes on AWS.

Under the practical Learning Method approach, participants will receive a cloud entrance pack to apply the concepts through workshops, laboratories and projects in real environments, and will use tools such as AWS GUE, Amazon Redshift, Kinesis, Dynamodb, SageMaker and AWS EMR. They will learn to design storage, processing and data analysis solutions for Big Data environments, Business Intelligence and Machine Learning.

This course is aimed at data engineers, data analysts, data scientists and ETL professionals who wish to be certified as AWS Certified Data Engineer Associate and develop key competitions in the cloud.

At the end of the course, participants will be able to:

  • Understand the infrastructure and AWS services for data, such as Amazon S3, RDS, Dynamodb, Redshift and Gue
  • Design and implement data pipes, optimizing intake, processing and storage
  • Apply principles of data safety and governance, ensuring compliance with regulations
  • Scale data solutions on AWS, implementing high -performance and low -cost architectures
  • Automatize ETL processes in AWS, using AWS GUE, Step Functions and Lambda
  • Optimize the performance in Big Data environments, with AWS EMR and Apache Spark
  • Implement data analysis solutions, integrating Amazon Athena, Quicksight and Kinesis
  • Prepare for AWS Certified Data Engineering - Associate certification, validating the mastery of the best data engineering practices in AWS

Courses

  1. Certificación internacional AWS Certified Cloud Practicioner

To participate in this training, attendees must meet the following requirements:

  • Having taken the AWS Academy Cloud Foundations course in matrix classroom or demonstrating equivalent experience in AWS.
  • Basic knowledge in databases and SQL language.
  • Familiarity with concepts of storage, processing and data analysis.

These requirements ensure that participants can focus on applying knowledge acquired in real data environments.

Certificación internacional AWS Certified Data Engineering - Associate Applies
Certificación internacional AWS Certified Data Engineering - Associate 40 hours

Learning Methodology

The learning methodology, regardless of the modality (in-person or remote), is based on the development of workshops or labs that lead to the construction of a project, emulating real activities in a company.

The instructor (live), a professional with extensive experience in work environments related to the topics covered, acts as a workshop leader, guiding students' practice through knowledge transfer processes, applying the concepts of the proposed syllabus to the project.

The methodology seeks that the student does not memorize, but rather understands the concepts and how they are applied in a work environment.

As a result of this work, at the end of the training the student will have gained real experience, will be prepared for work and to pass an interview, a technical test, and/or achieve higher scores on international certification exams.

Conditions to guarantee successful results:
  • a. An institution that requires the application of the model through organization, logistics, and strict control over the activities to be carried out by the participants in each training session.
  • b. An instructor located anywhere in the world, who has the required in-depth knowledge, expertise, experience, and outstanding values, ensuring a very high-level knowledge transfer.
  • c. A committed student, with the space, time, and attention required by the training process, and the willingness to focus on understanding how concepts are applied in a work environment, and not memorizing concepts just to take an exam.

Pre-enrollment

You do not need to pay to pre-enroll. By pre-enrolling, you reserve a spot in the group for this course or program. Our team will contact you to complete your enrollment.

Pre-enroll now

Infinity Payments

Make your payment quickly, safely and reliably


- For bank transfer payments, request the details by email capacita@aulamatriz.edu.co.

- If you wish to finance your payment through our credit options
(Sufi, Cooperativa Unimos or Fincomercio), click on the following link:
Ver opciones de crédito.

To continue you must
Or if you don't have an account you must

Description

The AWS Certified Data Engineering - Associate (DEA -C01) course is designed to provide practical skills in the design, implementation and optimization of data pipes on AWS.

Under the practical Learning Method approach, participants will receive a cloud entrance pack to apply the concepts through workshops, laboratories and projects in real environments, and will use tools such as AWS GUE, Amazon Redshift, Kinesis, Dynamodb, SageMaker and AWS EMR. They will learn to design storage, processing and data analysis solutions for Big Data environments, Business Intelligence and Machine Learning.

This course is aimed at data engineers, data analysts, data scientists and ETL professionals who wish to be certified as AWS Certified Data Engineer Associate and develop key competitions in the cloud.

Objectives

At the end of the course, participants will be able to:

  • Understand the infrastructure and AWS services for data, such as Amazon S3, RDS, Dynamodb, Redshift and Gue
  • Design and implement data pipes, optimizing intake, processing and storage
  • Apply principles of data safety and governance, ensuring compliance with regulations
  • Scale data solutions on AWS, implementing high -performance and low -cost architectures
  • Automatize ETL processes in AWS, using AWS GUE, Step Functions and Lambda
  • Optimize the performance in Big Data environments, with AWS EMR and Apache Spark
  • Implement data analysis solutions, integrating Amazon Athena, Quicksight and Kinesis
  • Prepare for AWS Certified Data Engineering - Associate certification, validating the mastery of the best data engineering practices in AWS

Courses

To participate in this training, attendees must meet the following requirements:

  • Having taken the AWS Academy Cloud Foundations course in matrix classroom or demonstrating equivalent experience in AWS.
  • Basic knowledge in databases and SQL language.
  • Familiarity with concepts of storage, processing and data analysis.

These requirements ensure that participants can focus on applying knowledge acquired in real data environments.

offers

Certificación internacional AWS Certified Data Engineering - Associate Applies
Certificación internacional AWS Certified Data Engineering - Associate 40 hours

Learning Methodology

The learning methodology, regardless of the modality (in-person or remote), is based on the development of workshops or labs that lead to the construction of a project, emulating real activities in a company.

The instructor(live), a professional with extensive experience in work environments related to the topics covered, acts as a workshop leader, guiding students' practice through knowledge transfer processes, applying the concepts of the proposed syllabus to the project.

La metodología persigue que el estudiante "does not memorize", but rather "understands" the concepts and how they are applied in a work environment."

As a result of this work, at the end of the training the student will have gained real experience, will be prepared for work and to pass an interview, a technical test, and/or achieve higher scores on international certification exams.

Conditions to guarantee successful results:
  • a. An institution that requires the application of the model through organization, logistics, and strict control over the activities to be carried out by the participants in each training session.
  • b. An instructor located anywhere in the world, who has the required in-depth knowledge, expertise, experience, and outstanding values, ensuring a very high-level knowledge transfer.
  • c. A committed student, with the space, time, and attention required by the training process, and the willingness to focus on understanding how concepts are applied in a work environment, and not memorizing concepts just to take an exam.

Course Modules

Module I: Welcome to AWS Academy Data Engineering

  • Prerequisites and course objectives
  • Course overview

  • Data-Driven Decisions
  • The Data Pipeline: Infrastructure for Data-Driven Decisions
  • The Role of the Data Engineer in Data-Driven Organizations
  • Modern Data Strategies
  • Lab: Accessing and Analyzing Data Using Amazon S3
  • Knowledge Check

  • The five V's of data: volume, velocity, variety, veracity, and value
  • Volume and velocity
  • Variety of data types
  • Variety of data sources
  • Veracity and value
  • Activities to improve veracity and value
  • Activity: Planning your pipeline
  • Knowledge check

  • AWS Good Architecture Framework and Lens
  • Activity: Using the Good Architecture Framework
  • The Evolution of Data Architectures
  • Modern Data Architecture on AWS
  • Modern Data Architecture Pipeline: Ingestion and Storage
  • Modern Data Architecture Pipeline: Processing and Consumption
  • Streaming Analytics Pipeline
  • Lab: Querying Data with Athena - Knowledge Check

  • Cloud Security Review 
  • Security of Analytical Workloads
  • Security of Machine Learning
  • Scaling: Overview
  • Creating Scalable Infrastructure
  • Creating Scalable Components
  • Knowledge Check

  • Comparison of ETL and ELT
  • Introduction to Data Manipulation
  • Data Discovery
  • Data Structuring
  • Data Cleaning
  • Data Enrichment
  • Data Validation
  • Data Publishing
  • Knowledge Check

  • Comparison of batch and stream ingestion
  • Batch ingestion processing
  • Ingestion tools specifically designed
  • AWS Glue for batch ingestion processing
  • Scaling considerations for batch processing
  • Laboratory: performing ETL on a dataset using AWS Glue
  • Kinesis for stream processing
  • Scaling considerations for stream processing
  • IoT data ingestion via stream
  • Knowledge check

  • Storage in modern data architecture
  • Data lake storage
  • Data storage
  • Databases specifically designed
  • Support pipeline storage
  • Secure storage
  • Laboratory: Data storage and analysis using Amazon Redshift
  • Knowledge verification

  • Big data processing concepts
  • Apache Hadoop
  • Apache Spark
  • Amazon EMR
  • Managing your Amazon EMR clusters
  • Lab: Processing logs using Amazon EMR
  • Apache Hudi - Lab: In-place update of dynamic data
  • Knowledge check

  • ML Concepts
  • The ML Lifecycle
  • Framing the ML Problem to Achieve Business Objectives
  • Collecting Data
  • Applying Labels to Training Data with Known Objectives
  • Activity: Labeling with SageMaker Ground Truth
  • Data Preprocessing
  • Feature Engineering
  • Developing a Model
  • Implementing a Model
  • Machine Learning Infrastructure on AWS
  • Wise Creator
  • Demonstration: Data Preparation and Model Training with SageMaker
  • Demonstration: Data Preparation and Model Training with SageMaker Canvas
  • AI/ML Services on AWS

  • Consider the factors that influence the selection of tools
  • Comparison of AWS tools and services
  • Demonstration: data analysis and visualization with AWS IoT Analytics and QuickSight
  • Selection of tools for a gaming analytics use case
  • Laboratory: Streaming data analysis and visualization with Kinesis Data Firehose, OpenSearch Service, and OpenSearch dashboards
  • Knowledge check

  • Automation of infrastructure deployment
  • CI/CD
  • Automation with step functions
  • Laboratory: Creation and orchestration of ETL pipelines using Athena and step functions
  •  Knowledge verification

  • General description of the AWS certification