AWS Certified Big Data – Specialty (BDS‑C00)

AWS Big Data Specialty Certification (BDS-C00)

Advanced certification for professionals who design and operate Big Data architectures on AWS. Validates skills in large-scale data collection, ingestion, storage, processing, analysis, visualization, and security.

24 hours
Official Certificate
Expert Instructors
Online Learning
AWS Certified Big Data – Specialty (BDS‑C00)
AWS ACADEMY MEMBER INSTITUTION logo

Advanced certification for professionals who design and operate Big Data architectures on AWS. Validates skills in large-scale data collection, ingestion, storage, processing, analysis, visualization, and security.

At the end of the course, participants will be able to:

  • Design scalable ingestion pipelines and architectures
  • Select storage solutions based on requirements
  • Implement batch and real-time processing
  • Perform data analysis and modeling with EMR, Redshift, and Athena
  • Visualize results with QuickSight or compatible tools
  • Ensure data integrity, encryption, and access control

Recommended but not required:

  • 2 years in AWS and 5 years in data analytics
  • AWS Cloud Practitioner or Associate-level certification (Solutions Architect, Developer, or SysOps)
  • Primary objective: Ensure in-depth knowledge of AWS services and experience handling large volumes of data in real-world environments

AWS Certified Big Data – Specialty (BDS‑C00) Applies
AWS Certified Big Data – Specialty (BDS‑C00) 24 hours

Learning Methodology

The learning methodology, regardless of the modality (in-person or remote), is based on the development of workshops or labs that lead to the construction of a project, emulating real activities in a company.

The instructor (live), a professional with extensive experience in work environments related to the topics covered, acts as a workshop leader, guiding students' practice through knowledge transfer processes, applying the concepts of the proposed syllabus to the project.

The methodology seeks that the student does not memorize, but rather understands the concepts and how they are applied in a work environment.

As a result of this work, at the end of the training the student will have gained real experience, will be prepared for work and to pass an interview, a technical test, and/or achieve higher scores on international certification exams.

Conditions to guarantee successful results:
  • a. An institution that requires the application of the model through organization, logistics, and strict control over the activities to be carried out by the participants in each training session.
  • b. An instructor located anywhere in the world, who has the required in-depth knowledge, expertise, experience, and outstanding values, ensuring a very high-level knowledge transfer.
  • c. A committed student, with the space, time, and attention required by the training process, and the willingness to focus on understanding how concepts are applied in a work environment, and not memorizing concepts just to take an exam.

Pre-enrollment

You do not need to pay to pre-enroll. By pre-enrolling, you reserve a spot in the group for this course or program. Our team will contact you to complete your enrollment.

Pre-enroll now

Infinity Payments

Make your payment quickly, safely and reliably


- For bank transfer payments, request the details by email capacita@aulamatriz.edu.co.

- If you wish to finance your payment through our credit options
(Sufi, Cooperativa Unimos or Fincomercio), click on the following link:
Ver opciones de crédito.

To continue you must
Or if you don't have an account you must

Description

Advanced certification for professionals who design and operate Big Data architectures on AWS. Validates skills in large-scale data collection, ingestion, storage, processing, analysis, visualization, and security.

Objectives

At the end of the course, participants will be able to:

  • Design scalable ingestion pipelines and architectures
  • Select storage solutions based on requirements
  • Implement batch and real-time processing
  • Perform data analysis and modeling with EMR, Redshift, and Athena
  • Visualize results with QuickSight or compatible tools
  • Ensure data integrity, encryption, and access control

Recommended but not required:

  • 2 years in AWS and 5 years in data analytics
  • AWS Cloud Practitioner or Associate-level certification (Solutions Architect, Developer, or SysOps)
  • Primary objective: Ensure in-depth knowledge of AWS services and experience handling large volumes of data in real-world environments

offers

AWS Certified Big Data – Specialty (BDS‑C00) Applies
AWS Certified Big Data – Specialty (BDS‑C00) 24 hours

Learning Methodology

The learning methodology, regardless of the modality (in-person or remote), is based on the development of workshops or labs that lead to the construction of a project, emulating real activities in a company.

The instructor(live), a professional with extensive experience in work environments related to the topics covered, acts as a workshop leader, guiding students' practice through knowledge transfer processes, applying the concepts of the proposed syllabus to the project.

La metodología persigue que el estudiante "does not memorize", but rather "understands" the concepts and how they are applied in a work environment."

As a result of this work, at the end of the training the student will have gained real experience, will be prepared for work and to pass an interview, a technical test, and/or achieve higher scores on international certification exams.

Conditions to guarantee successful results:
  • a. An institution that requires the application of the model through organization, logistics, and strict control over the activities to be carried out by the participants in each training session.
  • b. An instructor located anywhere in the world, who has the required in-depth knowledge, expertise, experience, and outstanding values, ensuring a very high-level knowledge transfer.
  • c. A committed student, with the space, time, and attention required by the training process, and the willingness to focus on understanding how concepts are applied in a work environment, and not memorizing concepts just to take an exam.

Course Modules

Data Collection

Design and implement data ingestion pipelines (batch/streaming) using S3, Kinesis Data Streams/Firehose, Snowball, IoT. Evaluate performance, durability, order, and structure of the data flow.

Analyze access patterns and data structure in S3, Redshift, DynamoDB, HDFS/EMR, optimizing formats (Parquet, ORC) and storage preferences for different workloads.

Select and design processing architectures with EMR (Spark/Hadoop), Glue, Lambda, and Redshift Spectrum, evaluating batch and real-time operations.

Determine analysis tools: EMR, Athena, Redshift, integrated Machine Learning. Design analytical solutions optimized for latency, performance, and cost.

Design and develop dashboards and reports using QuickSight, Jupyter, redshift-clients, optimizing performance, scalability, and user experience

Implement encryption (KMS, SSE, SSE-KMS), access policies (IAM, Lake Formation), auditing (CloudTrail), integrity, and regulatory compliance (GDPR/HIPAA)

Integrates the previous components into a complete architecture: ingestion → storage → processing → analysis → visualization, with monitoring and scalability.

Study real-world scenarios such as IoT pipelines, clickstream, massive ETL with EMR, and query optimization/data distribution.

Hands-on with deploying Big Data architectures on AWS: Kinesis → S3 → EMR/Glue → Redshift → QuickSight, controlling performance and cost metrics.

Exam-type exercises, multiple-choice questions, and case analysis. Strategies for interpretation and time management.

Review of official whitepapers (Big Data Analytics Options on AWS, Data Lakes, security), key resources, personalized study plan, and pre-exam checklist.