Big Data with Cloud Computing and AWS Architecture
Big Data on AWS introduces you to cloud-based big data solutions and Amazon Elastic MapReduce (EMR), the AWS big data platform. In this course, we show you how to use Amazon EMR to process data using the broad ecosystem of Hadoop tools like Pig and Hive. We also teach you how to create big data environments, work with Amazon DynamoDB, Amazon Redshift, and Amazon Kinesis, and leverage best practices to design big data environments for security and cost-effectiveness.
This course is designed to teach you how to:
Understand Apache Hadoop in the context of Amazon EMR
Understand the architecture of an Amazon EMR cluster
Launch an Amazon EMR cluster using an appropriate Amazon Machine Image and Amazon EC2 instance types
Choose appropriate AWS data storage options for use with Amazon EMR
Know your options for ingesting, transferring, and compressing data for use with Amazon EMR
Use common programming frameworks available for Amazon EMR including Hive, Pig, and Streaming
Work with Amazon Redshift to implement a big data solution
Leverage big data visualization software
Choose appropriate security options for Amazon EMR and your data
Perform in-memory data analysis with Spark and Shark on Amazon EMR
Choose appropriate options to manage your Amazon EMR environment cost-effectively
Understand the benefits of using Amazon Kinesis for big data
We recommend that attendees of this course have:
Basic familiarity with big data technologies, including Apache Hadoop and HDFS
Knowledge of big data technologies such as Pig, Hive, and MapReduce is helpful but not required
Working knowledge of core AWS services and public cloud implementation
Students should complete the AWS Essentials course or have equivalent experience
Basic understanding of data warehousing, relational database systems, and database design
This course will be delivered through a mix of:
Instructor-Led Training (ILT)
Note: course outline may vary slightly based on the regional location and/or language in which the class is delivered.
This course will cover the following concepts on each day:
1. > Introduction to Cloud Computing
Learning Objectives - In this module, you will learn what Cloud Computing is and what are the different models of Cloud Computing along with the key differentiators of different models. We will also introduce you to virtual world of AWS along with AWS key vocabulary, services and concepts.
Topics - Introduction to Cloud Computing, AWS Architecture, AWS Management Console, Setting up of the AWS Account.
2. > Amazon EC2 and Amazon EBS
Learning Objectives - Introduction to compute offering from AWS called EC2. We will cover different instance types and Amazon AMIs. A demo on launching an AWS EC2 instance, connect with an instance and hosting a website on AWS EC2 instance. We will also cover EBS storage Architecture (AWS persistent storage) and the concepts of AMI and snapshots.
Topics - Amazon EC2, Amazon EBS, Demo of AMI Creation, Backup, Restore, EC2 Services and EBS persistent storage.
3.> Big Data on AWS Introduction:
Overview of Big Data, Apache Hadoop, and the Benefits of Amazon EMR, Amazon EMR Architecture, Using Amazon EMR, Launching and Using an Amazon EMR Cluster, Hadoop Programming Frameworks
4.> Advanced Big Data Applications on AWS:
Using Hive for Advertising Analytics, Using Streaming for Life Sciences Analytics, Overview: Spark and Shark for In-Memory Analytics, Using Spark and Shark for In-Memory Analytics, Managing Amazon EMR Costs, Overview of Amazon EMR Security, Data Ingestion, Transfer, and Compression, Using Amazon Kinesis for Real-Time Big Data Processing
5.> Big Data Applications on AWS:
Using Amazon Kinesis and Amazon EMR to Stream and Process Big Data, AWS Data Storage Options, Using DynamoDB with Amazon EMR, Overview: Amazon Redshift and Big Data, Using Amazon Redshift for Big Data, Visualizing and Orchestrating Big Data, Using Tableau Desktop or Jaspersoft BI to Visualize Big Data
6. > Amazon Storage Services : S3, RRS, Glaciers
Learning Objectives - AWS provides various kind of scalable storage services. In this module, we will cover different storage services like S3, RRS & Glacier and learn how to host a static website on AWS. This session also covers monitoring AWS resources and setting up alerts and notifications for usage and billing.
Topics - AWS Storage Services: S3, RRS & Glacier, Amazon Cloud Watch, Alerts, Notification.
7. > 'Scaling' and 'Load Distribution' in AWS
Learning Objectives - It is one of the key module of this course. You will learn about 'Scaling' and 'Load distribution techniques' in AWS. This session also includes a demo of Load distribution and Scaling your resources horizontally based on time or activity.
Topics - Amazon Scaling Service: ELB and Auto Scaling.
8. > Identity and Access Management Techniques (IAM)
Learning Objectives - In this module, you will learn how to achieve distribution of access control with AWS using IAM. We will talk about the managed relational database service of AWS called RDS and will also cover AWS NoSQL service: DynamoDB.
Topics - Amazon IAM Overview, Amazon RDS.
9. > Multiple AWS Services and managing the resources' lifecycle
Learning Objectives - This module provides an overview of multiple AWS services. We will talk about how do you manage life cycle of AWS resources and follow the DevOPs model in AWS. We will also talk about notification and email service of AWS along with Content Distribution Service in this module.
Topics - AWS CloudFront, AWS Import / Export , Overview of AWS Products such as Elastibean Stalk, Cloud Formation, AWS OpsWorks SNS, SES.
10. > AWS Architecture and Design
Learning Objectives - This module cover various architecture and design aspects of AWS. We will also cover the cost planning and optimization techniques along with AWS security best practices, High Availability (HA) and Disaster Recovery (DR) in AWS.
Topics - AWS Backup and DR Setup, AWS High availability Design, AWS Best Practices (Cost +Security), AWS Calculator & Consolidated Billing.