Cloudera Certified Developer for Hadoop

(CCDH) Exam Code: CCD-410

Cloudera Certified Developer for Apache Hadoop Exam:
  • Number of Questions: 50 - 55 live questions
  • Item Types: multiple-choice & short-answer questions
  • Exam time: 90 Mins.
  • Passing score: 70%
  • Price: $295 USD

Syllabus Cloudera Develpoer Certification Exam

Infrastructure Objectives 25%
  • Recognize and identify Apache Hadoop daemons and how they function both in data storage and processing.
  • Understand how Apache Hadoop exploits data locality.
  • Identify the role and use of both MapReduce v1 (MRv1) and MapReduce v2 (MRv2 / YARN) daemons.
  • Analyze the benefits and challenges of the HDFS architecture.
  • Analyze how HDFS implements file sizes, block sizes, and block abstraction.
  • Understand default replication values and storage requirements for replication.
  • Determine how HDFS stores, reads, and writes files.
  • Identify the role of Apache Hadoop Classes, Interfaces, and Methods.
  • Understand how Hadoop Streaming might apply to a job workflow
Data Management Objectives 30%
  • Import a database table into Hive using Sqoop.
  • Create a table using Hive (during Sqoop import).Successfully use key and value types to write functional MapReduce jobs.
  • Given a MapReduce job, determine the lifecycle of a Mapper and the lifecycle of a Reducer.
  • Analyze and determine the relationship of input keys to output keys in terms of both type and number, the sorting of keys, and the sorting of values.
  • Given sample input data, identify the number, type, and value of emitted keys and values from the Mappers as well as the emitted data from each Reducer and the number and contents of the output file(s).
  • Understand implementation and limitations and strategies for joining datasets in MapReduce.
  • Understand how partitioners and combiners function, and recognize appropriate use cases for each.
  • Recognize the processes and role of the the sort and shuffle process.
  • Understand common key and value types in the MapReduce framework and the interfaces they implement.
  • Use key and value types to write functional MapReduce jobs.
Job Mechanics Objectives 25%
  • Construct proper job configuration parameters and the commands used in job submission.
  • Analyze a MapReduce job and determine how input and output data paths are handled.
  • Given a sample job, analyze and determine the correct InputFormat and OutputFormat to select based on job requirements.
  • Analyze the order of operations in a MapReduce job.
  • Understand the role of the RecordReader, and of sequence files and compression.
  • Use the distributed cache to distribute data to MapReduce job tasks. Build and orchestrate a workflow with Oozie.
Querying Objectives 20%
  • Write a MapReduce job to implement a HiveQL statement.
  • Write a MapReduce job to query data stored in HDFS.

Drop us a query

Contact us : +919958433899
                  : +918252167867

Course Features

Real-Life Case Studies
Assignments
Lifetime Access
Expert Support
Global Certification
Job Portal Access