The Ultimate Hands-On Hadoop – Tame your Big Data is a tutorial from the Udemy site that introduces you to Hadoop tools and working with big data. The world of hedgehogs and big data is scary and confusing given the vastness of hundreds of new technologies with codenames. Today, almost all major companies like Amazon, Google, Facebook, Twitter and IBM somehow use Hedup, so gaining skills in this ecosystem can greatly impact your career prospects. With the help of this course you will not only learn about Hadoop tools and how they relate, but you can also learn how to use them to solve many business problems.
This course teaches you over 25 different technologies in the field of big data and introduces you to the complete set of Hadoop tools. Some of the topics covered include installing and working with Hortonworks software with Hortonworks and the Ambari UI, managing large batch data with HDFS and MapReduce, writing applications for analyzing data on Hadoop using Pig, and Spark is storing and querying data with Hive and HBase, designing real systems using the Hadoop ecosystem, managing batches with YARN and Mesos, and relaying streaming data with Kafka and Flume.
Courses taught in this course:
- Designing distributed systems for managing large data using Hadoop
- Use HDFS and MapReduce to store and analyze data
- Use Pig and Spark to build scripts to process information
- Relational data analysis using Hive and MySQL
- Analysis of non-relational data by HBase, Cassandra, and MongoDB
- Data query with Drill, Phoenix, and Presto
- Select a suitable storage source for applications
- Dissemination of data on Hedope batch with the help of Kafka, Sqoop, and Flume
- Stream data by Spark Streaming, Flink, and Storm
Profile period !The Ultimate Hands-On Hadoop – Tame your Big Data :
- Language : English
- Duration Time : 14:37:46
- Number of lessons: 100
- Training level : Intermediate
- Instructor : Frank Kane
- File format : mp4
The headlines course :
100 lectures 14:37:46
Learn all the buzzwords! And install the Hortonworks Data Platform Sandbox.
7 lectures 50:24
Using Hadoop\\\\\\\’s Core: HDFS and MapReduce
11 lectures 01:34:29
Programming Hadoop with Pig
7 lectures 56:08
Programming Hadoop with Spark
8 lectures 01:14:07
Using relational data stores with Hadoop
9 lectures 01:03:03
Using non-relational data stores with Hadoop
13 lectures 02:28:27
Querying your Data Interactively
9 lectures 01:21:54
Managing your Cluster
13 lectures 01:59:14
Feeding Data to your Cluster
6 lectures 54:47
Analyzing Streams of Data
8 lectures 01:16:28
Designing Real-World Systems
7 lectures 52:35
2 lectures 06:10
Ago Need period !The Ultimate Hands-On Hadoop – Tame your Big Data
- You will need access to a PC running 64-bit Windows, MacOS, or Linux with an Internet connection, if you want to participate in the hands-on activities and exercises. You must have at least 8GB of free RAM on your system; 10GB or more is recommended. If your PC does not meet these requirements, you can still follow along in the course without doing hands-on activities.
- Some activities will require some prior programming experience preferably in Python or Scala.
- A basic familiarity with the Linux command line will be very helpful.
Sample movie is The Ultimate Hands-On Hadoop
After the Extract with the Player your custom view.
Version 2020/4 compared to 2019/5 number of 4 lessons, and about 7 minutes, increase the total time.
Password file(s): www.downloadly.ir