Dailysteals and COVID-19 service updates. You may experience shipping delays. Learn more here.

Flume and Sqoop for Ingesting Big Data-

Flume and Sqoop for Ingesting Big Data

Brand: Shop Hacker

Product Type: Digital Course

Regular price $ 14.00 Sale price $ 9.99


Guaranteed Safe Checkout

apple pay american express discover master paypal visa

Course Description:


Hours of Content: 2.5

Taught by a team which includes 2 Stanford-educated, ex-Googlers. This team has decades of practical experience in working with Java and with billions of rows of data.

Use Flume and Sqoop to import data to HDFS, HBase, and Hive from a variety of sources, including Twitter and MySQL

Let’s parse that.

Import data: Flume and Sqoop play a special role in the Hadoop ecosystem. They transport data from sources like local file systems, HTTP, MySQL and Twitter which hold/produce data to data stores like HDFS, HBase and Hive. Both tools come with built-in functionality and abstract away users from the complexity of transporting data between these systems.

Flume: Flume Agents can transport data produced by a streaming application to data stores like HDFS and HBase.

Sqoop: Use Sqoop to bulk import data from traditional RDBMS to Hadoop storage architectures like HDFS or Hive.

What's Covered:

Practical implementations for a variety of sources and data stores ..

  • Sources: Twitter, MySQL, Spooling Directory, HTTP
  • Sinks: HDFS, HBase, Hive

Flume features:

Flume Agents, Flume Events, Event bucketing, Channel selectors, Interceptors

Sqoop features : 

Sqoop import from MySQL, Incremental imports using Sqoop Jobs

Using discussion forums

What are the requirements?

  • Knowledge of HDFS is a prerequisite for the course
  • HBase and Hive examples assume basic understanding of HBase and Hive shells
  • HDFS is required to run most of the examples, so you'll need to have a working installation of HDFS

What am I going to get from this course?

  • Use Flume to ingest data to HDFS and HBase
  • Use Sqoop to import data from MySQL to HDFS and Hive
  • Ingest data from a variety of sources including HTTP, Twitter, and MySQL

What is the target audience?

  • Yep! Engineers building an application with HDFS/HBase/Hive as the data store
  • Yep! Engineers who want to port data from legacy data stores to HDFS

Flume and Sqoop for Ingesting Big Data

Retail value $14.00
...Click Here for the Full Description