kafka to hadoop ingest with parsing, dedup and other big data transformations

13
© 2016 DataTorrent Chaitanya Chebolu Committer, Apache Apex Engineer, DataTorrent Sep 14, 2016 Data Ingestion - Kafka ETL

Upload: datatorrent

Post on 08-Jan-2017

45 views

Category:

Technology


0 download

TRANSCRIPT

Page 1: Kafka to Hadoop Ingest with Parsing, Dedup and other Big Data Transformations

© 2016 DataTorrent

Chaitanya CheboluCommitter, Apache Apex

Engineer, DataTorrentSep 14, 2016

Data Ingestion - Kafka ETL

Page 2: Kafka to Hadoop Ingest with Parsing, Dedup and other Big Data Transformations

© 2016 DataTorrent

Agenda

2

• Introduction about Apache Apex (Architecture, Application, Native Hadoop Integration)

•What is Data Ingestion•Use Case : Kafka ETL•Brief about Kafka•Kafka ETL App•Kafka ETL Demo

Page 3: Kafka to Hadoop Ingest with Parsing, Dedup and other Big Data Transformations

© 2016 DataTorrent3

Apache Apex •Platform and runtime engine that enables development of

scalable and fault-tolerant distributed applications•Hadoop native (Hadoop >= 2.2)

No separate service to manage stream processingStreaming Engine built into Application Master and

Containers•Process streaming or batch big data•High throughput and low latency•Library of commonly needed business logic•Write any custom business logic in your application

Page 4: Kafka to Hadoop Ingest with Parsing, Dedup and other Big Data Transformations

© 2016 DataTorrent4

Apex Architecture

Page 5: Kafka to Hadoop Ingest with Parsing, Dedup and other Big Data Transformations

© 2016 DataTorrent5

An Apex Application is a DAG(Directed Acyclic Graph)

A DAG is composed of vertices (Operators) and edges (Streams).A Stream is a sequence of data tuples which connects operators at end-points called PortsAn Operator takes one or more input streams, performs computations & emits one or more output streams

● Each operator is USER’s business logic, or built-in operator from our open source library● Operator may have multiple instances that run in parallel

Page 6: Kafka to Hadoop Ingest with Parsing, Dedup and other Big Data Transformations

© 2016 DataTorrent6

Apex - Native Hadoop Integration

• YARN is the resource manager

• HDFS used for storing any persistent state

Page 7: Kafka to Hadoop Ingest with Parsing, Dedup and other Big Data Transformations

© 2016 DataTorrent

What is Data Ingestion?

7

•Data IngestionA process of obtaining, importing, and analyzing data for

later use or storage in a database•Big Data Ingestion

Discovering the data sources Importing the data Processing data to produce intermediate data Sending data out to durable data stores

Page 8: Kafka to Hadoop Ingest with Parsing, Dedup and other Big Data Transformations

© 2016 DataTorrent

Use Case: Kafka ETL

8

•Consuming data from Kafka

•Processing data to produce intermediate data

•Writing the processed data to HDFS

Page 9: Kafka to Hadoop Ingest with Parsing, Dedup and other Big Data Transformations

© 2016 DataTorrent

Brief about Kafka

9

● Distributed Messaging System.

● Data Partitioning Capability.

● Fast Read and Writes.

● Basic Terminology○ Topic ○ Producer○ Consumer○ Broker

Page 10: Kafka to Hadoop Ingest with Parsing, Dedup and other Big Data Transformations

© 2016 DataTorrent

Kafka ETL App

10

Kafka Parser Dedup Transform Formatter

HDFS

Page 11: Kafka to Hadoop Ingest with Parsing, Dedup and other Big Data Transformations

© 2016 DataTorrent

Kafka ETL Demo

11

Demo

Page 12: Kafka to Hadoop Ingest with Parsing, Dedup and other Big Data Transformations

© 2016 DataTorrent

Resources

12

• Apache Apex - http://apex.apache.org/• Subscribe - http://apex.apache.org/community.html• Download - https://www.datatorrent.com/download/• Twitter

ᵒ @ApacheApex; Follow - https://twitter.com/apacheapexᵒ @DataTorrent; Follow – https://twitter.com/datatorrent

• Meetups - http://www.meetup.com/topics/apache-apex• Webinars - https://www.datatorrent.com/webinars/• Videos - https://www.youtube.com/user/DataTorrent• Slides - http://www.slideshare.net/DataTorrent/presentations • Startup Accelerator Program - Full featured enterprise product

ᵒ https://www.datatorrent.com/product/startup-accelerator/

Page 13: Kafka to Hadoop Ingest with Parsing, Dedup and other Big Data Transformations

© 2016 DataTorrent

We Are Hiring

13

[email protected]• Developers/Architects• QA Automation Developers• Information Developers• Build and Release• Community Leaders