InterviewSolution
Saved Bookmarks
| 1. |
What is Apache Flume in Hadoop ? |
|
Answer» APACHE Flume is a tool/service/data INGESTION mechanism for assembling, aggregating, and carrying huge amounts of streaming data such as record files, EVENTS from various references to a centralized data store. Flume is a very stable, DISTRIBUTED, and configurable tool. It is generally designed to copy streaming data (LOG data) from various web servers to HDFS. |
|