|
Answer» Deploying a model into a Big Data Platform involves mainly three key steps they are, - Data ingestion
- Data Storage
- Data Processing
Let’s have a look at what these are, - Data Ingestion: This process involves COLLECTING data from different sources like social media platforms, business applications, log files, etc.
- Data Storage: When data extraction is completed, the challenge is to store this large volume of data in the database in which the Hadoop Distributed File system (HDFS) PLAYS a vital role.
- Data Processing: After storing the data in HDFS or HBase, the NEXT task is to analyze and visualize these large amounts of data using specific algorithms for better data processing. And yet again, this task is more straightforward if we use Hadoop, Apache SPARK, Pig, etc.
After performing these essential steps, one can deploy a big data model successfully.
|