Differences
This shows you the differences between two versions of the page.
| Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
| learn:bigdata:hadoop [2014/08/07 15:01] – [Eco system] yehuda | learn:bigdata:hadoop [2022/01/03 16:03] (current) – external edit 127.0.0.1 | ||
|---|---|---|---|
| Line 7: | Line 7: | ||
| * Intelligent | * Intelligent | ||
| - | Main tools are [[HDFS]] (HaDoop File System) | + | Main tools are [[HDFS]] (HaDoop File System), [[MapReduce]] and [[YARN]] |
| + | |||
| + | ===== Installing Hadoop ===== | ||
| + | Single node [[Cluser]] | ||
| + | |||
| + | * Standalon mode - all hadoop components run under single [[JVM]] | ||
| + | * Pesodo Destributed - each deamon runs under seperated [[JVM]] | ||
| + | * Fully Destributed - each deamon runs under seperated maching | ||
| + | |||
| + | |||
| + | see [[Install Hadoop eco system (single mode)]] | ||
| ===== Hadoop Technology stack ===== | ===== Hadoop Technology stack ===== | ||
| + | see more at http:// | ||
| ==== Data Access ==== | ==== Data Access ==== | ||
| [[Hive]], [[Pig]] | [[Hive]], [[Pig]] | ||
| Line 31: | Line 42: | ||
| [[Ambari]], [[Zookeeper]], | [[Ambari]], [[Zookeeper]], | ||
| + | ==== More ==== | ||
| + | [[HDT]], [[Konx]], [[Spark]] | ||
| ===== Use-cases ===== | ===== Use-cases ===== | ||
| * New-York Times - Want to convert 4 TB of articales to PDF. thay did it with AWS less then 24 hours and it cost them about $240! | * New-York Times - Want to convert 4 TB of articales to PDF. thay did it with AWS less then 24 hours and it cost them about $240! | ||