Home / Tutorials / MapReduce – heart of Hadoop

MapReduce – heart of Hadoop

About MapReduce

MapReduce is the heart of Hadoop®. It is this programming paradigm that allows for massive scalability across hundreds or thousands of servers in a Hadoop cluster. The MapReduce concept is fairly simple to understand for those who are familiar with clustered scale-out data processing solutions.

For people new to this topic, it can be somewhat difficult to grasp, because it’s not typically something people have been exposed to previously. If you’re new to Hadoop’s MapReduce jobs, don’t worry: we’re going to describe it in a way that gets you up to speed quickly.

The term MapReduce actually refers to two separate and distinct tasks that Hadoop programs perform. The first is the map job, which takes a set of data and converts it into another set of data, where individual elements are broken down into tuples (key/value pairs). The reduce job takes the output from a map as input and combines those data tuples into a smaller set of tuples. As the sequence of the name MapReduce implies, the reduce job is always performed after the map job.

An example of MapReduce

Let’s look at a simple example. Assume you have five files, and each file contains two columns (a key and a value in Hadoop terms) that represent a city and the corresponding temperature recorded in that city for the various measurement days. Of course we’ve made this example very simple so it’s easy to follow. You can imagine that a real application won’t be quite so simple, as it’s likely to contain millions or even billions of rows, and they might not be neatly formatted rows at all; in fact, no matter how big or small the amount of data you need to analyze, the key principles we’re covering here remain the same. Either way, in this example, city is the key and tempera¬ture is the value.

Toronto, 20
Whitby, 25
New York, 22
Rome, 32
Toronto, 4
Rome, 33
New York, 18

Out of all the data we have collected, we want to find the maximum tem¬perature for each city across all of the data files (note that each file might have the same city represented multiple times). Using the MapReduce framework, we can break this down into five map tasks, where each mapper works on one of the five files and the mapper task goes through the data and returns the maximum temperature for each city. For example, the results produced from one mapper task for the data above would look like this:

(Toronto, 20) (Whitby, 25) (New York, 22) (Rome, 33)

Let’s assume the other four mapper tasks (working on the other four files not shown here) produced the following intermediate results:

(Toronto, 18) (Whitby, 27) (New York, 32) (Rome, 37)(Toronto, 32) (Whitby, 20) (New York, 33) (Rome, 38)(Toronto, 22) (Whitby, 19) (New York, 20) (Rome, 31)(Toronto, 31) (Whitby, 22) (New York, 19) (Rome, 30)

All five of these output streams would be fed into the reduce tasks, which combine the input results and output a single value for each city, producing a final result set as follows:

(Toronto, 32) (Whitby, 27) (New York, 33) (Rome, 38)

As an analogy, you can think of map and reduce tasks as the way a cen¬sus was conducted in Roman times, where the census bureau would dis¬patch its people to each city in the empire. Each census taker in each city would be tasked to count the number of people in that city and then return their results to the capital city. There, the results from each city would be reduced to a single count (sum of all cities) to determine the overall popula¬tion of the empire. This mapping of people to cities, in parallel, and then com¬bining the results (reducing) is much more efficient than sending a single per¬son to count every person in the empire in a serial fashion.

Source:  http://www-01.ibm.com/software/data/infosphere/hadoop/mapreduce/

About cmadmin

Web Developer & Designer | Android App Developer


  1. Normally I don’t read article on blogs, but I wish to say that this write-up very compelled me to check out and do
    it! Your writing style has been surprised me. Thank you, very great article.

  2. Plenty of Fish Dating Site of Free Dating

    Hi! This is my first visit to your blog! We are a group of volunteers and starting a new initiative in a community in the same niche.
    Your blog provided us useful information to work on.
    You have done a marvellous job!

  3. Plenty Of Fish Dating Site Of Free Dating

    Fine way of explaining, and good article to get information concerning my presentation subject matter, which
    i am going to present in university.

  4. Highly energetic blog, I enjoyed that bit. Will there be a part 2?

  5. I am just curious to find out what blog system you might be utilizing?
    I’m having some small security problems with my latest website and
    I would personally love to find some thing safe. Do you possess any recommendations?

  6. Wonderful goods by you, man. I actually have understand your
    stuff previous to and you happen to be just too excellent. I really like what
    you’ve acquired here, really like what you’re saying and how for which you say it.
    You will be making it entertaining so you still care for to hold it
    smart. I cant wait to read significantly more by you.
    This really is really a terrific web site.

  7. What’s up, this weekend is good for me, because this time
    i am reading this enormous educational post here at my home.

  8. certainly like your website however you need to test the spelling on several of your respective posts.
    Many of them are rife with spelling issues and so i to
    find it very bothersome to inform the facts nonetheless I’ll certainly come back again.

Leave a Reply

Your email address will not be published. Required fields are marked *


Create an Account!
Forgot Password? (close)

Sign Up

Confirm Password
Want to Login? (close)

Forget Password?

Username or Email
%d bloggers like this:
To get latest new / tutorial / technology / development information subscribe with us.
Lets Get Updated with latest trends & tutorials!
Your Information will never be shared with any third party.
Ready for latest tutorials & tools !