JOB SCHEDULING ALGORITHMS FOR EFFICIENT TASK EXECUTION
Abstract
Map Reduce is a programming approach that allows enormous scalability on hundreds or thousands of computers in a
cluster like Hadoop. In fact, the word Map Reduce refers to two different operations performed by Hadoop algorithms. The first
one is the map task, which receives a collection of information and transforms it into another data set that splits each item into
two keys. We need to improvise the algorithms for optimum use of resources with dead line by setting the capacity and priority
of queues.