Example Using the Hadoop Compiler App Workflow

Supported Platform: Linux® only.

This example shows you how to use the Hadoop Compiler app to create a deployable archive consiting of MATLAB® map and reduce functions and then pass the deployable archive as a payload argument to a job submitted to a Hadoop® cluster.

Goal: Calculate the maximum arrival delay of an airline from the given dataset.

Dataset:airlinesmall.csv
Description:

Airline departure and arrival information from 1987-2008.

Location:/usr/local/MATLAB/R2017b/toolbox/matlab/demos

Procedure

Prerequisites

  1. Start this example by creating a new work folder that is visible to the MATLAB search path.

  2. Before starting MATLAB, at a terminal, set the environment variable HADOOP_PREFIX to point to the Hadoop installation folder. For example:

    ShellCommand
    csh / tcsh

    % setenv HADOOP_PREFIX /usr/lib/hadoop

    bash

    $ export HADOOP_PREFIX=/usr/lib/hadoop

    Note

    This example uses /usr/lib/hadoop as directory where Hadoop is installed. Your Hadoop installation directory maybe different.

    If you forget setting the HADOOP_PREFIX environment variable prior to starting MATLAB, set it up using the MATLAB function setenv at the MATLAB command prompt as soon as you start MATLAB. For example:

    setenv('HADOOP_PREFIX','/usr/lib/hadoop')

  3. Install the MATLAB Runtime in a folder that is accessible by every worker node in the Hadoop cluster. This example uses /usr/local/MATLAB/MATLAB_Runtime/v## as the location of the MATLAB Runtime folder.

    If you don’t have the MATLAB Runtime, you can download it from the website at: http://www.mathworks.com/products/compiler/mcr.

    Note

    Replace all references to the MATLAB Runtime version v## in this example with the MATLAB Runtime version number corresponding to your MATLAB release. For example, MATLAB R2017b has MATLAB Runtime version number v92. For information about MATLAB Runtime version numbers corresponding MATLAB releases, see this list.

  4. Copy the map function maxArrivalDelayMapper.m from /usr/local/MATLAB/R2017b/toolbox/matlab/demos folder to the work folder.

     maxArrivalDelayMapper.m

    For more information, see Write a Map Function (MATLAB).

  5. Copy the reduce function maxArrivalDelayReducer.m from matlabroot/toolbox/matlab/demos folder to the work folder.

     maxArrivalDelayReducer.m

    For more information, see Write a Reduce Function (MATLAB).

  6. Create the directory /user/<username>/datasets on HDFS™ and copy the file airlinesmall.csv to that directory. Here <username> refers to your user name in HDFS.

    $ ./hadoop fs -copyFromLocal airlinesmall.csv hdfs://host:54310/user/<username>/datasets

Procedure

  1. Start MATLAB and verify that the HADOOP_PREFIX environment variable has been set. At the command prompt, type:

    >> getenv('HADOOP_PREFIX')

    If ans is empty, review the Prerequisites section above to see how you can set the HADOOP_PREFIX environment variable.

  2. Create a datastore to the file airlinesmall.csv and save it to a .mat file. This datastore object is meant to capture the structure of your actual dataset on HDFS.

    ds = datastore('airlinesmall.csv','TreatAsMissing','NA',...
         'SelectedVariableNames','ArrDelay','ReadSize',1000);
    
    save('infoAboutDataset.mat','ds')

    In most cases, you will start off by working on a small sample dataset residing on a local machine that is reprensetative of the actual dataset on the cluster. This sample dataset has the same structure and variables as the actual dataset on the cluster. By creating a datastore object to the dataset residing on your local machine you are taking a snapshot of that structure. By having access to this datastore object, a Hadoop job executing on the cluster will know how to access and process the actual dataset residing on HDFS.

    Note

    In this example, the sample dataset (local) and the actual dataset on HDFS are the same.

  3. Launch the Hadoop Compiler app through the MATLAB command line (>> hadoopCompiler) or through the apps gallery.

  4. In the Map Function section of the toolstrip, click the plus button to add mapper file maxArrivalDelayMapper.m.

  5. In the Reduce Function section of the toolstrip, click the plus button to add reducer file maxArrivalDelayReducer.m.

  6. In the Datastore File section, click the plus button to add the .mat file infoAboutDataset.mat containing the datastore object.

  7. In the Output Types section, select keyvalue as output type. Selecing keyvalue as your output type means your results can only be read within MATLAB. If you want your results to be accessible outside of MATLAB, select output type as tabulartext.

  8. Rename the MapReduce job payload information to maxArrivalDelay.

  9. Click Package to build a deployable archive.

    The Hadoop Compiler app creates a log file PackagingLog.txt and two folders for_redistribution and for_testing.

    for_redistributionfor_testing
    readme.txtreadme.txt
    maxArrivalDelay.ctfmaxArrivalDelay.ctf
    run_maxArrivalDelay.shrun_maxArrivalDelay.sh
     mccExcludedFiles.log
     requiredMCRProducts.txt

    You can use the log file PackagingLog.txt to see the exact mcc syntax used to package the deployable archive.

  10. From a Linux shell naviagate to the for_redistribution folder.

    1. Incorporate the deployable archive containing MATLAB map and reduce functions into a Hadoop mapreduce job from a Linux shell using the following command:

      $ hadoop \
      jar /usr/local/MATLAB/MATLAB_Runtime/v##/toolbox/mlhadoop/jar/a2.2.0/mwmapreduce.jar \
      com.mathworks.hadoop.MWMapReduceDriver \
      -D mw.mcrroot=/usr/local/MATLAB/MATLAB_Runtime/v## \
      maxArrivalDelay.ctf \
      hdfs://host:54310/user/<username>/datasets/airlinesmall.csv \
      hdfs://host:54310/user/<username>/results
    2. Alternately, you can incorporate the deployable archive containing MATLAB map and reduce functions into a Hadoop mapreduce job using the shell script generated by the Hadoop Compiler app. At the Linux shell type the following command:

      $ ./run_maxArrivalDelay.sh \
      /usr/local/MATLAB/MATLAB_Runtime/v## \
      -D mw.mcrroot=/usr/local/MATLAB/MATLAB_Runtime/v## \
      hdfs://host:54310/user/username/datasets/airlinesmall.csv \
      hdfs://host:54310/user/<username>/results
  11. To examine the results, switch to the MATLAB desktop and create a datastore to the results on HDFS. You can then view the results using the read method.

    d = datastore('hdfs:///user/<username>/results/part*');
    read(d)
    ans = 
    
               Key           Value 
        _________________    ______
    
        'MaxArrivalDelay'    [1014]

Other examples of map and reduce functions are available at toolbox/matlab/demos folder. You can use other examples to prototype similar deployable archives to run on a Hadoop cluster. For more information, see Build Effective Algorithms with MapReduce (MATLAB).

See Also

| | |

Related Topics

Was this topic helpful?