Building MXNet on EC2 ARM instances

Recently AWS announced the availability of ARM based EC2 instances. So I thought id try out compiling MXNet to run on one of these instances since MXNet has already been used on ARM based edge devices.

So, here we go!

First, log in to your Amazon account and jump into the EC2 service. We’re going to launch an ARM instance to do this build. After clicking the “Launch Instance” button, choose the Ubuntu 16.04 AMI and be sure to click the little radio button for (Arm).

ubuntu_arm

On the instance type page, I chose an A1.XLarge instance type with 4 ARM cores. Then continue through the wizard until we get to the Storage page, and choose 100GB of space. We wont need it all, but itll give us some elbow room. Continue through the wizard and launch the instance.

After it starts up, connect via an ssh connection. Now lets setup the OS, run the following commands:

sudo apt-get update
sudo apt-get install -y libopenblas-dev liblapack-dev libopencv-dev make pkg-config python-pip

Then download MXNet from the GitHub repo:

git clone https://github.com/apache/incubator-mxnet.git --recursive

Next, cd into the incubator-mxnet directory and run these commands to configure the build:

echo "USE_OPENCV = 1" >> ./config.mk
echo "USE_BLAS = openblas" >> ./config.mk
echo "USE_SSE = 0" >> ./config.mk
echo "USE_CUDA = 0" >> ./config.mk
echo "MSHADOW_STAND_ALONE = 1" >> ./config.mk

Now we’re ready to build, so run make -j$(nproc) to start the process. This takes about 30min or so on my A1.XLarge instance. If you see any errors like g++: internal compiler error: Killed (program cc1plus) try also doing:

echo "DEBUG = 1" >> ./config.mk

This will use -O0 and -g compiler flags and reduce the amount of memory gcc tries to use for doing optimizations. Also use make -j1 to use just 1 thread for compilation. It will take longer, but then the single thread wont have to share memory with other threads.

After its done, cd into the python directory and run pip install -e . to add this installation to our python path. Dont forget the “.” dot at the end!

Finally we're ready to test. Since we've installed our local build we can run tests from anywhere, so im going to cd back to ~ and run the test. Heres a simple script to run resnet-50:

import mxnet as mx
import numpy as np
from collections import namedtuple
Batch = namedtuple('Batch', ['data'])

path='http://data.mxnet.io/models/imagenet/'

[mx.test_utils.download(path+'resnet/50-layers/resnet-50-0000.params'),
mx.test_utils.download(path+'resnet/50-layers/resnet-50-symbol.json'),
mx.test_utils.download(path+'synset.txt')]

ctx = mx.cpu()

sym, arg_params, aux_params = mx.model.load_checkpoint('resnet-50', 0)
mod = mx.mod.Module(symbol=sym, context=ctx, label_names=None)
mod.bind(for_training=False, data_shapes=[('data', (1,3,224,224))],
label_shapes=mod._label_shapes)
mod.set_params(arg_params, aux_params, allow_missing=True)
with open('synset.txt', 'r') as f:
   labels = [l.rstrip() for l in f]

fname = mx.test_utils.download('https://github.com/dmlc/web-data/blob/master/mxnet/doc/tutorials/python/predict_image/cat.jpg?raw=true')
img = mx.image.imread(fname)

# convert into format (batch, RGB, width, height)
img = mx.image.imresize(img, 224, 224) # resize
img = img.transpose((2, 0, 1)) # Channel first
img = img.expand_dims(axis=0) # batchify

mod.forward(Batch([img]))
prob = mod.get_outputs()[0].asnumpy()
# print the top-5
prob = np.squeeze(prob)
a = np.argsort(prob)[::-1]
for i in a[0:5]:
   print('probability=%f, class=%s' %(prob[i], labels[i]))

You should see the following output:

$ python test.py 
[20:43:23] src/nnvm/legacy_json_util.cc:209: Loading symbol saved by previous version v0.8.0. Attempting to upgrade...
[20:43:23] src/nnvm/legacy_json_util.cc:217: Symbol successfully upgraded!
probability=0.418679, class=n02119789 kit fox, Vulpes macrotis
probability=0.293494, class=n02119022 red fox, Vulpes vulpes
probability=0.029321, class=n02120505 grey fox, gray fox, Urocyon cinereoargenteus
probability=0.026230, class=n02124075 Egyptian cat
probability=0.022557, class=n02085620 Chihuahua

And thats it!

Leave a comment