Step 2:
Change the hidden layer activation functions to ReLU, and the output layer activation function to softmax.
clear;
clc;
close all;
net=feedforwardnet([20 20 20]);
net.layers{1:3}.transferFcn = 'poslin';
% Set the activation function for the three hidden layers to ReLU
%ReLU activation function (poslin = positive linear);
%poslin is just Matlab's name for the ReLU function
%type help nntransfer in the command window to see a list of all activation
%functions that Matlab provides
net.layers{4}.transferFcn = 'softmax'; %output layer