7
$\begingroup$

I have a data set that contains 135 input features and 132 output values to be predicted. The input features are all numeric floating point values and each output value would be an integer between [0,1,2,3,4].

I am basically new to data science and machine learning, therefore I need to understand what kind of neural network model (regression or classification) would fit best for this kind of data. On one hand the output values represent different classes but I am not sure how a single neural network can predict multiple classes from the same input data.

$\endgroup$

    3 Answers 3

    2
    $\begingroup$

    welcome to the site!

    I think the key word you need to know that defines your task is: multi-target classification or regression.

    You can find an explanation and some possible techniques at this link.

    For neural networks:

    The key is to remember that the last layer should have linear activations (i.e. no activation at all).

    As per your requirements, the shape of the input layer would be a vector (135,) and the output (132,).

    The usual loss function used for regression problems is mean squared error (MSE). Here's an example of multidimensional regression using Keras:

    model = Sequential() model.add(Dense(200, input_dim = (135,))) model.add(Activation('relu')) model.add(Dense(200)) model.add(Activation('relu')) model.add(Dropout(0.3)) model.add(Dense(132)) model.compile(loss='mean_absolute_error', optimizer='Adam') 
    $\endgroup$
    0
      1
      $\begingroup$

      Take a look at the softmax activation function for the output layer for classification.

      $\endgroup$
      2
      • $\begingroup$what I understood was softmax is used for Single label multi class classification, if I need to predict only a single label. But in my case I would need to predict multiple labels from the same input.$\endgroup$
        – Ali Akber
        CommentedOct 3, 2018 at 15:28
      • $\begingroup$I think, that you should use some loss suitable for multi-label classification, e.g. here$\endgroup$
        – GrozaiL
        CommentedOct 3, 2018 at 15:37
      0
      $\begingroup$

      Keras functional API allows to easily add more than one target.

      It is a little daunting to try this out, since there are not too many good examples for this technique online. Below I post a simple model based on Boston data, which shows how multi-target (or multi-output) models can be implemented using Keras. See also here.

      import numpy as np import pandas as pd from keras.datasets import boston_housing (train_data, train_targets), (test_data, test_targets) = boston_housing.load_data() # Standardise data mean = train_data.mean(axis=0) train_data -= mean std = train_data.std(axis=0) train_data /= std test_data -= mean test_data /= std # Add an additional target (just add some random noise to the original one) import random train_targets2 = train_targets + random.uniform(0, 0.1) test_targets2 = test_targets + random.uniform(0, 0.1) # https://keras.io/models/model/ from keras import models from keras import layers from keras.layers import Input, Dense from keras.models import Model from keras import regularizers from keras.layers.normalization import BatchNormalization # Input and model architecture Input_1=Input(shape=(13, )) x = Dense(1024, activation='relu', kernel_regularizer=regularizers.l2(0.05))(Input_1) x = Dense(512, activation='relu', kernel_regularizer=regularizers.l2(0.05))(x) x = Dense(256, activation='relu', kernel_regularizer=regularizers.l2(0.05))(x) x = Dense(128, activation='relu', kernel_regularizer=regularizers.l2(0.05))(x) x = Dense(8, activation='relu', kernel_regularizer=regularizers.l2(0.05))(x) # Outputs out1 = Dense(1)(x) out2 = Dense(1)(x) # Compile/fit the model model = Model(inputs=Input_1, outputs=[out1,out2]) model.compile(optimizer = "rmsprop", loss = 'mse') # Add actual data here in the fit statement model.fit(train_data, [train_targets,train_targets2], epochs=500, batch_size=4, verbose=0, validation_split=0.8) # Predict / check type and shape preds = np.array(model.predict(test_data)) #print(type(preds), preds.shape) # is a 3D numpy array # get first part of prediction (column/row/3D layer) preds0 = preds[0,:,0] # second part preds1 = preds[1,:,0] # Check MAE from sklearn.metrics import mean_absolute_error print(mean_absolute_error(test_targets, preds0)) print(mean_absolute_error(test_targets2, preds1)) 
      $\endgroup$

        Start asking to get answers

        Find the answer to your question by asking.

        Ask question

        Explore related questions

        See similar questions with these tags.