I have a network model that accepts about 25 inputs and outputs 3 actions. The outputs are: delta X and delta Y of the robot and the angle of the robot. After I enter the data into the model, I get very different (and strange) predictions for the angle What could be the reason for this? And how can it be fixed? If someone can attach a patch (or another model) it will help me a lot! :)
Here are some details: Unique values in the first column of Y: [-65. -45. 0. 10. 15. 25. 35. 45. 65. 90. 115. 135. 155. 180. 225. 245. 250. 270.]
angle Labels: tensor([ 0., 270., 270., 0., 0., 0., 180., 180., 270., 0., 90., 0., 270., 0., 90., 90., 0., 90., 90., 0., 0., 90., 90., 0., 270., 270., 0., 270., 270., 270., 0., 180., 90., 180., 0., 90., 0., 270., 0., 180., 90., 180., 0., 90., 180., 180., 180., 90., 0., 180., 0., 270., 270., 90., 180., 180., 0., 0., 90., 180., 270., 90., 90., 0.])
angle Outputs(prediction): tensor([ 58.0496, 134.1157, 86.4644, 38.5840, 207.9981, 45.4016, 145.8846, 95.8378, 85.2003, 149.1076, 138.2106, 194.3036, 38.5840, 76.8651, 57.8443, 109.1870, 45.4016, 146.5283, 40.7307])
This is my model:
# Define the neural network model class NeuralNetwork(nn.Module): def __init__(self, input_size, num_actions): super(NeuralNetwork, self).__init__() self.layer1 = nn.Linear(input_size, 64*2) self.relu = nn.ReLU() self.layer2 = nn.Linear(64*2, 32*2) self.output_layer = nn.Linear(32*2, num_actions) def forward(self, x): out = self.layer1(x) out = self.relu(out) out = self.layer2(out) out = self.relu(out) out = self.output_layer(out) return out
Thank you so much!