Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Multi-layer Perceptron classifier #493

Open
wants to merge 2 commits into
base: ml
Choose a base branch
from
Open

Multi-layer Perceptron classifier #493

wants to merge 2 commits into from

Conversation

PondiB
Copy link
Member

@PondiB PondiB commented Jan 4, 2024

No description provided.

@PondiB PondiB changed the base branch from master to ml January 4, 2024 22:23
@PondiB PondiB added new process ML DL Deep Learning labels Jan 4, 2024
Copy link
Member

@soxofaan soxofaan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

some small notes

proposals/dl_fit_class_mlp.json Outdated Show resolved Hide resolved
proposals/dl_fit_class_mlp.json Outdated Show resolved Hide resolved
@PondiB PondiB mentioned this pull request Feb 21, 2024
Comment on lines +106 to +117
"name": "activation_function",
"description": "Activation function for the hidden layers.",
"schema": {
"type": "string",
"enum": [
"relu",
"tanh",
"sigmoid"
],
"default": "relu"
}
},

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This might need some minor corrections. We cannot specify a specific activation function per layer. A choice might be more appropriate

  • a single string: activation function applied to all layers
  • an array of strings: one string to specify the activation function for each layer

It might also be helpful to include softmax here, commonly used on the output layer, helpful for uncertainty quantification.

This could be the schema then:

{
  "oneOf": [
    {
      "type": "string",
      "enum": ["relu", "tanh", "sigmoid", "softmax"]
    },
    {
      "type": "array",
      "items": {
        "type": "string",
        "enum": ["relu", "tanh", "sigmoid", "softmax"]
      }
    }
  ],
  "default"="relu"
}

}
},
{
"name": "hidden_layers",

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We define the shape of hidden layers here. But where do we define the shape of input and output layers?
Suggestion: Rename this parameter to layers to include input and output.

"deep learning"
],
"experimental": true,
"parameters": [

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could be helpful to somehow divide training data into train/test to get training statistics independently from the training data, e.g.

{
  "name": "train_test_split",
  "description": "defines the ration by which data is split into training and test samples",
  "schema": {
    "type": "number",
    "minimum": 0.0,
    "maximum": 1.0,
    "default": 0.8
  }
}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants