Keras improvements + simplification
This merge request makes a few improvements and simplifications to the Keras classifiers:
- Remove some unused imports and imports the relevant classes as their actual class name (
Model
vs.keras_model
) - Add options to add a random seed and a way to set weights to balance out the classes, like what's being done for the scikit-learn classifiers.
- Clean up how classifier kwargs make it into the classifier. The biggest change is combining
layers
,activationfunction
andnumnodes
and fold them into thelayers
kwarg, with the goal of making this simpler. Before, we would have to do something like:
[ANN]
...
layers=["Dense","Dense"]
activationfunction=['relu','sigmoid']
numnodes=[1000,1]
loss="binary_crossentropy"
optimizer="adam"
metrics=["accuracy"]
safe_channels_path = safe_channel_list.txt
feature_columns = ['delta_t', 'snr']
where safe_channels_path
and feature_columns
have to be passed in just to figure out the size of the input layer. Now, it's more like:
[ANN]
...
num_columns = 3
loss = binary_crossentropy
optimizer = adam
metrics = accuracy
layers =
Dense 1000 relu
Dense 1 sigmoid
In order to have the nice formatting from layers
, I added a config2layers
func in configparser.py
.
I'm still not too happy with passing in a num_columns
just to figure out the input dimensions, but I don't see around it because the only way to know is to either know how many channels and columns you have by calculating it explicitly or be able to figure it out from the quiver directly. Problem with the quiver approach is that calling quiver.vectorize
needs the model to already be defined... If you have a good idea, I'd love to hear it.