thanderrine

thanderrine OP t1_it8qwlf wrote

So I used to do transfer learning for my models before right... But it kind of feels like I'm using someone else's work. Like if I'm using the weights and architecture of someone else's work then how does it shows my skills... You know what I mean.

All I do is use the image dataset and preprocess it so that it fits the model. So how can I possibly present something like. as my project if the majority of the work is done by someone else.

About tweaking the loss, so I am kind of doing that for my model. I'm using focal tversky loss, with gamma as 0.65.

1

thanderrine OP t1_it3dyfy wrote

So yes I do have an independent training and validation set (there are no same image both in training and validation).

  1. I understand that, so in this case is there a specific metric that you'd like to suggest?

  2. Totally agree. Increase in validation error could be overfitting but what I'd also assume with over fitting is that my training accuracy should also increase.... Unless I'm wrong and you can correct me.

  3. Yeah by learning rate I meant the increase in accuracy with each epoch. Sorry about using 'learning rate'

  4. so unless I'm misunderstanding your question, the 128 neuron dense layer is a hidden layer, the last layer is of course a dense with 17 neurons. If you were talking about something else do let me know.

So I have studied the architecture and hyper parameters of VGG and resnet but you see, I want to understand what goes into saying we're going to stack x convoluted layers and x2 dense layers with these many neurons. Like where does this confidence comes from? You know what I mean...

Sure the hyperparameter tuning is great and the result of every stack of convoluted layer is also great. But still the core architecture I.e. number of layers and number of neurons in each layer is still a bit of a mystery from the papers.

So this is a project to kind of give me a ballpark estimate that okay 'for an image of size fxf that could belong to x classes, the number of layers with neurons are around this range '

Anywho thank you for replying. And thank you for your insights.

1