Rune Johan Borgli
We use a work-in-progress system for automatic hyperparameter optimization called Saga.
We use transfer learning with models pre-trained on ImageNet.
We use Bayesian optimization for the hyperparameter optimization.
We make changes to the dataset to try to improve performance.
DenseNet169
SGD
0.00005
0
0.004
(A) Ulcerative colitis, (B) Esophagitis, (C) Normal z-line, (D) Dyed lifted polyps, (E) Dyed resection margins, (F) Out of patient, (G) Normal pylorus, (H) Stool inclusions, (I) Stool plenty, (J) Blurry nothing, (K) Polyps, (L) Normal cecum, (M) Colon clear, (N) Retroflex rectum, (O) Retroflex stomach, and (P) Instruments
Automatic hyperparameter optimization achieves good results.
Creating the optimal model is not the whole picture. We must also improve the dataset.
Future work should include preprocessing and k-fold crossvalidation.