image/svg+xml Sozi: Your First Presentation 2015-05-10 Guillaume Savaton blind faith major un- certaintiescome from Correlation Bias in data Noisy data issues in data ? input prediction model todo: Lehrer, derauf komplizierteFormeln auf Tafelzeigt ? input prediction model major un- certainties Correlation Bias in data Noisy data issues in data observe all activationsof model observer Experiment (c) (d) (b) (a) 0.25 0.5 0.75 noise 0.0 0.5 1.0 1.5 2.0 2.5 encoding error correctly_classified yes no 13.0 34.0 89.0 233.0 rotation 0.0 0.1 0.2 0.3 0.4 0.5 0.6 encoding error correctly_classified yes no 3.0 5.0 8.0 13.0 translation (pixels) 0.0 0.2 0.4 0.6 0.8 1.0 1.2 1.4 encoding error correctly_classified yes no SCARABOTTECHNOLOGIES www.scarabot.de currentlyworking on A Measure of Confidence of Artificial Neural Network Classifiers Polarisedlight light-source Prism Opticaldetectionunit Angle Intensity microfluidics surface plasmon resonance spectroscopy Photoakustik Thank you! >> link website << observer observeactivations input layer hidden layer output layer model autoencoder anomalydetection correctlyclassified misclassifications Andreas Gschossmann andreas.gschossmann@oth-regensburg.de A Measure of Confidence of Artificial NeuralNetwork Classifiers 21.05.2018 Discussion Diagnostic Coverage The more the data differs from the training data,the easier it is recognize. sample chosen threshold untrustfulbehaviour normal Andreas Gschossmann <andreas.gschossmann@oth-regensburg.de> Simon Jobst <simon.jobst@oth-regensburg.de> Jürgen Mottok <juergen.mottok@oth-regensburg.de> Rudolf Bierl <rudolf.bierl@oth-regensburg.de> Great impact of the data itself. The more the data differs from the training data,the more misclassifications are produced. highly imbalanced
1
  1. contact
  2. black box
  3. blind faith
  4. major error
  5. data issues
  6. correlation
  7. bias
  8. noise
  9. all issues
  10. head
  11. observer
  12. observer
  13. observer
  14. observer
  15. head
  16. method
  17. distribution
  18. experiment
  19. mnist
  20. threshold
  21. distribution noise
  22. distribution rotation
  23. distribution translation
  24. further work
  25. thank you