We define a methodology that directly measures how well a deep neural network generalizes to unseen samples without the need of a testing dataset. To do this, let us consider a network that maps an input variable x to an output variable y. For each specific input, the network uses a subset of its parameters to compute the corresponding output. We show that if this subset of parameters is similar across most inputs, then our algorithm has learned to generalize, whereas, if the subset of parameters used is highly dependent on the inputs, then our algorithm has used local parameters to memorize samples. These functional differences can be readily computed using algebraic topology, yielding an approach that accurately estimates the testing error of any given network. We report results on several computer vision problems, including object recognition, facial expressions analysis, and image segmentation.
PAPER:
7 views
35
3
4 years ago 00:04:43 7
Computing Testing Error without a Testing Set
12 years ago 00:02:02 1.1K
Testing the Dice computer
4 years ago 00:20:58 10
Testing Brain-Computer Interfaces
9 years ago 00:13:36 67
Leap Computing - демонстрация сервиса
4 years ago 00:03:17 5
Testing the BitchinFast 3D 2000!!!
6 years ago 00:42:09 74
Computer-based Testing for Cambridge English: Young Learners
12 years ago 00:11:36 40
Penetration Testing: Real World Penetration Testing
4 years ago 00:06:35 1
NVIDIA Quadro RTX 4000 PC Game Testing
3 years ago 00:19:53 528
Lego Car Suspension Testing Device
4 years ago 00:04:21 595
Testing the Nvidia RTX 4090
8 years ago 00:04:10 81
Testing Homemade USB Killer
2 years ago 00:02:14 733
bilgisayar arızası nasıl bulunur?, how to find a computer malfunction?, testing is short and simple