The Lund Neural Network Program - JETNET version 3.1 ****** Latest date of change: Feb. 2, 1994 ****** Initialized for a 3 layered net with 1 nodes in layer number 2 (output layer) 8 nodes in layer number 1 4 nodes in layer number 0 (input layer) Standard Back-Propagation updating. Weights and thresholds set randomly Training using 2 event classes over 10 epochs with 50 bins UNSETR called Minimum decay length significance: 3.000 Minimum vertex multuplicity : 3 Initialisation complete - total of 6032 good patterns Fraction in class 1: .2455 Fraction in class 2: .7545 Training using 4524 events, testing using 1508 Event cache filled with 6032 events (space for 10000) After 1 epochs, trainFoM= .1010+- .0154 test FoM= .1776+- .0221 After 2 epochs, trainFoM= .1828+- .0138 test FoM= .1990+- .0220 After 3 epochs, trainFoM= .1923+- .0137 test FoM= .1806+- .0220 After 4 epochs, trainFoM= .1991+- .0136 test FoM= .1822+- .0220 After 5 epochs, trainFoM= .2101+- .0135 test FoM= .1891+- .0219 After 6 epochs, trainFoM= .2054+- .0134 test FoM= .1892+- .0216 After 7 epochs, trainFoM= .2043+- .0134 test FoM= .2025+- .0213 After 8 epochs, trainFoM= .2108+- .0133 test FoM= .1912+- .0216 After 9 epochs, trainFoM= .2098+- .0133 test FoM= .1952+- .0213 After 10 epochs, trainFoM= .2124+- .0132 test FoM= .1927+- .0215 NETTRA finished - summary Nodename FoM +- error Input 1 class 1 .1215+- .0132 Input 1 class 2 .1215+- .0132 Input 2 class 1 .0431+- .0144 Input 2 class 2 .0431+- .0144 Input 3 class 1 .1700+- .0118 Input 3 class 2 .1700+- .0118 Input 4 class 1 .0346+- .0146 Input 4 class 2 .0346+- .0146 trainFoM +- error testFoM +-error Output 1 .2124+- .0132 .1927+- .0215 UNFIN: Input ntuple closed