The Lund Neural Network Program - JETNET version 3.1 ****** Latest date of change: Feb. 2, 1994 ****** Initialized for a 3 layered net with 1 nodes in layer number 2 (output layer) 10 nodes in layer number 1 6 nodes in layer number 0 (input layer) Standard Back-Propagation updating. Weights and thresholds set randomly Training using 2 event classes over 30 epochs with 50 bins UNSETR called Minimum decay length significance: 3.000 Minimum vertex multuplicity : 3 Initialisation complete - total of 8273 good patterns Fraction in class 1: .3046 Fraction in class 2: .6954 Training using 6204 events, testing using 2069 Event cache filled with 8273 events (space for 12000) After 1 epochs, trainFoM= .1258+- .0101 test FoM= .2522+- .0172 After 2 epochs, trainFoM= .2381+- .0097 test FoM= .3032+- .0166 After 3 epochs, trainFoM= .2769+- .0094 test FoM= .3140+- .0165 After 4 epochs, trainFoM= .2876+- .0093 test FoM= .3215+- .0165 After 5 epochs, trainFoM= .2973+- .0093 test FoM= .3304+- .0163 After 6 epochs, trainFoM= .3055+- .0091 test FoM= .3424+- .0160 After 7 epochs, trainFoM= .3123+- .0091 test FoM= .3357+- .0159 After 8 epochs, trainFoM= .3176+- .0090 test FoM= .3445+- .0157 After 9 epochs, trainFoM= .3199+- .0089 test FoM= .3463+- .0156 After 10 epochs, trainFoM= .3257+- .0089 test FoM= .3516+- .0152 After 11 epochs, trainFoM= .3286+- .0088 test FoM= .3533+- .0155 After 12 epochs, trainFoM= .3336+- .0088 test FoM= .3601+- .0153 After 13 epochs, trainFoM= .3356+- .0088 test FoM= .3567+- .0153 After 14 epochs, trainFoM= .3364+- .0088 test FoM= .3595+- .0153 After 15 epochs, trainFoM= .3398+- .0088 test FoM= .3604+- .0154 After 16 epochs, trainFoM= .3397+- .0087 test FoM= .3586+- .0154 After 17 epochs, trainFoM= .3426+- .0087 test FoM= .3591+- .0153 After 18 epochs, trainFoM= .3424+- .0087 test FoM= .3610+- .0153 After 19 epochs, trainFoM= .3436+- .0087 test FoM= .3626+- .0153 After 20 epochs, trainFoM= .3461+- .0087 test FoM= .3636+- .0153 After 21 epochs, trainFoM= .3452+- .0087 test FoM= .3655+- .0152 After 22 epochs, trainFoM= .3476+- .0087 test FoM= .3652+- .0152 After 23 epochs, trainFoM= .3478+- .0087 test FoM= .3646+- .0152 After 24 epochs, trainFoM= .3471+- .0087 test FoM= .3668+- .0153 After 25 epochs, trainFoM= .3494+- .0087 test FoM= .3680+- .0152 After 26 epochs, trainFoM= .3471+- .0087 test FoM= .3642+- .0152 After 27 epochs, trainFoM= .3485+- .0087 test FoM= .3680+- .0152 After 28 epochs, trainFoM= .3483+- .0087 test FoM= .3661+- .0152 After 29 epochs, trainFoM= .3488+- .0087 test FoM= .3671+- .0152 After 30 epochs, trainFoM= .3505+- .0087 test FoM= .3712+- .0152 NETTRA finished - summary Nodename FoM +- error Input 1 class 1 .0792+- .0085 Input 1 class 2 .0792+- .0085 Input 2 class 1 .1013+- .0090 Input 2 class 2 .1013+- .0090 Input 3 class 1 .1252+- .0081 Input 3 class 2 .1252+- .0081 Input 4 class 1 .0346+- .0089 Input 4 class 2 .0346+- .0089 Input 5 class 1 .0861+- .0086 Input 5 class 2 .0861+- .0086 Input 6 class 1 .0437+- .0090 Input 6 class 2 .0437+- .0090 trainFoM +- error testFoM +-error Output 1 .3505+- .0087 .3712+- .0152 UNFIN: Input ntuple closed