ۥ-/@ -Muifrb`b``b jb >c>c>cLcddddd( T (where T is the threshold) and zero (inactive) otherwise. For programming purposes, it is easier to eliminate T by introducing another input (x0) which is always equal to 1, and another synapse or "weight" (w0). Depending on what w0 is, we can say that y=1 when x0w0 + x1w1 + x2w2 + x3w3 > 0 and y=0 otherwise. For reasons to be discussed later, it will be best to describe this as one continuous formula: EMBED Equation  (eqn 1) where EMBED Equation  (eqn 2) One can see that y will be pretty close to 1 when x0w0 + x1w1 + x2w2 + x3w3 > 0 and y will be pretty close to 0 otherwise. Let us consider a simple application of a neural network. Consider a logical OR gate with two inputs, x1 and x2. The gate will return a value of 1 if x1, x2 or both is 1 and 0 if x1 and x2 are both zero. A single neuron can be made to emulate an OR gate. Considering equation 1 above, if one were to substitute w0 = -1/2, w1 = 1, w2 = 1, one will notice that y will be approximately 1 (at least greater than .62) when either x1 or x2 is 1. You will, of course, remember that x0=0. If they are both zero, the value of y will be approximately zero. The key to neural programming is finding out what the "weights" (w0, w1, and w2) should be. Perhaps the simplest way to determine the weights is graphically, so we will start with that. Using the example of the OR gate, let us plot x1 vs. x2. The following figure is a diagram of an OR gate:  Figure 2: Graphical Representation of an OR Gate In the above figure, the inputs that would make the output of the OR gate 1 are shown by a closed circle. Those which make it zero are shown by an open circle. The line w1x1 + w2x2 + w0 = 0 is drawn on the graph. Note that all of the points where the output would be 1 are on one side of the line, and all where the output are zero are on the other (w1 = 1, w2 = 1, w0 = -1/2). Note that instead of using conventional OR logic, we can make an OR gate by simply looking at x1 + x2 - 1/2. If it is less than zero, the output is zero. If it is greater than zero, than the output is 1. If we apply our thresholding function f(x) from above, the quantity f( x1 + x2 - 1/2) will approximate that of the OR gate. This is not exactly the same as the OR gate, however. Note that (2,2) will give an output of 1 on the above graph, and (-1,-3) will give an output of zero. It will be obvious that if one were to try to make an AND gate using the same type of approach, there would be a different line that would divide the "1" and "0" points. Can this approach be used for more than simple logic gates? Of course. For the last several years, the United States Consumer Product Safety Commission (CPSC) has had a project on cooking fires. It appears that every year, tens of thousands of cooking fires occur, costing hundreds of millions in property damage. The problem is to stop the (stovetop) cooking fire before it begins. One obvious solution is to put a thermocouple in the burner to prevent it from getting to hot. Let us attempt to solve this problem with a simple neural network. We know that there are four basic elements to a fire: heat, fuel, oxygen, and an uninhibited chain reaction releasing hydroxyl radicals. We can get a measurement representative of heat by measuring temperature (with a thermocouple). The fuel in a cooking fire is a flammable gas (when you set your bacon on fire, it's actually the flammable gasses that go first), which can be measured with a gas sensor. Oxygen we can't do much about, except assume that it's always there. As for the uninhibited chain reaction, by that point there's already a fire, so we'll forget about that one (many fire suppressants operate by not allowing the uninhibited chain reaction to take place, thereby preventing fire) The CPSC placed a thermocouple in a stove burner so it touches the bottom of the pan. A gas sensor (a general alcohol sensor--which also detects cooking hydrocarbons) is placed on the stove hood. For testing, several foods were cooked normally and burned to the point where they caught fire. Foods tested were chicken, bacon, sugar, and oil. Readings from the gas sensor (which outputs a voltage) and the thermocouple (which outputs a voltage) are shown in the figure below:  Figure 3: One Application of a Neural Network Note that a diagonal line separates points taken at times within 120 seconds of ignition (a probable fire hazard) and points which are not. Note that some points which are not within 120 seconds of ignition are above the line. In a network, this is an "error" (which is not good, but is not necessarily a reason to throw away the network). In the case of the graph above, take voltage of the gas (alcohol) sensor as x1, and temperature as x2. Plugging into equations 1 and 2, we see that f(77x1 + x2 - 206) to a good approximation will be 1 when there is about to be a fire, and 0 when there is not. Suppose for the moment that we just used a simple thermostat instead of a combination of temperature and gas sensor readings. A constant trip temperature would equate to a horizontal line on the graph above. Note that there really isn't a single good horizontal line which divides the points which are within 120 seconds of ignition from those that are not. In that sense, by adding a second sensor and using the neural network, more cooking scenarios have been allowed without reducing safety. It is tempting to say that the neural network is a "be all and end all" of computer programs. After all, it can not only be an OR or an AND gate, but also save millions in property damage and perhaps even save lives. It is very important to recognize the limitations of the neural network. Going back to logic gates, consider the following Exclusive-OR (XOR) gate:  Figure 4: Graphical Representation of an XOR Gate Here we run into a problem with our one-neuron neural network. This time there is no one line which separates the "1" from the "0" values. The neural network solution to this problem is to use a different (more sophisticated) network. One that works is shown below:  Figure 5: Two Layer Neural Network This is more of a true neural network than the one-neuron examples shown before. Note that it has two layers. The first layer has two neurons, each neuron has two inputs. The second layer has one neuron with two inputs. The output of neuron 1, which I'll call x'1 (not shown) is f(w11x1 + w21x2 + w01). Remember that w01 is a constant. Likewise, the output of neuron 2 is x'2=f(w12x1 + w22x2 + w02). The final output (y) will by y=f(v1x'1 + v2x'2 + v0). In this case, the easiest way to find the solution is graphically:  Figure 6: Two Layer Network Solving the XOR Gate Problem The two neurons correspond to two lines. We can see that one solution is w11=1, w12=1, w21=1, w22=1, w01= -1/2, and w02= 1/2. We'll choose the second layer weights (v1, v2 -- there's also a v0 which is a constant input to the second layer neuron) such that when the output of the first neuron (xx1) is zero, and the output of the second neuron (xx2) is 1, it will output 1, otherwise, it will output 0. One solution is that v1=1, v2=1, v0= -1/2. So far, we have examined neural networks with two inputs. Suppose one had three. In this case, we might be able to graphically find our "decision regions" by drawing a three dimensional diagram. For networks with more than three inputs, this becomes a problem, because we can't simply draw a picture to find what the "weights" (the w's and v's) should be (or at least I haven't been able to do so). To solve the problem of more than three inputs, we go back to the concept of error. We have been concerned with networks whose output is either 1 or 0. An error occurs when the network gives us the wrong answer. We can define a total error as the sum of the errors. For reasons that make the mathematics simpler, we will define the total error as the sum of the square of the errors. In other words: EMBED Equation  (eqn 3) In equation 3, y is the actual output of the network, and T is the desired output. Using the example of the OR gate, T would be 1 when either x1 or x2 was 1 (0 otherwise), and y would be the result obtained from actually evaluating EMBED Equation  (eqn 4) To find the w's without using a graph, we will attempt to minimize the error from eqn 3. When the error is at a minimum, it's derivative will be zero. Note that in the OR gate, we have 3 weights w0, w1, and w2 (remember that the equation of the line was w0 + w1x1 + w2x2 = 0). For simplicity, instead of leaving w0 by itself, we'll think of it as being w0x0, where x0 = 1. Given our four training points (1,1 => 1) , (1,0 => 1), (0,1 => 1), (0,0 => 0), we will find the w's such that the error E is minimized. In this simple case, it could be solved in closed form, but we'll do it numerically, so that we will have a method to use on larger networks. One numerical approach used is to start the unknowns (w's in this case) of with small different values, and adjust them using the method of gradient descent: EMBED Equation  (eqn 5) Equation 5 states that the rate at which you change the weight is proportional (h is a constant) to the gradient of the error (which in our case is the derivative of the error with respect to w). We have three w's. For purposes of example, I'll start with w1. First, let's find the gradient: EMBED Equation  (eqn 6) One of the first things to notice is that f' (the derivative of f) EMBED Equation  (eqn 7) Keeping that in mind, we will apply equation 5 to obtain EMBED Equation  (eqn 8) Provided that we don't take too big of a time increment, we can approximate EMBED Equation  (eqn 9) and apply this to equation 8. Since 2, h and Dt are constant, we will combine them into one constant , which will be referred to as the "learning rate". We obtain: EMBED Equation  (eqn 10) or in computerese we would say that for each iteration we adjust w1 in the following way: EMBED Equation  (eqn 11) We would then repeat the process for the rest of the w's. In general, we would update the ith weight using the relation: EMBED Equation  (eqn 12) In order to find the weights of the two layer network in figure 5, the process would be repeated, only using the functions for the two layer network. I'll spare the reader by not going through the calculations and simply stating that the update laws for the two layer network is: EMBED Equation  EMBED Equation  (eqn 13) where x'j is the output of the jth middle layer neuron. Enough theory. Let's get down to programming. First, the following source code is the one-neuron OR gate example: 10 REM *ONE NEURON NEURAL NETWORK* 20 H=10:REM LEARNING RATE 30 X(3)=1 40 L=1:REM GAIN 50 CLS 60 DEF FN F(X)=1/(1+EXP(-L*X)) 70 PRINT 80 PRINT" ONE NEURON NEURAL NETWORK" 90 PRINT" BY AARON BANERJEE" 100 PRINT 110 PRINT" (OR GATE)" 120 PRINT:PRINT:PRINT 130 REM 140 REM INITIALIZE W(I,J) 150 REM NOTE: SINCE THIS IS 160 REM A SINGLE NEURON, OUR 170 REM W'S WILL BE W(I), WHERE 180 REM I GOES FROM 1 TO 3 190 REM 200 FOR I=1 TO 3:W(I)=2*RND(0):NEXT I:REM SMALL RANDOM NUMBERS 210 REM 220 LET E9=0:REM INITIALIZE COUNTER 230 REM 240 REM MAIN LOOP 250 REM 260 RESTORE:REM RESET DATA 270 F1=0 :REM ASSUME W'S ARE GOOD 280 E9=E9+1:REM ADD 1 TO THE COUNTER 290 FOR G=1 TO 4:REM 4 SAMPLES 300 READ X(1),X(2),T 310 GOSUB 550:REM EVALUATE Y 320 GOSUB 450:REM CHANGE W(I) 330 IF ABS(Y-T)>.1 THEN F1=1:REM ERROR TOO GREAT. SET FLAG 340 NEXT G 350 IF F1=1 THEN 260:REM THE ERROR IS TOO GREAT. DO AGAIN. 360 CLS 370 PRINT"I HAVE FINISHED LEARNING." 380 PRINT"LEARNING RATE: "H:PRINT"NUMBER OF ITERATIONS: "E9 390 PRINT"TRY A TEST POINT: (X1,X2) TO STOP" 400 INPUT X(1),X(2) 410 GOSUB 550:REM FIND Y 420 PRINT"FOR THAT INPUT, Y="Y" ("INT(Y+.5)")":PRINT 430 PRINT"TRY ANOTHER." 440 GOTO 400 450 REM 460 REM FIND DELTA W(I)'S 470 REM AND ADD THEM 480 REM 490 FOR I=1 TO 3 500 D(I)=H*Y*(1-Y)*(T-Y)*X(I) 510 NEXT I:REM FOUND THEM 520 FOR I=1 TO 3:W(I)=W(I)+D(I):NEXT I:REM ADDED THEM.. 530 RETURN 540 REM 550 REM EVALUATE Y 560 REM 570 Q=0 580 FOR I=1 TO 3:Q=Q+W(I)*X(I):NEXT I 590 Y=FNF(Q) 600 RETURN 610 DATA 1,0,1 620 DATA 0,1,1 630 DATA 1,1,1 640 DATA 0,0,0 650 REM 660 REM THE ABOVE DATA IS IN THE FORM: 670 REM DATA X1,X2,T 680 END In the above program, the data used to train the network is in lines 610 to 640. It is currently configured to be an OR gate, but you could change it to an AND gate (or NOR, NAND, etc.) if you like. Run the program. When it finishes training, it will ask you to input two numbers. Go ahead and test it to make sure it is an OR gate if you like. Since you can change the function (e.g. OR, AND) of the program without changing the code, we can say that this is a generic computer program. If you try to change the data to a XOR gate, the program will run, but it will never get out of the loop from lines 260-350 because no matter where the computer draws the line, one point will be on the wrong side (where the error will be more than .1). Note that the learning rate has been assigned a value of 10. Where did "10" come from? I guessed at it. In general, you will want the largest learning rate you can get away with (time to convergence is less), but if you choose a learning rate which is too high, the program will become unstable and usually crash. The following listing is the two layer network used for the XOR gate. 10 REM 20 CLS 30 PRINT"DOES THIS TERMINAL ACCEPT THE":PRINT CHR$(34)"VITAMIN E"CHR$(34)" POKE? (M=65495)" 40 PRINT"IF THIS IS NOT A TRS-80 COLOR":PRINT"COMPUTER, THE ANSWER IS NO (N)" 50 INPUT A$:IF A$="Y" THEN POKE 65495,0 60 IF NOT(A$="Y") THEN RANDOMIZE 432 70 CLS 80 PRINT"TWO LAYER NEURAL NETWORK" 90 PRINT 100 PRINT" BY AARON BANERJEE" 110 PRINT" 28 FEB 1998":PRINT:PRINT:PRINT 120 H=20 130 FOR I=1 TO 3:FOR J=1 TO 3:W(I,J)=RND(0):NEXT J,I 140 IF A$="Y" THEN 160 150 FOR I=1 TO 3:FOR J=1 TO 3:W(I,J)=RND:NEXT J,I 160 FOR I=1 TO 3:W(I,3)=0:NEXT I 170 L=1 180 X(3)=1:X1(3)=1 190 DEF FN F(X)=1/(1+EXP(-X*L)) 200 REM ENTER INITIAL DATA 210 PRINT"TEACHING THE SYSTEM: PLEASE WAIT":PRINT"THIS CAN TAKE UP TO 5 MINUTES." 220 RESTORE 230 F1=1 240 REM 250 READ X(1),X(2),T 260 IF T=-1 THEN RESTORE:GOTO 310 270 GOSUB 640 280 GOSUB 420 290 IF ABS(Y-T)>.1 THEN F1=0 300 GOTO 240 310 REM 320 E9=E9+1:REM KEEP COUNT OF HOW MANY ITERATIONS 330 IF F1=1 THEN 360 340 IF E9>600 THEN PRINT"*** THE PROGRAM HAS FAILED ***":PRINT:PRINT"..RESTARTING...":E9=0:GOTO 120 350 GOTO 220 360 PRINT STRING$(30,".") 370 PRINT"LEARNING RATE: "H:PRINT"NUMBER OF ITERATIONS: "E9 380 PRINT"TRY A TEST POINT.":INPUT"X1,X2";X(1),X(2) 390 GOSUB 640:PRINT"THAT INPUT CORRESPONDS TO AN":PRINT"OUTPUT OF: "Y" ("INT(Y+.5)")" 400 GOTO 380 410 REM 420 REM FIND DELTA WIJ 430 REM 440 FOR I=1 TO 3:FOR J=1 TO 3 450 D1=Y*(1-Y)*(T-Y) 460 D2=X1(J)*(1-X1(J))*D1*W1(J) 470 C1(J)=H*D1*X1(J) 480 C(I,J)=H*D2*X(I) 490 NEXT J,I 500 REM 510 REM WE'VE FOUND THE WEIGHTS... NOT LET'S ADD THEM ON 520 REM 530 FOR I=1 TO 3:FOR J=1 TO 2 540 W(I,J)=W(I,J)+C(I,J) 550 NEXT J,I 560 FOR J=1 TO 3 570 W1(J)=W1(J)+C1(J):NEXT J 580 REM 590 REM DONE FIXING W(IJ) AND W1(J) 600 RETURN 610 REM 620 REM EVALUATE Y,X1 630 REM 640 Q=0 650 FOR J=1 TO 2 660 Q=0 670 X(3)=1 680 FOR I=1 TO 3 690 Q=Q+W(I,J)*X(I) 700 NEXT I 710 X1(J)=FNF(Q) 720 NEXT J 730 Q=0 740 FOR J=1 TO 3 750 Q=Q+W1(J)*X1(J) 760 NEXT J 770 Y=FNF(Q) 780 RETURN 790 REM 800 REM ENTER TRAINING DATA HERE 810 REM 820 REM FORM: DATA X1,X2,(DESIRED OUTPUT) 830 REM 840 REM EXCLUSIVE OR GATE 850 REM 860 DATA 0,0,0 870 DATA 1,0,1 880 DATA 0,1,1 890 DATA 1,1,0 900 DATA -1,-1,-1 I've written this program to allow you to have any number of data points you wish. So far, we've been concerned with two input logic gates, which have 4 input data points. There's no reason why that has to be the case. For that matter, input values do not have to be 0 or 1. Note the case of the fire prediction application in figure 3, where the output is binary (either there is going to be a fire or there isn't), but the inputs are analog (voltage, temperature). Add any number of data points you want (just make sure the last one is -1,-1,-1). The number of applications of the neural network are endless. The following example will attempt to identify who is at the keyboard by the manner in which they type (e.g. The network is trained with several people's typing samples and guesses at an unknown one). The neural network requires numerical inputs. Telegraph operators of the past could sometimes recognize who was on the other end of the line by the way the other person was tapping the key. In this example, we will make an assumption that people have certain (and hopefully measurable) idiosyncrasies in their typing. For example, a left handed person might have shorter timings betwen letters typed with the left hand (e.g. "as", "cat"), which could be used to distinguish themselves from a right handed person (where presumably the reverse is true). For illustration, the phrase chosen was "NOW IS THE TIME FOR ALL GOOD MEN". The program is set to "know" up to 8 people. Each person enters a typing sample. The times between keystrokes are measured and used as data. Those who ran both the one and two layer neural net programs above will appreciate that the two layer is significantly slower than the one layer. In this case, instead of having two inputs, we will have 31. The TRS-80 Color Computer is not a fast computer; big number-crunchers quickly become unmanageable on the Coco. On the other hand, if we opt for a simple one-layer network, it might not be able to classify the problem. The most obvious solution is to forget about the Coco and write the program on another computer. While this is the easy way out, it is not necessarily the best. Necessity is the mother of invention. If we are concerned about the program taking too long, we will attempt to invent a method that does the same job in less time. The first step is to examine the learning law in equation 12. Note that the term y(1-y) will always result in a small number (because y is always very close to either 1 or zero). In a single layer network, we don't really need the thresholding function (equations 1 and 2) for training purposes. Repeating the process of deriving the learning law for a neuron without the thresholding function, we obtain the following: EMBED Equation  (eqn 13) The only problem with not using a threshold function is we don't know what to set the desired output to. It is stated without proof that using the function EMBED Equation  (eqn 14) can be used with the learning law in equation 13 to allow a neuron with basically two states, -1 and 1. This function only works in a single layer network. To increase the ability of the single layer network to classify data, instead of just plugging in the data (31 times between keystrokes in this case, each time will be squared and presented as an additional input. The reason why this will most likely help is simple. if we did not use the squared term, the decision regions for a neuron is marked with a line (w1x1 + w2x2 + w0 = 0). Adding the squared terms (w1x1 + w2x2 + w3x12 + w4x22 + w0 = 0) describes the equation of an ellipse (which is better at classifying in general than a single line). This is not a "perfect" substitution. Note that for the range fire data in figure 3, two lines (neurons) would perform better than an ellipse. Technically, an ellipse should involve an x1x2 term (cross products), but we will see if the network will do without (all possible cross products for 31 data points would lead to a huge array). The network to identify people on the keyboard is shown in figure 7 below:  Figure 6: Network to Identify Person at the Keyboard Suppose there are n people. We establish a neuron and collect one or more data samples from each person (e.g. if there were 5 people, you would need 5 neurons). The samples are feed to the inputs (the t's in this case). Note that in the case of 31 times, we will have 63 inputs (31 t's, 31 t squareds, and 1 constant input). The neurons are trained such that each will produce an output of 1 only when one of the that person's data samples is shown to the network, and -1 if anyone else's is. After the network is trained, one of the people (preferable one for which it is trained) types in a sample and presents it to the network. The neuron whose output is 1 is who they most likely are. Note that for some unknown samples, more than one neuron can have an output of 1, or perhaps none of them do. This indicates that the network is not sure (no outputs are 1) or thinks that you could be one of a group of people. Typing patterns are not a very accurate way to identify people, so some error is to be expected. The following is the code for the neural net program to identify people at the keyboard. 10 CLS 20 DIM D(63,8),W(63,8),N$(8),ID(8),B(32) 30 DIM A1(64) 40 PC=0:UN=0 50 EM=0 60 H=500 70 PRINT" NEURAL IDENTIFIER" 80 PRINT:PRINT 90 GOSUB 110 100 GOTO 190 110 REM get usernames 120 INPUT "HOW MANY USERS";U 130 FOR I=1 TO U 140 PRINT"REMEMBER YOUR USER NUMBER!" 150 PRINT "USER "I", WHAT IS YOUR NAME?" 160 LINE INPUT "> ";N$(I) 170 NEXT I 180 RETURN 190 REM GET DATA FROM USERS 200 CLS 210 PRINT" MAIN MENU" 220 PRINT:PRINT 230 PRINT"1. ADD DATA" 240 PRINT"2. QUIT TO EMULATION MODE" 250 PRINT 260 INPUT C:C=INT(C):IF C<1 OR C>3 THEN 260 270 IF C=1 THEN GOSUB 300 280 IF C=2 THEN GOSUB 580:GOTO 880 290 GOTO 190 300 REM GET DATA 310 IF EM=0 THEN PC=PC+1 320 CLS 330 IF EM=1 THEN 360 340 PRINT"NOW ENTERING DATA TAB #"PC 350 INPUT"PLEASE ENTER YOUR USER NUMBER";ID(PC) 360 PRINT"PLEASE TYPE THIS PHRASE AS FAST" 370 PRINT"AS YOU CAN WITHOUT MAKING ANY" 380 PRINT"MISTAKES. IF YOU MAKE A MISTAKE"; 390 PRINT"PRESS THE BACKSPACE KEY." 400 PRINT 410 PRINT"NOW IS THE TIME FOR ALL GOOD MEN" 420 TIMER=0 430 B$="" 440 FOR I=1 TO 32 450 A$=INKEY$:IF A$="" THEN 450 460 B(I)=TIMER 470 IF A$=CHR$(8) THEN I=40 ELSE B$=B$+A$:PRINT A$; 480 NEXT I 490 IF I>39 THEN PRINT:INPUT"PRESS ENTER TO TRY AGAIN";A$:PC=PC-1:GOTO 300 500 IF B$<>"NOW IS THE TIME FOR ALL GOOD MEN" THEN PRINT"PLEASE MAKE SURE THE LINES ARE IDENTICAL.":I=42:GOTO 490 510 FOR I=1 TO 31:A1(2*I)=B(I+1)-B(I):A1(2*I+1)=A1(2*I)*A1(2*I):NEXT I 520 A1(1)=1 530 FOR I=1 TO 63 540 IF EM=1 THEN D(I,0)=A1(I) ELSE D(I,PC)=A1(I) 550 NEXT I 560 RETURN 570 REM 580 REM find the weights 590 REM 600 FOR J=1 TO U 610 CLS 620 PRINT@229,"NOW TRAINING NEURON "J 630 IC=0 640 FOR I=1 TO 63:W(I,J)=RND(0):NEXT I 650 E=1 660 FOR S=1 TO PC 670 IF ID(S)=J THEN T=1 ELSE T= -1 680 GOSUB 790 690 IF ABS(Y-T)<.2 THEN 740 ELSE E=0 700 FOR I=1 TO 63 710 W(I,J)=W(I,J)+H*(T-Y)*D(I,S) 720 NEXT I 730 IC=IC+1 740 PRINT @ 261,USING" ... ITERATION ####";IC; 750 NEXT S 760 IF E=0 THEN 650 770 NEXT J 780 RETURN 790 REM EVALUATE Y 800 REM 810 Q=0 820 FOR II=1 TO 63 830 Q=Q+W(II,J)*D(II,S) 840 NEXT II 850 IF Q< -1 THEN Y= -1 ELSE IF Q>1 THEN Y=1 ELSE Y=Q 860 RETURN 870 REM 880 REM emulation section 890 REM 900 CLS:EM=1 910 GOSUB 360 920 S=0 930 PRINT:PRINT"YOU ARE MOST LIKELY TO BE ONE " 940 PRINT"OF THE FOLLOWING PEOPLE:":PRINT 950 FL=0 960 FOR J=1 TO U 970 GOSUB 790 980 IF Y>.5 THEN PRINT N$(J):FL=1 990 NEXT J 1000 IF FL=0 THEN PRINT"I HAVE NO IDEA WHO YOU ARE." 1010 PRINT 1020 INPUT"PRESS ENTER TO TRY AGAIN";A$ 1030 GOTO 880 1040 END This program first asks how many users there are and for each to enter his/her name. Although it records the user's name, it keeps track of them by user number. There are two modes to the program: adding data/training and emulation. A menu will give the user a choice to add data or to go to emulation mode. It is important that each user enter at least one sample of data, and preferably at least two or three. When the add data choice is selected, the user at the keyboard will enter his/her user number. The coco will then proceed to have the user type in a sample. This is an equivalent to a data point in figures 2, 4, or 6. The more data that is entered, the more accurate the system will be, but the longer it will take to train. I've found that 3 samples for each user seems to suffice. Experimentation will indicate that this is not a perfect identification scheme. Although it could be used in conjunction with passwords in certain applications for added security, but I have found it more of a form of entertainment. If the network is poorly trained, or if it is trained with several "hunt and peck" people (inherently incosistent), it can produce unpredictable and sometimes amusing results. More work could be put into this network. I've designed to make it feasible on a TRS-80 Color Computer. More complex networks might be able to do a more accurate job of identifying people, but then again, they may simply take more time without adding any more accuracy at all. As for security, this network could be used in conjunction with passwords so that it would matter not only what the password was, but how it was typed. Then again, what if the user were to break a finger... There are some important quirks to this program: 1. The program uses the TIMER function to measure time between keystrokes. BASIC interpreters which don't have a TIMER function (e.g. non Extended Color BASIC, GWBASIC, etc.) won't be able to run the program without modification. High speed or "Vitamin E" pokes alter the rate of the timer. If you use a high speed poke, make sure it is on both during training and emulation or the system won't work properly. 2. When the computer collects the typing sample, there is no cursor (which is disorientating to some people). It doesn't start timing until you press the first key (an"N"). Once you do, the clock is running. If you stop to look at the screen (or do something you otherwise wouldn't do) you will be affecting your data. 3. In some (rare) cases, the network won't train. This means the people in question have typing habits which are too similar to discern (at least by measuring timing between keystrokes) with this network. This is analogous to trying to solve the exclusive-or problem with a single neuron. Sometimes changing the learning rate in line 60 to a smaller number will get the system to converge. 4. Flashy human-computer interfaces were never my strong suit in programming. This program is somewhat user unfriendly. 5. Although the program will work with only two people, it seems to work best with 3 or 4 (that's all it's been tested for). With two people, sometimes it sets one as a "default" person. It seems to work best for typists who use all 10 fingers, as those who "hunt and peck" tend to be more erratic in their typing. W Aaron Banerjee 7620 Willow Point Falls Church, VA 22042  Lim, et al. "Study of Technology for Detecting Pre-Ignition Conditions of Cooking Related Fires Associated with Electric and Gas Ranges: Phase III". Feb 23, 1998. Prepared by the Directorates of Laboratory Sciences and Engineering Sciences. etw:G +    y%  C y%y%(%y!s'gg?'s!?ÿ??=?99''3??OO!gs''s!th:L h7h  .1  @ & MathType`8Times New Roman-!y`!f`!w`<!x` Times New Roman-!n`!n!n!n!N`xSymbol-!=\ Symbol-!=.!=Symbol-!`Times New Roman-!(` !) Times New Roman-!0"System-]99+++10r+]!5327654'&##5!&33266z: n7D  .1   & MathType-: @\Times New Roman+-!f@!xi!e Times New Roman-!x@@Times New Roman+-!(@!)@Symbol-!=iG!+K Symbol-!-M Times New Roman\8-!1i6!1"System-1'~|~?|7~N!:> H0  f  8 fC 88(8;17s?????x?????͟?}}}?͟???? =????Ͽ?>gϟ>g>g>ggfg?o.&lkn+ C&+E..f:Q>-(# 6    X  C XX(X(((                       """                                                                                                                                                                                                                                                                                              '''                                               &&&   $$$                                                                '''          $$$                                                                                                                                                 ###         """                                                                                          """                                                                      """           $$$                                                                                                                                                                                                                         """       &&&                                                                       """                                                                                      !!!                                                               !!!                                                                                                                  $$$                                                                                                                                                                                                                                                         ###  """                                                               $$$                                                                                             """                                                                                                                             '''                                    !!!                                     %%%                                                                                                                                                                                                              !!!       ###                                                                  !!!                                ###                                                   """    %%%                                                                                                                            %%%                                    %%%                                                                                                                                                                                                                     """                                                                                                                                                                                                                                                                                                                                                                                                                                                   '''                                  $$$                                  :  8   C (??:k       C (??''9Dl|8???߿?~OO|?C??s?=??̜????'g~??</O?'9Dl|8?}?!?s`'~'s!"6>?X((:  8   C (????????????ÿ???s??C'ߞyߞ|O>?ߟ??O9??C??????????|!s '??y'ys?x!||~q:p 0| ? .1  ` & MathType`,Times New RomanHe-!E!y3!TSymbol-!=!-7"Symbol-!Times New RomanU-!(; !)  Times New RomanHe-!2"System-  : h.  .1   & MathType`8Times New Roman-!y`!f`!w`!x Times New RomanDI-!i!i!i`xSymbol-!= Symbol-!=Symbol-!`Times New Roman-!(` !) Times New RomanDI-!06!2"System-00`@:.D  .1    & MathType- M7Symbol-!is!@!hMTimes New RomanI-!wi,!t@!E@Symbol-!=@!-@!"System-:,   .1  @ & MathType- o qhSymbol-!7!x !!!Times New Roman-!E!w!yj!TV !fxO!wx!x!w= Times New Roman-!iR!ix !i  Times New Roman-!1n!0 !2 !1eTimes New Roman-!2Symbol-!=%!-  Symbol-!=_ Symbol-!=Times New Roman-!(r !)x !(x!)3  Times New Roman-!'"System-:j#n- d .1    & MathType-  24Times New Roman0-!df!xN!dxN!d!dx !e !e,!ep!f!x!f!xi  Times New Roman-!xl!x!xTimes New Roman0-!(X!)T!(!)h!(!)Symbol-!= !+Fences-!F}!H!G+ !I}+ !K+ !JoSymbol-!=!+!=!-  Symbol-!-!-!- Times New Roman-!1!1!1!1yhFences#-!by!g"System- 0::V%h D  .1  @ & MathType- sM4Times New RomanM-!dwi!dt@!y@ !y@ !y@l!T@(!x Times New Roman-!1!1@DTimes New RomanM-!2@^!1@ Symbol-!=@q!-@n !-@'!-@(Symbol-!h@Times New Roman-!(@ !)@? !(@t!)"System-:F,tD * .1    & MathType- suM4Times New Roman+-!dwi!dtM!wi!t Times New RomanF{-!1f!1@ Symbol-!=M!Di!D"System-:0F.P   .1   & MathType@@Symbol-!D@ Times New Romanl0-!w@Q!y@ !y@ !T@ !y@k!x Times New Romanw -!1 !1@Times New Romanl0-!1@Symbol-!=@!-@ !-@kSymbol-!h@Times New Roman 0-!(@ !)@ !(@!)"System-:,   .1   & MathType@@ Times New Romanp--!w@!w@!y@b !y@f !T@3!y@!x Times New Roman-!1h!1R!1@Times New Romanp--!1@Symbol-!=@a!+@ !-@!-@Symbol-!h@eTimes New Roman-!(@. !)@ !(@!)"System-:+   .1   & MathType@@ Times New Roman-!w@s!w@}!y@F !y@J !T@!y@!x Times New RomanTi-!ia!i=!i@ Symbol-!=@E!+@ !-@!-@Symbol-!h@ITimes New Roman-!(@ !)@ !(@!)@!1"System- z:{, h   .1  @ & MathType`Times New Roman-!vD!v7!y !y !T!yQ!x Times New Roman-!j!j(!jSymbol-!=!+ !-h!-QSymbol-!hTimes New Roman-!( !)h !(!) Times New Roman-!'Times New Roman-!1"System-:{+lh  .1  @ & MathType` Times New Roman-!w@!w !y !y!T!y!v_!x!x!x Times New Roman+-!i!j.!i!j!j6!j!jf!ij Times New Roman-!,!,$!'s!' Times New Roman+-!( !)H!(}!)!( !)Symbol-!=!+w !-H!-Q!-1Symbol-!hg Times New Roman+-!1A!1"System-:4^-t @  .1    & MathType`@ Times New Roman+-!w@@!w@ !T@ !y@!x Times New Roman-!i!j.!i!j!ij Times New Roman+-!,!,@ Times New Roman-!(@T !)@Symbol-!=@!+@ !-@1Symbol-!h"System-$P::,   .1  @` & MathType`8Times New Roman+-!yC!xxSymbol-!=!-@ ! !@7Times New Roman+-!1!1@ ! @!(@m!x@ !1@ !) ! !(a!-1 !<n !x !<!1!) ! !(m!x !  !- !1o !)"System-.D:,  !!  4 !C 44(4?o0????~{???''ss s~?_?}??߾p??}????߿~s??g???Ͽ??}?~yw=߿y~~?￿Ͽ?o{?>?????>???߿?_???0?_;?߿??????{7???|w???Op?=>?/? |??????q???/??~??{>?o??????|???vx?}~7????''s?ss?>???g????? Equation y=f(w n x nn=0n=N  )(PP?0 METAFILEPICTL L h7  .1  @ & MathType`8Times New Roman-!y`!f`!w`<!x` Times New Roman-!n`!n!n!n!N`xSymbol-!=\ Symbol-!=.!=Symbol-!`Times New Roman-!(` !) Times New Roman-!0"System-]99+++10r+]!5327654'&##5!&33266 Equation` f(x)=11+e -xENAB METAFILEPICT H n7  .1   & MathType-: @\Times New Roman+-!f@!xi!e Times New Roman-!x@@Times New Roman+-!(@!)@Symbol-!=iG!+K Symbol-!-M Times New Roman\8-!1i6!1"System-1'~|~?|7~nV Equation` E=(y-T)    2? METAFILEPICTp p 0 ? .1  ` & MathType`,Times New RomanHe-!E!y3!TSymbol-!=!-7"Symbol-!Times New RomanU-!(; !)  Times New RomanHe-!2"System-  meV Equation y=f(w i x i ) i=02  METAFILEPICT  h.  .1   & MathType`8Times New Roman-!y`!f`!w`!x Times New RomanDI-!i!i!i`xSymbol-!= Symbol-!=Symbol-!`Times New Roman-!(` !) Times New RomanDI-!06!2"System-00`@ Equation` wt=-hE9 METAFILEPICTh.  .1    & MathType- M7Symbol-!is!@!hMTimes New RomanI-!wi,!t@!E@Symbol-!=@!-@!"System- Equation Ew 1 =2(y-T)f  (w i x i ) i=02  w 1' METAFILEPICT,  .1  @ & MathType- o qhSymbol-!7!x !!!Times New Roman-!E!w!yj!TV !fxO!wx!x!w= Times New Roman-!iR!ix !i  Times New Roman-!1n!0 !2 !1eTimes New Roman-!2Symbol-!=%!-  Symbol-!=_ Symbol-!=Times New Roman-!(r !)x !(x!)3  Times New Roman-!'"System-6 Equation df(x)dx=ddx11+e -x ()=e -x 1+e -x =f(x)1-f(x)() M METAFILEPICTj#tj#n- d .1    & MathType-  24Times New Roman0-!df!xN!dxN!d!dx !e !e,!ep!f!x!f!xi  Times New Roman-!xl!x!xTimes New Roman0-!(X!)T!(!)h!(!)Symbol-!= !+Fences-!F}!H!G+ !I}+ !K+ !JoSymbol-!=!+!=!-  Symbol-!-!-!- Times New Roman-!1!1!1!1yhFences#-!by!g"System- 0d Equation dw 1 dt=-2hy(1-y)(y-T)x 1  METAFILEPICTV%  .1  @ & MathType- sM4Times New RomanM-!dwi!dt@!y@ !y@ !y@l!T@(!x Times New Roman-!1!1@DTimes New RomanM-!2@^!1@ Symbol-!=@q!-@n !-@'!-@(Symbol-!h@Times New Roman-!(@ !)@? !(@t!)"System-V Equation dw 1 dt=Dw 1 Dt|- METAFILEPICTF, * .1    & MathType- suM4Times New Roman+-!dwi!dtM!wi!t Times New RomanF{-!1f!1@ Symbol-!=M!Di!D"System- Equation Dw 1 =hy(1-y)(T-y)x 1? METAFILEPICT00F.  .1   & MathType@@Symbol-!D@ Times New Romanl0-!w@Q!y@ !y@ !T@ !y@k!x Times New Romanw -!1 !1@Times New Romanl0-!1@Symbol-!=@!-@ !-@kSymbol-!h@Times New Roman 0-!(@ !)@ !(@!)"System-ET Equation w 1 =w 1 +hy(1-y)(T-y)x 1 METAFILEPICT,  .1   & MathType@@ Times New Romanp--!w@!w@!y@b !y@f !T@3!y@!x Times New Roman-!1h!1R!1@Times New Romanp--!1@Symbol-!=@a!+@ !-@!-@Symbol-!h@eTimes New Roman-!(@. !)@ !(@!)"System-ET Equation w i =w i +hy(1-y)(T-y)x iTS METAFILEPICT+  .1   & MathType@@ Times New Roman-!w@s!w@}!y@F !y@J !T@!y@!x Times New RomanTi-!ia!i=!i@ Symbol-!=@E!+@ !-@!-@Symbol-!h@ITimes New Roman-!(@ !)@ !(@!)@!1"System- 6 Equation v j =v j +hy(1-y)(T-y)x j METAFILEPICTH{,   .1  @ & MathType`Times New Roman-!vD!v7!y !y !T!yQ!x Times New Roman-!j!j(!jSymbol-!=!+ !-h!-QSymbol-!hTimes New Roman-!( !)h !(!) Times New Roman-!'Times New Roman-!1"System- Equation w i,j =w i,j +hy(1-y)(T-y)v j x j (1-x j )x i S METAFILEPICT{+  .1  @ & MathType` Times New Roman-!w@!w !y !y!T!y!v_!x!x!x Times New Roman+-!i!j.!i!j!j6!j!jf!ij Times New Roman-!,!,$!'s!' Times New Roman+-!( !)H!(}!)!( !)Symbol-!=!+w !-H!-Q!-1Symbol-!hg Times New Roman+-!1A!1"System- Equation w i,j =w i,j +h(T-y)x i METAFILEPICT4^-  .1    & MathType`@ Times New Roman+-!w@@!w@ !T@ !y@!x Times New Roman-!i!j.!i!j!ij Times New Roman+-!,!,@ Times New Roman-!(@T !)@Symbol-!=@!+@ !-@1Symbol-!h"System-$P6 Equation y=1x-1        (x1)        (-1<x<1)        (x -1)gso METAFILEPICT,  .1  @` & MathType`8Times New Roman+-!yC!xxSymbol-!=!-@ ! !@7Times New Roman+-!1!1@ ! @!(@m!x@ !1@ !) ! !(a!-1 !<n !x !<!1!) ! !(m!x !  !- !1o !)"System-@ !-@1Symbol-!h"System-$P6 Equation y=1x-1        (x1)        (-1<x<1)        (x -1)gsfgW X a b q r s t b c ɿ @ @l PuKc m n u v  ~cdefjklmqr!")*MNRSno|} (")"<">"?"@"D"F"G"H"L"N"a"c"""""""""$UxDб  P"""""""""""""""0#1#############$$$$2$3$$$$$ %!%&%'%,%-%f(g(v(w(x(y())))n)o)~))))****************,,0| @TTy @h N,,,,,+-,---......e.f.u.v.w.x.......(/)/8/9/:/;/o/p/u/v////000i0j0y0z0Ży @( @D$ܓ @  @ @ @   @ .z0{0|0111111<2=2L2M2N2O2Q2R2a2b2c2d2x2y2"3A:>GRRRRRRySzSSSSSUUUUUUUUUUUUUUUUUŻ @C @|> ( @7 @l2О @-7UUUUUUUUUUUUUUWWWWXX\fssPtQtKuMuJH <  <>QS fi|~a v  ! # ~57,. ſŤ!!! !!0 !H!!!!!!!!!!!!H: !!0#3#n#p#%6%8%&&f({((n))),,,..!.e.z.....(/>/G//00i0~0011"1<2Q2f2p222 3"3I3h3s33333344-4Ԝ!h!!!!!!!H!!!!!!!!!!=-4D4M4k44444475@5s5|555556<6R6q66667%7K7777738L8Z8c8888888,989A9X9a9j999999999:8:A:C:/=1=p>r>>>>>)?x??????@N@X@@@@@AA9AWAAAAAABB!B?B!!!!]?BMBVBBBC!CeGedeme{eeeeee ff=fIffffffffwgygijmmmm\o^opp1r3rrrssstt-tPtIuKuMu!!!! !!!!1 F  s s $ }#-4;BGV\