TDL version 1.1 
Copyright (C) 1995 Universal Problem Solvers, Inc. 
All Rights Reserved 
 
***About System*** 
 
TDL Version 1.1 Demo is freely available (FreeWare) to anyone and may be distributed to others, as long 
as it is ONLY utilized for personal or educational purposes.  This Demo version RESTRICTS access to 
some of TDL's program options and LIMITS systems parameters (e.g., size of network, training examples, 
etc.).  A complete, unrestricted system version  may be purchased if so desired.  For more information 
refer to the file pricing.txt.   
 
 
***Purpose*** 
 
The purpose of TDL is to provide users of neural networks with a specific platform to conduct pattern 
recognition tasks.  The system allows for the fast creation of automatically constructed neural networks. 
There is no need to resort to manually creating neural networks and twiddling with learning parameters.  
TDL's Wizard can help you optimize pattern recognition accuracy. 
Besides allowing the application user to automatically construct  
neural network for a given pattern recognition task, the system supports trans-dimensional learning.  
Simply put, this allows one to learn various tasks within a single network, which otherwise differ in the  
number of input stimuli and output responses utilized for describing them.  With TDL it is possible to 
incrementally learn various pattern recognition tasks within a single coherent  
neural network structure.  Furthermore, TDL supports the use of semi-weighted neural networks, which 
represent a hybrid cross between standard weighted neural networks and weightless multi-level threshold 
units. Combining both can result in extremely compact network structures (i.e., reduction in connections 
and hidden units),  
and improve predictive accuracy on yet unseen patterns. Of course the user has the option to create 
networks which only use standard weighted neurons.   
 
***What's New*** 
 
Thanks to user suggestions TDL has received a number of improvements: 
 
(1)	The user is now in control of TDL's memory system.  You decide 	how many units TDL can 
allocate and how many training 	examples can be loaded (no more limitations, except for your 
	computers memory). 
 
(2)	TDLs Wizard supports hazzle-free development of  neural 	networks, the goal of course being 
optimization of predictive 	accuracy on unseen patterns.  It's as easy as one, two, three... 
 
(3)	Automatic and manual load and save of network parameters and menu settings. 
 
(4)	A completely re-designed parameter menu system. 
 
(5)	An improved graphics display. 
 
(6)	Introduction of TDLs Help Viewer. 
 
(7)	Expanded help with more examples. 
 
(8) 	History option allows users to capture their favorite keystrokes and save them.  Easy recall for 
future use. 
 
(9)	Introduces a new local learning rule: fastprop. 
 
(10)	Expands the number of unit transfer functions by: Gaussian and Linear. 
 
(11)	Improves the new n-level threshold function to deal with continuous outputs.	 
 
 
***Highlights*** 
 
(1) 	Provides symbolic interface which allows the user to create: 
	      
		(a) Input and output definition files. 
		(b) Pattern files. 
		(c) Help files for objects (i.e., inputs, input values, 			and outputs). 
 
(2)	Supports categorization of inputs.  This allows the user to 	readily access inputs via a popup 
menu within the main TDL 	menu.  The hierarchical structure of the popup menu is under the 
	full control of the application developer (i.e., user). 
 
(3)	Symbolic object manipulation tool: 
 
	(a) Allows the user to interactively design the input/output 	structure of an application.  The 
user can create, delete, or 	modify inputs, outputs,  input values, and categories. 
 
	(b) Furthermore, inputs and categories can be moved from 	one location to another. 
 
	(c) Receive a quick overview of the hierarchical category 	structure. 
 
(4)	Supports Rule representation: 
 
	(a) Extends standard Boolean operators (i.e., and, or, not) to 	contain several quantifiers (i.e., 
atmost, atleast, exactly, 	between).   
 
	(b) Provides mechanisms for rule revision (i.e., refinement) 	and extraction. 
 
	(c) Allows partial rule recognition. Supported are first- and 	best-fit. 
 
(5)	Allows co-evolution of different subpopulations (based on type of 	transfer function chosen 
for each subpopulation). 
 
(6)	Provides three types of crossover operators: simple random, 	weighted and blocked. 
 
(7)	Supports both one-shot as well as multi-shot learning.  Multi-shot 	learning allows  for the 
incremental acquisition of different data 	sets.  A single expert network is constructed, capable of 
	recognizing all the data sets supplied during learning. Quick 	context switching between 
different domains is possible. 
 
(8)	Three types of local learning rules are included: perceptron, delta 	and fastprop. 
 
(9)	Implements 7 types of unit transfer functions: simple threshold, 	sigmoid, sigmoid-squash, 
n-level threshold, new n-level-	threshold, gaussian and linear. 
 
(10)	Data sets can contain either binary or continuous inputs and 	outputs. 
 
(11)	Automatically constructed networks can be either tested (i.e., 	measure performance accuracy) or 
used for classifying new 	patterns. 
 
(12)	Batch training and testing both in one-shot and multi-shot mode 	is supported.  
 
(13)	Graphical interface allows user to view the construction of a 	network over time and view the 
change in unit activation during 	testing or classification.  Obtain detailed information on 
	individual network units (i.e., unit id, weights, 	connections,transfer functions, etc.).  In 
case of n-level threshold 	or new-n-level threshold functionthe user can view the 	activation\output 
function. Using the network dependency 	option the user can also view subnetworks. 
 
(14)	Over a dozen statistics are collected during various batch training 	sessions.  These can be 
viewed using the chart option. 
 
(15)	Network and memory resouces can be viewed directly. 
 
(16)	A hypertext online help menu is available.   
 
(17)	A DEMONSTRATION of TDL can be invoked when initially 	starting the program. 
 
 
***IMPORTANT: Getting Started*** 
 
(1)	Select the Play Demo option when you start up the TDL software.  	This DEMO gives a basic 
overview of TDL's learning and 	generalization capabilities as a trans-dimensional pattern 
	recognizer. 
 
(2)	Run TDL and select  "Help On ..." from the Help option of the 	main menu.  TDL's help 
viewer will appear. Select the Show 	option and select the heading "Getting Started ...". Press the 
View 	button and the appropriate page will appear.  Exit the dialog box. 	Several pages will guide 
you through some of the TDL basics.  	You can move forward or backward using the appropriate 
menu 	buttons.  The Help View Tool also allows you to jump to other 	pages by clicking on select 
words.  You can highlight select words 	of a page by clicking the right mouse button (another click 
will 	turn this option off).  If the cursor is placed on a jump word, the 	cursor will change (Jump 
cursor).   
 
***Bibliography*** 
 
Many of the features incorporated in TDL were reported in the scientific literature. Below follows an 
excerpt of published articles: 
 
Romaniuk, S.G., Evolutionary Growth Perceptrons, In S. Forrest (Ed.)  
Genetic Algorithms: Proceedings of the 5th International Conference, Morgan Kaufmann,   1993. 
 
Romaniuk, S.G., Evolutionary Grown Semi-Weighted Neural Networks, To    appear in: International 
Conference on Genetic Algorithms (ICGA-95), Morgan Kaufmann, 1995. 
 
Romaniuk, S.G.,  Application of Learning to Learn to Real-World  
Pattern Recognition, International Conference on Artificial Neural  
Networks and Genetic Algorithms, (ICANNGA-95), France, 1995. 
 
Romaniuk, S.G.: Trans-Dimensional Learning, International Journal  
of Neural Systems, Vol. 4, No. 2 (June), 171-185, 1993. 
 
Romaniuk, S.G.: Learning to Learn: Automatic Adaptation of Learning Bias, AAAI-94, Seattle, WA, 
USA, 1994. 
 
Some of these articles can be obtained as technical reports via anonymous ftp from: 
 
	ftp.nus.sg 
	/pub/NUS/ISCS/techreports 
 
Look for reports: TRG7/93, TR20/93, and TRB1/94. 
 
 
*** Installation *** 
 
First create directory TDL in the root directory of the C drive. Then Place the zip filesTDL11-*.zip into 
C:\TDL\.  Unzip the files and PRESERVE their subdirectory structure!!!!! 
Next, install the code with Windows Program Manager: 
 
(1)	Select File menu option of Windows Program manager 
 
(2)	Select menu option New and check the Program Group radio 	button.  Then select OK. 
 
(3)	For the Description of the Program Group Properties enter: TDL 	v. 1.1 DEMO 
 
(4)	Now select OK button. 
 
(5)	Next, again select the New option from the File menu of the 	Windows Program Manager. 
 
(6)	This time the Program Item radio button should be selected. Hit 	OK. 
(7)	Now enter for the selections: 
 	Description: Trans-Dimensional Learning 
     	Command Line: C:\TDL\TDL11.EXE 
    	Working Directory: C:\TDL 
(8) 	Once you have entered these choices hit OK button. 
 
You should now be able to view the TDL icon within the earlier created folder TDL v. 1.1 DEMO. You 
are ready to get started. 
 
 
*** Requirements *** 
 
The version requires Windows 3.1 or Windows for Workgroups.   
 
*** Documentation *** 
 
For documentation, consult the online help option. 
 
***Other Products*** 
 
See the file products.txt. 
 
 
***FILE_ID.DIZ*** 
 
v(1.1) TDL - Win Neural Net Pattern Recog. Tool 
Quick, hassle-free automatic construction of NN by  
use of evolutionary processes.  Trans-Dimensional 
Learning allows multiple data sets to be learned incre- 
mentally, regardless of differences in number of inputs 
and outputs. Complete graphical support of  training 
and testing.  Point and click recall of network units, 
unit info and unit dependency. Various user param. 
supported. TDL Wizard helps find good solutions. 
Freeware. Contact: zlxx69a@prodigy.com. 
 
### 

