BazEkon - The Main Library of the Cracow University of Economics

BazEkon home page

Main menu

Huk Maciej
Wybrane właściwości sieci neuronowej Sigma-if
Chosen Properties of the Sigma-if Neutral Network
Prace Naukowe Akademii Ekonomicznej we Wrocławiu, 2005, nr 1064, s. 80-87, bibliogr. 14 poz.
Issue title
Pozyskiwanie wiedzy i zarządzanie wiedzą
Sztuczna inteligencja, Zarządzanie wiedzą, Sieci neuronowe
Artificial intelligence, Knowledge management, Neural networks
Omówiono model sieci neuronowej Sigma-if oraz jej właściwości: zdolność do niedestruktywnej eliminacji połączeń międzyneuronalnych oraz rozszerzone możliwości klasyfikacyjne.

This article presents the results of research on the Sigma-if neural network model. In spite of its simple structure and the use of standard, synchronous working and learning methods, it possesses important properties that are unattainable for classic perceptron networks. Due to expanding the domain of neuronal activation functions to include the time dimension and by extending the interneuronal connection attribute set, this structure realises the idea of nondestructive interneuronal connection elimination. It is capable - when working as well as when training - of excluding interneu-ronal connections which are irrelevant at a given moment, without completely eliminating the possibility of using them in other cases. Special attention is devoted to the possibility of correct solving the linearly inseparable problems by single Sigma-if neuron, even though it uses simple sigmoidal threshold function (as in case of classical perceptron, which hasn't such properties) Analysis of this fact is supplemented by its biological interpretation. Results of experiments are presented along with graphical illustrations of decision spaces of example functioning models. The presentation mentioned above provides the necessary background for a discussion on the networks' mathematical functional model and properties. The author derives functions, which describe the dynamics of each individual Sigma-if neuron and the automaton as a whole. The theoretical description is supplemented by a number of application examples, which show the legitimacy and promise of the Sigma-if model. The article is augmented by descriptions of key experiments, along with analyses of their results.(original abstract)
The Main Library of the Cracow University of Economics
The Library of University of Economics in Katowice
The Main Library of Poznań University of Economics and Business
The Main Library of the Wroclaw University of Economics
  1. Abeles M., Role of the Cortical Neuron: Integrator or Coincidence Detector?, Israel J. Med. Sci., 18:83-92, 1982.
  2. Andrews R., Diederich J., Tickle A.B., Survey and Critique of Techniques for Extracting Rules from Trained Artificial Neural Networks, Queensland University of Technology, 1995.
  3. Cohen S., Intrator N., A Hybrid Projection Based and Radial Basis Function Architecture, Initial values and global optimization, Lecture Notes in Computer Science, 2001.
  4. Craven M., Shalvik J.: Using Neural Networks for Data Mining, Carnegie Mellon University, 1997.
  5. Duch W., Jankowski N., Transfer Functions: Hidden Possibilities for Better Neural Networks, Nicholas Copernicus University, 2001.
  6. Huk M., Określanie istotności atrybutów w zadaniach klasyfikacyjnych przez niedestruktywną eliminację połączeń w sieci neuronowej, Pozyskiwanie Wiedzy z Baz Danych, AE, Wrocław 2003.
  7. Huk M., Modelowanie sieci neuronowej Sigma-if, Metody i Systemy Komputerowe w Nauce i Technice, Kraków, 2003.
  8. Huk M., The Sigma-if Neural Network as a Method of Dynamic Selection of Decision Subspaces for Medical Reasoning Systems, "Journal of Medical Informatics & Technologies" 2004 vol. 7.
  9. Kaski S., Data Exploration Using Self-Organizing Maps, Helsinki University of Technology, 1997.
  10. Kavzoglu T., Mather P.M., The Use of Feature Selection Techniques in the Context of Artificial Nural Networks, University of Nottingham 2000.
  11. Luo Z.A., Tseng P., Analysis of an Approximate Gradient Projection Method with Application to the Backpropagation Algorithm, "Optimization Methods and Software", 1994 vol. 4, no 2.
  12. Prechelt L., Connection Pruning with Static and Adaptive Schedules, Neurocomputing, 1997.
  13. Pui-Fai Sum J., Extended Kaiman Filter Based Pruning Algorithms and Several Aspects of Neural Network Learning, The Chinese, University of Hong Hong 1998.
  14. Setino R., Loew W.K., FERNN: An Algorithm for Fast Extraction of Rules from Neural Networks, "Applied Intelligence" 2000 vol. 12.
Cited by
Share on Facebook Share on Twitter Share on Google+ Share on Pinterest Share on LinkedIn Wyślij znajomemu