Connectionist Model Based on Cellular Neural Network for Object Extraction.
Date of Submission
December 1993
Date of Award
Winter 12-12-1994
Institute Name (Publisher)
Indian Statistical Institute
Document Type
Master's Dissertation
Degree Name
Master of Technology
Subject Name
Computer Science
Department
Center for Soft Computing Research (CSCR-Kolkata)
Supervisor
Pal, Sankar K. (MIU-Kolkata; ISI)
Abstract (Summary of the Work)
Cellular neural networks are made of massive aggregate of analog circuit compo- nents called cells, and interconnections between them. Any cell in a cellular neural network is connected only to its nearest neighbor cells and interact only with them. Cells not directly connected together affect each other due to propagation effects. Since a cellular neural network is a highly parallel analog circuit it has an ability to process signals in real time. In view of the local interconnections they are more suitable to VLSI implementation than general neural networks. A typical cell of a cellular neural network contains both linear and non-linear circuit elements such as resistors, capacitors, controlled voltage & current sources and independent voltage & current sources. The dynamics of cellular neural networks have both feedback and control operators. The feedback and control to a cell can be set using ap- propriate cloning templates. The input to a cell is restricted to -1,+1] and the output of a cell is binary valued ( -1 or +1 ). A properly chosen cloning template can impart a cellular neural network an ability to extract some spatial properties from the input. We have conducted the following two investigations.(i) The first part demonstrates an application of cellular neural networks for object extraction problem. Object extraction involves classifying all the pixels of image as belonging to object or background depending upon the spatial and grey level properties of the concerned pixel. In order to apply cellular neural network for object extraction a cell is applied to each pixel and the grey value of each pixel(normalized between |-1,+1]) is the input to the cell. Global and local information of the image and the cloning template in use determine its dynamics. The output of each cell determines whether the pixel belongs to object or background (-1 or +1).The study was conducted on synthetic images with different signal to noise ratios and also on real images using different cloning templates. The performance has also been quantitatively determined in terms of percentage pixels correctly classified. The results are compared with those obtained from other neural network (such as Hopfield network and Self-organizing neural network based techniques. It has been found that a 2-connected and 3-connected cloning template performs better in the images with predominantly thin elongated objects, while 4-connected cloning template performs better for compact objects. Extraction of boundaries of object regions has also been accomplished using cellular neural network as a part of the experiment.(ii) Cloning template plays an major role in the dynamics of cellular neural net- work, hence selection of an appropriate cloning template is an important step in object extraction using cellular neural network. An attempt has been made to generate a cloning template automatically by using Genetic Algo- rithms which is an adaptive, robust and parallel search technique for machine learning. The fitness function used in Genetic Algorithms was the divergence measure between two fuzzy sets. The cloning template obtained by the above technique was found to perform uniformly well under different signal to noise ratios.
Control Number
ISI-DISS-1993-134
Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.
DOI
http://dspace.isical.ac.in:8080/jspui/handle/10263/6304
Recommended Citation
P., Harish, "Connectionist Model Based on Cellular Neural Network for Object Extraction." (1994). Master’s Dissertations. 73.
https://digitalcommons.isical.ac.in/masters-dissertations/73
Comments
ProQuest Collection ID: http://gateway.proquest.com/openurl?url_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&res_dat=xri:pqm&rft_dat=xri:pqdiss:28843086