IJRET : International Journal of Research in Engineering and Technology

© All Rights Reserved

Просмотров: 16

IJRET : International Journal of Research in Engineering and Technology

© All Rights Reserved

- 00296402.pdf
- Soft Computing - Roy - Solutions
- How to Use Batch Normalization With TensorFlow and Tf.keras to Train Deep Neural Networks Faster
- lahore
- journal data mining
- Self-taught learning: Implementation using MATLAB
- 04740969
- tg
- Stock Price Pattern Recognition - A Recurrent NN Approach - K Kamijo T Tanigawa
- History of Neural Networks
- 2011SeralathanA Thesis
- neural network
- Automatic Selection of Open Source Multimedia Softwares Using Error Back-Propagation Neural Network
- Improved Methods of BP Neural Network Algorithm and its Limitation
- Artificial Neural Networks
- elk-14-3-7-0606-9
- Jerry Topics 2017 v2
- Symbiotic, Encrypted Epistemologies
- Design Optimization of Reinforced Concrete
- A Comparative Analysis on Credit Card Fraud Techniques Using Data Mining

Вы находитесь на странице: 1из 5

__________________________________________________________________________________________

Volume: 03 Special Issue: 03 | May-2014 | NCRIET-2014, Available @ http://www.ijret.org 385

A SURVEY: RESEARCH SUMMARY ON NEURAL NETWORKS

ShrutiB.Hiregoudar

1

, Manjunath.K

2

, K.S.Patil

3

1

Dept. Computer Science, BasaveshwareEngineering College, Bagalkot, Karnataka, India

2

Dept. Computer Science, Basaveshware Engineering College, Bagalkot, Karnataka, India

3

Prof: Dept. Computer Science, Basaveshware Engineering College, Bagalkot, Karnataka, India

Abstract

Neural Networks are relatively crude electronic models based on the neural structure of the brain. The brain basically learns from

experience. It is natural proof that some problems that are beyond the scope of current computers are indeed solvable by small energy

efficient packages.In this paper we propose the fundamentals of neural network topologies, activation function and learning

algorithms based on the flow of information in bi-direction or uni-directions. We outline themain features of a number of popular

neural networks and provide an overview on their topologies and their learning capabilities.

Keywords: Neural Network, Feed Forward, Recurrent, Radial Basic Function Network (RBFN), Kohonen Self Organizing

Map (KSOM).

-----------------------------------------------------------------------***----------------------------------------------------------------------

1. INTRODUCTION

The human bairn has capabilities in processing information

and marking instantaneous decision. The many researchers

shown that the human brain make computations in a radically

different manner to that done by binary computers. The

neurons is a massive network of parallel and distributed

computing elements, many scientists are working last few

decades to build computational system called neural network,

which is also called as connectionist model.A neural network

is composed of set of parallel and distributed processing units

called nodes or neurons, these neurons are interconnected by

means of unidirectional or bidirectional links by ordering them

in layers.

Fig-1:Basic Structure of Neural Network

The basic unit of neural network is neuron, it consist of N no

of inputs to the network are represented by x(n) and each

input are multiply by a connection weight these weights are

represented by w(n).The product of input and weight are

simply summed and feed through a transfer function

(activation function) to generate the result (output).

2. NEURAL NETWORK DESIGN

Neural network mainly consist of three things

Network Topology

Network Transfer Function

Network Learning Algorithm

2.1 Network Topology

the neural network topologies are classified based upon

interconnection are arranged with in the layer, there two well-

known neural network topologies are.

Feed Forward Topology

Recurrent Topology

2.1.1 Feed Forward Topology

In feed forward topology network, the nodes are hierarchically

arranged in layers starting with the input layers and ending

with output layers. In between the input layer and output layer

the number of hidden layers provide most of the network

computational power. The nodes in each layers connect to next

layer through uni-direction paths starting from one layer

(source) and ending at the subsequently layer (sink). The

output of a given layers feed the nodes of the following layers

in a forward directions and does not allow feedback flow of

information in the structure. Application multilayer layer

perception network and radial basic function network.

IJRET: International Journal of Research in Engineering and Technology eISSN: 2319-1163 | pISSN: 2321-7308

__________________________________________________________________________________________

Volume: 03 Special Issue: 03 | May-2014 | NCRIET-2014, Available @ http://www.ijret.org 386

Fig-2: Feed Forward Topology

2.1.2 Recurrent Topology

In (RNT) allows flow of information in between the connected

nodes in bi directions.i.e support both feed forward and

feedback. In recurrent network, structure has some sort of

memory which help topermit storageof information in

theiroutput nodes through dynamicstates. The mapping of

inputs and outputs are dynamic in nature.Application:

Hopfield Network and Time Delayed Neural Network

(TDNN).

Fig-3: Recurrent Topology

2.2 Network Transfer Function

The basic unit of neural network is neuron,these are sorts of

simple processors which take the weighted sum of their input

from other node and apply to them non-linear mapping

function called an activation function before delivering the

output to the next to the next neuron.

Fig-4: Transfer Function

2.3Neural Network Learning Algorithm

Learning algorithm are used to update the Weight parameter

of the input connections level of the neuron during the training

processes of the network. There are three types of leaving

algorithms are classified:

Supervised

Unsupervised and

Reinforcement

2.3.1 Supervised Learning

In supervised learning mechanism,the external source provides

the network ith a set of input stimulus for which the output is a

priori known and during the running process the output results

are continuously compared with the desire data. After number

of iterations, the gradient descent rule uses the error between

the actual output and the target data to adjust the connections

weights so as to obtain the closest match between the target

out and the actual out. Application: feed forward network.

Fig-5: Supervised Learning

2.3.2 Unsupervised Learning Algorithm:

It is also called as self-organizing learning algorithm because

there is no any external source to provide the network and

relies instead upon local information and internal control.

Thetraining data and input pattern are presented to the system

and system organization .the data into clusters or categories. A

set of training data is presented to the system at the input layer

IJRET: International Journal of Research in Engineering and Technology eISSN: 2319-1163 | pISSN: 2321-7308

__________________________________________________________________________________________

Volume: 03 Special Issue: 03 | May-2014 | NCRIET-2014, Available @ http://www.ijret.org 387

level, the network connection weights are then adjusted

through some sort of competition among the node of the

output layer where the successful candidate will be the node

with the highest value.

Fig-6: Unsupervised learning

2.3.3 Reinforcement Learning:

The reinforcement learning algorithm also called as graded

learning it has mimic in a way the adjusting behaviour of

humans which interacting with a given physical environment

.the network connections are modified according to feedback

information provided to the network by its environment. This

information simply instructs the systems on whether or not a

correct response has been obtained. In case of correct response

the corresponding connections leading to that output are

strengthened otherwise they are weakened. Reinforcement

learning doesnt get information on what the output should be

when the network is presented with a given input pattern.

Fig-7: Reinforcement learning

3. MAJOR CLASSES OF NEURAL NETWORKS

There are 4 types of neural network classes

Multilayer Perception

Radial Basis Function Network

The Kohonen Self Organizing Map

Hop Field Network.

3.1 The Multilayer Perception:

Topology: The multilayer perceptron belong to the class of

feed-forward networks, means that the information flows

among the network nodes exclusively in the forward direction.

the number of hidden layers required with in multi-layer

perceptron depends in major part on the type of problem being

addressed .for instances, a system with a single hidden layer

able to solve the problem of XOR function or related problems

in which the separate boundaries are relatively simple.

Activation function: in multilayer perception network with

one single hidden layer composed of an appropriate number of

nodes with sigmoid activation function ,as the activation

function for all the neurons of the network defined

as:E

c

=1/2 1

=0

2

(

=1

.

Learning algorithm: The algorithm is based on the gradient

descent technique for solving an optimization problem which

involves the minimization of the network cumulative error E

c

,

E

c

represents the sum of n squared errors

E(K)=I/2 2

=1

, Where the index represents

the i-th neuron of the output composed of a total number of q

neurons. The algorithm is designed in such a way as to update

the weights in the direction of the gradient descent of the

cumulative error. Applications: signal processing, weather

forecasting, financial market prediction, pattern recognition,

signal compression.

3.2 Radial Basis Function Networks:

Topology: radial basis function network represent a special

category of the feed forward neural network architecture. The

basic RBFN structure consists of an input layer, hidden layer

with activation function and output layer. The network

structure use non liner transformations at its hidden layer but

uses linear transformations between the hidden and the output

layers.in general form of RBF function is given by gi(x)=

| |,where x is the input vector and vi is the

vector denoting the center of the receptive field unit gi with i

as its unit width parameter.

Activation function: the logistic function of RBF is given by

gi(x)=1\(1+exp(| |22.

Learning Algorithm: In RBF network is a two stage learning

strategy:

IJRET: International Journal of Research in Engineering and Technology eISSN: 2319-1163 | pISSN: 2321-7308

__________________________________________________________________________________________

Volume: 03 Special Issue: 03 | May-2014 | NCRIET-2014, Available @ http://www.ijret.org 388

Step1: train the RBFN layer to get the adaption of centres and

scaling parameters using the unsupervised training.

Step2: adapt the weights of the output layer using the

supervised training algorithm.

Applications: control system,audio and video signal

processing and pattern recognition, weatherand power load

forecasting.

3.3 Kohonens Self Organizing Network:

Topology: The kohonen self-organizing network (KSON) also

called self-organizing map(SOM) belongs to the class of

unsupervised learning networks in KSON , the node

distributed themselves across the input space to recognize

groups of similar input vector while the output nodes compute

among themselves to de fired one at a time in response to a

particular input vector this processes is called competitive

learning. The nodes of the KSON can be recognize groups of

similar input vector this generate a topographic mapping of the

input vector to output vector which depends primarily on the

pattern of the input vector and results in dimensionality

reduction of the input space.

Learning algorithm: The learning here permits the clustering

of input data into a smaller set of elements having similar

characteristics. It is based on the competitive learning

technique also known as the winner take all strategy.

Application: speech recognition, vector coding, texture

segmentation, designing nonlinear controllers.

3.4 Hopfield Network

Topology: The hope field is a recurrent topology and working

of Hopfield is based on the associative memory concept that

means the network is able to recognize newly presented

pattern using an already stored complete version of that

pattern. Hopfield defines as any physical system whose

dynamics in phase space is dominated by a substantial number

of locally stable states to which it is attached can therefore be

regarded as a general content addressable memory.

The activation function of each neuron in Hopfield is defined

by the equation:

o

i

=sign( )

=1

!=

,

Where,oi is output of the current

processing unit.

Learning algorithm: The learning algorithm of Hopfield is

based upon the Hebbain learning rule which is based on

supervised learning algorithm. The Hebbian learning rule

applied to a set of q presented patterns

P

k

(k=1,2,3,4.q)each with dimension n, where n is the

numbers of neurons unit in the Hopfield network.

Applications: It has been used to solve optimization problem,

original optimization problem, combination tutorial

optimization problem.

4. APPLICATION OF NEURAL NETWORKS

Neural network for process monitoring and optimal

control.

Neural network is a semiconductor manufacturing

process.

Neuralnetwork is a power systems.

Neuralnetwork in robotics.

Network in communications.

Neuralnetwork in pattern recognition.

5. CONCLUSIONS

In this paper mainly we description of neural network main

features in terms of topology, learning algorithm and

activation functions. These include the feed forward and the

recurrent based topologies, they also include network with

supervised and unsupervised learning algorithm. We provide

detailed descriptions of a numbers of the very often used

neural network models the Multilayer Perceptron, Radial Basis

Function network, Kohonens Self-Organizing network, the

Hopfield network along with highlights on their fields of

applications.

REFERENCES

[1]. Hop, good, A. (1993) knowledge based systems for

Engineer and scientists,Boca Raton, Florida, CRC Press , 159-

85.

[2]. Jang, J, Sun, S, and Mizutani, Etice (1997) Neuro-Fuzzy

and Soft Computing, prentice Hall, Upper Saddle Rver, NJ.uin

[3].Haynkin,S. (1994) Neural Networks, A Comprehensive

Foundation, MacMillan Publishing,EnglewoodCliffs,NJ.

[4]. Werbos,P.(1947) Beyond Regression: new tools for

predicgions and analysis in the behivoral sciences . PhD

dissertation, Harvard University .

[5]. Rumelhary, D. Hinton, G., and Williams, R, (1986)

Learning represantations by backpropagation errors, Nature,

vol.323,pp. 553-6.

[6]. Haykin, S.(1994) Neural Networks, a Comprehncive

Foundation, MacMillan Publishing,EnglewoodCliffs,NJ.

[7]. Shar, S.andPalmieri, F.(1990) MEKA. A fast local

algorithm for training feed-forward neural networks,in

proceedings of the International Joints Conference on Neural

Networks.

[8]. Kattyama, R., Kuwata, K., Kajitani,

Y,.andWatanabe,M.(1995) Embeding Functions networks,

Fuzzy Sets and Systems, 72(3):311-27.

[9]. Hopfield, J. (1984) Neurons with graded response have

collective computational properties like those of two state

neurons , in proceedings of the national properties like those

of two state national Acccadamy of science, pp.388-92.

IJRET: International Journal of Research in Engineering and Technology eISSN: 2319-1163 | pISSN: 2321-7308

__________________________________________________________________________________________

Volume: 03 Special Issue: 03 | May-2014 | NCRIET-2014, Available @ http://www.ijret.org 389

BIOGRAPHIES

1 ShrutiB.Hiregoudar, Mtech(cse), Basaveshware Engineering

College, Bagalkot. and shrutishining411@ gmail.com

2 Manjunath K, Mtech(cse), Basaveshware Engineering

College, Bagalkot. andmanjuise.026@gmail.com.

3 Prof: K.S.Patil,Mtech(cse), Basaveshware Engineering

College, Bagalkot.and kamalashashibec@rediffmail.com

- 00296402.pdfЗагружено:Amrik Singh
- Soft Computing - Roy - SolutionsЗагружено:mm8871
- How to Use Batch Normalization With TensorFlow and Tf.keras to Train Deep Neural Networks FasterЗагружено:susan
- lahoreЗагружено:Raghavendra Vithal Goud
- journal data miningЗагружено:fortunatesanjib
- Self-taught learning: Implementation using MATLABЗагружено:bodgergely
- 04740969Загружено:Ganesh Kumar Arumugam
- tgЗагружено:Yeferson Guevara
- Stock Price Pattern Recognition - A Recurrent NN Approach - K Kamijo T TanigawaЗагружено:darreal53
- History of Neural NetworksЗагружено:Ambrish Singh
- 2011SeralathanA ThesisЗагружено:zorrin
- neural networkЗагружено:Praveen Kumar
- Automatic Selection of Open Source Multimedia Softwares Using Error Back-Propagation Neural NetworkЗагружено:Anonymous 7VPPkWS8O
- Improved Methods of BP Neural Network Algorithm and its LimitationЗагружено:Fahad Khan
- Artificial Neural NetworksЗагружено:Cuong Polyme
- elk-14-3-7-0606-9Загружено:Krishanjeet Baliyan
- Jerry Topics 2017 v2Загружено:Gerry Anastasatos
- Symbiotic, Encrypted EpistemologiesЗагружено:David Anderson
- Design Optimization of Reinforced ConcreteЗагружено:fat470
- A Comparative Analysis on Credit Card Fraud Techniques Using Data MiningЗагружено:Integrated Intelligent Research
- DemoЗагружено:Pradeep Kr
- Ivan Nunes da Silva, Danilo Hernane Spatti, Rogerio Andrade Flauzino, Luisa Helena Bartocci Liboni, Silas Franco dos Reis Alves (auth.)-Artificial Neural Networks _ A Practical Course-Springer Interna.pdfЗагружено:Muhammad Nur Sururi
- ic970155Загружено:gheorghe gardu
- Filetype Network Neural PDFЗагружено:Julie
- Ijert PaperЗагружено:Abdul Salim N
- jack aknowlege2.pdfЗагружено:Jack
- Google contact lens.docxЗагружено:Gm Lakshman
- W1_Paper_1_Pabitra.pdfЗагружено:Manjunath D
- bioЗагружено:Anshul Agrawal
- KB_neural_data_mining.pdfЗагружено:Jesus Jacobe

- Effect of Hudhud Cyclone on the Development of Visakhapatnam as Smart and Green City- A Case StudyЗагружено:International Journal of Research in Engineering and Technology
- Enhancing Post Disaster Recovery by Optimal Infrastructure Capacity BuildingЗагружено:International Journal of Research in Engineering and Technology
- Impact of Flood Disaster in a Drought Prone Area – Case Study of Alampur Village of Mahabub Nagar DistrictЗагружено:International Journal of Research in Engineering and Technology
- Wind Damage to Trees in the Gitam University Campus at Visakhapatnam by Cyclone HudhudЗагружено:International Journal of Research in Engineering and Technology
- Hudhud Cyclone – a Severe Disaster in VisakhapatnamЗагружено:International Journal of Research in Engineering and Technology
- Detection of Hazard Prone Areas in the Upper Himalayan Region in Gis EnvironmentЗагружено:International Journal of Research in Engineering and Technology
- Flood Related Disasters Concerned to Urban Flooding in Bangalore, IndiaЗагружено:International Journal of Research in Engineering and Technology
- Groundwater Investigation Using Geophysical Methods- A Case Study of Pydibhimavaram Industrial AreaЗагружено:International Journal of Research in Engineering and Technology
- Shear Strength of Rc Deep Beam Panels – a ReviewЗагружено:International Journal of Research in Engineering and Technology
- Wind Damage to Buildings, Infrastrucuture and Landscape Elements Along the Beach Road at VisakhapatnamЗагружено:International Journal of Research in Engineering and Technology
- Role of Voluntary Teams of Professional Engineers in Dissater Management – Experiences From Gujarat EarthquakeЗагружено:International Journal of Research in Engineering and Technology
- Effect of Lintel and Lintel Band on the Global Performance of Reinforced Concrete Masonry in-filled FramesЗагружено:International Journal of Research in Engineering and Technology
- Risk Analysis and Environmental Hazard ManagementЗагружено:International Journal of Research in Engineering and Technology
- Monitoring and Assessment of Air Quality With Reference to Dust Particles (Pm10 and Pm2.5) in Urban EnvironmentЗагружено:International Journal of Research in Engineering and Technology
- Low Cost Wireless Sensor Networks and Smartphone Applications for Disaster Management and Improving Quality of LifeЗагружено:International Journal of Research in Engineering and Technology
- Likely Impacts of Hudhud on the Environment of VisakhapatnamЗагружено:International Journal of Research in Engineering and Technology
- Review Study on Performance of Seismically Tested Repaired Shear WallsЗагружено:International Journal of Research in Engineering and Technology
- A Geophysical Insight of Earthquake Occurred on 21st May 2014 Off Paradip, Bay of BengalЗагружено:International Journal of Research in Engineering and Technology
- Cyclone Disaster on Housing and Coastal AreaЗагружено:International Journal of Research in Engineering and Technology
- Assessment of Seismic Susceptibility of Rc BuildingsЗагружено:International Journal of Research in Engineering and Technology
- COMPARATIVE STUDY OF THE FORCES IN G+5 AND G+10 MULTI STORIED BUILDINGS SUBJECTED TO DIFFERENT WIND SPEEDSЗагружено:International Journal of Research in Engineering and Technology
- Can Fracture Mechanics Predict Damage Due Disaster of StructuresЗагружено:International Journal of Research in Engineering and Technology
- Disaster Recovery Sustainable HousingЗагружено:International Journal of Research in Engineering and Technology
- Challenges in Oil and Gas Industry for Major Fire and Gas Leaks- Risk Reduction MethodsЗагружено:International Journal of Research in Engineering and Technology
- Coastal Zones – Seismic Vulnerability an Analysis From East Coast of IndiaЗагружено:International Journal of Research in Engineering and Technology
- Cpw-fed Uwb Antenna With Wimax Band-notched CharacteristicsЗагружено:International Journal of Research in Engineering and Technology
- Rate Adaptive Resource Allocation in Ofdma Using Bees AlgorithmЗагружено:International Journal of Research in Engineering and Technology
- Developing of Decision Support System for Budget Allocation of an r&d OrganizationЗагружено:International Journal of Research in Engineering and Technology
- Analytical Solutions for Square Shape PressureЗагружено:International Journal of Research in Engineering and Technology
- Brain Tumor Segmentation Using Asymmetry Based Histogram Thresholding and K-means ClusteringЗагружено:International Journal of Research in Engineering and Technology

- Secugen Help Document for Windows 7Загружено:Medagam Venkateswarareddy
- Traces of Roman Moo 00 New yЗагружено:Jesse Faulkner
- 4 4 Effective Elastic Properties Halpin Tsai and Self-consistent - StudentЗагружено:Mee Noi
- AbrahamЗагружено:Kirk Sta Ana
- Bond strength of reinforcement in splices in beams.pdfЗагружено:Vandhiyan Radhakrishnan
- Temperature IndicatorЗагружено:Nazurudin Ahamed
- The Importance of Using Contour Lines in TeachingЗагружено:Nabill Ashraf
- The Dual Nature of Law ALEXYЗагружено:MAVillar
- IBM Tsm DocviewЗагружено:mpjegan90
- Implementation Steps For Oracle Alert ModuleЗагружено:mzee1981
- News Creative NarrativesЗагружено:nektarini
- LM3581Загружено:Hemanth Pattazhy
- Rol Del Anestesiologo en Morbilidad y Mortalidad MaternaЗагружено:Diana Reaño Ruiz
- lesson plan 5-unit planЗагружено:api-354526672
- nitrogen cycle ppt-1Загружено:api-293001217
- Menzies Research Centre - Unions IncЗагружено:Michael Smith
- FA2008FNLЗагружено:jchung19
- radical equations 1Загружено:api-304977850
- Blue Spike v. DERMALOG Identification Systems et. al.Загружено:PriorSmart
- Competitive DynamicsЗагружено:S. Ramgopal Rao
- Sweetness CarbonatesЗагружено:Maria Valentina Contreras
- Sohel cvЗагружено:SISkobir
- 2010_Commander.pdfЗагружено:ricardo garavito
- Updated Childcare Resume 2018Загружено:Demah Evans
- 5070_w10_qp_11Загружено:mstudy123456
- Cheat CodesЗагружено:Manjunath Manju
- Anti Global is at Ion MovementЗагружено:md4mahesh
- Real AnalysisЗагружено:Dilip Bhati
- About CNC ControllersЗагружено:sendil250381
- Week 5 LectureЗагружено:Faded Rianbow

## Гораздо больше, чем просто документы.

Откройте для себя все, что может предложить Scribd, включая книги и аудиокниги от крупных издательств.

Отменить можно в любой момент.