Home > Sample essays > Essay 2017 04 25 000Cm8

Essay: Essay 2017 04 25 000Cm8

Essay details and download:

  • Subject area(s): Sample essays
  • Reading time: 5 minutes
  • Price: Free download
  • Published: 1 April 2019*
  • Last Modified: 23 July 2024
  • File format: Text
  • Words: 1,407 (approx)
  • Number of pages: 6 (approx)

Text preview of this essay:

This page of the essay has 1,407 words.



The Neutral Network [NN].

Done by:

ID:

Instructor:

Course :

The neural network

There are many of neural networks definitions in technology world . the most comprehensive one is" a system of software and hardware design that occurs after the process of neurons in the people brains ".  this type of networks consist set of learning techniques , it has been labeled as " Artificial neural networks " . In general, problems of patter recognition and complex signal processing were solved by commercial application of technologies .There too many helpful Business applications such as : handwriting recognition to check processing, speech-to-text transcription, oil exploration data analysis, weather forecasting and face recognition.

Biological Neural Networks

Nowadays alive creatures , have the power and ability to face there environments changes , so they have a controlling unit which helps to learn and be up-to-date . also the humans and the more developed creatures needs – to perform this task- a way complex network neurons . what have been mentioned previously considered as a benefit of the "Artificial NN" the thing that gives a huge space of inspiration to the biological nervous system and having information to recognize the organization this system .   

The feature of neural network

There are five features of neural network , The first feature of  NN is, high-resolution workbooks can be constructed When training algorithms on large tagged training datasets. learning algorithms can directly distinguish data, such as signals, text, and images, For example.  The Toolbox’ of  Neural Network supports training gyration NN and auto encryption to deep learning algorithms for image classification and learning tasks feature.

The second feature of  NN , is the ability of quick up training in the neural network and also simulating large data sets using neural network tools with parallel computing tool which considered  One of most important features possessed by the neural network s. The training and simulation includes many equivalent  calculations, which in turn accelerate them with multi-core NVIDIA GPU processors that uphold CUDA (GPUS), and computer sets with multiple processors and GPUs. GPO is required to train deep neural networks.

Neural Network  Tools designed to create, train and simulate neural networks, so this is  The third feature.  The applications that make the development of neural networks for tasks such as grading and regression (including time series regression, and clustering) easier and in no time also. While  establishing  your networks by those tools, you can directly generate MATLAB code to maintain your work and automate orders.

 Neural Network Toolbox supports a assortments of supervised and unsupervised network architectures , and by this we reach the fourth feature . With the toolbox’s modular approach to building networks, you can develop your own network architectures for problems you deal with . You can view the network architecture including all inputs, layers, outputs, and interconnections.

The last feature to speak about  is Training and learning functions are mathematical procedures used to automatically adjust the network's weights and biases. a global algorithm that affects all the weights and biases of a given network dictates The training function. within a network, The learning function can be applied to single weights and biases.

a variety of training algorithms is being supported by Neural Network Toolbox, including many gradient appropriate  methods, conjugate gradient methods, the Levenberg-Marquardt algorithm (LM), and the resilient back propagation algorithm (Rprop). The toolbox’s modular framework lets you quickly develop custom training algorithms that can be integrated with built-in algorithms. While training your neural network, you can use error weights to define the relative importance of desired outputs, which can be prioritized in terms of sample, time step (for time series problems), output element, or any combination of these. You can access training algorithms from the command line or via apps that show diagrams of the network being trained and provide network performance plots and status information to help you monitor the training process.

gradient descent, Hebbian learning, LVQ, Widrow-Hoff, and Kohonen is also provided and included in a suite of learning functions.

The network inputs Preprocessing and targets, develops  the efficiency of neural network training. Preprocessing authorize  detailed analysis of network performance. Neural Network Toolbox provides preprocessing and post processing functions and Simulink” blocks that enable you to:

– Reduce the dimensions of the input vectors using principal component analyis

– Perform regression analysis between the network response and the corresponding targets

– Scale inputs and targets so they fall in the range [-1,1].

Normalize the mean and standard deviation of the training set-

Use automated data preprocessing and data division when creating your networks.-

Comparison between the Feed forward neural networks and SOM (Self organizing Map):

The biological side inspired the Feed forward  neural network  , which includes units to process  such as simple neurons, its organized into several layers. Each layer is connected to a unit with all the modules in the previous layer. The connections are differentiate and may have uneven strengths or weights.

The output reaches the pass on the network after the input data were entered. It is considered as classified and there are interaction between layers while passing.

the figure below shows an example of a 2-layered network.

 from top to bottom: an output layer with 5 units, a hidden layer with 4 units. The network has 3 input units.

The 3 inputs are shown as circles and these do not belong to any layer of the network (although the inputs sometimes are considered as a virtual layer with layer number 0). Any layer that is not an output layer is a hidden layer. This network therefore has 1 hidden layer and 1 output layer. The figure also shows all the connections between the units in different layers. A layer only connects to the previous layer.

The operation of this network can be divided into two stages.

 The learning stage  and The classification stage.

Self-Organizing Map

Teuvo Kohonen is the one who found self-organization map (SOM), which in turn visualizes data visualization that helps to comprehend data by minimize the dimensions of data in the map. It is also control the gathering of similar data together, in another form it reduces data dimensions and displays similarities among data.

There are some competing units in SOM on the current object. inserting data trained the work of artificial neurons into the system and provide it with input. The weight carriers are close to the organism but its more active. The values of the input variables are regulated during training to keep the living relationships in the input data set.

SOM is regarded as a new and effective software tool for high-dimensional data visualization Teuvo Kohonen. It converts complex, nonlinear statistical relationsihps between high-dimensional data items into simple geometric relationships on a low-dimensional display. It also maintains the information.

Reducing  Data Dimensions

The opposite of learning technique in neural networks, training a SOM demands  no target vector. A SOM learns must classify the training data without any external supervision.

Feed forward neural networks and SOM (Self organizing Map) diagrams :

Feed forward neural network

 

SOM (Self organizing Map)

The using of NN in Social life

The concept of  renewing neural networks, and its function was to gather as much data for the purpose of using it in other processes. Which resemble neurons in their actions within the human brain when they learn a new method. Neural networks are used personally and not for practical application, from their use of forecasting data pollution of the measured data series with margin error. It also has an application in cell phones.

Taking a new approach to digitizing is a great challenge and a good sell. When updating a complete list of your favorite bands may tend to lose gravity after time, if you want good networking you need to make them understand our needs and not be less intelligent and less clear.

So  we should say that the future of social networks depends on the intelligence of management and focus on building new neural network technology to identify our potential.

Lest of  References

http://searchnetworking.techtarget.com/definition/neural-network

https://www.mathworks.com/products/neural-network/features.html

https://www.quora.com/What-are-the-various-types-of-neural-networks

http://www.fon.hum.uva.nl/praat/manual/Feedforward_neural_networks_1__What_is_a_feedforward_ne.html

http://www.pitt.edu/~is2470pb/Spring05/FinalProjects/Group1a/tutorial/som.html

https://www.intellisystem.it/en/introduction-to-deep-learning-on-social-networks-how-to-learn-even-more-about-your-personal-life-using-neural-networks-technology-en/

Content

2 The neutral network

2 Biological Neural Networks

2 The feature of neutral network

4

Comparison between the   Feedforward neural networks and SOM (Self organizing Map):

5 Self-Organizing Map

5 Reducing  Data Dimensions

7 The using of NN in Social life

8 Lest of  References

9 Content

About this essay:

If you use part of this page in your own work, you need to provide a citation, as follows:

Essay Sauce, Essay 2017 04 25 000Cm8. Available from:<https://www.essaysauce.com/sample-essays/essay-2017-04-25-000cm8/> [Accessed 22-04-26].

These Sample essays have been submitted to us by students in order to help you with your studies.

* This essay may have been previously published on EssaySauce.com and/or Essay.uk.com at an earlier date than indicated.