American library books Β» Other Β» Data Mining by Mehmed Kantardzic (good book recommendations TXT) πŸ“•

Read book online Β«Data Mining by Mehmed Kantardzic (good book recommendations TXT) πŸ“•Β».   Author   -   Mehmed Kantardzic



1 ... 74 75 76 77 78 79 80 81 82 ... 193
Go to page:
in its environmental conditions. Moreover, when it is operating in a nonstationary environment, an ANN can be designed to adopt its parameters in real time.

4. Evidential Response. In the context of data classification, an ANN can be designed to provide information not only about which particular class to select for a given sample, but also about confidence in the decision made. This latter information may be used to reject ambiguous data, should they arise, and thereby improve the classification performance or performances of the other tasks modeled by the network.

5. Fault Tolerance. An ANN has the potential to be inherently fault-tolerant, or capable of robust computation. Its performances do not degrade significantly under adverse operating conditions such as disconnection of neurons, and noisy or missing data. There is some empirical evidence for robust computation, but usually it is uncontrolled.

6. Uniformity of Analysis and Design. Basically, ANNs enjoy universality as information processors. The same principles, notation, and steps in methodology are used in all domains involving application of ANNs.

To explain a classification of different types of ANNs and their basic principles it is necessary to introduce an elementary component of every ANN. This simple processing unit is called an artificial neuron.

7.1 MODEL OF AN ARTIFICIAL NEURON

An artificial neuron is an information-processing unit that is fundamental to the operation of an ANN. The block diagram (Fig. 7.1), which is a model of an artificial neuron, shows that it consists of three basic elements:

1. A set of connecting links from different inputs xi (or synapses), each of which is characterized by a weight or strength wki. The first index refers to the neuron in question and the second index refers to the input of the synapse to which the weight refers. In general, the weights of an artificial neuron may lie in a range that includes negative as well as positive values.

2. An adder for summing the input signals xi weighted by the respective synaptic strengths wki. The operation described here constitutes a linear combiner.

3. An activation function f for limiting the amplitude of the output yk of a neuron.

Figure 7.1. Model of an artificial neuron.

The model of the neuron given in Figure 7.1 also includes an externally applied bias, denoted by bk. The bias has the effect of increasing or lowering the net input of the activation function, depending on whether it is positive or negative.

In mathematical terms, an artificial neuron is an abstract model of a natural neuron, and its processing capabilities are formalized using the following notation. First, there are several inputs xi, i = 1, … , m. Each input xi is multiplied by the corresponding weight wki where k is the index of a given neuron in an ANN. The weights simulate the biological synaptic strengths in a natural neuron. The weighted sum of products xi wki for i = 1, … , m is usually denoted as net in the ANN literature:

Using adopted notation for wk0 = bk and default input x0 = 1, a new uniform version of net summation will be

The same sum can be expressed in vector notation as a scalar product of two m-dimensional vectors:

where X = {x0, x1, x2, … , xm} W = {wk0, wk1, wk2, … , wkm}

Finally, an artificial neuron computes the output yk as a certain function of netk value:

The function f is called the activation function. Various forms of activation functions can be defined. Some commonly used activation functions are given in Table 7.1.

TABLE 7.1. A Neuron’s Common Activation FunctionsActivation FunctionInput/Output RelationGraphHard limitSymmetrical hard limitLinearSaturating linearSymmetric saturating linearLog-sigmoidHyperbolic tangent sigmoid

Now, when we introduce the basic components of an artificial neuron and its functionality, we can analyze all the processing phases in a single neuron. For example, for the neuron with three inputs and one output, the corresponding input values, weight factors, and bias are given in Figure 7.2a. It is necessary to find the output y for different activation functions such as symmetrical hard limit, saturating linear, and log-sigmoid.

1. Symmetrical hard limit

2. Saturating linear

3. Log-sigmoid

Figure 7.2. Examples of artificial neurons and their interconnections. (a) A single node; (b) three interconnected nodes.

The basic principles of computation for one node may be extended for an ANN with several nodes even if they are in different layers, as given in Figure 7.2b. Suppose that for the given configuration of three nodes all bias values are equal to 0 and activation functions for all nodes are symmetric saturating linear. What is the final output y3 from the node 3?

The processing of input data is layered. In the first step, the neural network performs the computation for nodes 1 and 2 that are in the first layer:

Outputs y1 and y2 from the first-layer nodes are inputs for node 3 in the second layer:

As we can see from the previous examples, the processing steps at the node level are very simple. In highly connected networks of artificial neurons, computational tasks are multiplied with an increase in the number of nodes. The complexity of processing depends on the ANN architecture.

7.2 ARCHITECTURES OF ANNS

The architecture of an ANN is defined by the characteristics of a node and the characteristics of the node’s connectivity in the network. The basic characteristics of a single node have been given in a previous section and in this section the parameters of connectivity will be introduced. Typically, network architecture is specified by the number of inputs to the network, the number of outputs, the total number of elementary nodes that are usually equal processing elements for the entire network, and their organization and interconnections. Neural networks are generally classified into two categories

1 ... 74 75 76 77 78 79 80 81 82 ... 193
Go to page:

Free e-book: Β«Data Mining by Mehmed Kantardzic (good book recommendations TXT) πŸ“•Β»   -   read online now on website american library books (americanlibrarybooks.com)

Comments (0)

There are no comments yet. You can be the first!
Add a comment