Você está na página 1de 5

Modelling and Simulation of a Neuron and a Neuron Network using a Visual HDL

Kate R Worland, John D Zakis and Brian J Lithgow

ABSTRACT This work used the Summit Visual HDL for VHDL a Visual Hardware Description package which compiles flowcharts, state diagrams, truth tables etc into synthesisable VHDL (VHISIC Hardware Description Language). We used this design environment to design and simulate a digital model of a single neuron. This neuron models the superficial input/output characteristics of a biological neuron, producing an action-potential-like waveform when inputs to the cell sum to reach a pre-determined threshold. It was shown that by connecting a number of these digital neuron models in a network configuration with appropriate input weighting, it was possible to emulate the function of a simple feed-forward neural network. This finding is important, because previous work has suggested that artificial neurones that are more biological in their function have the potential to form neuron networks that are more robust and noise-tolerant than conventional artificial neural networks.

INTRODUCTION This project aimed to produce a model of a neuron ultimately implementable in silicon. The motivation for this is twofold. Firstly, a neuron chip has the potential to provide a more realistic and faster method for simulating the physiological properties of neurones than traditional modelling methods. Secondly, the development of such a model could be a useful new development in the field of artificial neural networks and neural computing. Thus, we need to have substantial grounding in two very broad areas, both of which are detailed and indepth disciplines in their own right. The first is the field of neural modelling that is the quest to produce an accurate and useful model of a single neuron or a group of neurones in order to further understand the working of the nervous system. The second is the field of Artificial Neural Networks (ANNs) which are used in computation, often in research into artificial intelligence. The style of computing is based not on a sequential, algorithmic paradigm, but rather on a dynamic network of units that can be trained (more like a physiological system of neurones). THE BIOLOGICAL NEURON

The neuron as depicted in Figure 1, receives inputs from other cells at the dendrites in the form of excitatory or inhibitory post-synaptic potentials. These small, transient membrane depolarisations propagate passively to the cell body, where if they sum to reach threshold, an action potential as shown in Figure 2, is generated [1]. This all-or-nothing electrical pulse is conducted at constant speed and amplitude down the axon, to the terminal endings (synapses) that pass the signal to the target cells.

Figure 2. Action Potential example. A typical action potential waveform recorded at a particular point on the axon over time. The initial membrane voltage of 70mV is the resting membrane potential. Excitation causes the membrane potential to reach threshold of 55mV, resulting in sudden, massive depolarisation of the cell. Rapid repolarisation and a period of hyperpolarisaion follow this, where the membrane voltage is actually more negative than resting membrane potential. Over a period of a few milliseconds, membrane voltage returns to its resting level. Such a pulse propagates along the axon, each depolarisation causing the neighbouring

Figure 1

section of membrane to reach threshold. The refractory nature of the membrane ensures that the signal only travels in the forward direction. THE MODEL We took a functional or black box approach to neuron modelling, with the model based on the fundamental input/output characteristics of a biological neuron. The nature of the model was influenced by the tools used to create it. The model is digital, with inputs and outputs (as well as internal signals) represented as eight-bit digital signals. Model behaviour is object oriented and concurrent. In other words, the design is made up of autonomous units representing summing units, weighting functions and an action potential generation unit. Units perform their op-

eration concurrently, with internal operations occurring only when there is a change in unit inputs. The model accepts a number of synaptic inputs, and produces an action-potential-like pulse output if the sums of the inputs at any particular time reach threshold. The model was implemented using Summit Visual HDL, a tool used to design, simulate and synthesise complex integrated circuit designs for implementation in silicon. Block diagrams, asynchronous state machines and VHDL code were used to implement the desired structure and behaviour of the model. The design is hierarchal as depicted in Figure 3, which shows the layered nature of the Summit HDL design, with a top level block diagram hiding the complex operation of individual components. System elements are described in terms of their behaviour as block diagrams, state diagrams and VHDL code Signals take the form of 8-bit signed integers, scaled and shifted so as to represent a biological range of values. Like a biological neuron, the model adds a number of weighted inputs (four in this model), and outputs an action potential-like waveform if the sum reaches threshold. The top level of the design describes the neuron model in the barest manner, simply defining its input and output signals. The model is synchronous and hence requires a clock signal. There are four other inputs to the cell, representing synaptic connections to the neuron. The inputs can take any values, but were originally selected to allow two excitatory and two inhibitory inputs, grouped into two sets. These assignments are purely arbitrary the signals do not have to take the form of post-synaptic potentials, and the weights can be set to any desired value. The other four inputs are of a type defined in VHDL as std_logic_vector. The type std_logic is an enhanced binary signal that can, as well as logical zero and one, take other values such as u (uninitialised), x (unknown), z (high impedance) etc. These inputs are processed within a block called Soma, which produces an output sequence (also of type std_logic_vector) called to_axon. This represents the signal that is transmitted down the axon. The next layer adds the four input signals together and feeds them to the unit designed to generate an action potential. The units

Figure 3

adder1, adder2 and adder3 are basically identical eight-bit carry-lookahead adders. The next layer shows the implementation of a carry-lookahead adder (CLA). The block adder8bit_1 hides a (flat) unit of VHDL code. The blocks adder2 and adder3 in the block diagram Top.Soma.Sum contain identical units. The action potential is generated in the ap_generate block and was implemented using a state diagram. State diagrams are a form of behavioural description available in the Summit Visual design environment. A state diagram details the states that the system may reside in, and the transitions between states. Transit between states is governed by particular guard conditions. For these reasons, state diagrams are a convenient means of representing sequential control. MODEL VERIFICATION It was shown that for simple sets of inputs up to four excitatory or inhibitory input waveforms the model behaved as desired. In the example shown, inputs consisted of a weighted excitatory and an inhibitory post-synaptic potential. In order to simplify calculations, it was signals were scaled so that resting membrane potential (70mV in this model) was represented by binary vector 0000 0000. Twos compliment binary representation was used, hence the allowed range using an eight-bit std_logic_vector was 128 to +127. A resolution of 1mV per scaled unit was chosen, hence the range of membrane potentials able to be represented was 198mV to 57mV. The maximum potential normally experienced by a nerve membrane is that generated during an action potential (55mV in the case of this model), hence this range is adequate, though not generous. The scale used is shown in Table 1. Membrane Shifted scale potential (mV) 8 bit 57 127 0 70 -55 15 -70 0 -198 -128 Table 1.

the presence of no excitation. The shifted scale column gives the actual output of the model in decimal notation of the signed 8 bit integer values. Signals typical of excitatory and inhibitory postsynaptic potentials were used to test the response of this artificial neuron to biological-like stimuli. These signals, obtained from the work of Curtis and Eccles [2] and were sampled at 0.0625 ms intervals, scaled (as described above) so that RMP (70 mV) was set to zero, and represented using eight-bit twos compliment std_log_vectors. The model was tested with various combinations of excitatory and inhibitory signals, however in biological neurones, different synaptic connections have different synaptic strengths, related to (among other factors) the distance of the synapse from the cell body. We model this characteristic by introducing weighted signals as shown in Table 2. These are implemented within the units weighting1 weighting4 and take the form of a simple integer multiplication. EDendritic IDendritic ECellBody ICellBody epsp 3 ipsp 2 epsp 2 ipsp

Table 2 As can be seen in Figure 4, reduction in weight of the inhibitory signal means that the excitation dominates, allowing the signal S0 to reach threshold, and an action potential to be initiated. The neuron model thus performs as designed as a summing unit with threshold that produces biological-style waveform closely resembling a biological action potential. Maass and Natschlager [3] suggest that Networks of more biological cell models have the potential to be more robust than some conventional neural nets.
140 120 100
weighted excitatory signal weighted inhibitory signal S0 to_axon

signal

80 60 40 20 0 0 0.25 0.5 0.75 1 1.25 1.5 1.75 2 2.25 2.5 2.75 3 3.25 3.5

Max 0mV Threshold RMP Min

-20 -40

threshold

Time [ms]

Figure 4. Max and Min represent the maximum and minimum membrane potentials able to be represented by the model. Threshold refers to the cells chosen threshold of excitation. RMP stands for resting membrane potential the membrane potential in The time axis gives equivalent biological time. The model actually executes much more rapidly, with each point of the output corresponding to a single clock cycle of the simulation.

A NEURONE NETWORK Having established that the model like a biological neurone, the next step is to test the behaviour of a network of these artificial neurones. This was done by first formulating a simple network using traditional neural computing methods, and then attempting to realise a similar network using the model developed in this project. Three neurones were incorporated into a structure designed to emulate a simple feed-forward neural network as shown in Figure 5. The conventional ANN (Artificial Neural Network) called OddNet was trained to accept a four-bit binary input, providing an output of zero when the input vector is odd, and one when the input vector is even.

whereas the inputs and outputs of the neurones described here are waveforms that vary over time. Secondly, the activation functions used differ also. Nonetheless, there are no immediately evident reasons why a network of similar weights and bias created using a more biological style of artificial neurone should not work. The waveforms seen in Figure 6 are taken from the Summit visual simulation window. There are three input waveforms displayed in hexadecimal format for each clock cycle. This produces an action potential as required for an odd number of inputs.

Figure 6 The salient waveforms from the simulation window have been converted to a graphical representation using Excel, and shown in Figure 7. This is a much more intuitive representation, which now displays the shape of the action potential. The horizontal axis is in simulation clock cycles. If this neurone network was built in silicon then 50 clock cycles might be one microsecond using a very slow PLD.

140 120

Figure 5. Simple 4-bit in - 1-bit out functions could be implemented using this configuration, with a bit of information in a conventional neural net corresponding to an action potential in this neurone network. Additional synapse blocks were included in order to transform an action potential output from one digital neurone into a post-synaptic potential input for the next. An extra offset input for each digital neurone was also required. Appropriate weights to implement simple functions such as an odd-number detector, or a prime number detector were obtained using conventional neural net modelling software. The ANNs used in these simple networks differ from the neurones designed in this project, in a number of obvious ways. Firstly, input to cells in OddNet consists of single bit binary signals,

100 80
bias, S0-S2 S3 Out4

Singal

60 40 20 0 -20 -40 0 5 10 15 20 25 30 35 40 45 50 55

Clock cycles

Figure 7 In the nervous system, action potentials are all-ornothing, invariant events. The wave shape of an action potential generally does not carry information. Rather it is the rate of firing or spatio-temporal patterns amongst groups of neurones that are important in coding information about the signal stimulus. Therefore it is possible that our neurone network can identify odd and even sequences with skewed arrival times as well as simultaneous inputs.

In showing that a network of digital neurones can emulate a simple neural net, we have quite an important result. In line with the findings of Maass and Natschlager (1997) we can conclude that conventional neural nets can be made more biological without a discernible loss of function. The network demonstrated here models very closely the structure and method of execution of the net from which it was taken, taking into account the unique characteristics of the neurone used to implement it. The interpretation of the operation of the neurone net was based on simple binary logic. Either the net fired, or it didnt. This neglects an important factor regarding both biological neurone networks, and indeed, the network shown here: timing. It is almost certain that considerations of real time are important in the way signals are coded in the nervous system, be it in terms of firing rate, or the timing of firing patterns [4]. Timing clearly has a role to play in the neurone net with trains of signals being input. If we plan to create more realistic neurone nets, it is important to begin to think about how temporal characteristics might be taken into account. A number of approaches to integrating temporal characteristics into neural networks have been tackled. It is possible to translate patterns of timing into spatial patterns by means of some kind of buffering and pre-processing. This has been shown to be effective, but is quite un-biological in its structure and assumptions [5]. Other options include using back connections (like the Hopfield net designed by Maass & Natshlager.) allowing the network to create a signal sequence. Another method that has been used with some success is based on the biological phenomenon of long and short term post-synaptic potentials. In many neurones there are a number of kinds of postsynaptic potentials observed, some persisting for a period of minutes, rather than milliseconds. These are the long-term post-synaptic potentials. Kleinfeld and Sompolinsky (1989) developed a network model featuring the unique attribute of each connection between neurones being encoded as two connections: one with a much longer response time than the other [6]. This network was able to generate both linear and cyclic temporal patterns. CONCLUSIONS

nets using more biological neurones without a discernible loss of function is an important result. The software used is ideal for both simulation and synthesis, thus this work suggests the possibility of a more intuitively modular paradigm for neural computing, as well as future possibilities for intelligent hardware created using integrated circuit neurone networks. The most exciting projection of this work is in the field of neurone networks. This and other recent work has shown that not only are models of biological neurones able to perform effectively using conventional neural computing paradigms, but they have the potential to be more effective in terms of robustness and noise tolerance. As new, more organic methods of computing become more and more utilised, especially in the field of artificial intelligence, digital neurone networks offer the potential for an entirely new paradigm of computation one using digital technology to realise methods of computation based on the structures of the human mind. Implementation of many innovative network schemes using digital neurones is well within the realms of possibility once the temporal behaviour of these neurones and networks of neurones is better understood. REFERENCES 1. Horackova, M., Nonner, W. & Stampfli, R. (1968). Action potentials and voltage clamp currents of single rat Ranvier nodes. In XXIV International Congress on Physiological Science, pp. 198. Curtis & Eccles, (1959). Repetitive synaptic activation. Journal of Physiology 148: 43-44. Maass, W. & Natschlager, T. (1997). Networks of spiking neurons can emulate arbitrary Hopfield nets in temporal coding. Network : Computation in Neural Systems 8: 355-71. Nicholls, J. G., Martin, R. A., Wallace, B. G.. From Neuron to Brain: A Cellular and Molecular Approach to the Function of the Nervous System 3rd ed. Sinauer Associates, Inc. 1992. Simpson, P. K., & Deitch, R. O. (1989). Neural networks, fuzzy logic and acoustic pattern generation. AAAIC 88. Keinfeld, D. & Sompolinsky, H. (1989). Associative network models for central pattern generators, in Koch, C. & Segev, I. (eds), Methods in Neuronal Modelling. Cambridge University Press, 1989. Chapter 7, pp.195-246.

2.

3.

4.

5.

6. As far as can be ascertained, the idea of a synthesisable digital neurone is a novel one, and so this work is more in the nature of a preliminary qualitative exploration than an in-depth study. Future work should examine the temporal characteristics of both the single neurone model and the network. The successful emulation of conventional neural

Você também pode gostar