Você está na página 1de 5

RadialBasisFunctionFunctionalNetworksand

FunctionApproximation
*
WangDongdong
College of BasicScience
Qingdao Binhai University
Qingdao, P.R. China
dondonwang@126.com
Liang Li
Collegeof BasicScience
Qingdao Binhai University
Qingdao, P.R. China
Liangli2002@163.com
AbstractIn order to solve function approximation, a
mathematic model of Radial Basis Function Functional
Networks(RBFFN)isproposedandthelearningalgorithmfor
function approximation is presented. The algorithm uses the
lease square method and constructs auxiliary function by
Lagrange multiplier method, and the parameters oftheradial
basisfunctionfunctionalnetworksaredeterminedbysolvinga
systemoflinearequations.Andthen,RBFFNisgeneralized to
solve twovariable function approximation. Experiment
result illustrates the effectiveness of the radial basis function
functional networks in solving approximation problems of the
functionwithapole.
Index Termsradial basis function, functional networks,
learning algorithm, Lagrange multiplier method, function
approximation
I. INTRODUCTION
In 1998, Spanish scholar Enrique Castillo inspired by
functional equations in engineering applications, and put
forward the Functional Networks (FN) [1]. Functional
Networks is the general promotion of Artificial Neural
Networks. Like neural networks, it deals with general
functionalmodelsinsteadofsigmoidlikeones.Andinthese
networks there are no weights associated with the links
connectingneurons.In thismodel, theneuron functions are
not fixed, but can be learned. Usually, theneuron functions
are approximated by some basis functions ( such as
polynomials, triangle functions, etc.) or linear combination
of basis functions, we can choose some appropriate neuron
functions by the experience knowledge of approximating
functions, soit can be made the neuron functions choosing
more flexible. Functional Networks has been used
successfully in many fields, including nonlinear system
identification[2],chaotictimeseriespredication[3],solving
differential equation, difference equation and functional
equation [4], CAD, linear regression and nonlinear
regression [5], etc. These successful applications show that
functional networks can indeed solve many computation
problems [612]. Functional Networks is an effective
promotion of Neural Networks, because Functional
Networks not only can be solving the problems that can be
solved by Neural Networks, but also can be solving the
problemsthatcannotbesolvedbyNeuralNetworks[1316].
In reference [17], if function has extreme value, that is
function ) (x f is unbounded in the neighborhood of
0
x , or
A x f ) ( (
0
x , A is a constant.), its not easy with
polynomial approximation, and using radial basis function
approximationismoreappropriate[1822].So,wedesigna
mathematic model of Radial Basis Function Functional
Networks (RBFFN), that combine both the special function
approximation as the application background, and the
characteristics of general functional networks topological
structure. The output of RBFFN is
) (
) (

i
i
i
x g
x f
y = , ) (
i
x f and
) (
i
x g are linear combination of radial basis functions, and
theradial basis functions are arbitrarily given, for example,
polynomials, triangle functions, and Fourier expansion, and
soon.Inthispaper,we usestheleasesquaremethodthought
and constructs auxiliary function by Lagrange multiplier
method, and the parameters of the radial basis function
functional networks are determined by solving a system of
linear equations. Experiment result illustrates the
effectivenessofthe radialbasisfunctionfunctionalnetworks
in solving approximation problems of the function with a
pole.
II. THE MODELOFRBFFN
A. TopologicalStructureofGeneralFunctionalNetworks
Functionalnetworksareageneralizationneuralnetworks
that combine both knowledge about the structure of the
problem, to determine the architecture of the network, and
date,toestimatetheunknownfunctionalnetworks[5].Asit
canbeseeninFig. 1,afunctionalnetworkconsistsof
1) An input unit layer. The input unit layer is } {
i
x in
Fig.1,anditsfunctionistoinputinformation.
2) One or several layers of functional neurons. Every
functionalneuronisacomputerunit,whichmakesoutputby
processing one or more input information. In Fig. 1, there
aretwolayersoffunctionalcomputerunits, { }
2 1
,f f and{ }
3
f .
3) Several intermediate storage unit layers. These
intermediate storage unit layers are used for storing the
informationthatisproducedbyfunctionalneurons.InFig.1,
thereisonlyoneintermediatestorageunitlayer, { }
2 1
,
i i
x x .
4) One output layer. It containsthe outputinformation
thatis { }
i
y inFig.1.
5) A set of directed links. Arrows are used to connect
the input layer,intermediate layer and outputlayer, and the
direction of arrows is used to show the direction of
informationtransmissioninFig. 1.
*
This work is supported by National Science Foundation of China
(61165015) andNationalScienceFoundationofQingdaoBinhaiUniversity
(2011K10).
Figure1. Asingle inputandsingleoutputfunctionalnetworkstopological
structure
B. TheModelof RBFFN
Functional Networks as an effective promotion of the
Neural Networks, the same as Neural Networks, Functional
Networks has various structures. So we cannot give a
uniform structure to describe all Functional Networks, and
cannot use a uniform functional equation to describe all
Functional Networks. According to this characteristic of
Functional Networks, combined with the experience
knowledge of practical problems as a guide, we have
designedthestructureoftheradialbasisfunctionfunctional
networks.AsshowninFig.2.
Figure2. RBFFNforapproximatingfunctionofonevariable
The output of RBFFN is
) (
) (

i
i
i
x g
x f
y = , ) (
i
x f and ) (
i
x g
are linear combinations of radial basis functions. In other
words,
= + + + =
=
m
j
i j j i m m i i i
x a x a x a x a x f
1
2 2 1 1
) ( ) ( ) ( ) ( ) ( f f f f L , (1)
= + + + =
=
n
k
i k k i n n i i i
x b x b x b x b x g
1
2 2 1 1
) ( ) ( ) ( ) ( ) ( j j j j L . (2)
Sotheoutputof RBFFNis


=
+ + +
+ + +
= =
=
=
n
k
i k k
m
j
i j j
i n n i i
i m m i i
i
i
i
x b
x a
x b x b x b
x a x a x a
x g
x f
y
1
1
2 2 1 1
2 2 1 1
) (
) (
) ( ) ( ) (
) ( ) ( ) (
) (
) (

j
f
j j j
f f f
L
L
, (3)
Where } , , 2 , 1 | { m j
j
L = f and } , , 2 , 1 | { n k
k
L = j are
arbitrary radial basis functions. ) , , 2 , 1 ( , m j a
j
L = and
) , , 2 , 1 ( , n k b
k
L = are parametersofthe radialbasisfunction
functionalnetworks.
III. LEA RNINGALGORITHM OFRBFFN
The same as Neural Networks, radial basis function
functional networks need to learn too. However, the radial
basis function functional networks are not learning for
weights,butarelearningforstructureandparametersofthe
networks. Once the structure of radial basis function
functional networks has been determined, thenext learning
is neuron functions, and the objective of learning is to find
the exact expression or the approximate expression of
neuron functions. Usually, we have two methods for
learning the neuron factions there are exact learning and
approximate learning. We often use the approximate
learningmethod itis based onthe training data toevaluate
the neuron functions [20]. The basic method is to find a
linearcombinationofradialbasisfunctions,andtooptimize
the parameters of the linear combination. In this paper, we
usetheleasesquaremethodthought andLagrangemultiplier
methodtosolveparametersof RBFFN.
In Fig. 2, neuron functions of radial basis function
functional networks are linear combination of radial basis
functions that are given arbitrarily, such as (1) and (2). So,
the output of RBFFN is
) (
) (

i
i
i
x g
x f
y = , and the error cost
functioncanbedefinedas

) (
) (

i
i
i i i i
x g
x f
y y y e - = - = . (4)
Inordertofindtheoptimalnetworkparameters,weneed
tominimizetheerrorsumofsquares
- = =
= =
N
i
i i
N
i
i
y y e E
1
2
1
2
0
) ( . (5)
Where,N isthe numberoftrainingdata.
In orderto ensuretheuniqueness of thenetwork, we set
theinitialvalueofthenetworkfor.
b a = = ) ( , ) (
0 0
x g x f . (6)
Where
0
x is an arbitrary initial value, and are
arbitraryconstants,but0.
Thus,usingthe Lagrangemultipliermethodweconstruct
theauxiliaryfunction
) ) ( ( ) ) ( (
0
1
2
1
0 1 0
b j l a f l - + - + =
= =
x b x a E E
n
k
k k
m
j
j j
. (7)
Computing partial derivative about
2 1
, , , l l
k j
b a for (7),
wecanget

















= - =


= - =


= +




- =


= +


- - =


=
=
=
=
=
=
=
=
= =
=
n
k
k k
m
j
j j
p
i
k
i
n
k
k k
m
j
i k i j j
i k
n
k
k
m
j
i j j
i
k
N
i
j n
k
i k k
i j
n
k
i k k
m
j
i j j
i
j
x b
E
x a
E
x
x b
x x a
x b
x a
y
b
E
x
x b
x
x b
x a
y
a
E
1
0
2
1
0
1
1
0 2
2
1
1
1
1
1
0 1
1 1
1
0 ) (
0 ) (
0 ) (
)) ( (
) ( ) (
)
) (
) (
( 2
0 ) (
) (
) (
)
) (
) (
( 2
b j
l
a f
l
j l
j
j f
j
f
f l
j
f
j
f
. (8)
Arbitrary radial basis functions } , , 2 , 1 | { m j
j
L = f and
} , , 2 , 1 | { n k
k
L = j , we can get parameters
) , , 2 , 1 ( , m j a
j
L = and ) , , 2 , 1 ( , n k b
k
L = of theradial basis
functionfunctionalnetworksbysolving(8).
IV. GENERALIZATION OFRBFFN
RBFFN for approximating function of one variable is
given in and learning algorithm is given too in . In
i
x
) (
i
x f
) (
i
x g

f/g
i
y
2 i
x
1 i
x
i
x
) (
1 i
x f
) (
2 i
x f

3
f
i
y
order better to generalize RBFFN, we give the model for
approximating function of two variables, and give learning
algorithminthispart.
Themodelof RBFFNforapproximatingfunctionwhich
hastwovariablesisshowninFig. 3.
Figure3. RBFFNforapproximatingfunctionoftwo variables
Theoutputof RBFFNis


=
+ + +
+ + +
= =
=
=
n
k
i k k
m
j
i j j
i n n i i
i m m i i
i
i
i
x b
x a
x b x b x b
x a x a x a
x g
x f
y
1
2
1
1
2 2 2 2 2 1 1
1 1 2 2 1 1 1
2
1
) (
) (
) ( ) ( ) (
) ( ) ( ) (
) (
) (

j
f
j j j
f f f
L
L

In orderto ensuretheuniqueness of thenetwork, we set


theinitialvalueofthenetworkfor.
b a = = ) ( , ) (
20 10
x g x f .
Where
10
x and
20
x are arbitrary initial values, and
arearbitraryconstants,but0.
Weconstructtheauxiliaryfunction
) ) ( ( ) ) ( (
20
1
2
1
10 1 0
b j l a f l - + - + =
= =
x b x a E E
n
k
k k
m
j
j j
. (9)
Computing partial derivative about
2 1
, , , l l
k j
b a for (9),
wecanget

















= - =


= - =


= +




- =


= +


- - =


=
=
=
=
=
=
=
=
= =
=
n
k
k k
m
j
j j
p
i
k
i
n
k
k k
m
j
i k i j j
i k
n
k
k
m
j
i j j
i
k
N
i
j n
k
i k k
i j
n
k
i k k
m
j
i j j
i
j
x b
E
x a
E
x
x b
x x a
x b
x a
y
b
E
x
x b
x
x b
x a
y
a
E
1
20
2
1
10
1
1
20 2
2
2
1
1
2 1
2
1
1
1
1
10 1
1
2
1
1
2
1
1
0 ) (
0 ) (
0 ) (
)) ( (
) ( ) (
)
) (
) (
( 2
0 ) (
) (
) (
)
) (
) (
( 2
b j
l
a f
l
j l
j
j f
j
f
f l
j
f
j
f
.
(10)
The same as Part , we can get parameters
) , , 2 , 1 ( , m j a
j
L = and ) , , 2 , 1 ( , n k b
k
L = of the radial basis
functionfunctionalnetworksbysolving(10).
V. EXPERIMENTALRESULTS
In orderto verify the effectiveness and efficiency of the
radial basis function functional networks, we take three
examples to conduct the experiments. To estimate the
performanceofapproximation,theRootMeanSquareError
(RMSE)isdefinedas
- =
=
N
i
i i
y y
N
RMSE
1
2
) (
1
.
Theexperimentalsimulationplatform:Operatingsystem:
Windows7, CPU: Core i3370, Frequency: 2.40GHz, RAM:
2GB,Integrateddevelopmentenvironment Matlab7.0.
Example1 ) cos(sin x y= , ] , 0 [ p x .
Thetrainingdata are givenrandomlyinTable .
TABLEI. TRAINING DATA
) , (
i i
y x ) , (
i i
y x
(1.9 005,0.5 848) (0.5 350,0.8 728)
(2.0 719,0.6 394) (1.6 952,0.5 468)
(0.5 761,0.8 553) (1.9 585,0.6 012)
(1.9 998,0.6 142) (2.1 548,0.6 717)
(2.1 279,0.6 609) (2.4 475,0.8 023)
(2.7 546,0.9 296) (0.9 654,0.6 806)
(0.0 405,0.9 992) (2.9 112,0.9 740)
(0.9 752,0.6 765) (2.1 323,0.6 626)
(0.6 725,0.8 122) (2.3 676,0.7 655)
(1.8 916,0.5 825) (2.0 723,0.6 396)
According to the basic idea of this algorithm, we use
different radial basis functions to approximate neuron
factions base on the model in Fig. 2. The experimental
resultsareinTable .
TABLEII. EXPERIMENTAL RESULTS
Model
3 , 2 , 1 , = j
j
f 3 , 2 , 1 , = k
k
j RMSE
1 {1,sinx,cosx} {1,sinx,cosx} 0
2 {1,x} {1,x,x
2
} 1.257510
7
3 {1,x,sinx} {1,cosx,e
x
} 2.001210
9
The RMSE of model 1 is the smallest, and
approximatingfunctionareshowninFig.4.
0 0.5 1 1. 5 2 2.5 3
0.5
0.55
0.6
0.65
0.7
0.75
0.8
0.85
0.9
0.95
1
ObjectiveFunction
ApproximatingFunct ion
Figure4. Objectivefunctionandapproximatingfunction
Example2
x
y
sin
1
= , ) 1 , 0 ( x .
Thetrainingdata are givenrandomlyinTable .
x
1i

f(x
1i
)

x
2i

g(x
2i
)

f/g
i
y
TABLEIII. TRAINING DATA
) , (
i i
y x ) , (
i i
y x
(0.4 235,2.4 334) (0.6 405,1.6 733)
(0.5 155,2.0 285) (0.2 091,4.8 181)
(0.3 340,3.0 508) (0.3 798,2.6 972)
(0.4 329,2.3 837) (0.7 833,1.4 171)
(0.2 259,4.4 636) (0.6 808,1.5 887)
(0.5 798,1.8 253) (0.4 611,2.2 475)
(0.7 604,1.4 510) (0.5 678,1.8 594)
(0.5 298,1.9 787) (0.7 942,1.4 019)
(0.4 692,2.2 114) (0.9 883,1.1 974)
(0.0 648,15.4 474) (0.5 828,1.8 170)
According to the basic idea of this algorithm, we use
different radial basis functions to approximate neuron
factions base on the model in Fig. 2. The experimental
resultsareinTable .
The RMSE of model 3 is the smallest, and
approximatingfunctionareshowninFig. 5.
TABLEIV. EXPERIMENTAL RESULTS
Model
3 , 2 , 1 , = j
j
f 3 , 2 , 1 , = k
k
j RMSE
1 {1, sinx,cosx} {1, sinx,cosx} 2.760110
9
2 {1, x} {1, x, x
2
} 1.0 75010
7
3 {1, log(2+x), log(3+x)} {1, log(2+x)} 1.001010
12
Figure5. Objectivefunctionandapproximatingfunction
Example3HnonFunctionis
2
2
1
3 . 0 4 . 1 1
- -
+ - =
n n n
x x x .
The results of approximation based on Castillo method
areshowninTablewhentheinitialvaluex
0
is0.5andx
1
is0.5[23].
TABLEV. EXPERIMENTAL RESULTS BASED ON CASTILLO
Model
2 , 1 , = j
j
f
3
f RMSE
1 {1, x, x
2
} 0
2 {sinx,sin2x,cosx,cos2x} {1, x} 0.005 93219
3 {sinx,sin2x,cosx,cos2x} 0.00337892
4 {1, log(2+x), log(3+x), log(4+x),log(5+x)} 0.00001598
According to the basic idea of this algorithm, we use
different radial basis functions to approximate neuron
factions base on the model in Fig. 3. The experimental
resultsareinTable .
TABLEVI. EXPERIMENTAL RESULTS
Model
3 , 2 , 1 , = j
j
f 3 , 2 , 1 , = k
k
j RMSE
1 {1, x, x
2
, x
3
} {1, x, x
2
} 1.106110
6
2 {1, x,2x
2
1,x
3
} {1, sinx,cosx
2
} 1.0 93110
7
3 {1, sinx,cosx} {1,log(2+x)} 1.9 01610
6
Although all the RMSE have not achieved zero, but the
RMSEarebetterthanmodel2tomodel4inCastillomethod.
The RMSE of model 2 is the smallest, and approximating
functionareshowninFig. 6.
0 10 20 30 40 50 60 70 80 90 100
2
1.5
1
0.5
0
0.5
1
1.5
2
ApproximatingFunct ion
ObjectiveFunction
Figure6. Objectivefunctionandapproximatingfunction
VI. CONCLUSIONS
Inthispaper,amathematicmodelofRBFFNisproposed
and the learning algorithm for function approximation is
presented.The parameters of thenetworkare determined by
solving a system of linear equations. Experiment result
illustrates the effectiveness of the radial basis function
functional networks in solving approximation problems of
the function with a pole. But the functional networks have
the flexibility, so a quantitative description for the
approximationoffunctionalnetworksisdifficulttobegiven.
We can choose from many types of functional networks for
the same problems, and it is impossible to express by a
uniform structure. But, if the choice of functional networks
structureandthenumberofneuronfunctionscombinedwith
experience knowledge of original problem, the
approximation speed and approximation accuracy will be
improved.
ACKNOWLEDGMENT
Thanks to Zhou Yongquan, working in Guangxi
University for Nationalities of Nanning, who gave a great
helpforusinlearningrealization.
REFERENCES
[1] E.Castillo, Functional networks, Neural Processing Letters, Vol.7,
pp.151159, July 1998.
[2] Li Chunguang, Liao Xianfeng, He Songbai and Yu Juebang,
Functional network method for the identification of nonlinear
systems, Systems Engineering and Electronics, Vol.23, No.11,
pp.5053,November 2001.
[3] E.Castillo and J.M.Gutierrez, Nonlinear time series modeling and
prediction using functional networks, Extracting information masked
0 0.
1
0.
2
0.
3
0.
4
0.
5
0.
6
0.
7
0.
8
0.
9
1
0
100
200
300
400
500
600
ApproximatingFunction
ObjectiveFunction
by chaos, Physical Letters. A, Vol. 24, No.2, pp.7184, February
1998.
[4] E.Castillo,A.Cobo and J.M.Gutirrez, Working with differential,
functional and difference equations using functional networks,
AppliedMathematicalModeling,Vol.23,pp.89107,February 1999.
[5] E.Castillo,A.Cobo and J. M.Gutirrez, Functional networks with
applications,KluwerAcademicPublishers,Boston,1999.
[6] P.Shankar,R.K.YedavalliandJ.J.Burken,Selforganizingradialbasis
function networks for adaptive flight control,Journal of Guidance,
Control,andDynamics,Vol.34,No.3,pp.783794,MayJune2011.
[7] Wang Renhong and Zhu Gongqin, Rational function approximation
andapplications,SciencePublishers,Beijing,2002.
[8] J.R.Pontes, A.P.D.Paiva, P.P.Balestrassi, J.R.Ferreira and
M.B.D.Silva, Optimization of radial basis funtion neural nework
employed forprediction of surfaceroughnessin hardturning process
using Taguchis orthogonal arrays, Expert Systems with
Applications,Vol.39,No.9,pp.77767787,July2011.
[9] M. Jalili and M.G. Knyazeva, EEGbased functional networks in
schizophrenia, Computers in Biology and Medicine, Vol.41, No.12,
pp.11781186,December2011.
[10] A.Khouki,M.Oloso,M.Elshafei,A.AbdulraheemandA.AlMajed,
Support vertor regression and functional networks for viscosity and
GAS/oil ratio curves estimation, International Journal of
Computational Intelligence and Applications, Vol.10, No.3, pp.269
293,September2011.
[11] A.A.Schuppert,Efficientreengineeringofmesoscaletopologiesfor
functional networks in biomedical applications, Journal of
MathematicsinIndustry,Vol.1,No.1,pp.120,2011.
[12] H.U.Voss, L.A.Heier and N.D.Schiff, Multimodal imaging of
recovery of functional networks associated with reversal of
paradoxical herniation after cranioplasty, Clinical Imaging, Vol.35,
No.4,pp.253258,August2011.
[13] E.A.ElSebakhy,Funtionalnetworksasanoveldataminingparadigm
in forecasting software development efforts, Expert Systems with
Applications,Vol.38,No.3,pp.21872194,March2011.
[14] N.T.Tai and K.K.Ahn,A hysteresis funtional link artificial neural
network for identification and model predictive control of SMA
actuator,JournalofProcessControl,Vol.22,No.4,pp.766777,April
2012.
[15] S. Tomasiello,A functional network to predict fresh and hardened
propertids of self_compacting concretes, International Journal for
NumericalMethodsinBiomedicalEngineering,Vol.27,No.6,pp.840
847,June2011.
[16] S.Y.S.Leung, Tang Yang and W.K.Wong,A hybird particle
optimization and itsapplication in neural networks, Expert Systems
withApplications,Vol.39,No.1,pp.395405,January2012.
[17] K.S.Narendra and K.Parthasarathy, Identification and control of
dynamic systems using neural networks, In: IEEE Trans. Neural
Networks,1990, pp.4~27.
[18] H. Pomares, I. Rojas, M. Awad and O. Valenzuela,An ehanced
clusteringfunctionapproximationtechniqueforaradialbasisfunction
neural network, Mathematical and Computer Modelling,
Vol.55,No.34,pp.286302,February2012.
[19] L.J. Herrera, H. Pomares, I.Rojas, A.Guillen and O.Valenzuela,The
TaSeNF model for function approximation problems: Approaching
localand global modelling,Fuzzy Setsand Systems, Vol.171,No.1,
pp.121,May2011.
[20] E.F.Arruda, M.D.Fragoso and J.B.R.Do Val,Approximate dynamic
programming via direct search in the space of value function
approximations,EuropeanJournalofOperationalResearch,Vol.211,
No.2,pp.343351,June2011.
[21] Zhou Yongquan and Jiao Licheng, Onevariable interpolation
function based on functional networks, International Journal of
InformationTechnology,Vol.12,No.2,pp.120129,May2006.
[22] Zhou Yongquan and Jiao Licheng, Approximate factorization
learning algorithm of multivariate polynomials based on functional
networks,JournalofInformationandComputationalScience,Vol.2,
No.1,pp.205210,June2005.
[23] Zhou Yongquan, Lv Yongmei and Shen Yun, Orthogonal function
network approximate theory and learning algorithm, Computer
Science,Vol.36,No.1,pp.138141,January2009.

Você também pode gostar