Você está na página 1de 4

ARTICLE

pubs.acs.org/jchemeduc

Periodic Table of the Elements in the Perspective of Artificial Neural


Networks
Maurcio R. Lemes*, and Arnaldo Dal Pino

Faculdade Anhanguera de Taubate, Engenharia, Av. Charles Schnneider, 585, Parque Senhor Bonm, Taubate, S~ao Paulo 12062350,
Brazil

Instituto Tecnologico de Aeronautica, Prac-a Mal-do-Ar Eduardo Gomes 50 Vila das Acacias, Sao Jose dos Campos, SP 12228-900, Brazil
ABSTRACT: Although several chemical elements were not known by end of the 19th
century, Mendeleev came up with an astonishing achievement, the periodic table of
elements. He was not only able to predict the existence of (then) new elements, but also
to provide accurate estimates of their chemical and physical properties. This is a profound
example of the human intelligence. Here, we try to shed some light on the following
question: Can an articial intelligence system yield a classication of the elements that
resembles, in some sense, the periodic table? To achieve our goal, we have used a selforganized map (SOM) with information available at Mendeleevs time. Our results show
that similar elements tend to form individual clusters. Thus, although SOM generates
clusters of halogens, alkaline metals, and transition metals that show a similarity with the
periodic table of elements, the SOM did not achieve the sophistication that Mendeleev achieved.
KEYWORDS: General Public, Graduate Education/Research, Interdisciplinary/Multidisciplinary, Physical Chemistry, ComputerBased Learning, Atomic Properties/Structure, Chemometrics, Periodicity/Periodic Table, Physical Properties

n 1869 Mendeleev1 presented the periodic law of the elements


to the scientic community. Mendeleev knew the existence
and some properties of about 60 elements. For the vast majority
of these elements, his knowledge was restricted to atomic weight,
reaction of the element with oxygen, atomic radius, and melting
point.2 He had so much condence in his discovery that he left
empty positions in his table. These spaces were dedicated to
those elements that, according to him, would still have to be
discovered. If one takes into consideration the limited information available, the table developed by Mendeleev deserves the
greatest admiration.
At that time, scientists knew nothing about the atomic
structure and atomic numbers that are used in the organization
of the elements of the current table. Over 40 years later, in 1913,
Mosely established the concept of atomic number.3 This discovery, however, provoked only minor rearrangements in the
classication of the elements created by Mendeleev. Possibly, the
biggest triumph of the periodic table of the elements was to
foresee the existence and properties of unknown elements at its
time. For example, Mendeleev not only claimed the existence of
the element eka-silicon, today known as germanium, but also
inferred its properties and reactions with chlorine and oxygen
with considerable precision.
The periodic table identies similarities between two or more
elements and arranges them under the format of periods and
groups. The intervals in which these similarities repeated consistently related to the atomic number. In the table, the elements
are arranged horizontally, in numerical sequence, according to
their atomic numbers, thus, giving rise to the appearance of seven
horizontal lines (or periods). Each period, with the exception of
the rst one, starts with a metal and nishes with a noble gas. The
Copyright r 2011 American Chemical Society and
Division of Chemical Education, Inc.

length of a period diers, ranging from a minimum of 2 elements


to a maximum of 32. The vertical lines are formed by elements
whose external electronic structures are similar. These columns
are called groups. In some of them, the elements are so closely
related that they are called families. For example, group 2 is the
family of alkali earth metals (beryllium, magnesium, calcium,
strontium, barium, and radium).
Such great success of human intelligence yields a fertile eld
for exploring the capacity of articial intelligent systems to
produce similar results. Kohonen networks,4 self-organized
maps, and other techniques have been commonly used in
classication eorts, such as in silicon clusters, spectrometry,
modeling, optimization, chemical problems511 and others.1215
The goal of this article is to investigate the capacity of an
intelligent articial system to classify chemical elements. To this
end, a Kohonen network (KN) is supplied with the information
known by the end of the 19th century. The KN is, therefore, fed
with similar knowledge that was available to Mendeleev. We
show that the 8  8 KN places the elements in such a way that it
obeys many properties presented in the original periodic table.
Such a fact reinforces the eciency of the method. We also show
that some elements are so similar that they share the same cell.

KOHONEN NEURAL NETWORK


Neural networks were originally developed16 in the 1940s by
the neurophysiologist Warren McCulloch of Massachusetts
Institute of Technology and the mathematician Walter Pitts of
the University of Illinois. They proposed a simple model of the
Published: September 16, 2011
1511

dx.doi.org/10.1021/ed100779v | J. Chem. Educ. 2011, 88, 15111514

Journal of Chemical Education

Figure 1. Schematic of a Kohonen network; input data are represented


by the black circles with the solid lines representing possible pathways to
the network and the processing is represented by the dotted lines on the
5  4 grid.

ARTICLE

Figure 3. The neuron with the strongest response captures the letter.
On the left, the rst letter (E) has been previously captured and the letter
B is being captured. On the right, the group of letters has been organized;
note that the uppercase and lowercase letters are near to each other and
vowels and constants are near to each other.

Table 1. The 69 Elements Used in This Worka


Elementsa

Figure 2. Uppercase and lowercase letters, vowels and consonants


make up the group to be classied. The neuron in the KN that responds
more strongly to an object wins it (neighbor cells are aected).

neuron that revealed itself as a powerful computing device and


proved that a synchronized arrangement of these neurons is
capable, in principle, of universal computation. Thus, an articial
neural network can perform any calculation that an ordinary that
is based on the human brain.
An articial neural network (ANN) is composed of several
processing units whose individual functioning is simple. These
units are connected by communication channels that are associated with certain weights. The units operate on their local data,
which are entries received by their connections. The intelligent
behavior of an ANN is a global eect explained by interactions
between the processing units of the network. There are two types
of ANN according to the learning scheme: supervised and
unsupervised. In this work, a well-known type of unsupervised
learning network called Kohonen network (KN)17 is used.
The KNs are formed by a set of simple elements organized in
more complex structures that work together. Each neuron is a
processing unit that receives stimulus (from outside the system
or from other neurons) and produces a response (to other
neurons or outside the system). Similar to the structure of the
brain, the neurons of the neural networks are interconnected by
branches through which the stimuli are propagated. The learning
process consists of strengthening the links that lead the system to
produce more ecient responses. The goal of a KN is to map
input patterns of arbitrary dimension N for a discrete geometric
arrangement of two dimensions (Figure 1). What distinguishes
the Kohonen networks from others is a double-layered structure:
one layer for input and another for processing, where the map is
formed. The processing layer consists of a geometric arrangement of neurons connected only to their immediate neighbors.

H
Li

Ca*
Zn*

Y*
In*

Mg*
Ce*

Ga*
N*

Nb*
Sb

I
Fe

Na*

Sr*

La*

Hf

P*

Ta

Pt

K*

Cd*

Er

Pb

V*

Bi

Ni*

Cu*

Ba

Tl

Th

As*

F*

Cu*

Rb*

Hg

O*

Mo*

Cl*

Os

Ag*

Be

Si

S*

Te*

Mn*

Pd*

Cs

Ti*

Cr*

Br*

Ir

Au
Be

Al*
Sc*

Zr*
Sn*

Se*
Tc

U
Co

Ru*
Rh*

Ag*

The asterisk (*) denotes the 41 elements chosen for the training of
the KN.

The objects to be grouped for subsequent segmentation (for


example, the chemical elements) are presented, one at a time, to
the input neurons. At each presentation, the stimuli generated by
the object (for example, atomic weight, atomic radius, density,
temperature, fusion, etc.) are captured by the input layer and
transmitted equally to all the neurons of the layer of the map. In
the network, the neuron that responds more strongly to the
stimuli of the presented object wins it for itself. Furthermore, it
reinforces its links with its neighbors, making them more
sensitive to the characteristics of the captured object.
The presentation of all input objects to the neural network and
the update of the weight for each item is termed the epoch. In a
new epoch, when an object is presented to the map, the entire
sensitized region will react more intensely. However, as the
neighboring neurons are dierent from the winning neuron, each
will react more intensely to a slightly dierent object. With each
new presentation of an object to the map, the sensitivity prole of
the neurons changes. This is what is termed training the
network (Figure 2). These changes, however, become smaller
each time, so that the conguration of the map converges to a
stable arrangement. When this happens, the map has learned to
classify individuals.
The result of processing a trained network is that each neuron
becomes the owner of a number of objects (Figure 3) similar to
those captured by neighboring neurons. Thus, similar individuals
get placed near each other, forming a gradient of characteristics.
The KNs may be referred to as self-organized maps. They are an
example of unsupervised learning networks.
1512

dx.doi.org/10.1021/ed100779v |J. Chem. Educ. 2011, 88, 15111514

Journal of Chemical Education

ARTICLE

The learning process uses a set of known elements and their


properties to determine the optimal values for the connections
between neurons represented by the weights, w. Mathematically,
the learning process of a KN may be described by

because various properties with dierent orders of magnitude


are used.

TRAINING AND PREDICTION


For the training of the KN, the following properties, known to
Mendeleev, were used: atomic weight, radius of connection,
atomic radius, melting point, and reaction with oxygen. After the
training, an investigation of the behavior of properties dierent
from those that were trained was conducted. These properties
were the boiling point, atomic number, ionization potential,
electronegativity, and density. The KN was able to map the
features that were not part of the training. A list of the 69
elements studied in this work is given in Table 1. Among the
elements for training, 41 chemical elements were randomly
chosen and, for the training of the networks, the 5 neural
properties previously identied were used.
For convenience, a KN of square architecture, whose sides
were composed of 8 neurons, was used. Training was conducted
in 5000 epochs for all tests. Through the systematic variation of
the parameters of learning (0.04 e e 0.2) and of neighborhood (0.7 e e 1.5), it was found that the 8  8 network with
the highest number of cells lled with a single element was
obtained when = 1.4 and = 0.09.

wij k 1 wij k wij k


wij k edl, j i  wij k

where is called learning rate, is the neighborhood factor (the


higher the value of , the less the neighborhood will be aected),
i represents the ith training property, wij are the weights to be
trained, l and j are indexes that characterize the cells, and d(l,j) is
the distance between cells l and j. The weights are initialized from
random values and are submitted to training. The process is
iterative; that is, the weights obtained in iteration k + 1 are
calculated from the values of the iteration k, until the values w(k +
1) and w(k) remain essentially unchanged. Each of these iterations is the epoch. It is important to note that the KN possesses
periodic boundary condition. The values of i are normalized;
that is, upon entering the network, they become values between 0
and 1. This is done to ensure uniformity of the input data,
Table 2. Map Founda
In (4)

La (1)

Sr (2)

Rb (3)

K (3)

Na (3)

Mg (2)

Sn (4)

Ce (5)

Y (1)

Ca (2)

Te (6)

Zr (1)

Sc (1)

Al (4)

P (6)

N (6)

Ag (1)
Pd (1)

Ru (1)

Mo (1)

V (1)

Ti (1)

O (6)
F (6)

Cr (1)

S (6)

Cl (6)

Ni (1)

Mn (1)

Br (6)

Cu (1)

Ag (1)

Zn (1)

Ga (4)

As (6)

Se (6)

RESULTS
The KN after the training process is presented in Table 2. By
inspecting Table 2, it can be seen that the KN recognized and
grouped elements with high electronegativity. The elements
uorine, chlorine, bromine, oxygen, and nitrogen occupy neighboring cells. The transition metals were also grouped: silver and
palladium; nickel and copper; manganese (Mn), chromium (Cr),
vanadium(V), and titanium (Ti). There were groupings of alkali
metals such as rubidium (Rb), potassium (K), and sodium (Na).
Another line group that formed was potassium (K), calcium
(Ca), and scandium (Sc). There was also a lineup of strontium
(Sr), yttrium (Y), and zirconium (Zr). From the 5A group,
phosphorus (P) and nitrogen (N) were grouped.
Using the trained weights, the cells occupied by the elements
erbium (Er), platinum (Pt), gold (Au), and hydrogen (H) were
identied and added to Table 2. This result is shown in Table 3.
Note the proposed position in the KNs, placing Er and Ce
together, hydrogen in the same cell as the uoride, and platinum
together with gold. Compared to the current periodic table, it is
noted that the erbium and cerium, which occupy the same cell,
are lanthanides. Platinum and gold, which are metals, are close to

The numbers refer to transition metals (1), alkaline earth metals (2),
alkali metal (3), other metals (4), lanthanides (5), and nonmetals (6).

Table 3. Map with Predictions Using the Trained Weights


In (4)

La(1)

Sr (2)

Sn (4) Ce (5)  Er Y (1)

Rb (3) K (3)

Na (3)

Mg (2)

Ca (2)

Te (6)
Ag (1) Pt  Au

Zr (1)
Mo (1)

Sc (1) Al (4)
Ti (1)

P (6)

N (6)
O (6)

Pd (1) Ru (1)

V (1)

FH

Cr (1)

S (6)

Cl (6)

Ni (1) Mn (1)

Br (6)

Cu (1)

Ag (1) Zn (1) Ga (4) As (6) Se (6)

Table 4. Element Properties, Actual Values, and Normalized Values Used for the Training
Atomic Weight/amu

Covalent Radius/

Atomic Radius/

Melting Point/K

Element

Actual

Normalized

Actual

Normalized

Actual Normalized

Actual Normalized

Nb

92.91

0.45

1.34

0.55

2.08

0.59

2740

Mo

95.94

0.46

1.3

0.53

2.01

0.57

2890

Cd

112.41

0.52

1.48

0.61

1.71

0.47

In
Cu

114.82
63.546

0.53
0.34

1.44
1.17

0.60
0.48

2
1.57

Ag

107.868

0.51

1.34

0.55

Rh

102.9

0.49

1.25

Pd

106.4

0.50

1.28

Specic Heat/(J g1 C1)

Reaction with O2

Actual

Normalized

0.70

0.26

0.11

2.5

0.73

0.25

0.11

0.74

594.18

0.23

0.23

0.11

1.2

0.23

0.56
0.42

429.76
1357.6

0.19
0.40

0.23
0.38

0.11
0.12

1.5
0.5  104

0.36
0.1

1.75

0.48

1234

0.37

0.235

0.11

0.5  104

0.1

0.51

1.83

0.51

2236

0.60

0.242

0.11

0.52

1.79

0.50

1825

0.50

0.24

0.11

1513

Actual

Normalized
0.61

dx.doi.org/10.1021/ed100779v |J. Chem. Educ. 2011, 88, 15111514

Journal of Chemical Education

ARTICLE

REFERENCES

Table 5. Element Properties Not Used in Training


Atomic
Element Number

Ionization

Electronegativity/ Boiling Density/

Potential/V

eV

Point/K (g/cm3)

Cd

48

8.993

1.69

1040

8.65

In

49

5.786

1.78

2346

7.31

Cu

29

7.726

1.9

2836

Ag

47

7.576

1.93

2436

10.5

8.96

Rh
Pd

45
46

7.46
8.34

2.28
2.2

3970
3237

12.4
12

Nb

41

6.88

1.6

5017

Mo

42

7.099

2.16

4912

8.55
10.2

silver, ruthenium, and molybdenum and hydrogen was predicted


next to uoride, which is also a nonmetal.
The nal results starting from Table 2, show 33 cells
occupied by one element and 4 cells occupied by 2 elements.
(This table is not shown.) Cadmium (Cd) and indium (In),
copper (Cu) and silver (Ag), rhodium (Rh) and palladium
(Pd), niobium (Nb) and molybdenum (Mo) occupy the same
cell. The properties used for training the neuron and the
elements that shared the same cell are presented in Table 4.
The pair, niobium and molybdenum, presents all the untrained properties with similar values. The pair cadmium
and indium has similar atomic weight, connection, as well as
atomic radii, and diers in only 20% at the melting point. A
similar situation occurs for the pair rhodium and palladium.
The pair copper and silver has dierent atomic weight, but the
other untrained properties are similar. The network therefore
shows that the atomic weight is not the most important feature
for classifying of elements.
Some properties not used in training are presented in Table 5.
The pair cadmium and indium has similar atomic number and
electronegativity. The pair copper and silver features dierent
atomic numbers, but the other untrained properties are similar.
The pair rhodium and palladium has dierent ionization potential, but the other properties are similar. The pair niobium and
molybdenum has dierent densities, but other untrained properties are similar.

(1) Mendeleev, D. The Relation between the Properties and Atomic


Weights of the Elements. J. Russ. Chem. Soc. 1869, 1, 6077.
(2) Mendeleev, D. Z. Chem. 1869, 12, 405.
(3) Moseley, H. G. J. The High Frequency Spectra of the Elements.
Phil. Mag. 1913, 1024.
(4) Kohonen, T. Self-organized formation of topologically correct
feature maps. Biological Cybernetics 1982, 43, 5969.
(5) Vander Heyden, Y.; Vankeerberghen, P.; Novic, M.; Zupan, J.;
Massart, D. L. The application of Kohonen neural networks to diagnose
calibration problems in atomic absorption spectrometry. Talanta 2000,
51 (455466), 2000.
(6) Tusar, M.; Zupan, J.; Gasteiger, J. J. Chem. Phys. 1992, 89, 1517.
(7) Favata, F.; Walker, R. Biological Cybernetics 1991, 64, 463.
(8) Lemes, M. R.; Pino, A. D., Jr. Quim. Nova 2002, 25, 539.
(9) Lemes, M. R.; Marim, L. R.; Pino, A. D., Jr. Phys. Rev. A 2002,
66, 23203.
(10) Zupan, J.; Gasteige, J. Anal. Chim. Acta 1991, 1, 248.
(11) Zupan, J.; Gasteige, J. Neural Networks for Chemists: VCH: New
York, 1993.
(12) Lambert, J. M. Proceedings of the 5th ICNN, 1991.
(13) Mhaskar, H. N.; Hahm, N. Neural Computation 1997, 9, 144.
(14) Suzuki, Y. Self-organizing QRS-Wave recognition in ECG using
neural networks. IEEE Trans. Neural Networks 1995, 14691477.
(15) Haykin, S.; Li, L., 16 Kbs adaptive dierential PCM of speech.
In Applications of Neural Networks to Telecommunications; Allspector, J.,
Goodman, R., Brown, T. X., Eds.; Laurence Elbaum Associates: Hillsdale,
NJ, 1993.
(16) McCulloch, W.; Pitts, W. A Logical Calculus of the Ideas
Immanent in Nervous Activity. Bull. Math. Biophys. 1943, 5, 115133.
(17) Kohonen, T. Self-Organizing and Associative Memory, 3rd ed.;
Springer Verlag: Berlin, 1989.

CONCLUSIONS
Using information known at the time of Mendeleev, an
articial intelligent system was tested to classify chemical elements. The KNs were able to map the chemical elements and to
organize them according to various trained as well as untrained
properties. The KNs organized alkali metals, transition metals,
and even properties that were not present during training, for
instance, electronegativity. Using the 8  8 architecture, the
system was ecient and managed to map many dierent aspects
of the elements. However, some chemical elements occupied
the same cell because they had similar general properties.

AUTHOR INFORMATION
Corresponding Author

*E-mail: ruvlemes@terra.com.br.
1514

dx.doi.org/10.1021/ed100779v |J. Chem. Educ. 2011, 88, 15111514

Você também pode gostar