Você está na página 1de 31

Saint Jude Catholic School High School Department S.Y.

2013-2014

Chaotic Systems: The Sensitivity of Human Decisions on Wiener Entropy

In Partial Fulfillment of the Requirements in Physics IV 3rd Quarter

Submitted by #14 Tan, Matthew #17 Tiu, Steven #21 Ang, Princess #37 Sy, Nyra #40 Tan, Hana HS4C

Submitted to: Mr. Odi

Abstract The study revolves around a particular dynamic system the human brain and its decision-making capabilities which is of great importance to the advancement of man. Determining the sensitivity of the human brain to background audio waves when making decisions is the main objective of this study. Five grayscale images, each containing abstract shapes of different shades of gray, were presented to correspondents (n=30). Results showed that 68.08 percent of the correspondents were influenced by the chaos stashed within the sound waves. With concepts of dynamic systems, especially the human mind, as well as entropy measurement techniques, this study could very well change the way society views noise pollution and entropy around us.

Acknowledgements

The researchers would like to acknowledge their classmates and batchmates, most of which were chosen to be the test subject in the experiment proper. The enthusiasm the correspondents displayed facilitated the progress of the study. Also, integral sources of knowledge without which the research could not advance, the authors of the various resources receive the utmost gratitude of the researchers. Finally, the researchers would like to acknowledge the help of Mr. Odi, who suggested the use of the Broken Window Theory as our topic. Though the research itself is slightly tangential to the original proposal of Mr. Odi, without his insight, this research would not come to fruition.

Table of Contents Title Page..1 Abstract.2 Acknowledgements...3 I. Introduction A. Background of the Study..5 B. Objectives.6 C. Significance of the Study..7 D. Scope and Limitations..8 II. Review of Related Literature9 III. Methodology A. Materials..15 B. Methodology15 C. Flowchart.16 IV. Data Analysis A. Image Analysis Program.17 B. Interpretation of Image Entropy..18 C. Sound Analysis Program.20 D. Interpretation of Wiener Entropy21 V. Results.22 VI. Conclusion.27 VII. Recommendation.28 VIII. Bibliography...29 IX. Appendices A. Entropy formulae....30 B. Statistical formulae.31

I. Introduction

A. Background of the Study Chaos commonly used to describe a state of bedlam is actually the manifestation of randomness in a system, which consequently results to the inability of predicting future events. An example of such a system would be the weather; no forecaster can accurately predict conditions a year from now. The Butterfly Effect, proposed by meteorologist Edward Lorenz, aptly describes chaos within not only weather systems but also nature itself. It states that theoretically, the flapping of the wings of a butterfly may cause the formation of a hurricane in a distant continent several weeks later. Though somewhat improbable, the abundance of chaos within natural and artificial systems gave rise to a new field of study: Chaos Theory.

Another significant theory that is currently utilized in societal reforms is the Broken Windows Theory. Proposed by James Wilson and George Kelling, this criminological theory states that certain external indicators such as broken windows or graffiti can induce anti-social behavior in exposed citizens. The presence of these stimuli induces the notion of government absence or anarchy; hence, the citizens do not feel obligated to conform to the law. This theory has been applied successfully in New York. In the 1990s, the sky-high crime rates presented a serious problem to the police enforcers of the Big Apple; however, by simply replacing broken windows and recoating vandalized walls, the number of violent crimes and property crimes tumbled down 56 and 65 percent respectively in a span of a decade, somewhat restoring the idyllic reputation of the city.

To draw a parallel between the different theories above, an analysis of the decisions or behavior of people when exposed to systems with varying levels of entropy was conducted by the researchers.

B. Objectives

Our study aims to determine the extent to which the entropy of the surroundings affects the entropy of ones choices. Furthermore, other objectives include the calculation of the predictability of human choices given ambient noise, and the corroboration of the validity of Wilson and Kellings Broken Window Theory through the gathering and analysis of empirical data. Consequently, our research will tackle alternative ways of measuring the entropy within a system due to the appalling lack of entropy-measuring devices.

C. Significance of the Study

The great debate between nature and nurture has yet to be resolved, yet even the advocates of the former cannot completely deny the latters influence on the development of a human being. The personal experiences and the surroundings of a person play a vital role in his future decisions and mannerisms. Our experiment deals with the inclination of humans in disordered systems to make chaotic decisions, consciously or subconsciously. This could have considerable implications and applications in a myriad of real-world settings: from the tonal arrangement of music to the permutations of books in the library, from the interior design of various structures to the optimal screensavers for students. The endless streams of visual and audio stimuli flooding our brain may induce a personality change that goes unnoticed by most. Although these extremely minute details may seem insignificant, according to the Butterfly Effect, the long-term effects may be as disastrous as the hurricane; hence, it is imperative that we take into consideration the disorder within the system before making decisions.

Other beneficiaries of such results would include various sectors of society as well. Psychologists can utilize the information gathered from the study to estimate the extent which the entropy of the environment affects the subconscious actions of the individual, providing further grounds for future studies. Criminologists can also apply this concept, in conjunction with the broken window theory, to predict the actions or hypothesize the motives of criminals. Market analysts can use these concepts to calculate the most plausible probability distribution of sales during certain political situations. DJs, composers and music artists can take this concept into consideration when developing an album for an intended target market. In general, the repercussions of such a research would affect not only those who seek knowledge, but also innumerable segments of society.

D. Scope and Limitations

Given the limited time interval allocated for the research, approximately no more than thirty trials will be executed, thereby increasing the margin of error. Furthermore, high school students of Saint Jude Catholic School will generally be used as subjects to eliminate the need to search for distinct pollsters within the community; however, a consequence would be the non-arbitrary sample distribution relative to the population. Several variables including but not limited to mood, personality, and taste, may also affect the results of the experiment; nevertheless, these cannot be constrained and are therefore limited by our choice of images. Computer analysis also has its limitations. Constant notifications like warning: your version of GraphicsMagick limits images to 16 bits per pixel pop up now and then. Lastly, although the processes conducted will be systematic, there is also an empirical element in this research, which may produce different results when recreated under similar yet slightly different conditions.

II. Review of Related Literature

Chaos

The big news about chaos is that the smallest of changes in a system can result in very large differences in that system's behavior. The so-called butterfly effect has become one of the most popular images of chaos. The idea is that the flapping of a butterfly's wings in Argentina could cause a tornado in Texas three weeks later. By contrast, in an identical copy of the world sans the Argentinean butterfly, no such storm would have arisen in Texas. The mathematical version of this property is known as sensitive dependence. However, it turns out that sensitive dependence is somewhat old news, so some of the implications flowing from it are perhaps not such big news after all. Still, chaos studies have highlighted these implications in fresh ways and led to thinking about other implications as well.

In addition to exhibiting sensitive dependence, chaotic systems possess two other properties: they are deterministic and nonlinear. This entry discusses systems exhibiting these three properties and what their philosophical implications might be for theories and theoretical understanding, confirmation, explanation, realism, determinism, free will and consciousness, and human and divine action.

Butterfly Effect

The butterfly effect is a term used in chaos theory to describe how small changes to a seemingly unrelated thing or condition (also known as an initial condition) can affect large, complex systems. The term comes from the suggestion that the flapping of a butterfly's wings in South America could affect the weather in Texas, meaning that the tiniest influence on one part of a system can have a huge effect on another part. Taken more broadly, the butterfly effect is a way of describing how, unless all factors can be accounted for, large systems like the weather remain impossible to predict with total accuracy because there are too many unknown variables to track.

The concept of the butterfly effect is attributed to Edward Norton Lorenz, a mathematician and meteorologist, who was one of the first proponents of chaos theory. Lorenz was running global climate models on his computer one day and, hoping to save himself some time, ran one model from the middle rather than the beginning. The two weather predictions, one based on the entire process, including initial conditions, and another based on a portion of the data, starting with the process already part way completed, diverged drastically. Lorenz, along with most scientists of his time, had expected the computer models to be identical regardless of where they started. Instead, tiny, unpredictable variations caused the two models to differ.

Intrigued by the results, Lorenz began creating a mathematical explanation that would show the sensitive dependence of large, complex systems like the weather. Sensitive dependence means that the development of the system depends on a wide number of factors. To simplify his findings, Lorenz coined the butterfly explanation that has since become so widely known.

10

Broken Window Theory

In the March1982 issue of the Atlantic Monthly, political scientist James Wilson and criminologist George Kelling published an article under the title Broken Windows, in which they argued that policing in neighborhoods should be based on a clear understanding of the connection between order-maintenance and crime prevention. In their view the best way to fight crime was to fight the disorder that precedes it. They used the image of broken windows to explain how neighborhoods might decay into disorder and crime if no one attends to their maintenance: a broken factory window suggests to passers-by that no one is in charge or cares; in time a few more windows are broken by rock-throwing youths; passers-by begin to think that no one cares about the whole street; soon, only the young and criminals are prepared to use the street; which then attracts prostitution, drug-dealing, and such like; until, in due course, someone is murdered. In this way, small disorders lead to larger disorders, and eventually to serious crimes.

This analysis implies that if disorderly behaviors in public places (including all forms of petty vandalism, begging, vagrancy, and so forth) are controlled then a significant drop in serious crime will follow. Wilson and Kelling therefore argue in favor of community policing in neighborhoods. This means many more officers involved in foot-patrol (in Britain the philosophy of bobbies on the beat) and fewer involved in riding around in police cars merely following up 911 calls. In this way law enforcement comes to be seen as a technique for crime prevention rather than as a vehicle for reacting to crime.

11

Fourier Transformation

Fourier series provides an alternate way of representing data: instead of representing the signal amplitude as a function of time, we represent the signal by how much information is contained at different frequencies. The blinking lights on a stereo equalizer are simply Fourier analysis at work. The lights represent whether the music contains lots of bass or treble. Jean Baptiste Joseph Fourier, a French Mathematician who once served as a scientific adviser to Napoleon, is credited with the discovery of the results that now bear his name. Fourier analysis is important in data acquisition just as it is in stereos. Just as you might want to boost the power of the bass on your stereo you might want to filter out high frequency noise from the nearby radio towers in Needham when you are conducting a lab experiment. Fourier analysis allows you to isolate certain frequency ranges.

12

Power Spectrum The power spectrum answers the simple question How much of a given signal is at a frequency f. Periodic signals give peaks at a fundamental and its harmonics; quasiperiodic signals give peaks at linear combinations of two or more irrationally related frequencies (often giving the appearance of a main sequence and sidebands); and chaotic dynamics give broad band components to the spectrum. Indeed this later may be used as a criterion for identifying the dynamics as chaotic. These are all statements about the ideal power spectrum, if innitely long sequences of continuous data are available to process. In practice there are always limitations of restricted data length and sampling frequency, and it is important to investigate how these limitations affect the appearance of the power spectrum.

13

Wiener Entropy/Spectral Flatness/Tonality Coefficient

Abstract: Spectral flatness is a feature of acoustic signals that has been useful in many audio signal processing applications. The traditional definition of spectral flatness is the ratio of the geometric mean to the arithmetic mean of the magnitude spectrum of the signal, as obtained from the DFT. Presented is an analysis of this measure and the shortcomings. To overcome these shortcomings, a robust measure based on the concept of entropy is proposed. Although this entropy-based measure is derived from a completely different standpoint, it is closely related to the traditional measure, the relation of which is also derived here. Given the importance that spectral flatness is gaining in the field of audio signal processing, it is believed that an understanding of this measure and the development of a robust version as proposed is exigent.

Abstract: This paper presents an investigation of spectral entropy features, used for voice activity detection, in the context of speech recognition. The entropy is a measure of disorganization and it can be used to measure the peakiness of a distribution. We compute the entropy features from the short-time Fourier transform spectrum, normalized as a PMF. The concept of entropy shows that the voiced regions of speech have lower entropy since there are clear formants. The flat distribution of silence or noise would induce high entropy values. In this paper, we investigate the use of the entropy as speech features for speech recognition purpose. We evaluate different sub-band spectral entropy features on the TI-DIGIT database. We have also explored the use of multi-band entropy features to create higher dimensional entropy features. Furthermore, we append the entropy features to baseline MFCC 0 and evaluate them in clean, additive babble noise and reverberant environments. The results show that entropy features improve the baseline performance and robustness in additive noise.

14

III. Methodology A. Materials

The researchers will utilize the following materials for the experiment, namely, grayscale images of various abstract shapes, the octave 3.6.4 software, a t-test Calculator, a printer and a sound recorder.

B. Methodology

First, the creation of grayscale images is required for the commencement of the subsequent processes. These grayscale images, rescaled to the same size, will contain the same shapes, but with different shades of gray to minimize biased decision making. Now, for the trial itself: randomly selected individuals will be asked to select an image from the given array of the five printed images. During the selection process itself, we will also use the sound recorder to document the ambient noise wavelengths. Next, utilizing the octave 3.6.4 software, the researchers will determine the entropy content of each image. All the decisions of the pollsters will be tabulated, and the recorded sound will be analyzed, especially the Wiener entropy, also through the octave software. The correlation between the noise levels and the image entropy that the correspondent opted for will be demonstrated through graphs that will facilitate in the presentation of the results.

15

C. Flowchart

16

IV. Data and Analysis A. Image Analysis Program

The following is the GNU octave program used to calculate the image entropy: I = imread (image.jpg); H = histc (I, 100, 2); s = sum(sum(H)); J = H./s; K = find (J); L = J (K); E = 0; for p = L, >E = E p.*log2(p); >end; disp (E) The first function, imread ( ), converts the input image.jpg into a matrix of numbers denoting the shades of gray of the pixels. Histc( ) then proceeds to sort these numbers into 100 bins each containing pixels within the interval of fixed width(e.g. there is a bin for numbers 1-10, 11-20,991-1000). H is then a matrix containing the values corresponding to the number of contents each bin has. The sum(sum( )) adds up all the values H contains, assigning the sum to the variable s, and J is the matrix resulting from the scalar division of matrix H with s. Essentially, J contains the p(x)s necessary for the calculation of the entropy (specifically Shannon Entropy). Given the probability of each distinct event entropy is given by is with , the . However, J may contain 0s

as not every bin must be occupied; these 0s then create an error as the logarithm function cannot accept an input of 0. Hence, the find( ) function is used to call back the address of all nonzero elements within the matrix J. Inputting J(K) then outputs all nonzero elements of matrix J, and L is then the transformed matrix from J. A for loop is then used in lieu of the summation, and for each element p within L, is

then added to the initially zero E. Finally, the disp( ) function outputs the value of E after the loop has terminated: the value of the entropy.

17

B. Interpretation of Image Entropy

Shannon Entropy, given its formula following properties:

observes the

i.

The proof of this claim is as follows. Simplifying H(X) yields the expression but each distinct probability is of the form with derivative . With ] equivalent to zero if and only if and ], the

maximum value of the function within the interval [0, 1] is either at 0 or 1. In any case, both 00 and 11 yield a value of 1. Going back, each individual implies

with equality if and only if within the event probability set, one and only one , leaving the rest, 0. This just establishes the lower bound of the . Taking the limit as n approaches infinity, .

inequality. On the other hand, the upper bound can be proven by taking

ii.

for two complementary events X and 1-X

Take the probabilities of events X and 1-X to be

and

respectively.

The symmetry of the function

implies a minimum at

. Then,

, and thus, it is true for all such events X and 1-X. A similar proof can also be established to show that this holds not only for X, 1-X and 1, but for any three probabilities, two of which add up to the last. The rationale behind this is that a high event probability leaves little room for uncertainty; hence, the low entropy. However, if this event would split into two events, the 18

unpredictability increases and is especially maximized when the two events are equally likely (e.g. ). Applying this property recursively also shows that more events yield higher and higher entropy.

These two properties are essential in understanding the grayscale image entropy. An image is considered void of entropy if and only if it is completely monochromatic: not even a single pixel is a different hue of gray since this would yield a probability set of . On the other hand, the more intermediary shades of gray there are within the image, the more events there are, and the more chaotic the image. A theoretical infinite entropy image would be one consisting of equal amounts of infinite shades of gray.

19

C. Sound Analysis Program

The following GNU octave program was used to calculate the wiener entropy of the audio files: W = wavread(wavefile.wav); F = fft(W); P = abs(F).^2; [nr, nc] = size(P); A = sum(sum(P))/nr; G = prod(prod(nthroot(P,nr))); W = G/A; disp (W) Wavread( ) is the audio equivalent of imread( ), and the successive fft stands for the Fourier Fast Transform, an algorithm that separates the sound wave into trigonometric functions (with real and imaginary counterparts). Given a complex number , the abs(F).^2 is actually the square of its argument, namely . Next, size( )

returns the number of rows (nr) and number of columns (nc) of the inputted matrix which will then be used to get the arithmetic and geometric mean of the Power spectrum, as the formula for Wiener entropy is

. Since P is a

vertical matrix, nc is 1, so the total number of elements inside P is merely nr. The sum(sum( )) is used to get the summation of all elements of matrix P, then division by nr converts this summation to the arithmetic mean. The same is true with the geometric mean except nthroot(P, nr) is used first before prod(prod( )) due to the extremely small values of P that increase when the nth root is obtained, thereby decreasing the chance of errors caused by infinitesimal number calculations. Finally the quotient of the GM and AM is obtained.

20

D. Interpretation of Wiener Entropy

The behavior of the Wiener Entropy is extremely dependent on the infamous AM-GM inequality, a subset of the QM-AM-GM-HM inequality, which states the following:

Hence, the following properties: i. ] The Wiener Entropy W is essentially the quotient of the Geometric Mean and the Arithmetic Mean, and by the AM-GM Inequality, the upper bound is established. On the other hand, the Power Spectrum denotes how much of the sound wave is at a given frequency. With a minimum of zero, the Power Spectrum is nonnegative; consequently, so is the Wiener Entropy, and the lower bound is verified as well.

ii.

if the sound is a pure tone or white noise respectively. Pure tone describes sounds with a fixed smooth frequency, with graphical

representations much like the sines and cosines drawn in trigonometry class. The Power Spectrum of a pure tone would then be a horizontal linear line on the x-axis with a single extremely high spike at the fixed frequency level. White noise, on the other hand, is the complete opposite; it describes sounds with a sporadic, unpredictable frequency, but each frequency appears in equal amounts. The Power Spectrum of a pure tone would then be a horizontal linear line somewhere on the upper quadrants.

Since the Power Spectrum of a pure tone contains multiple zero values and a single high value, it is inevitable for the set of to contain at least one zero, the exact condition which would yield a geometric mean and a Wiener entropy of zero. On the other hand, the power spectrum of white noise yields for equality of arithmetic mean and geometric mean hence the Wiener entropy of 1.

21

V. Results

After the painstaking analysis of each set of data through octave programming, the results have been tabulated and are presented below.

Note that in order to emphasize the infinitesimal differences between the Wiener Entropies of the audio samples, the logarithm scale was used. Thus, the value displayed is instead of the original W. To convert such logarithms back to the original value, one can raise ten to the value of the logarithm.

22

Image Entropy 1 4 2 3 5 0.13352 0.048451 0.0371375 0.033517 0.014566

Mean Wiener Entropy Logarithm (log ) -3.1802 -3.2508 -3.2587 -3.2814 -3.5978

Standard Deviation 0.1127 0.0732 0.0602 0.0972 0.1544

Image 1 Entropy 0.13352

Wiener Entropy (logarithm) -3.1538 -3.3944 -3.1576 -3.0565 -3.1482 -3.1708

Image 4 Entropy 0.048451

Wiener Entropy -3.1567 -3.2184 -3.2695 -3.2110 -3.3065 -3.2209 -3.3960 -3.2275

Image 2 Entropy 0.0371375

Wiener Entropy -3.3062 -3.1592 -3.2538 -3.3060 -3.2683 Image 3 Entropy 0.033517

Wiener Entropy -3.2627 -3.3529 -3.1817 -3.3307

Image 5 Entropy 0.014566

Wiener Entropy -3.4983 -3.8295 -3.6625 -3.4355 -3.5633

-3.4015 -3.1590

23

Now to analyze the predictability of human decisions, we use statistics. Let P(X | Y) denote the probability a person will select image Y when the optimum image, given the Wiener Entropy, is X. Now we will be using the confidence interval formula, as this can be used to find the probability of taking a value greater than or less than the arithmetic mean by certain amounts.

figure 5.1 two normal curves intersecting

Take the given figure 5.1 above as an example. With the confidence interval formula, not only is it possible to ascertain the intersection point of the two curves, but also calculate the area under the intersection, which is the probability itself. X = 1, 2, 3, 4, 5 denotes the images that were supposed to be chosen, i.e. the images that correspond to the background audio entropy. Y = A, B, C, D, E denotes the images chosen by the correspondent, with A equivalent to image 1, B image 2, and so on so forth. P(1 | B) will be used as an exemplar:

First of all, the confidence interval was used to determine the value of t that would equate the two values. Afterwards, a t-test calculator was used to determine the 24

value of the area P under the curve which is greatly dependent on the degrees of freedom

df, an expression equivalent to n-1. Two values were obtained, mainly because the two
components each had a different value of df. However, their sum corresponds to the area under the intersection of the two curves, and the probability of being able to choose from the two images. Since we assume that correspondents under this curve will choose randomly (i.e. ), the value of P(1 | B) is 0.3368/2 = 0.1684. However, one must note that P(1 | C),P(1 | E) are also subsets of the curve and have distinct probabilities as well. Hence, by principle of inclusion and exclusion, one can deduce the final value of P(1 | B): 0.1012.

Likewise, the following values can be determined. P(1 | B) = 0.1012 P(1 | C) = 0.0871 P(1 | D) = 0.0718 P(1 | E) = 0.0037 P(2 | C) = 0.2383 P(2 | D) = 0.3013 P(2 | E) = 0.0075 P(3 | D) = 0.1953 P(3 | E) = 0.0144 P(4 | E) = 0.0028 P(2 | A) = 0.1012 P(3 | A) = 0.0871 P(4 | A) = 0.0718 P(5 | A) = 0.0037 P(3 | B) = 0.2383 P(4 | B) = 0.3013 P(5 | B) = 0.0075 P(4 | C) = 0.1953 P(5 | C) = 0.0144 P(5 | D) = 0.0028

Furthermore, the following values can also be found within the given data. P(1) = 0.3333 P(2) = 0.3 P(3) = 0.1 P(4) = 0.1333 P(5) = 0.1333 These values are obtained by counting the number of Wiener entropy values that lie within the intervals defined by the arithmetic mean of the consecutive Wiener entropies

25

means. Now with all these values calculated, we can obtain the ones with actual consequence, i.e. P(1 | A), P(2 | B), P(5 | E). The solution for calculating P(1 | A) is shown below. | | |

for

Interpretation of Equation Consider an imaginary setting where each person must choose a correct answer among five different choices. The correct answer varies from person to person, but remains unknown to all until afterwards. The goal is to determine how much of those who must pick the first choice actually did. One must take into account those who were supposed to pick the first but chose the second, third, fourth or fifth option instead, which constitutes the | portion on the left-hand side. One must also take |

into account those who were not supposed to pick the first choice, but did so instead. This value is represented by the expression A) is attained: 0.6629 Likewise, the other values are tabulated as follows: P(1 | A) = 0.6629 P(2 | B) = 0.7042 P(3 | C) = 0.6459 P(4 | D) = 0.5896 P(5 | E) = 0.7909 . Finally, the value of P(1 |

Finally, given that X is the event that the chosen image corresponds to the audio entropy levels, | | | .

26

VI. Conclusion After conducting the experiment, analyzing the data, and interpreting the results, the researchers have come to a conclusion that despite the chaos within the complex dynamic system that is our mind, decisions are in fact influenced by surrounding noise levels. Specifically in our experiment, given the entropy of the array of images, one can hypothesize the image that will be chosen based on the background audio entropy. The value of P(X) attained, which is 0.6808, embodies the predictability of human decision as given that the background audio entropy falls within a fixed interval, 68.08% of the correspondents would choose the same image. This is a large difference from the conventional 20% derived from simply choosing an image from a set of five. All these data substantiate the claim that increasingly erratic background noise may cause people to make increasingly chaotic decisions which they are unable to distinguish except subconsciously. On a different light, the researchers were able to fulfill their other objectives as well, such as devising a means of measuring entropy and gathering empirical evidence that corroborates the Broken Window Theory.

27

VII. Recommendations

First and foremost, future researchers may opt to use computers with higher processing power. As the sampling rate, especially in the audio analysis, increases to infinity, so does the accuracy of the Wiener Entropy obtained and the memory allocated for the program. Hence, more processing power may yield more accurate results. Furthermore, due to time constraints, the researchers were only able to conduct 30 trials which future researchers can choose to increase to their liking. Benefits of more trials include increased accuracy and precision. Lastly, future researchers may select different forms of entropy for analysis. Audio and visual inputs are not the only manifestations of entropy; fractals are potentially interesting concepts which entropy can be applied to. Due to their ubiquity in nature, these pretty patterns may hide a deeper meaning behind their aesthetic faade that merits a closer examination. Lyapunov exponents, though hard to calculate, can also be used as indicators of the sensitivity of the dynamic system to initial conditions. In general, there is a wealth of additional information future researchers can discover in the broad field of chaos theory.

28

VIII. Bibliography

Azad, K. (2012). An Interactive Guide To The Fourier Transform. Retrieved from http://betterexplained.com/articles/an-interactive-guide-to-the-fourier-transform/

Bishop, R. (2008). Chaos, The Stanford Encyclopedia of Philosophy (Fall 2009 Edition), Stanford, CA: The Metaphysics Research Lab. Retrieved from http://plato.stanford.edu/archives/fall2009/entries/chaos/

California Institute of Technology. Chaos Course Chapter 6. Retrieved from http://www.cmp.caltech.edu/~mcc/Chaos_Course/Lesson6/Power.pdf

Ellis-Christensen, T. (2013). What is the Butterfly Effect? Retrieved from http://www.wisegeek.org/what-is-the-butterfly-effect.htm

Marshall, G. (1998). Broken Windows Thesis. A Dictionary of Sociology Retrieved from http://www.encyclopedia.com Madhu, N. (2009). Note on measures for spectral flatness, Electronics Letters, 45(23), 1195-1196. doi:10.1049/el.2009.1977

Storey, B. Computing fourier series and power spectrum through Matlab. Retrieved from http://faculty.olin.edu/bstorey/Notes/Fourier.pdf

Toh, A.M., Togneri, R., & Nordholm, S. Spectral Entropy as Speech Features for Speech Recognition. Retrieved from http://www.ee.uwa.edu.au/~roberto/research/theses/tr05-01.pdf

29

IX. Appendices A. Entropy Formulae Shannon Entropy

Fourier Transform

Power Spectrum | |

Arithmetic Mean

Geometric Mean

Wiener Entropy

AM-GM Inequality

30

B. Statistical Formulae Sample Standard Deviation

Confidence Interval

Principle of Inclusion and Exclusion

Bayes Theorem | |

31

Você também pode gostar