Escolar Documentos
Profissional Documentos
Cultura Documentos
Definition 1 A point is that which has no part. Definition 2 A line is breadth-less length. Definition 3 The ends of a line are points. Definition 4 A straight line is a line which lies evenly with the points on itself.
http://aleph0.clarku.edu/~djoyce/java/elements/logo.gif
Page 1 of 39
Table of Contents
No. 1. 2. 3. 4. Topic Page No. 3 4 5 8 8 12 15 19 23 31 32 35 Summary Introduction Three types of straight lines Three examples of use of y/x ratios and underlying linear law 4.1 Profits-Revenues data for a company 4.2 US Traffic Fatality data 4.3 The Ohio Unemployment Puzzle Concluding Remarks: Einsteins work function Appendix I: The Olympic Long Jump Record, Work Function Appendix II: Derivative for the Generalized Planck law Appendix III: The US teen pregnancy problem Appendix IV: Bibliography list of related articles
5. 6. 7. 8. 9.
Figure 3 in 3 provides a graphical illustration of how the ratio y/x can either increase or decrease as x increases, even on a PERFECT straight line, if the straight line does NOT pass through the origin. Hence, using the ratio y/x to determine a rate can yield misleading conclusions. Everyone should actually know about this important mathematical property of a straight line, and all of its implications, before graduating from middle school.
Page 2 of 39
1. Summary
Based on several discussions that are commonly encountered in many articles, some written even by those holding advanced degrees, it appears that a very important mathematical property of a straight line, and its implications, have NOT been widely appreciated. Very briefly, the ratio y/x is a constant at all points on a straight line, if and only if the straight line passes through the origin. More generally, if the straight line, y = hx + c, does NOT pass through the origin, the nonzero intercept c means that the ratio y/x = m = h + (c/x) will either increase or decrease as x increases, even on a PERFECT straight line. Hence, the common practice of using simple ratios (converted to a percentage), or the so-called rates which is a misleading term for a simple y/x ratio, to make comparisons can create a lot of confusion. Three examples of the use of such ratios, while overlooking the underlying linear law, are discussed here. There are literally hundreds and thousands of such ratios, or rates, used in economics, business, finance, management sciences, and in the social and political sciences to quantify empirical observations: profit margin, earnings per share, labor productivity, unemployment rate, traffic fatality rate, teen pregnancy rate, cancer rates, medical errors rates, suicide rates, and the list goes on. The simple linear law y = hx + c = h(x x0) can be compared to Einsteins photoelectric law. The nonzero intercept c, emphasized here, is exactly analogous to the work function W in Einsteins law, K = E W = hf W = h(f f0). Here E = hf is the elementary energy quantum introduced by Planck when he laid the foundations of quantum physics, K is the maximum kinetic energy of the electron produced when a photon (a particle of light with the energy E = hf) strikes the surface of a metal to eject an electron. However, in this interaction, some energy W, which Einstein refers to as the work function of the metal, must be given up. The cut-off frequency f0 = W/h is exactly analogous to the cut-off value, x0 = - c/h, of the independent variable, or the stimulus function, x before a response y is observed. It appears that Einsteins idea of a work function (and Plancks ideas) can thus be generalized and extended well beyond physics to explain many other empirical observations where a simple linear law is often observed.
Page 3 of 39
2. Introduction
As we all know, the straight line is the shortest distance between two points. Any two points on an x-y graph can always be joined by a straight line. Of course, we can join them using a fancy curve, or even a squiggle, but it is the straight line that we prefer over all else. What else is there to know about a straight line? Can anyone actually write an article about some little known mathematical property of a straight line? Well then, let us take a look at the graph in Figure 1.
25
Dependent variable, y
20
15
10
-5
Independent variable, x
Figure 1: Three types of straight lines that do NOT pass through the origin. This graph illustrates three types of straight lines. Actually, there is a fourth type of straight line, which can be treated as a special case of the three lines here. This fourth straight line is the line that we tend to implicitly assume when we make a
Page 4 of 39
number of common calculations based on ratios or percentages to determine what we call rates. This is illustrated in Figure 2. What is the difference between these four straight lines?
25
Dependent variable, y
20
10
0 0 10 20 30 40 50
Independent variable, x
Figure 2: The straight line y = mx passing through the origin. The ratio y/x = m = constant at all points along this line.
This is a very important property of a straight line and one that we take for granted. Does it apply to every straight line? NO. Actually, this property is ONLY exhibited by a straight line passing through the origin (0, 0).
1.0 0.9 0.8
A B C
y = 0.5x - 2 y/x = 0.5 - (2/x)
100 120
Independent variable, x
Figure 3a: The ratio y/x is NOT a constant and varies on different points on a straight line, if the straight line does NOT pass through the origin. The ratio y/x is a constant at all points on a straight line only it passes through the origin. If the straight line does not pass through the origin, the general equation of the line is y = hx + c. Hence, the ratio y/x = m = h + (c/x) can either increase or decrease as x increases, depending on the numerical values of the slope h and the intercept c. This gives rise to the three types of straight lines illustrated in Figure 1. The graphs in Figures 3a and 3b illustrate the varying nature of the ratio y/x as x increases even on a PERFECT straight line. We consider the three straight lines: y = 0.5x with a zero intercept c = 0 (Figure 2), y = 0.5x -2, the Type I line in Figure 1, with a negative intercept c = - 2, and y =
Page 6 of 39
0.5x + 2, the Type II line in Figure 1 with a positive intercept c = + 2. With reference to Figure 3a, the upper curve A is the graph of y/x = m = 0.5 + (2/x) for the case c = + 2. The ratio y/x keeps on decreasing as x increases and becomes equal to the slope h = 0.5 for very large values of x. The graph of y/x is a falling hyperbola. The horizontal line B is for case of zero intercept. The ratio y/x = m = h = 0.5 = constant. The lower curve C is the graph of y/x = m = 0.5 (2/x) for the case c = - 2. The ratio y/x keeps on increasing and becomes equal to the slope h = 0.5 for very large x. The graph of y/x is a rising hyperbola.
2.50 2.00
40
60
80
100
120
Independent variable, x
Figure 3b: Variation of the ratio y/x on the Type III straight line, y = -0.5x + 20. As x increases, the ratio y/x = -0.5 + (20/x) decrease continuously and becomes zero when x = 40. For x > 40, the ratio y/x becomes negative and approaches the limiting value of h = -0.5, the slope of the line, at very large values of x. The graph is a falling hyperbola. (Mathematically speaking, y/x h, as x , for all three cases. This is called the asymptotic value of the ratio y/x for very large x.)
Page 7 of 39
Quite surprisingly, this rather important and fundamental mathematical property of a straight line (viz., the variation of y/x with a nonzero intercept as opposed to the absolute constancy of the ratio y/x for the straight line passing through the origin) does NOT seem to have been widely appreciated based on: a) conversations I have had with many who have advanced degrees, b) the review of several websites that discuss the basic mathematical properties of a straight line, c) several news items that have caught my attention over the years (a couple examples will be provided shortly). What are the implications of this fundamental property of a straight line?
4. Three examples of use of y/x ratios and the overlooked underlying Linear Law
We often use simple ratios y/x to make sense of many different (and very complex) situations such as financial performance of a company, the unemployment rate, the teenage pregnancy rate, traffic-related fatality rates, and so on.
Page 8 of 39
20 40
10 20
0.5 0.5
60
50
40
30
y = mx = 0.3x
20
y = mx = 0.2x
10
y = mx = 0.1x
0 0 20 40 60 80 100 120 140 160
Page 9 of 39
However, a careful examination of the graphs reveals that the three data points lie on a Type I line envisioned in Figure 1, with the equation, y = 0.5x 24. The slope h = 0.50 > 0 and the intercept c = - 24 < 0. This means the profit margin, the ratio y/x = m = 0.5 24/x. This will keep on increasing as revenues x increase. The slope h is fixed by considering the changes in revenues and profits (see Table 1) between consecutive years. The slope h =y/x = 10/20 = 20.40 = 0.50. Once h is fixed, the intercept c = (y hx) is readily determined since the Type I straight line passes through all three points and we know the values of h, x, and y. For example c = 6 (0.50 60) = - 24, for (60, 6). The same value is obtained with all the three (x, y) pairs. Only two (x, y) pairs are needed to fix the slope h and intercept c of a straight line. In this case, the third pair also lies on the extension of exactly the same straight line.
8
y = mx = 0.5x
y = mx = 0.3x
y = mx = 0.2x
Type II, y = hx + c = h(x x0) y = 0.1x + 2 = 0.1 (x + 20) (5, 2.5), (10, 3) and (20, 4)
10
20
30
40
50
The three rays, shown as dashed lines in Figures 4a are of course, imaginary. They serve merely as an aid to understanding. There are no actual data points for profits and revenues that fall along these rays. The company is actually operating along the Type I line. What is the significance of the slope h, as opposed to the profit margin y/x = m? As seen already, the data reveals that when revenues increase by an amount x, the profits increase by an amount y = hx. The strict proportionality between x and y will be maintained as long as the profits and revenues data fall exactly (or approximately) along this line. Notice also that the Type I line makes a finite positive intercept x0 = - c/h = 48 on the x-axis, or the revenues axis. The profits y go to zero when x = x0. For x < x0, there are no profits. Thus x0 is the minimum, or the cut-off, revenue that must be exceeded before the company can report a profit. Once x > x0, the additional revenues are converted into profits which will then increase at the fixed rate h, the slope of the Type I line. Indeed, exactly the scenario described here is observed when we analyze the financial data for several companies, big and small. Apple, Google, Microsoft, are some examples of companies that reveal this Type I behavior. For such companies both profits and profit margins increase with increasing revenues. All the three types of behavior conceived in Figure 1, Type I, Type II, and Type III are observed in the real world. In the Type II mode, profits increase as revenues increase but, unlike the Type I mode, profit margins decrease with increasing revenues, as illustrated by the three rays with decreasing slopes in Figure 4b. These points have been discussed in several recent articles, posted on this website. The reader is referred to the bibliography list at the end of this article. Specifically, the significance of x0 and its relation to the classical breakeven model for the profitability of a company has been discussed. Also, in the real world, the transitions from Type I to Type II and Type III behavior are to be viewed as local straight line segments of a smooth profits-revenues curve with a maximum point. The rising portion of the curve is the Type I segment. The falling portion of the curve is the Type III segment. These are often joined by the transitional Type II segment. Sometimes, as in the case of Microsoft, a company makes a transition from Type I to Type II and then back again to Type I behavior.
Page 11 of 39
The implications of a maximum point on the profits-revenues graph (seen with several companies, Ford Motor Company, General Motors, Yahoo, Verizon Communications, Kroger, Southwest Airlines, Air Tran, RIM) should also be carefully considered. The appearance of the maximum point and the transition to Type III behavior often seems to precede a critical negative event for the company, such as a bankruptcy (General Motors) or a merger (Air Tran with Southwest). Ford Motor Company and Southwest Airlines are examples of companies that are now operating past their maximum points. It remains to be seen what the future holds for these two, thus far, highly successful companies. This brings us to our second example.
Page 12 of 39
historically established negative trend (from 1976, after NSML went into effect) continued, see further discussion at 1. http://www.scribd.com/doc/101982715/Does-Speed-Kill-Forgotten-USHighway-Deaths-in-1950s-and-1960s and also, 2. http://www.scribd.com/doc/101983375/Effect-of-Speed-Limits-onFatalities-Texas-Proofing-of-Vehciles The traffic fatality rate is the ratio y/x, where the numerator y is the number of traffic-related fatalities and the denominator x is the VMT (units of 100 million miles). The x-y graph can be shown to reveal a Type III straight line. The following linear regression equation y = - 5.601x + 56,075 where x is in billions of mile, can be deduced for the (x, y) data given for the three years 1998, 1996, and 1966. Because of the negative slope, as the VMT, x, increased between 1966 and 1998, the number of fatalities y decreased.
A graphical representation of this data, see Figure 5, reveals a Type III behavior. The slope of the two rays joining the (x, y) pair back to the origin is defined as the fatality rate. The slope y/x = m has been decreasing year after year since the data follows a Type III line with a negative slope h. The equation of the line joining the 1966 and 1998 data is y = -0.558x + 56,148 where x is in units of 100 million miles. A complete review of the historical data on traffic-related fatalities, going back to the earliest days of the automobile, may be found in Ref.[1]. Consider now the implications of this Type III behavior. If we extrapolate backwards to lower VMT, this leads us to the ridiculous conclusion that when VMT goes to zero the number of traffic-related fatalities will increase to the
Page 13 of 39
maximum value given by the intercept c = 56,148 fatalities. This is clearly impossible and means that the Type III behavior is only applicable for a limited range of VMT. Indeed, at lower VMT, it can be shown that there is a transition to Type I (and also Type II) behavior. However, Type I behavior is NOT desirable in the problem being studied here since Type I behavior means increasing fatalities with increasing VMT. Indeed, this was observed historically in the US.
80,000
y/x = 5.51
70,000
Traffic fatalities, y
y/x = 1.58
GM, and Chrysler. And, a young Ralph Nader, who would become a leading consumer safety advocate, published his famous book Unsafe at any Speed in November 1965, condemning the US auto industry for neglecting safety issues. All of this forced the US Congress to hold highly publicized hearings in 1966 to improve the safety of cars. The Highway Safety Act was signed into law by President Lyndon Johnson in September 1966. Nonetheless, highway deaths continued to increase between 1966 and 1973 and the transition from a Type I to Type III (with a short intervening Type II) behavior actually occurred only after the NMSL went into effect on January 1, 1974. Suddenly, every one, all over the US, even in Texas and Montana, had to slow down to 55 mph. Ultimately, those who were lobbying in favor of repealing NMSL prevailed! The fatality rate y/x has continued to decrease and was down to 1.13 in 2009. Will the carnage begin again with the repeal of NMSL and the progressive upping of the speed limit by states like Texas? Technology has fundamentally changed and cars are safer. But when accidents do occur, especially at high speed, there is no escaping the laws of physics. Speed kills for a reason. The kinetic energy K of the vehicle moving at the high speed must be absorbed in the crash. K = mv2 where m is the mass of the vehicle and v its speed. Unfortunately, in a high speed crash (K increases as the square of the speed), the occupant is forced to become the energy absorber. And, highway fatality studies indicate that SEAT BELTS are still not being used by many drivers. This brings us to our third example.
With a little reflection it becomes obvious that we are dealing with a Type II behavior. Indeed, this is confirmed by the unemployment data for several Ohio counties, especially the largest counties (with the highest number of unemployed), taken for the month of January 2000 (before the above WSJ article was written). Table 3: Unemployment data for several large Ohio counties for January 2000 County Labor force, x Unemployed, y Unemployment rate, y/x Wayne 58,600 2,300 3.92 149,200 9,500 Lorain 6.37 Lucas 225,300 11,300 5.02 Montgomery 277,000 12,400 4.48 277,800 14,100 Summit 5.08 Hamilton 424,000 17,600 4.15 Franklin 585,400 20,500 3.50 681,200 29,700 Cuyahoga 4.36 Source: http://ohiolmi.com/asp/laus/LAUS.asp The data compiled in Table 3 was obtained from the website of the Bureau of Labor Statistics (see link given above). If we consider the data for Lorain, Summit, and Cuyahoga counties, we find the pattern mentioned in the 2001 WSJ article. Cuyahoga County had the lowest unemployment rate but the highest number of unemployed! Lorain County, with the lowest unemployed had the highest unemployment rate. Amazing, indeed! In Ohio, the lower the unemployment rate, the higher is the number of unemployed! How can this data be rationalized? Indeed, with some reflection, it becomes obvious that this is a classic example of the Type II behavior envisioned in Figure 1 and mentioned also (Figure 4b) while discussing the profits-revenues data for a company. The idea of decreasing profit margins with increasing profits and revenues does NOT seem odd to merit serious discussion. However when the same variables x and y are associated with the unemployment problem, the Type II behavior seems rather odd and worthy of serious discussion. The same pattern (of increasing unemployed with decreasing unemployment rates) is confirmed by the data plotted in Figure 6 for 19 Ohio counties covering the entire range of x and y values, see also Figure 7.
Page 16 of 39
40.000 35.000
10.000
5.000 0.000 0 100 200 300 400 500 600 700 800 900
Page 17 of 39
35,000
30,000
Cuyahoga
Unemployed, y
25,000 20,000
Franklin
15,000 10,000 5,000 0 0.00
Lorain
1.00
2.00
3.00
4.00
5.00
6.00
7.00
only 4.6%. This is a clear size effect. The unemployment law is y = hx + c, where the numerical value of h should be fixed, as we have done here, by considering the highest unemployment levels. This same argument was also used to determine the slope h from the US, Canadian, and the Japanese unemployment data. Indeed, it appears that a single universal value of h = 0.0956, deduced by considering the highest and lowest unemployment levels, over a few decades, can be used to describe the data for three countries. In other words, the various transitions from Type I to Type II or Type III observed in all the three examples discussed here is a reflection of the size effect associated with larger and larger values of the independent (or stimulus) variable x. This (stimulus or independent variable) is revenues when we consider finances of a company, the VMT when we consider traffic fatality data and labor force when we consider unemployment statistics. This size effect is also illustrated in Figure 7. As the unemployment rate y/x decreases, the number of unemployed y is increasing because of the Type II behavior revealed in Figure 6.
5. Concluding Remarks
The noted political columnist Jeff Greenfield made an interesting remark recently. He writes, I got into writing and thinking about politics because I was told there would be no math. Boy, was I misled. Its not just the torrent of polls that we have to deal with, but the numbers that supposedly forecast Presidential elections with uncanny accuracy http://news.yahoo.com/add-it-up--the-prediction-models-lookdismal-for-obama--can-he-still-win-.html
Add it up: The prediction models look dismal for Obama. Can he still win? Yahoo! News Tue, Jul 31, 2012
Welcome to real world of data analysis! All of the confusion with number, ratios, rates, and percentages, is due to the fact somewhere along the way we forgot the significance of the important mathematical property of a straight line that has been discussed here. There are at least three
Page 19 of 39
types of straight lines that we encounter in the real world, when we analyze the large volumes of (x, y) data being compiled almost daily in a variety of economic, social, political, business and finance related problems. The three types of straight lines and the maximum point (in the profits-revenues data and also in the traffic fatality data) lead us to the (grand??) conception of a (bold!) generalization of the ideas of Planck and Einstein, well beyond physics. I am referring here to the fundamental idea of an elementary energy quantum E proposed by Planck to derive his blackbody radiation law. (E = hf, where f is the frequency of light and h is the Planck constant.) This, in turn, lead Einstein to the conclusion that light can be viewed as a stream of particles, each having Plancks energy quantum. (Einstein considers a property called the entropy of light to arrive at the particle conception of light. A simplified version of Plancks radiation law is used in these mathematical deliberations.) More importantly, Einstein also introduced the far reaching idea of a work function W in his photoelectric law, K = E W = hf W = h(f f0). Here K is the maximum kinetic energy of the electron that is produced when photons, with energy E, strike the surface of a metal. Some of the energy must be given up and does not appear as the energy K. The energy that must be given up is called W, an unknown quantity that must be determined experimentally for each new metal. Einsteins law is also a linear law. The K-f relation is linear and is exactly analogous to y = hx + c = h(x x0). The nonzero intercept c, which has been emphasized here, and leads us to the three types of straight lines, is really a generalization of Einsteins work function W well beyond physics to many problems, three of which have been discussed here. How do we resolve the Ohio unemployment puzzle? Let us consider just the (x, y) data for the three larger Ohio counties, Lorain, Summit, and Cuyahoga. This is plotted again in Figure 8. The three data points lie on rays with the slope y/x = m = unemployment rate, joining the individual point back to the origin. From the changing slope it is clear that as the unemployment rate y/x decreases, the number of unemployed y also increases. What is unique about Ohio that makes the number
Page 20 of 39
of unemployed y increase as the unemployment rate y/x goes down? How do we rationalize this empirical finding?
40 35 30 25 20 15 10 5 0 Counterintuitive observations Highest unemployment rate Lowest number of unemployed (Lorain) Lowest unemployment rate Highest number of unemployed (Cuyahoga)
100
200
300
400
500
600
700
800
Indeed, the labor force is akin to the energy of the photon. It represents the human energy or the human potential within the economy. Under the right conditions, as when E < W in Einsteins law, there will be no unemployment (x < x0). Once the labor force exceeds this cut-off, just like the appearance of an electron, there will be some unemployed. The labor force, by definition, is the sum of the employed plus the unemployed, just as E = K + W in Einsteins law. The analogy being drawn here with Einstein photoelectric law, and the idea of a work function, thus seems very fitting. When we compile empirical observations on profits and revenues, for example, why does the profit margin increase with increasing profits for Company A and decrease with increasing profits for Company B? Or, when we compile empirical observations on the unemployment data, why does the number of unemployed increase with decreasing unemployment rates, or vice versa? Or, imagine making observations on a moving vehicle. We determine its speed v as a function of time t. Vehicle A is seen to be accelerating. Vehicle B is decelerating. Vehicle C is moving at a constant speed. This does not surprise us. We know the reason why. It has to do with the idea of a force acting on the vehicle. Likewise, do we really understand why Company A reports increasing profits and profit margins while Company B reports decreasing profits and profit margins? And, so on for the other problems discussed here, and many more that have not been discussed, where we must deal with tables of x and y values. Further discussion of some of these ideas may be found in the articles cited in the bibliography list. The articles on Microsoft, Google, Apple, and Kia, provide a good summary. The unemployment problem, the teen pregnancy problem, and the traffic fatality problem provide additional examples.
Page 22 of 39
http://www.dailymail.co.uk/news/article-2183904/Breathtaking-photo-showsmoon-forming-sixth-ring-Olympic-display-Londons-Tower-Bridge.html Olympians seem to be running and swimming faster, throwing further than their predecessors, but when it comes to jumping (both the men and the women) they seem to be regressing. Great Britains Greg Rutherford won the gold with the shortest jump in 40 years, lamented Daniel Lametti in the Slate magazine, see link below, citing the stats from 1968, 1988, 2008. After the Olympic record set by
Page 23 of 39
Bob Beamon in1968 (at 8.90 meters), only once has this been exceeded (by Mike Powell, at Tokyo in 1991, with 8.95 meters). If we extrapolate forward using the negative trend, the Olympic long jump gold may soon be for the taking at 8 meters or less, by 2028 or 2032, see Figure 9.
http://www.slate.com/blogs/five_ring_circus/2012/08/03/long_jump_olympics_wh y_do_the_best_long_jumpers_in_the_world_seem_to_be_jumping_shorter_distanc es_.html http://en.wikipedia.org/wiki/Athletics_at_the_2004_Summer_Olympics_%E2%80 %93_Men%27s_long_jump This discussion of the Olympic Long Jump records is being included here in the discussion for two interesting reasons. First, as with the traffic fatality data, we see the emergence of a classic Type III behavior. With a little Internet research, the Olympic long jump records for other intervening years, not included in the Slate article, can be shown to confirm the negative trend. The American athletic hero, Carl Lewis, who won this event 1984, 1988, 1992, and 1996, won the gold in 1996 with a 8.50 meter jump, 22 cm less than his own gold winning jump of 8.72 m in 1988. The gold mark has thus clearly been lowered in this event in recent years. The data for all of the Olympic gold winning jumps, going back to 1896, may be found in the Wikipedia article. Only the recent trends, going back 1956, preceding and following the record jumps by
Page 24 of 39
Bob Beamon (1968) and Mike Powell (1991) are considered here in Figure 9. It is of interest to note that the slope h = -0.01429 if we use the winning data for 1968 and 1996 which is virtually the same as h = -0.01425 for the 1968 and 2008 data. However, as discussed in the context of the traffic fatality data, the appearance of a Type III trend usually signifies the existence of an earlier Type I (or Type II) behavior. We cannot extrapolate the Type III trends backwards, indefinitely, or even forwards. The Type III equation, D = -0.014t + 37.01 (deduced from the data 1968 and 2008) implies that if we extrapolate to earlier years, at time t = 0, the gold medal winning jump distance D would be a ridiculously high 37 m. Or, in 2592, anyone can show up to claim the gold since the winning jump distance D = 0 in that landmark year!
7.60 1952 1960 1968 1976 1984 1992 2000 2008 2016 2024 2032
This Type I equation was deduced using the 1956 and record 1968 data. A smaller Type I slope can be deduced using the 1956 and 1991 data. The negative intercept with the Type I trend means that the ratio D/t = 0.089 (166.6/t) was increasing with each succeeding year. In other words, Olympians were indeed making the effort to beat the records held by their predecessors. Why then the recent Type III trend? This brings us to the second reason why this data is being highlighted here. As discussed in the Slate magazine article, the reason Olympians are NOT jumping as long may be the lack of lucrative post-Olympic monetary rewards. The long jump is not in the same league as other athletic events. The key to being a successful long jumper (running long jump as opposed to standing) is to have world class speed (to gain the momentum before jumping) but the athlete can make more money being a world class 100-meter runner than training for the long jump. Usain Bolt, whose 100-meter race is eagerly awaited as of this writing (on August 5, 2012), signed a lucrative three year contract with Puma, rumored at $32 million. Being the worlds fastest man apparently seems to have greater commercial value than being the worlds longest jumper! And so, it is argued that Olympians are just NOT making the effort, in other words working hard, to improve the record held by their predecessors! Work done, effort made, this is exactly what we mean by the work function W, or the nonzero intercept c in the law y = hx + c. The transition from Type I to Type III behavior that we see in the Olympic Long Jump records (which incidentally implies the existence of a maximum point on this graph) is manifestation of the nonzero intercept c, or the generalization of Einsteins idea of a work function W, well beyond physics. As noted earlier (see also the discussion in Refs. [5, 8] cited in the bibliography), Einstein uses a simplified version of Max Plancks radiation law, which can be written in its most generalized form as: y = [ mxne-ax/(1 + be-ax) ] + c (1)
Page 26 of 39
This is a power-exponential law with the power law term xn multiplying the exponential term e-ax. Hence, the x-y graph reveals a maximum point. In Plancks law b = - 1and c = 0, i.e., the intercept is taken to be zero. Einstein uses the simplified version (with b = 0, c = 0), y = mxne-ax, which also reveals a maximum point (the maximum point occurs when n = ax or x = a/n) to deliberate on the property called the entropy of light. Indeed, entropy is the starting point of Plancks discussion in developing quantum physics. The reader is referred to the references cited. Of interest to us here is the following expression for entropy S, which is the very first step taken by Planck, in his history making December 1900 paper. Planck writes (following Boltzmanns statistical arguments about entropy of a system of N particles) S = k ln + unknown constant (2)
Planck was interested in the problem of how a fixed total energy UN = NU can be distributed among N particles (which he envisioned as being oscillators, charged particles, which vibrate about a fixed position, radiating electromagnetic energy in the process). The expression for the average energy U derived by Planck marks the beginning of quantum physics. There are many different ways in which a fixed total energy can be distributed between N particles. This gives rise to the entropy S, which is a measure of extent of disorder, or chaos in the system. The parameter in equation 2 above is the number of ways and can be determined using the laws of permutations and combinations. This involves factorials of large numbers. Hence, instead of a linear law, we now have a logarithmic relation between S and . The proportionality constant in this relation is k, which Planck refers to as the Boltzmann constant in honor of Ludwig Boltzmann who spent all of his professional life developing the field that we now call statistical mechanics. In fact, we find the above entropy equation carved on Boltzmanns tombstone. (Sadly, Boltzmanns ideas were not widely appreciated by his peers. He suffered from bouts of severe depression and ultimately committed suicide, just before he was about to be vindicated, such as by Plancks use of the above entropy equation to develop quantum physics).
Page 27 of 39
Notice how Planck is careful to introduce an unknown constant into equation 2. This is the nonzero intercept made by the S- graph. We can rewrite this as S = k ln + S0 . When = 1, i.e., when there is only one way to distribute the energy (as when there is only one particle, or when only one particle has all the energy) the natural logarithm ln = 0 and the entropy S = S0. What is S0? This is a question that was later settled by physicists by actually formulating a new law of thermodynamics, called the Zeroth law, which states that the entropy of a PERFECT crystal, at the Absolute Zero temperature, will be exactly equal to ZERO. This is NOT a proof. It is more like a postulate. Planck recognizes the importance of the nonzero intercept S0 when he takes the first steps to develop quantum physics. Likewise, Einstein recognizes the importance of the nonzero intercept in the photoelectric law K = E W = hf W = h(f f0). The cut-off frequency f0 = W/h observed by experimental researchers before Einstein cannot be explained if the work function W is zero. The cut-off frequency is actually a manifestation of the nonzero intercept, or the work function W. In Einsteins law, W represents the work that must be done to overcome the forces that bind the electron within the metal. This work, or energy used up to produce the electron, cannot be calculated a priori and will depend on the metal. Einstein calls it W and must be deduced for each metal experimentally. The purpose of the discussion here is to highlight the importance of the nonzero intercept in the real world using the Type III behavior observed in the Olympic long jump record as an interesting example. There is a maximum point on this graph. It is the effort or the work that must be done by the Olympian that is subtly manifested in the nonzero intercept and hence also the maximum point since Type III must give way to Type I at earlier times. Like Planck and Einstein, we must recognize the importance of this nonzero intercept whenever we analyze (x, y) data, as discussed here. We make observations and use numbers to quantify these observations. One of the variables x is usually taken as the independent variable, or the stimulus function. This gives
Page 28 of 39
rise to the second observation, the dependent variable y, or the response function. The most general relationship between x and y is y = hx + c, not y = mx. This nonzero intercept also affects the unemployment problem (one that engages our attention because of the severe jobs crisis now faced in the USA) and in the contentious problem of labor productivity. Labor productivity = y/x = Number of units produced /Number of labor hours Is there a nonzero intercept c that affects labor productivity? The potential existence a nonzero intercept c means we must be careful when we use the ratio y/x = m to draw conclusions and formulate policies (as is done routinely by management using labor productivity data for various manufacturing plants, or to decide which retail stores to close, etc. in the retail industry, using per store statistics). The ratio y/x does not tell us anything about the rate of change y as x increases or decreases. y/x = m = h + (c/x). The slope h is the rate of change and h = m, if and only if the intercept c = 0. If not, we must be careful to consider what may be called the size effect, the dependence of the ratio y/x on the value of x. The implications of the nonzero c have been discussed for the unemployment problem, for the profits-revenues problem, for the traffic-fatality problem, and for the teenage pregnancy problem (see Ref. [26]). The nonzero c is Einsteins work function outside physics. Plancks idea about entropy and the radiation law, generalized as equation 1, can also be applied well beyond physics. We have just found a maximum point in the most unlikely of places, in the Olympic long jump record this morning, August 5, 2012! Quantum physics was conceived to explain the appearance of such a maximum point on the radiation curve for a heated body. Einsteins law and the expression relating the average entropy S and the average energy U, derived by Planck, can be generalized and applied beyond physics. Finally, Einsteins photoelectric law K = E W = hf W implies the K-f graph is a series of parallels, if we perform experiments with different metals, each having its own work function W. Examples of such movement along parallels can be found
Page 29 of 39
in the financial data (e.g., article on Microsoft, Refs. [17,18] and Kia [15]). We see a similar movement along essentially parallel lines when we consider all of the earlier Olympic Long Jump records, going back to 1896. This is illustrated in Figure 10. The historical data seems to segregate along three parallel Type I lines.
9.50 9.00
A
8.50 8.00 7.50 7.00
C B
6.50
6.00 1860 1880 1900 1920 1940 1960 1980 2000 2020 2040 2060
This is a power-exponential law. Hence, the x-y graph exhibits a maximum point at a finite value of x as x increases. This means the derivative dy/dx > 0 for small values of x, up to the maximum point x = xm and dy/dx < 0 for larger values. For the moment, let the nonzero intercept c = 0. The expression for the derivative dy/dx, can be readily deduced for various special cases such as b = 0 (simplified radiation law used by Einstein, also called Wiens law), for a = 0 and b = 0 (power law, which is also called the Rayleigh-Jeans law), and for a = 0, b = 0, n = 1, the simple linear law. For a = 0, b = 0, c = 0, y = mxn and there is no maximum point. In his December 1900 paper, Planck derives the expression for the average energy U of N oscillators (by invoking the statistical arguments of Boltzmann) and thus provides a theoretical justification for the expression [e-ax/(1 + be-ax) ] which appears within the square bracket in equation 1. It is indeed a simple exercise to derive the expression for dy/dx for the most general case is. One only needs to apply the rule for the derivative of the product of several simple functions. Thus, dy/dx = [(y c)/x] [n ax + axg (1 g) ] where g = 1/(1 + be-ax) (3) (4)
The function g is defined for convenience of differentiation and appears in the denominator of the expression for y. Now let us consider the special cases. For b = 0, g = 1 and dy/dx = [(y c)/x] [n ax] (5)
Now, it is easy to see that there is a maximum point when n = ax, or x = n/a. Also, the slope of the graph goes to zero as y c, the nonzero intercept. For b = 0 and a = 0, dy/dx = n(y c)/x (6)
Page 31 of 39
This is the power law case and there is no maximum point. The derivative dy/dx is not equal to zero for any finite value of x. The slope of the graph goes to zero, as before as y c. For b = 0, a = 0, n = 1, dy/dx = (y c) /x = m (7)
This is the simple case of the linear law, y = mx + c and the slope dy/dx = m if and only if the intercept c = 0. For nonzero c, the slope will depend on the numerical value of c. The purpose here is NOT to provide an expression for the location of the maximum point x = xm, on the generalized Planck curve. A graph can readily be prepared to find this maximum point. Rather, the purpose here is to show (once again, using the general expression for the derivative for the Planck curve) that, just as dy/dx varies at different points along a curve it also varies at different points along a straight line if the straight line does NOT pass through the origin.
2. http://www.scribd.com/doc/102384514/A-Second-Look-at-the-US-TeenagePregnancy-Rates-Evidence-for-a-Predominant-Natural-Law Published August 8, 2012. Of particular interest is a rather unique Type I relation observed in this problem, for the years 1980 to 1987, see Figure 11 below.
1300
1200
1100
1000
900
800 8,500
9,000
9,500
10,000
10,500
11,000
11,500
the slope of the line joining these two points on the x-y graph, h = y/x = 177,270/ (-1242) = 0.143 > 0, is positive. Furthermore, the pregnancy rate is decreasing, not increasing, on this Type I line. As the female population x decreases the pregnancies y decrease and the ratio y/x also decreases with x and y maintaining the Type I relation with a positive slope. This is illustrated by the following calculations presented in Table 4 for the years 1980 to 1987 (exceptions are the years 1985 and 1986). Table 4: US Teen Pregnancy Data revealing an INVERSE Type I relation Year Female Total Pregnancy Change Change Slope population pregnancies rate x y h = y/x x (in 000s) y 1000(y/x) 10,381 1,151,850 110.96 10,096 1,109,540 109.90 -285 -42,310 148.46 9,809 1,077,120 109.81 -287 -32,420 112.96 9,515 1,039,600 109.26 -294 -37,520 127.62 9,287 1,002,370 107.93 -228 -37,230 163.29 9,174 1,000,110 109.02 9,206 982,450 106.72 9,139 974,580 106.64 -1,242 -177,270 142.73
Data source: US Teen Pregnancies Trends since 1972, Table 2.1 Ages 15-19 http://www.guttmacher.org/pubs/USTPtrends08.pdf . The average of the five (5) values of the slope h, calculated from the changes in x and y for consecutive years is equal to 139, is consistent with the overall slope h = 0.143 for the line joining 1980 to 1987. The data for 1985 and 1986 were ignored as representing a fluctuation from the linear trend.
Although one might usually associate a Type I trend with increasing x and y values, here we witness an interesting INVERSE Type I trend, with decreasing x and y values and a positive slope h. The normal Type I is observed in the teen pregnancy data (click here) between 1972 and 1980. Note: When I first looked at the data for the 1980-1987 period, with the decreasing pregnancy rates 1000(y/x), and decreasing x (the year 1980 represent a minor peak in the pregnancy rate when plotted versus years). I was expecting a Type II trend, similar to that observed in the Ohio unemployment problem. The INVERSE Type I relation, yielding the decreasing pregnancy rates, is thus very unique, indeed.
Page 34 of 39
11.The Future of Southwest Airlines, Completed June 14, 2012 (to be published). 12.The Air Tran Story: An Important Link to the Future of Southwest Airlines, Completed June 27, 2012 (to be published). 13.Annies Inc. A Single-Product Company Analyzed using a New Methodology, http://www.scribd.com/doc/98652561/Annie-s-Inc-A-SingleProduct-Company-Analyzed-Using-a-New-Methodology Published June 29, 2012 14.Google Inc. A Lovable One-Trick Pony Another Single-product Company Analyzed using the New Methodology. http://www.scribd.com/doc/98825141/Google-A-Lovable-One-Trick-PonyAnother-Single-Product-Company-Analyzed-Using-the-New-Methodology, Published July 1, 2012. 15.GT Advanced Technologies, Inc. Analysis of Recent Financial Data, Completed on July 4, 2012. (To be published). 16.Disappearing Brands: Research in Motion Limited. An Interesting type of Maximum Point on the Profits-Revenues Graph http://www.scribd.com/doc/99181402/Research-in-Motion-RIM-Limited-WillDisappear-in-2013 Published July 5, 2012. 17.Kia Motor Company: A Disappearing Brand http://www.scribd.com/doc/99333764/Kia-Motor-Company-A-DisppearingBrand, Published July 6, 2012. 18.The Perfect Apple-II: Taking A Second Bite: A Simple Methodology for Revenues Predictions (Completed July 8, 2012, To be Published) http://www.scribd.com/doc/101503988/The-Perfect-Apple-II, Published July 30, 2012. 19.http://www.scribd.com/doc/101062823/A-Fresh-Look-at-Microsoft-After-itsHistoric-Quarterly-Loss Microsoft after the quarterly loss, Published July 25, 2012. 20.http://www.scribd.com/doc/101518117/A-Second-Look-at-Microsoft-After-theHistoric-Quarterly-Loss , Published July 30, 2012. ****************************************************************
Page 36 of 39
21.http://www.scribd.com/doc/100984613/Further-Empirical-Evidence-for-theUniversal-Constant-h-and-the-Economic-Work-Function-Analysis-ofHistorical-Unemployment-data-for-Japan-1953-2011 Single universal value of h for US, Canada and Japan in the unemployment law y = hx + c, Published July 24, 2012. 22.http://www.scribd.com/doc/100939758/An-Economy-Under-StressPreliminary-Analysis-of-Historical-Unemployment-Data-for-Japan, Published July 24, 2012. 23.http://www.scribd.com/doc/100910302/Further-Evidence-for-a-UniversalConstant-h-and-the-Economic-Work-Function-Analysis-of-US-1941-2011-andCanadian-1976-2011-Unemployment-Data Published July 24, 2012. 24.http://www.scribd.com/doc/100720086/A-Second-Look-at-Australian-2012Unemployment-Data, Published July 22, 2012. 25.http://www.scribd.com/doc/100500017/A-First-Look-at-AustralianUnemployment-Statistics-A-New-Methodology-for-Analyzing-UnemploymentData , Published July 19, 2012. 26.http://www.scribd.com/doc/99857981/The-Highest-US-Unemployment-RatesObama-years-compared-with-historic-highs-in-Unemployment-levels , Published July 12, 2012. 27.http://www.scribd.com/doc/99647215/The-US-Unemployment-Rate-Whathappened-in-the-Obama-years , Published July 10, 2012. **************************************************************** 28.http://www.scribd.com/doc/101828233/The-US-Teenage-Pregnancy-Rates-1 Published August 2, 2012. 29.http://www.scribd.com/doc/101982715/Does-Speed-Kill-Forgotten-USHighway-Deaths-in-1950s-and-1960s Published August 4, 2012. 30.http://www.scribd.com/doc/101983375/Effect-of-Speed-Limits-on-FatalitiesTexas-Proofing-of-Vehciles Published August 4, 2012. 31.http://www.scribd.com/doc/102000311/A-Little-Known-MathematicalProperty-of-a-Straight-Line-Strange-but-true-there-is-one Published August 4, 2012. ****************************************************************** 32.http://www.scribd.com/doc/101828233/The-US-Teenage-Pregnancy-Rates-1 Published August 2, 2012.
Page 37 of 39
Quantum Business Model (QBM). This extends (to financial and economic systems) the mathematical arguments used by Max Planck to develop quantum physics using the analogy Energy = Money, i.e., energy in physics is like money in economics. Einstein applied Plancks ideas to describe the photoelectric effect (by treating light as being composed of particles called photons, each with the fixed quantum of energy conceived by Planck). The mathematical law deduced by Planck, referred to here as the generalized power-exponential law, might actually have many applications far beyond blackbody radiation studies where it was first conceived. Einsteins photoelectric law is a simple linear law, as we see here, and was deduced from Plancks non-linear law for describing blackbody radiation. It appears that financial and economic systems can be modeled using a similar approach. Finance, business, economics and management sciences now essentially seem to operate like astronomy and physics before the advent of Kepler and Newton.
Page 39 of 39