background preloader

Mathematics

Facebook Twitter

Information Theory

Hammack Home. This book is an introduction to the standard methods of proving mathematical theorems.

Hammack Home

It has been approved by the American Institute of Mathematics' Open Textbook Initiative. Also see the Mathematical Association of America Math DL review (of the 1st edition), and the Amazon reviews. The second edition is identical to the first edition, except some mistakes have been corrected, new exercises have been added, and Chapter 13 has been extended. (The Cantor-Bernstein-Schröeder theorem has been added.) The two editions can be used interchangeably, except for the last few pages of Chapter 13. Order a copy from Amazon or Barnes & Noble for $13.75 or download a pdf for free here. Part I: Fundamentals Part II: How to Prove Conditional Statements Part III: More on Proof Part IV: Relations, Functions and Cardinality Thanks to readers around the world who wrote to report mistakes and typos! Instructors: Click here for my page for VCU's MATH 300, a course based on this book.

Science/Math. E^(i theta) Consider the function on the right hand side (RHS) f(x) = cos( x ) + i sin( x )Differentiate this function f ' (x) = -sin( x ) + i cos( x) = i f(x)So, this function has the property that its derivative is i times the original function.

e^(i theta)

What other type of function has this property? A function g(x) will have this property if dg / dx = i g This is a differential equation that can be solved with seperation of variables (1/g) dg = i dx (1/g) dg = i dx ln| g | = i x + C | g | = ei x + C = eC ei x | g | = C2 ei x g = C3 ei xSo we need to determine what value (if any) of the constant C3 makes g(x) = f(x). If we set x=0 and evaluate f(x) and g(x), we get f(x) = cos( 0 ) + i sin( 0 ) = 1 g(x) = C3 ei 0 = C3 These functions are equal when C3 = 1.Therefore, cos( x ) + i sin( x ) = ei x (This is the usual justification given in textbooks.)By use of Taylors Theorem, we can show the following to be true for all real numbers: sin x = x - x3/3! Entropy (information theory)

2 bits of entropy.

Entropy (information theory)

A single toss of a fair coin has an entropy of one bit. A series of two fair coin tosses has an entropy of two bits. The number of fair coin tosses is its entropy in bits. This random selection between two outcomes in a sequence over time, whether the outcomes are equally probable or not, is often referred to as a Bernoulli process. The entropy of such a process is given by the binary entropy function. This definition of "entropy" was introduced by Claude E. Entropy is a measure of unpredictability of information content. Now consider the example of a coin toss. English text has fairly low entropy. If a compression scheme is lossless—that is, you can always recover the entire original message by decompressing—then a compressed message has the same quantity of information as the original, but communicated in fewer characters.

Shannon's theorem also implies that no lossless compression scheme can compress all messages. . The average uncertainty , with. Geeky pin numbers.

Quantum Mechanics

Science/Math. Mathematics. Rubik cube solved in 20 movements or less. PatrickJMT. Why-couldnt-i-have-been-shown-this-in-maths-class.gif (GIF Image, 251x231 pixels)