Sunday 25 August 2013

Entropy


Entropy is a measure of the number of specific ways in which a system may be arranged, often taken to be a measure of disorder. The entropy of an isolated system never decreases, because isolated systems spontaneously evolve towards thermodynamic equilibrium, which is the state of maximum entropy. 






Entropy was discovered when it was noticed to be a quantity that behaves as a function of a state. Entropy is an extensive property, but it is often given an intensive property of specific entropy as entropy per unit mass or entropy per mole.

It is often said that entropy is an expression of disorder, or randomness of a system, or of our lack of information about it (which on some views of probability amounts to the same thing as randomness). 

Friday 23 August 2013


Ghost of a Moment


Image by Mark Andrew Webber  


I have this project. It is complicated and sprawling, and is starting to take over my life. So I decided to do what any self-respecting obsessive does, and blog about it. 

This blog is a journey, and a reflection. I am using it to document a creative process. It won't be neat, it won't be fangly-dangly. But it will be honest, and it will have my voice. Which is to say that it will have many voices, as there are many versions of me. 


August 2013