I called this blog intropy because it was a reasonably short word with links to 'information' that was still available as a short url...
I'd come across the word in the paper by Collier, though at the time I'd only scanned the paper and didn't really understand what he was meaning by the word. I had another go at Collier's paper while on holiday last week and I still don't understand it... Well, not entirely, but here's some thoughts about it.
Collier says "The order [of a system] is sometimes called the "intropy" of the system (in contrast to its entropy)." Previously he has defined order as:
Order = Smax - Sact
Smax = klogP
Sact = -k∑ p(mi)log(p(mi))
Intropy is not a widely used term, and some authors imply it is a synonym for negentropy, but in another paper (Hierarchical Dynamical Information Systems With a Focus on Biology, John Collier Entropy v5 pp 100-124, 26 June 2003) Collier says that intropy is one type of negentropy (enformation is another).
All these terms, like Gibbs Free Energy, are to do with the amount of order available to do something useful.