Stories
Slash Boxes
Comments
NOTE: use Perl; is on undef hiatus. You can read content, but you can't post it. More info will be forthcoming forthcomingly.

All the Perl that's Practical to Extract and Report

use Perl Log In

Log In

[ Create a new account ]

Alias (5735)

Alias
  (email not shown publicly)
http://ali.as/

Journal of Alias (5735)

Sunday August 27, 2006
09:17 PM

Date::Tiny 0.01 - Expanding my ::Tiny empire (and evolution)

[ #30772 ]

I've just uploaded Date::Tiny 0.01 to the CPAN. It implements an very small date object for use in log file parsing and other light duties where you won't need to manipulate the date, in as little code as possible. If you do need to do anything serious with the date object, it can be automatically inflated into a full DateTime object as needed with a ->DateTime method.

This is the 4th module in the ::Tiny series, and I plan to shortly also release the Time::Tiny and DateTime::Tiny companions to this latest module.

When I wrote Config::Tiny, the first of the ::Tiny modules, it was something of a rebellion. I was simply annoyed at the size to which modules to do apparently simple tasks had grown.

The uptake and and popularity of Config::Tiny has continued to astonish me.

Obviously there is something very attractive about a small, concise, zero-dependency module that is fast and takes up almost no memory, even if it is a bit hacky and not quite as "pure" as it could be. Config::Tiny is implemented just as a hash of a hash that has been blessed as a convenience, so it's not strictly OO.

It's worth noting I get more positive feedback, both emails and in person, for Config::Tiny than I do for anything else I've written. It even ended up in Perl Best Practices as the Config module of choice for basic config file functionality.

Obviously the concept has struck a chord with the general userbase.

After realising the data structures were almost identical (largely just a HoH), I then cloned it and made CSS::Tiny. And although needed less often than a module for .ini config files, I get the odd email about CSS::Tiny as well. Certainly more for a less-used area like CSS compared to other modules.

Surprisingly, early and speculative concepts from the world of non-biological evolution may back up my tactic of shrinking down larger modules in this way. Scientists in this area are trying to answer one very large question.

"What are the general characteristics of evolution, regardless of form, and how do we measure them?"

The holy grail for this area would be one equation to express how "evolved" something (anything) is. The same equation should be able to demonstrate why a human is more evolved than bacteria, why a pre-nova star is more evolved than a new star, and how evolved a human is relative to a star.

One of the early versions of this equation (provided as an example for demonstation purposes, but by no means considered rigourous) would be something like this.

e = i / s / m^3

e - Evolutionary level
i - Unit of effective information processing
s - second
m - metre

That is, where the general evolutionary level of something is based on the amount of information that is processed by that thing per second, per cubic metre.

Since information processing in all forms requires an expenditure of energy, some of which is invariably lost in the form of heat, this also means (loosely of course) that you can use per-volume heat output as a metric to compare two entities for their evolutionary state.

For example, by observing that the human brain generates more heat per cubic centimetre than the Sun (it really does, there's just a lot MORE of the Sun) we can state that the human brain is more evolved than the Sun. Humans are also more evolved than a rock or a stapler.

Whether we are more evolved than a computer is trickier, due to the different natures of the information processing done by our brain and a computer. Things are also complicated by volume constraints. Our brain can't get much bigger without running into heat issues, and chips have similar issues of heat management.

But based on it's higher capacity for heat dispersal, the physical medium of the computer chip certainly looks dangerous as a candidate to ultimately out-evolve biological life in general, if the processing efficiency can only be raised far enough. As long as our brains continue to process information more efficiently per cubic metre however, our future should be safe.

Now I'm doing an awful lot of hand-waving here, but like I said, it's early days for the field of non-biological evolution.

But if we take this concept to software, it allows us to make some interesting predictions about the long term evolution of software.

Software has gotten something of a free ride due to the ever-increasing power of hardware, and there has been very little short term pressure to evolve along the long-term direction of efficiency.

But looking at our e = i / s / m^3 equation, we can fit software into it quite nicely. Assuming equivalent hardware, our per-second becomes per-clock-tick computational efficiency (speed) and our volume becomes memory size.

Thus for any software problem we could well observe that the long-term trend is towards software (in compiled and running form) that is fast and uses very little memory. Which is exactly what the mandate of ::Tiny is (although it adds the caveat that it only implements a subset of functionality).

Doesn't look good for Perl though, with it's large memory usage.

But of course, these evolutionary trends typically take place of very very long time scales, so it has little relevance to us in the here and now, except at perhaps the level of decades.

The second part of the trend noted above is also interesting however. And that is that one of the indications of more evolved software is heat output per volume. And we can actually translate this as well.

If we assume the software is programmed into a FPGA, then heat output per volume can be taken (at least in part) as the execution density of the code.

That is, how often the average command within a program is run per second, the higher the better. So the first of the two equivalent code blocks below is more evolved, because each command fires more often on average than in the second.

    foreach ( 0 .. 10 ) {
          print "Hello World!\n";
    }

    foreach ( 0 .. 10 ) {
          print "Hello ";
          print "World!\n";
    }

Looking a the sort of code we write on a daily basis, this strongly encourages the use of CPAN modules. This is particularly the case where you can get two completely different pieces of code using the same utility module, since those command will fire twice as often then if they were written seperately, and increase your average execution density.

For a large module or framework, where you are less likely to use it twice in two different places, the trend is greatly reduced.

If I were to go out on a very long limb, I'd ask if this might help explain why utility and often-reused components like URI trend towards a single dominant module, while frameworks tend not to survive as long and eventually get replaced.

It's food for thought at least...

The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
 Full
 Abbreviated
 Hidden
More | Login | Reply
Loading... please wait.
  • Okay, you're trying to explain the success of modules on CPAN; that is, why do certain modules become more popular.

    What isn't clear to me is: since hardware is increasingly powerful, why is the size in memory of a module important? It's not as if different "species" of a module are competing for limited memory in a literal sense. Something like DBI is quite large, but it's undeniably more successful than database-specific modules were (you're not going to use an XS wrapper around libpq, even if it was t

    • The reason we've gotten away with using a lot of memory for so long is exactly because hardware is getting more powerful. In times of plenty, waste or resource usage is not an selection factor.

      But the time frames I'm talking about here are quite long. I'm looking at evolution of module usage over 3-10 year periods.

      Memory does eventually become important, if only for a subset of people. (Think mobile phones)

      I have one monstrous private application that uses 80-90 meg of RAM B to load, before doing any work o
  • The holy grail for this area would be one equation to express how "evolved" something (anything) is. The same equation should be able to demonstrate why a human is more evolved than bacteria, why a pre-nova star is more evolved than a new star, and how evolved a human is relative to a star.

    I think you will have a hard time finding such an equation, because I don't think there is a sense in which a human is more evolved than bacteria.

    I'm not just saying this to be cute. I'm a biology student, and my un

    • It may well be that if you can come with an equation to compare the "evolvedness" (or some more sophisticated concept) that a bacteria and a human are simply too close, and the difference (while existing) may just bee too small to be noticable.

      But what if you tried to compare the "evolvedness" of a bacteria and a rock.

      Or instead of a rock, how about a simple non-living self-replicating molecule?

      If some metric can be found, then perhaps they can be applied to the difference between bacteria and humans. And m
      • It may well be that if you can come with an equation to compare the "evolvedness" (or some more sophisticated concept) that a bacteria and a human are simply too close, and the difference (while existing) may just bee too small to be noticable.

        Your reply to my "you won't find any, since it's not there" seems to be "maybe it will be too small to notice".

        FWIW, I believe we won't find a metric or a scale along which a human would come out more evolved than bacteria; not because the differences will be to

        • I certainly agree that evolution is a process of adjustment, rather than a progression.

          But if it can be established that there are long consistent long term trends in evolution, where we progress (in the large scale) from state A to state B, then perhaps that change can be expressed as a metric.

          Lets not call that evolution, how about we call it futureification. :)