Consumption is one of those things that is on my mind a lot. Both economically as I aim to live debt free and with as little "stuff" as really needed as well as in other forms of energy. Buy local, buy less packaging, drive less, eat less! It goes on and on.
One of the really interesting things with looking at google appengine is their metering and logging of your application. The GAE has limits on how much disk, cpu and bandwidth your application can consume before you have to pay for those resources.
This model of utility computing has been something that has been kicked around and toyed with for literally four or five decades. In the beginning it was envisioned that computing would be just like the electrical grid, and you would pay for computing resources in much the same way as you do now. This was back when no one ever believed that a household could use or would need a computer that took up an entire room.
That of course all changed, and we drifted to the current state where everyone has their own (or three). Are we are drifting back to a utility model again with the usefulness of having your data live in the cloud? I know I certainly care less and less about the machine I happen to be using when accessing my data. If the data is in the cloud and therefore accessible anywhere you go, and more importantly from any device you choose (iPhone!) then it naturally just makes sense to perform operations on that data within the cloud as well. Why bring it all down to the client to compute values? Why own multiple computers and have idle processors and half empty disks? (wish I had that problem actually) I think it's a bit early to signal the death knell for the personal computer, far from it, but it certainly gets you thinking.
In terms of energy consumption these are all having significant impact on the overall picture. There still exists an incredible amount of power on the client machine that is largely going untapped with increasingly thin clients (but of course that is reversing now too) and that power is going under utilized when we have servers that are performing the same poorly crafted functions doing the work millions of times over for every page view etc.
There was a blog post in Jan 2007 that talked about how much energy would be saved if google switched from white to black. This post evolved into a full on article on the topic, and a website (Blackle) with a counter for energy saved.
This is all very interesting to me, but to get to my point... Using GAE and looking at very precise measurements of the resources my code and application are using was an incredible moment of perspective for me. Here I am, looking at a direct correlation to the algorithm I choose and a measurable amount of resources being consumed by that decision, amazing really. This is just profiling on the aggregate, but it feels profound. Somehow being in the utility computing frame of mind and looking at my "bill" I am compelled to rethink every aspect of my design to find ways to use less resources. This can only be a good thing.
One of the really interesting things with looking at google appengine is their metering and logging of your application. The GAE has limits on how much disk, cpu and bandwidth your application can consume before you have to pay for those resources.
(some stats on a very infrequently used blogquotes app)
This model of utility computing has been something that has been kicked around and toyed with for literally four or five decades. In the beginning it was envisioned that computing would be just like the electrical grid, and you would pay for computing resources in much the same way as you do now. This was back when no one ever believed that a household could use or would need a computer that took up an entire room.
That of course all changed, and we drifted to the current state where everyone has their own (or three). Are we are drifting back to a utility model again with the usefulness of having your data live in the cloud? I know I certainly care less and less about the machine I happen to be using when accessing my data. If the data is in the cloud and therefore accessible anywhere you go, and more importantly from any device you choose (iPhone!) then it naturally just makes sense to perform operations on that data within the cloud as well. Why bring it all down to the client to compute values? Why own multiple computers and have idle processors and half empty disks? (wish I had that problem actually) I think it's a bit early to signal the death knell for the personal computer, far from it, but it certainly gets you thinking.
In terms of energy consumption these are all having significant impact on the overall picture. There still exists an incredible amount of power on the client machine that is largely going untapped with increasingly thin clients (but of course that is reversing now too) and that power is going under utilized when we have servers that are performing the same poorly crafted functions doing the work millions of times over for every page view etc.
There was a blog post in Jan 2007 that talked about how much energy would be saved if google switched from white to black. This post evolved into a full on article on the topic, and a website (Blackle) with a counter for energy saved.
This is all very interesting to me, but to get to my point... Using GAE and looking at very precise measurements of the resources my code and application are using was an incredible moment of perspective for me. Here I am, looking at a direct correlation to the algorithm I choose and a measurable amount of resources being consumed by that decision, amazing really. This is just profiling on the aggregate, but it feels profound. Somehow being in the utility computing frame of mind and looking at my "bill" I am compelled to rethink every aspect of my design to find ways to use less resources. This can only be a good thing.