Monday, October 19, 2009

The importants of logs.

I'm a history buff... Well kind of. I don't have the date recollection that many do but I love history. It's easy to look through history and see villains and heroes, or just normal people living through extraordinary times. You can look through the diaries of John and Abigail Adams and see two brilliant people who in the middle of one of the greatest changes in history still have normal life events happening. It lets you know that John Adams, though brilliant, was still a man. Sure he was better educated then most today and don't get me started on Ben Franklin, but needless to say he was a self made man who people should look at and not say "well he was a genius so of course things worked well for him" and instead realize he was a genius who made something of his life.

But how does this all fit in with "Tech". Simple, it's evaluation time and I must sit back and recount what in the name of St. Torvalds did I do this year? Logs, diary's, journals all things I used to shun and now something I'm glad my manager wants us to do. This process becomes a lot easier when I look back and see what I have done at work. I may have to start one at home as well so I know what I did there. If anything it will be helpful to the historians who look back on my life and see that even though I lived in extraordinary times I was still just a normal guy.

But enough of that, how does logging help you? Well unless you have a photographic memory and instant recall of information your going to forget what you did, when you did it, and why you did it. Here is an anecdote to help with my case.

I was writing a file compression utility to help with about a terrabyte of data on an sftp server. I have to admit I was being a little lazy having previously worked on the code the day before I made an undocumented change. It was a smart change but upon looking at it the day after and forgetting why I made it I was confronted with the problem "What in was I thinking when I did this?". Fortunately the code was still being developed and my test environment was easy to duplicate. I say "fortunately because when I started working on it the next day I changed something that started putting my compressed files into a folder that was not self incrementing. Oops!

If I had made the notes I should have I wouldn't have had a self writing compression script that went on forever over writing it's data. My change from the day before was a smart one and kept this from happening and now I had broke it by undoing my brilliance.

So two lessons.
  1. What you did the day before had a good reason.
  2. Log what you did and the process behind it or you will surly cause your self double the work.
Logs, notes, journals, all are things that can help use deal with the massive amounts of data we ingest and create. Unfortunately they are also time consuming. But is that time lost for making the notes a good counter to time lost with breaking something? That all depends. Set processes have notes made and should be easy to back track and see were a step was skipped/missed. Thus you are covered. It's the undocumented process that bite you and can throw off an entire day.

There is also those points in your career where you have to sit and wonder "What did I do this year?" and in most case you remember one or two big projects but and remember the last couple of projects at the end of the year but forget all that happened in the beginning.

Just like a server logs it's events, log your own. You may not be a John Adams and partake in a world changing revolution, but you might just save your brain from being taxed on what you did with your life.

Friday, September 11, 2009

Ultra-low PUE??? KC Mares seems to know so.

KC Mares of MegaWatt Consulting has managed to get some really low PUE with today's technology. Color me sceptical but KC says his math is holding up up.

Now, you ask, how did we get to a PUE of 1.05? Let me hopefully answer a few of your questions: 1) yes, based on annual hourly site weather data; 2) all three have densities of 400-500 watts/sf; 3) all three are roughly Tier III to Tier III+, so all have roughly N+1 (I explain a little more below); 4) all three are in climates that exceed 90F in summer; 5) none use a body of water to transfer heat (i.e. lake, river, etc); 6) all are roughly 10 MWs of IT load, so pretty normal size; 7) all operate within TC9.9 recommended ranges except for a few hours a year within the  allowable range; and most importantly, 8) all have construction budgets equal to or LESS than standard data center construction. Oh, and one more thing: even though each of these sites have some renewable energy generation, this is not counted in the PUE to reduce it; I don’t believe that is in the spirit of the metric.
Google has reported 1.2 and 1.10 but if KC is right then they could possibly do even better. That all said I look forward to seeing if the test of time bears it out. That is one problem I have with PUE. At this point it is all theory and short term testing. At least as far as I have seen.

The question "Did all the IT load really NEED to be on UPS? " has some very interesting ideas. Yes, it comes down to risk but it is a very serious question that should be asked. In most cases the UPS is simply there to carry your load long enough to transfer power to the generator. Well why do you need both of your power supplies on the UPS for the preparation of a 15 minute power outage? Why have two when one would carry you through that time?

Of course you could point out "well what if your power supply on a critical server fails while your load is being transfered?" To which you must ask, "how 'critical' is this system and if it is that super critical why is it not clustered and have a failover server as well? Or do you like your single points of failure on one piece of hardware?"

I shall ponder this more. As they say the more direct you can get your power to the equipment the less power you lose. That cuts out one big middle man. Not sure APC would be all that happy...

Wednesday, July 1, 2009

Revolutionary may be an understatement. Meet Gaikai.

This just popped up on Slashdot and frankly to say it is Revolutionary is an understatement. Gaikai is seeking to allow high quality games to run on the cloud and free you the gamer from where you play it. I was sceptical at first but after seeing the video I am thoroughly impressed.

Gaikai is claming you can play pretty much any game online, anyware. Could be a PC title or console title. Bandwidth is a necissary part of the equation but they try to keep it down in the 1 MB range. They showed World of Warcraft, EVE, Mario Cart, and some others but this is just crazy given the implications.

  • The distribution channel has shifted from by the game in the store, to downloading it online, to now buy your account and start playing. No install, no patching, it is there and ready for you.
  • Store fronts may be a little pissed.
  • Operating system is neutral. This is a major deal for Mac and Linux folks as this runs in their browser.
  • Huge win for the game providor as they don't have to code for specific hardware. As they manage the hardware themselves they can do the upgrade and patching themselves. No worries about the customer screwing it up.
  • Piracy is pretty much mute. No pay, no account, no access.
That all said I do have some questions:
  • What about saved game data? Or game allowed addon's? World of Warcraft, for example, has addons you can put on. Is there some upload mechanism to put them on your account?
  • Can you change screen size?
  • Is there anyway to have the game play while not connected to the network?
  • Would there be an extra fee for the service on top of the game price or is it all rolled into one?
  • What does the server foot print look like to host a game and it's users? In the case of games like WoW are you having 10, 20, 30 users connected to a client server that then in turn connect to the Game server?
  • Game retail stores are not going to like this setup as they are essenteally cut out of the service.
  • How much is Gaikai talking to the telecommunication industry to help spread broadband service to cover all of America?
Hopefully we will see good things from this advancement in gaming. So far it is looking good.

Monday, June 8, 2009

Well duh! Most blogs are greated on a whim.

My beloved Slashdot has directed me to a NYTimes article by Douglas Quenqua (Published June 5, 2009). Douglas points to a 2008 survey by the Technorati that essentially states many (95%) of blogs go unattended. Left to the void of time and soon forgotten only to show up in usless google searches...

The oped goes on giving various examples of blogs come and gone and quite frankly are we supprised by this? Many people rant for a short time and then fade due to hopelessness or just shear lack of comments back. I've done varous searches for obsure topics and found hundreds of blogs with people who have limited feedback, bad spelling (I'm included in that one.), what turned out to be a shameless attempt to use Google adsence to make a buck.

I know I have personally started several blogs and somethings life just gets to busy to blog. Add in new technologies like Twitter and micro blogging and it makes since that typical blogs do not seem as active. Personally if I find myself posting a blog that is less than a paragraph then why bother? Twitter allows for the micro comments that make it much easier for people to give a comment with out using up lots of time.

Speaking of time, I think I've said enough. Blogging will stay for some time but look for many to persue micro-blogging.

UPDATE: 2009.06.09

John Scalzi has a different take on NYT's article. In "The New York Times: We May Slide Into Irrelevancy But At Least We Update Daily" notes that the NYT has a slight grudge with the new media and is a decade late on noting that blogs come and go. 

Well I agree that the old media simply missed the boat when it comes to the net. The old school newpapers were religated to irreliveance with Craig's List, eBay, blogs, and twitter. The cable news stations are doing a little better with addapting to the changing times but in many respects news print is dead in. This isn't to say that some papers havin't figured it out but many of the old hats will pass in the sands of time becoming afterthoughts or historical footnotes.

Thursday, May 14, 2009

What does a toaster and a HP DL360g5 have in common?

Why both can melt plastic very well.

Today one of the SysAdmins came to me with the woeful tale of not being able to PXE boot a DL360 so he could load an OS on it. After using iLO to look at the system remotely we see that the system never sees any drives. So off we go on a short walk to the data center and see what is going on. It should be noted that iLO gave no health warnings. It was all peachy keen save hard drives not showing up.

We arrive in the data center, plug in the monitor and see that yes the drives are not showing up.  The lack of blinking lights on the drives should have been our first clue but we skipped that step as it is rarely that two out of two drives fail. But we were obviosly wrong.

First I pulled out the drive on bay one and all was well. Then I pulled out drive number 2.

Well that is a sure sign of a problem. So I pulled the DL360g5 from the rack and took it to my desk. Fortunately we had another server on hand so the Admin was able to get back to work rebuilding his system. I took the system back to my desk and started to crack it open so as to see what other damage there might be.

Here is a view of the burned out fan.

And the burned drive controller.

It was fun talking to the support people and explaining that "no it was not in a fire or struck by lightning. It caused a fire or electrical arc". The engineer that will be coming out to document it kind of laughed when his boss told him they actually have a procedure for this but it is rarely used.

To be fair to HP this is the first I've ever heard of a server actually starting the fire. We have around 100 DL360's in service and this is the only one that we have had this happen to. I like the DL360 line and wish I could get the DL360g6 as they have are using the new Intel® Xeon® processor 5500 and have a huge energy savings. However I will be interested to see what the HP engineer says when he comes to look at the system.


I have some more photos for when we moved the fans out of the way.

First up some melted fans.

And here is a series of shots for the drive controller.

Cannot wait to hear back from the HP Engineers to see what the failure was.

Friday, April 24, 2009

The Jackalope jumps

My work lappy now has Jaunty Jackalope gracing it's hard drive.

First let me say I am impressed that the download went as fast as it did. Between using Down Them All, and a good network connection I averaged about 200 kb on the download. So it took about an hour and a half.

The upgrade from Intrepid to Jaunty was mostly painless. I did have a slight issue when it forgot that there was a Window install on hd0,0 but I quickly modified /boot/grub/menu.lst and it was working as intended.

I am still poking around with some of the settings. It did remember to keep all my compiz settings and I do think the cube is responding better and smoother. I'll have to update more as I continue to use it. Overall so far so good.

One more thing wireless and sound seem to be working fine so far but I did hear one report of a co-working having an issue with the wireless driver, but I cannot confirm it.

Wednesday, April 22, 2009

1 Day till the Jackalope is on the loose

One more day till Ubuntu's Jaunty Jackalope is released. A release that I look forward to.

The new Gnome is supposed to have better dual monitor support which is necessary given my old laptop that I often hook up to my Samsung 42" LCD TV in order to watch Hulu. In the past it has been a little bit of a hassel switching between the laptop and laptop + TV. Between Gnome 2.26 and X.Org 1.6 I'm hoping this becomes a smoother process until I can build my MythTV box.

I'm not sure if I'll update to the Ext4 file system though. If I do it will be on my older laptop which is more suitable to being wiped out and reinstalled. If all goes well I may go ahead and update my aging Ubuntu 7.10 system. However it is still working flawlessly so I may not brake whats not broken.

Over all I'm looking forward to the release but I don't think it's a huge game changer. My one hope is that more people will try it and see if it works well enough for their needs. So far my wife has has little problem with her laptop after I deleted her corrupted Vista install (the SP1 update killed it) and has only complained to me when it couldn't get on the network, which was really a problem with the access point.

A cautionary note:

For you die hards who are itching to gab this update, you may want to hold off a couple of days. As usual the Ubuntu release date is going to be flooded with requests to download and will probably be very slow. I know that last time I tried to download 8.10 on the release day from home it took a very long time.

Tuesday, April 21, 2009

Oracle puts on asbestos gloves and grabs the Sun

On Monday the 20th of April Oracle put on a flame retardant suit, some asbestos gloves and made sure a fire extinguisher was near by as it grabs hold of Sun. For you non-tech readers out their this is a big deal.

So the question becomes what does this mean for Sun's holdings? Many of which have become foundational to the we world. The two big ones being MySQL and JAVA. Now some analysts seem to think Oracle is buying into the server space. This has some merit as the cloud is starting to take off. Oracle may want some of that pie and Sun may just be the vehicle to do it.

Then again this is also a cheep with Sun's financial issues this became a buying opportunity to get JAVA which Oracle uses extensively. Many in the Open source world are wondering what this might do to them when Oracle takes control of MySQL?

MySQL was acquired by Sun back in 2008 for the nice some of $1 billion. However there was noted issues between some of the original MySQL founders and Sun leading to some like Monty to leave. Now MySQL will change hands yet again but not just any hands. With Oracle having it's own database it beggs the question as to what Oracle will plan to do with MySQL's business? For some time MySQL was nipping at Oracle to be a full fledged rival. MySQL already dominates the web space so this change leaves some very valid questions.

One other think I have yet to see anyone talk about is Open Office. Sun created Star Office and it's Open Source brother Open Office. Will Sun continue this development or just cut ties with Open Office? Then again they could put more backing behind it in the hopes of competing with Microsoft Office.

Either way this leads for a interesting time in tech.