Miguel de Icaza, creator of the Gnome Desktop, is a little down on the Linux Desktop. I cannot say I really disagree with his assessment. All the "great" desktop apps for Linux are really niche apps or apps that are cross platform. Chrome, Firefox, Thunderbird, even LibreOffice are all cross platform so their's nothing there that says Linux only. As a SysAdmin I would cry without ClusterSSH but that is really something for the SysAdmin niche. Adobe's PhotoShop works on Mac OS X and Windows but it is also a niche app (and rumor has it runs fine under Wine) that will be used by graphic designers, not Joe and Jane Smith.
As much as Miguel has a point it goes much broader than he realizes. It's not just Linux as Mac and Windows are going to hit the same problem for different reason. The 80% to 90% of computer users don't need the desktop. Killer apps or not their consumption habits are going to change and the PC will revert to niche areas. The killer app they do use (Internet and productivity software) are going mobile.
Get a good tablet with keyboard (bluetooth, usb, whatever) and suddenly the non-niche market has no need for the desktop. The PC of old will essentially be used by power users (gamers, programmers, developers, etc). Office apps will be in the "Cloud" or at least not needing a huge desktop to do the work. All the tablet needs is productivity software and the ability to connect to one to two larger displays and it will have wiped out the business PC as the productivity platform. All those office workers can start their spreadsheet or power point in a meeting and complete it at their desk on dual monitor goodness and not skip a beat. I have one Linux Admin in my office who has already said if he could get a ClusterSSH and a good shell client on his iPad he would have no need for a desktop.
Miguel's complaint about he Gnome/KDE and various windows manager's is frankly irrelevant to most people. If anything the Linux field needs to start thinking about next display technology and how it can jump in on the up and coming computer input technologies beyond the current multi touch. (Think of that computer desk from "The Island").
Apps come and go. They are what bring people to the computer platform but form factors also impact their usage. The power in a tablet computer is sufficient for the majority of people's needs. We are just waiting for the productivity apps to catch on and for the tablet designers to realize how the can supplant the desktop PC. Yes, niche players will always go for their desktop of choice but most of them will probably have a productivity device (Laptop/Tablet/Mobile device) on hand as well.
Friday, September 30, 2011
Wednesday, June 15, 2011
I use Linux in a corporate environment and proxy support sucks.
I have two work stations, a Windows XP desktop and Linux workstation. The bulk of my day to day work is done from the workstation. This does not come without some pitfalls. My most aggravating being proxy server authentication. Windows NTLM and it's ability to pass along your authentication is rather nice and simplifies your world. Linux on the other hand can be a little daunting at times.
My workstation currently runs Ubuntu 11.04. I had been running with Fedora 13 then 14 but frankly Fedora is not that user friendly for a desktop. SELinux borderlines on the insane if you use it and mildly annoying if you put it in passive mode. Yes, you could disable SELinux which is what many people do but as many times as you have to log in as root to do something useful it really seems counter productive. Ubuntu/Debian's sudo setup is far superior in this regard. And in my opinion APT is far easier to work with than YUM. Just managing repositories seems faster. Which takes me back to my previous issue, proxies.
Proxies, simply put, need help in Linux. Proxies are a standard in the corporate world. Which should tell you why it so important. When Chrome first came out it relied on the OS proxy settings which caused problems if you were needing to authenticate to the proxy. It was quickly updated and partially resolved the issue but authentication is still a bear. Some pages may prompt you several times to authenticate, others my only prompt you once. Not that this is due to the application. Firefox is prone to the multi-authentication issue where as Chrome will prompt once and your good for that session... most of the time. Other apps are not so forgiving or feature rich. Banshee authenticates with basic auth. Many productivity apps depend entirely on the OS provide proxy config but they don't even utilize it fully as many will ignore the authentication piece and just time out.
Now some apps will allow you to save your password. GREAT! However corporate password policy may not be so nice. If you have to change your password every 30 to 45 days trying to remember which app stored your password can be hazardous to your login attempt. I used to have Thunderbird remember my proxy password and this worked great until the password changed. Add to it I was in a rush and opened up multiple applications all failing their initial authentication and wham! "Your account is locked out".
With my head bowed I schlepped over to the domain admin, as my account can only be reset by a Domain admin, and requested they reset my password. After a long story as to why my account was locked I was greeted with typical Windows Admin jest of getting a real OS. To which I counter "They won't get me a UNIX work station" and then walk back to my desk to type in the new password.
What makes things more frustrating is that the Linux proxy tool gives you the option to put in your username and password for the proxy. However it rarely if ever works. Add to that I don't know how secure it is. I'd love to have the Linux proxy tool updated so that it worked with authentication proxies, stored your password securely, or just used your local authentication. I think this alone would help the corporate adoption rate, or at least make my life a little easier.
My workstation currently runs Ubuntu 11.04. I had been running with Fedora 13 then 14 but frankly Fedora is not that user friendly for a desktop. SELinux borderlines on the insane if you use it and mildly annoying if you put it in passive mode. Yes, you could disable SELinux which is what many people do but as many times as you have to log in as root to do something useful it really seems counter productive. Ubuntu/Debian's sudo setup is far superior in this regard. And in my opinion APT is far easier to work with than YUM. Just managing repositories seems faster. Which takes me back to my previous issue, proxies.
Proxies, simply put, need help in Linux. Proxies are a standard in the corporate world. Which should tell you why it so important. When Chrome first came out it relied on the OS proxy settings which caused problems if you were needing to authenticate to the proxy. It was quickly updated and partially resolved the issue but authentication is still a bear. Some pages may prompt you several times to authenticate, others my only prompt you once. Not that this is due to the application. Firefox is prone to the multi-authentication issue where as Chrome will prompt once and your good for that session... most of the time. Other apps are not so forgiving or feature rich. Banshee authenticates with basic auth. Many productivity apps depend entirely on the OS provide proxy config but they don't even utilize it fully as many will ignore the authentication piece and just time out.
Now some apps will allow you to save your password. GREAT! However corporate password policy may not be so nice. If you have to change your password every 30 to 45 days trying to remember which app stored your password can be hazardous to your login attempt. I used to have Thunderbird remember my proxy password and this worked great until the password changed. Add to it I was in a rush and opened up multiple applications all failing their initial authentication and wham! "Your account is locked out".
With my head bowed I schlepped over to the domain admin, as my account can only be reset by a Domain admin, and requested they reset my password. After a long story as to why my account was locked I was greeted with typical Windows Admin jest of getting a real OS. To which I counter "They won't get me a UNIX work station" and then walk back to my desk to type in the new password.
What makes things more frustrating is that the Linux proxy tool gives you the option to put in your username and password for the proxy. However it rarely if ever works. Add to that I don't know how secure it is. I'd love to have the Linux proxy tool updated so that it worked with authentication proxies, stored your password securely, or just used your local authentication. I think this alone would help the corporate adoption rate, or at least make my life a little easier.
Friday, April 29, 2011
Microsoft $5.2 billion VS Apple's $5.99 billion
The Wall Street Journal documents Microsoft 3Q Net Jumps 31%, But Windows Decline Dims Outlook - WSJ.com
Short answer, Microsoft did better than it has been but not enough to beat out it's rival Apple. Which is funny given how Microsoft bailed out Apple back 1997. Some theorize that without Gates, Microsoft is just moving on to a the slow death. Apple suffered without Steve Jobs for several years and look what happened when he returned!
Ars commented on the same story yesterday and the comments section went the way of the troll. But there were some good points made. One, Apple is a hardware company not a software company like Microsoft. Yes, they do have some software but they have dominated with their hardware and they have capitalized on the App Store which is a constant stream of revenue for Apple, to the tune of almost $2 billion. That's a huge sum for something that requires relatively little effort on Apples part. Microsoft has nothing to compare/compete with the App Store.
Microsoft is trying it's hand at the mobile game but lets be real. The hardware part is nothing compared to the application end and Apple gets a nice slice from every pie that goes through their store.
Apple gets 30% of every dollar that goes through the app store and that is almost free money. Add in the price of the phone, and what Apple gets for kick backs from the phone providers and you make lots of money. Microsoft doesn't compete. Apple has mobile providers nearly frothing at the mouth to get access, where as Microsoft is "meh, I guess we can carrier your OS."
My personal opinion is the Microsoft VS Apple is a bad comparison. They do have some overlap but they both compete in vastly different areas. I'd be more interested to see analysis on the Android impact as Android of today is the PC of yesteryear and Apple lost that battle in the 80's and 90's.
Android is not making more money than Apple but it has captured market share. Microsoft is not significant in the phone market, but Google is and Android VS iPhone is a far more realistic comparison.
Wednesday, March 9, 2011
Apotheker and future of HP
On November 1st, 2010 Léo Apotheker took the reigns of HP. The man has some big plans; webOS on every HP, increase HP's software profile, and bring back innovation that Mark Hurd chucked. But not all is sunny in HP land, today Bloombergs Carol Hymowitz and Douglas MacMillan note Apotheker's involvement in some shady dealings regarding HP's Board.
Over all I hope the best for HP. Hurd helped the money line but only helped in bringing down the business. The HP employee's started to remind me of Sprint folks because they didn't know when the axe would fall. Mark Hurd took to acquiring new things but cut off that which made HP great, namely it's people. If Apotheker's rehtoric can be believed then he is at least looking to bring back HP's innovative spirit. Which they will need. Dell has been making inroads in the server space, and personally I like Dell enclosures and servers better than HP's. We have notorious firmware issues with HP, where as our Dell systems don't have to be patched unless something is broke. HP Blade's are a different beast.
NOTE: Word to the wise, if you buy an C7000 fill the bays and then don't touch it if at all possible. Otherwise a firmware update on one blade may cause you to have to update everything on the other blades and the enclosure.
On the software end Oracle and HP have been beating on each other for a while but HP is falling behind. Frankly I think HP needs to capitalize on Oracle's bad blood in the FOSS community. Apotheker would be wise to encourage HP to back the disgruntled players in MySQL and the Java space.
In the end only time will tell. Apotheker has said he has learned from his mistakes and SAP and "The one thing I've learned is to try to manage my temper better and get rid of cynics sooner." (see "Apotheker seeks to save hp's lost soul with software").
Over all I hope the best for HP. Hurd helped the money line but only helped in bringing down the business. The HP employee's started to remind me of Sprint folks because they didn't know when the axe would fall. Mark Hurd took to acquiring new things but cut off that which made HP great, namely it's people. If Apotheker's rehtoric can be believed then he is at least looking to bring back HP's innovative spirit. Which they will need. Dell has been making inroads in the server space, and personally I like Dell enclosures and servers better than HP's. We have notorious firmware issues with HP, where as our Dell systems don't have to be patched unless something is broke. HP Blade's are a different beast.
NOTE: Word to the wise, if you buy an C7000 fill the bays and then don't touch it if at all possible. Otherwise a firmware update on one blade may cause you to have to update everything on the other blades and the enclosure.
On the software end Oracle and HP have been beating on each other for a while but HP is falling behind. Frankly I think HP needs to capitalize on Oracle's bad blood in the FOSS community. Apotheker would be wise to encourage HP to back the disgruntled players in MySQL and the Java space.
In the end only time will tell. Apotheker has said he has learned from his mistakes and SAP and "The one thing I've learned is to try to manage my temper better and get rid of cynics sooner." (see "Apotheker seeks to save hp's lost soul with software").
Sunday, November 21, 2010
Inventory, Data, and the Unicorn.
Managing the data center inventory for a large corporation is no small task. There are many questions that have been asked, are asked, and will be asked regarding the equipment and many times the person asking questions will throw you a curve. Some common questions that should be easy to answer are:
- How many systems do you have?
- How many are deployed and how many in inventory?
- What OS are deployed?
- What is the break down by manufacture?
- What is the warranty on the systems?
- Who is responsible for the systems?
- How much is the equipment worth?
Now there are many more but these are very common and something any inventory system should be able to answer.
So what is missing from this? It has been my experience that there is the once a year or maybe even twice a year question that gets asked. This question seems to change every year so it's a little hard to predict exactly what they will ask for this year. One year Finance may ask about the value of equipment? Next year they may ask what is the value and depreciation of the equipment? Other years they may just ask for the list of equipment and the purchase order they came in on?
Now you might reasonably ask, "Why are they asking me, they sign the PO and track the orders don't they?" Well first smack your self and realize you don't think like a Finance person. They have pieces of information, not all information. In many cases they lack the expertise to know the difference between a Linksys router you use in the branch office and the Cisco 4000 you use at your HQ. They simply know how much they cost and when it was purchased. Add to this they may have some arbitrary amount that they consider to be a capital asset where as the usage you may have for an item is a little more laissez-faire.
Take the gbic for example. You will most likely use it in one system and probably purchased it with that switch. However three years later the switch has lived it's glorious life and is being decommissioned. However it's 10Gb gbic is still good and you need it for another machine. Now you may quite reasonably think you can just take it and move it to another system. It's still a usable part and can fix a problem you currently have. Finance on the other had has different ideas. For them the gbic may still have monetary value and still be considered a capital good. It's where abouts has impact on them and they only have record of the equipment it was first purchased with. The equipment you are putting it in has a value in their records. By you moving that gbic from equipment A to equipment B you have now devalued A and increased the value of B. Your little equipment move no longer seems so simple.
Take the gbic for example. You will most likely use it in one system and probably purchased it with that switch. However three years later the switch has lived it's glorious life and is being decommissioned. However it's 10Gb gbic is still good and you need it for another machine. Now you may quite reasonably think you can just take it and move it to another system. It's still a usable part and can fix a problem you currently have. Finance on the other had has different ideas. For them the gbic may still have monetary value and still be considered a capital good. It's where abouts has impact on them and they only have record of the equipment it was first purchased with. The equipment you are putting it in has a value in their records. By you moving that gbic from equipment A to equipment B you have now devalued A and increased the value of B. Your little equipment move no longer seems so simple.
Finance isn't the only group with this problem. Your operations team has a set of information they want to know. Managers have information they want and there may be other teams that need information. In my organization Network and System's used to be all together but now they are distinct groups. Each group's has different bits of information they want to tie to an asset and that is where all the fun begins.
Incoming!
Quote: "Captain, we're receiving two hundred and eighty-five thousand hails" -- Lt. Wesley Crusher (Parallels)
For ever group their could be 5 requests. Each of the 5 requests may require data from 5 different places. 5x5x5 = 125 Unique queries. In short that is a lot of data to juggle. That of course is a low number considering I have already provided more than five questions earlier. Add in the variants on those questions and new questions and your query begins to sprawl. So what are you to do?
ITIL offers us some hope in the Change Management Database (CMDB), however it is my understanding that many people confuse a CMDB for being the holder of all good information in all it's glory. Truth is ITIL calls for federating the data in large data sets. To put it simply there is to much data for one system to adequately hold it all. Instead you need to link multiple databases where the CMDB contains some data but not all data.
This is done so as to simplify the data for it's respective members. The idea being that Finance has a record of it's data and can easily go out and gather additional information from other systems, either by finding it in the CMDB or the CMDB telling them where they can find more data. Meanwhile the Service Desk can retrieve information from different information pools in the same manor. The Service Desk may not care how much the equipment cost and it's depreciation rate but they may care about when it was purchased and received.
Great in Theory
Quote: Now I will believe that there are unicorns...--William Shakespeare--(The Tempest)
A federated database that tells you where everything is, everything you wanted to know and things you didn't want to know, why this is a grand idea! And then I woke up. I have seen many vendors attempt to make this grandiose dream of data jubilee come true but I have never seen such a wonder. HP, IBM, they get pretty close, if you are an HP and IBM shop and drink their kool-aid you can come very close to this dream, but we are not a pure HP, IBM, Dell, Oracle, ACME, shop. Many places are not bound to one vendor and their in lies the problem. How does one federate what so many applications keep hidden?
I would like to think that the Open Source Community can tackle this but I'm not sure if the heart or even the thought is there. Various asset tools try to gather information regarding hardware but you still run into the same issue of linking data with Finance and Service Desk applications (or other groups for that matter). You need to have your data accessible and as it stands every vendor has their own idea of accessible. Where is the W3 of data interoperability? Where is the IEEE of data transport?
I'll tell you where! It's right next to the unicorn and the pink elephant.
Monday, October 19, 2009
The importants of logs.
I'm a history buff... Well kind of. I don't have the date recollection that many do but I love history. It's easy to look through history and see villains and heroes, or just normal people living through extraordinary times. You can look through the diaries of John and Abigail Adams and see two brilliant people who in the middle of one of the greatest changes in history still have normal life events happening. It lets you know that John Adams, though brilliant, was still a man. Sure he was better educated then most today and don't get me started on Ben Franklin, but needless to say he was a self made man who people should look at and not say "well he was a genius so of course things worked well for him" and instead realize he was a genius who made something of his life.
But how does this all fit in with "Tech". Simple, it's evaluation time and I must sit back and recount what in the name of St. Torvalds did I do this year? Logs, diary's, journals all things I used to shun and now something I'm glad my manager wants us to do. This process becomes a lot easier when I look back and see what I have done at work. I may have to start one at home as well so I know what I did there. If anything it will be helpful to the historians who look back on my life and see that even though I lived in extraordinary times I was still just a normal guy.
But enough of that, how does logging help you? Well unless you have a photographic memory and instant recall of information your going to forget what you did, when you did it, and why you did it. Here is an anecdote to help with my case.
I was writing a file compression utility to help with about a terrabyte of data on an sftp server. I have to admit I was being a little lazy having previously worked on the code the day before I made an undocumented change. It was a smart change but upon looking at it the day after and forgetting why I made it I was confronted with the problem "What in was I thinking when I did this?". Fortunately the code was still being developed and my test environment was easy to duplicate. I say "fortunately because when I started working on it the next day I changed something that started putting my compressed files into a folder that was not self incrementing. Oops!
If I had made the notes I should have I wouldn't have had a self writing compression script that went on forever over writing it's data. My change from the day before was a smart one and kept this from happening and now I had broke it by undoing my brilliance.
But how does this all fit in with "Tech". Simple, it's evaluation time and I must sit back and recount what in the name of St. Torvalds did I do this year? Logs, diary's, journals all things I used to shun and now something I'm glad my manager wants us to do. This process becomes a lot easier when I look back and see what I have done at work. I may have to start one at home as well so I know what I did there. If anything it will be helpful to the historians who look back on my life and see that even though I lived in extraordinary times I was still just a normal guy.
But enough of that, how does logging help you? Well unless you have a photographic memory and instant recall of information your going to forget what you did, when you did it, and why you did it. Here is an anecdote to help with my case.
I was writing a file compression utility to help with about a terrabyte of data on an sftp server. I have to admit I was being a little lazy having previously worked on the code the day before I made an undocumented change. It was a smart change but upon looking at it the day after and forgetting why I made it I was confronted with the problem "What in was I thinking when I did this?". Fortunately the code was still being developed and my test environment was easy to duplicate. I say "fortunately because when I started working on it the next day I changed something that started putting my compressed files into a folder that was not self incrementing. Oops!
If I had made the notes I should have I wouldn't have had a self writing compression script that went on forever over writing it's data. My change from the day before was a smart one and kept this from happening and now I had broke it by undoing my brilliance.
So two lessons.
- What you did the day before had a good reason.
- Log what you did and the process behind it or you will surly cause your self double the work.
Logs, notes, journals, all are things that can help use deal with the massive amounts of data we ingest and create. Unfortunately they are also time consuming. But is that time lost for making the notes a good counter to time lost with breaking something? That all depends. Set processes have notes made and should be easy to back track and see were a step was skipped/missed. Thus you are covered. It's the undocumented process that bite you and can throw off an entire day.
There is also those points in your career where you have to sit and wonder "What did I do this year?" and in most case you remember one or two big projects but and remember the last couple of projects at the end of the year but forget all that happened in the beginning.
Just like a server logs it's events, log your own. You may not be a John Adams and partake in a world changing revolution, but you might just save your brain from being taxed on what you did with your life.

Friday, September 11, 2009
Ultra-low PUE??? KC Mares seems to know so.
KC Mares of MegaWatt Consulting has managed to get some really low PUE with today's technology. Color me sceptical but KC says his math is holding up up.
The question "Did all the IT load really NEED to be on UPS? " has some very interesting ideas. Yes, it comes down to risk but it is a very serious question that should be asked. In most cases the UPS is simply there to carry your load long enough to transfer power to the generator. Well why do you need both of your power supplies on the UPS for the preparation of a 15 minute power outage? Why have two when one would carry you through that time?
Of course you could point out "well what if your power supply on a critical server fails while your load is being transfered?" To which you must ask, "how 'critical' is this system and if it is that super critical why is it not clustered and have a failover server as well? Or do you like your single points of failure on one piece of hardware?"
I shall ponder this more. As they say the more direct you can get your power to the equipment the less power you lose. That cuts out one big middle man. Not sure APC would be all that happy...

Now, you ask, how did we get to a PUE of 1.05? Let me hopefully answer a few of your questions: 1) yes, based on annual hourly site weather data; 2) all three have densities of 400-500 watts/sf; 3) all three are roughly Tier III to Tier III+, so all have roughly N+1 (I explain a little more below); 4) all three are in climates that exceed 90F in summer; 5) none use a body of water to transfer heat (i.e. lake, river, etc); 6) all are roughly 10 MWs of IT load, so pretty normal size; 7) all operate within TC9.9 recommended ranges except for a few hours a year within the allowable range; and most importantly,Google has reported 1.2 and 1.10 but if KC is right then they could possibly do even better. That all said I look forward to seeing if the test of time bears it out. That is one problem I have with PUE. At this point it is all theory and short term testing. At least as far as I have seen.all have construction budgets equal to or LESS than standard data center construction. Oh, and one more thing: even though each of these sites have some renewable energy generation, this is not counted in the PUE to reduce it; I don’t believe that is in the spirit of the metric.
The question "Did all the IT load really NEED to be on UPS? " has some very interesting ideas. Yes, it comes down to risk but it is a very serious question that should be asked. In most cases the UPS is simply there to carry your load long enough to transfer power to the generator. Well why do you need both of your power supplies on the UPS for the preparation of a 15 minute power outage? Why have two when one would carry you through that time?
Of course you could point out "well what if your power supply on a critical server fails while your load is being transfered?" To which you must ask, "how 'critical' is this system and if it is that super critical why is it not clustered and have a failover server as well? Or do you like your single points of failure on one piece of hardware?"
I shall ponder this more. As they say the more direct you can get your power to the equipment the less power you lose. That cuts out one big middle man. Not sure APC would be all that happy...

Subscribe to:
Posts (Atom)