Saturday, January 28, 2012

Why Blackberry's fall could be bad for business communications

The general consensus is that Blackberry is on it's way down. There is some cheer-leading of this to some degree amongst the tech savvy because of what Blackberry has become: slow to innovate and expensive to use for so few features. I personally hate my Blackberry. It has a large touchscreen which freezes on me regularly and no Wifi built-in because when it came out, phone carriers were afraid of it turning into a VOIP device. So it was neutered. Or so the conspiracy theory goes.
But if Blackberry ceases to exists, we would be losing a key figure in the business communications over the Internet. Like it or not, BlackBerry systems have made Internet mail acceptable for serious business communications.
E-mail is so central to businesses now that it is easy to forget how fragile Internet Mail is. Until the advent of the Blackberry, I could not bring myself to recommend Internet e-mail as a primary means of business communication. Internet was playing a role in e-mail but mostly as cheap way to build VPNs for proprietary e-mail systems to talk to each other. Companies did have Internet mail addresses but these were either untrustworthy or of limited use.
There are too many problems with Internet mail, namely identity and delivery. You can never be sure anybody is who they claim to be unless you verified it through another medium. Internet mail is also notoriously difficult to figure out whether it has been delivered or not. Even standards back then were mainly about confirmation of mail being read. But that information was offered only voluntarily.
The Blackberry addresses these two concerns. First was the notion that a sender sending a message from a BB is really a real person using a phone. Blackberry and the phone company essentially verifies the person using the phone because he is both registered with the BB service and is paying a phone bill. The identity of the person sending can be made certain.
The Blackberry service also unifies the mail client and the mail server, technically known as the mail user agent (MUA) and the mail transfer agent (MTA). In standard Internet mail, the mail client and the mail server are two separate entities whose interactions are largely one way. You either send mail or receive mail at a given time. The MUA can only confirm a message has been sent to the MTA. The MTAs talk to each other to send mail. Blackberry unifies this and provides their customers with a way to confirm a message has been delivered. On BB, the users know when the MTA has sent the message to the recipient because it is one integrated system. So it solves the second problem by provide the person sending the mail, a way to know that the message has been delivered.

Wednesday, January 18, 2012

What the new WebOS foundation can do to make WebOS a success

HP has done the right thing and is contributing WebOS to the community by making it Open Source. This alone won't guarantee it's longevity. But WebOS fans can take heart at the success of FireFox as proof of how a closed source product can live on as an open source project. The success of FireFox is the success of the Mozilla foundation. The history lesson is best left to Wikipedia. But the model of success is there and should be emulated where possible. Namely
  • establishing a source of income to keep operating in the good times and the bad, fund advocacy and publicity. The deal with Google is where the largest chunk of money comes in
  • creating an framework where all forms of contribution are accepted - too radical or too specific to be rolled into the general application? Create a plug-in!
  • knowing when to rethink things. Firefox was born when they came to a realization that the Mozilla Suite, which included the browser, was too big and slow. By rethinking their goals when it came to the browser and making difficult changes, they made a browser that could compete with Internet Explorer and brought about a large change in the browser market. 
Unfortunately, the tablet OS market is more complex and challenging. While the Mozilla Foundation dealt with software only, the WebOS Foundation has to deal with both software and hardware. The first thing it has to do is establish a reference hardware design. The main purpose here is to stay relevant. By having a reference design that can be copied by the massive factories in China for cheap tablets, it increases WebOS's market share. This in turn will reward the developers who have stuck with WebOS with more potential customers.
Which brings me to the next thing the WebOS Foundation has to do, run an official marketplace. This is where those developers will find customers for apps as well as applications. This can serve, in the future, as a source of income. Right now, it should run it for a very low cut, say 5% for paid apps, to not put off developers. A key success to remain relevant in the consumer space is to have a diverse set of free or cheap apps. Developers are already focused on iPhone and Andriod apps. Windows 8 is on the horizon. While it may not be able to compete on total number of apps, there should be some effort to ensure key apps appear on WebOS too. Other than Angry Birds.
Following on above, it should also establish working relationships with phone manufacturers to build WebOS-based phones. Phone manufacturers will be interested in the royalty-free concept of using WebOS but realize they would need expertise developing the phone. Initially, this will have to be taken up by the foundation or a subsidiary to guide the development process. This could be another source of income for the foundation. But it should be the phone manufacturer's responsibility to develop updates to match updated version of the WebOS. This way, the manufacturers will determine how long they need to support a particular model. The foundation should also act as matchmaker with WebOS developers to provide the key apps that are bundled with the phone.
At a later stage, the foundation can take a step back, allowing developers to work directly with phone and tablet manufacturers. This would mimic the RedHat model where the developers would contribute code to the main OS, while working with their customers to build solutions for them. The foundation will maintain a reference development platform or reference virtual machines for testing and accreditation. This would not be unprecedented as it is much like what android is doing and palm did in the past.

Thursday, December 22, 2011

Blackberry SMTP servers identified as sending Spam?


Spamhaus.org servers were marking some blackberry SMTP servers are being spam relays or sending out spam. The servers were added to the XBL and CBL lists and as a result many messages from BB did not reach their recipients. Problem is, they appeared as delivered on the BB device. The message were probably delivered the the BB SMTP servers but communication from the servers were rejected by other SMTP servers that were using DNBL or similar services that relied on data from Spamhaus.
The only recourse is to resend the messages from the device.
It appears to affect BIS users in Malaysia between 4.00 GMT to 7.30GMT. sporadically.
Details to follow.

Tuesday, December 13, 2011

What to do with WebOS now that it's Open Source

Never stop hoping against hope. WebOS, whose pedigree can be traced back to PalmOS and the popular Palm Pilot series of PDAs (they were mini tablets, for you young'uns), will be OpenSource.
I've already had my say on what could be done with it here and here so I'm not going to repeat myself. So the question will be who will do what with it.


Will it be the next phone OS? Suddenly the open source world is full of phone OS / tablet OS. We have the resurrected MeeGo: Tizen, the OS for low-end Nokia phones Meltemi and now WebOs. Not to mention the lower profile GridOS, LiMo and SHR. This is all to the benefit of manufacturers who will be free to build on top of these platform. The best thing that could happen is that phones go the way of the PC. Hardware is independent of software. This means you can load any OS you want to your phone. Or customize it to the hilt. Skinning will no longer be the realm of the manufacturers and mobile service providers. Ah, the prospect of a high school graduation themed phones (complete with school-spirit color and logo) warms the hearts of retailers everywhere.


Will is be on your car dashboard? Toyota is on borad. Ford has flirted with in the past. Don't know how they are going to get it past their partner, Microsoft, though. But a WebOS powered in-car system is a great fit. It already does the touch interface well enough. Component and accessories manufacturers could just provide an interface and it would be the linux kernel's job to hook them up to WebOS. WebOS will then providing the human interface. That is a long way from a knob to turn on the air-conditioner.


Will it be on the TV? Finally TV manufacturers can provide an interface worthy of the Linux kernel already running in most new TVs. If your TV does YouTube or Netflix, chances are that you threw out a printed GPL license together with the FCC notices that came along with your TV. The TV doesn't need to be touchscreen for it to work but a remote with an accelerometer would be nice. This could be the start of a shift of how we consume entertainment. Think of the ability to buy channel apps where a channel is an app. Or packs of programs as apps. Better still, an app to control kids from watching too much TV.

What does this mean to Linux users out there? Your knowledge just got more valuable.

Wednesday, November 16, 2011

Recover from a missing kernel : The Problem

You read right. A missing kernel. Although this sounds terminal, the fix was fortunately simple enough. If you are in a jam the solution is here. But the journey how it got to this is a cautionary tale of "a little knowledge is a dangerous thing'. This is a long post.
A novice sysadmin in a small company had problems with CentOS VMs on VMWare ESX version 3. I had set it up for them a few years ago and had been maintaining it for a while after that. I recommended to the management to send their sysadmin for training on VMWare administration, even if it wasn't for certification. They agreed on principle but never did anything about it. Don't get me wrong. The guy was smart. He shadowed my work and understood what I was doing and knew to ask questions when he didn't. Not formally trained but experienced in administrating services (e.g. Samba, printing), I think it was a normal progression for him to take on more work closer-related to installation and configuration.

The VMware config - Linux Kernel Dance
I had not heard from them for a while when I began getting calls about "network problems". A quick look and I figured out the VMs that was running their DHCP and DNS servers had frozen up (if you are wondering, the  Magic SysReq key a.k.a. Ctrl-Alt-SysReq BUSIER works in the VMI console). Apparently the VMs were running out of resources with the CPU hitting and sustaining 100% average utilization. There weren't many VMs on the server and being Linux, I knew I could cram more than they were running currently. A more closer look revealed that it was caused by vmware-tools not being loaded. It wasn't being loaded because the sysadmin had updated the kernel but not reconfigured vmware-tools. This was happening for some time despite the message during bootup warning him about it.
I call this the linux vmware-config dance. For reasons known to VMWare, Linux is a second-class citizen. Even though, VMware ESX and ESXi and their flagship product VSphere, run on a Linux kernel, Linux support comes second in everything. The all-powerful VMware Machine Interface (VMI) client is Windows only. Don't point to that pathetic web-based management system. On Linux, we could start and stop servers but console access is broken or at best, works sporadically. We can't even create a VM using it. It's better with VMware Server Free (previously ESX). The web interface provides full access but it requires an ssl 1.0 support which is insecure and requires manual parameter configuration in Firefox to work .
The Vmware-tools service provides the kernel optimized access to memory, disk and network access. If it's not running, the VM can't do things like share memory with other VMs. Basically, it'll run slower and eat up more memory. And apparently, run it long enough, some resource gets gobbled up bit by bit without being properly released. The kicker is that since re-configuring vmware-tools affects network access, you can't do it remotely via ssh. It must be done via the console either through the web interface (VMware Server Free only) or the VMI for the paid stuff.
VMWare requires that vmware-tools to be reconfigured every time there is a kernel update. Updated kernels need to be loaded first, so a reboot is required. If you use a RedHat or SuSe kernel, the related Vmware-tools modules will load ok. But if it doesn't it'll recompile the modules. So you will need gblic and at least kernel headers to recompile. Depending on the distro, you may need to load kernel sources to get the headers. It's also good to restart the server after reconfiguration and reloading of vmware-tools to test whether there are knock-on effects on other modules. So to recap: restart the machine to load new kernel, reconfig vmware-tools and restart to cleanly load and test vmware tools and other modules. Now times that with the number of VMs you have and look at spending a lot of time doing this.

Recover from a missing kernel : The Solution

This is part two of two parts. You can read about the problem here.
The Solution
The solution was simple. I needed to install a new kernel.
I found that the sysadmin had an iso of the CentOS installation DVD on the VMWare server's datastore. The beauty of most modern distros is that their installation CDs or DVD come with a Repair Mode boot option. I modified the VM's setting to mount the iso as a cdrom for the VM. You may also have to change the VM's BIOS boot options to boot the CD-ROM drive before the hard disk. The VM's settings under Boot has an option to boot straight into the VM's BIOS setting. By default, the wait is too short for you to press the F2 key to enter the BIOS.
So I booted in to installation DVD's repair mode. It was all automatic. That is one of the nice things about using a VM environment: no hardware issues. Your distro supports them on bootup or not (commonly the network interface driver). CentOS found the network interface and configured it, found and mounted volumes and offered advice as to how to chroot to the mounted disks. Which I took. This makes the system think the root directory is the one mounted and not the DVD. Basically it boots into your system from the DVD and then makes the system think that it booted from the hard disk. Other then the running kernel, everything else is going to be loaded from the hard disk. /lib, /usr and /etc were where they should be. If there is no major incompatibility with the kernel, the existing utilities should run fine. I found yum was running ok. Why not? All the rpm databases and config files were right where it expected them to be. I installed latest kernel with yum. No problems because the network card was detected and was up. Once installed, I shut down the VM, removed the ISO from the CD-ROM settings and restarted a-okay.

Tuesday, November 01, 2011

The introduction to the comments section on http://www.ritholtz.com :
Please use the comments to demonstrate your own ignorance, unfamiliarity with empirical data, ability to repeat discredited memes, and lack of respect for scientific knowledge. Also, be sure to create straw men and argue against things I have neither said nor even implied. Any irrelevancies you can mention will also be appreciated. Lastly, kindly forgo all civility in your discourse . . . you are, after all, anonymous.
It goes without saying..

Tuesday, October 25, 2011

Windows vs Linux: Are we still fighting yesterday's war?

Linux-cluttered-desktop
Linux-cluttered-desktop (Photo credit: Wikipedia)
Energy Blue desktop, featuring the new Royale ...
A similarly cluttered Windows desktop. (Photo credit: Wikipedia)
I use Linux on my desktop daily, both at work and at home. I avoid Windows as much as possible but I also use them every day. I use them every day because everybody else is using Windows and my job is to help people make most of their PCs. Do I wish they were on Linux? Yes. Have I tried to convert them to Linux?No. That may seem odd but I have done the opposite in the past. I have converted some of my users to Linux. They were happy, productive and caused me few problems. But most of them have gone back to Windows. They admit missing Linux and it's stability and speed of  start-up. They miss having things that are working, just work and keep on working. But they all have gone back for the same reason. It's not Windows.
I ask myself why do I use Linux and not follow the crowd with Windows. The biggest reason is that it's free and I can do so much with free software. The reason most people use Windows is because it's what they are used to. I realize that I am also like them. I am used to Linux. And changing what we are used to is the biggest hurdle to moving from Windows to Linux.
For many years, Linux advocates, yours truly included, have been declaring that the next year will be the year of the Linux desktop. But even with it plethora of quality free software and increasing ease of use, the Linux desktop has still not grown. The effort seemed to go somewhere when Linux powered netbooks were gaining global popularity. People love the first netbooks so much that they didn't care about the OS. As long as it served their purpose, whether it is keeping files on the go, quick editing of photos or portable Internet access, people didn't care. They were cheap, portable and could be used longer than most notebooks. The popularity of these netbooks forced Microsoft to do the unthinkable, backtrack on Vista and extend Windows XP's life (that and the fact that businesses were not budging either). Now given a choice of something familiar (Windows XP, for most people) vs something strange (simplified version of Linux), guess which one people would choose, even for a little bit more money.
So, the dominance of Windows was extended. What ever ground gained was lost by Windows was regained by the time the early adopters got their second netbook, which was as soon as they got tired of dealing with the 8GB solid state hard dis. Really soon. To add insult to injury, Windows 7 looked more like KDE  the more you looked at it. 
So is all lost? Is Linux on the desktop going to be within the realm of the technically competent and those who wish to extend the life of their old machines? Will Linux remain the distant third on the desktop (after MacOS) forever?
Today I realized that there are more users using Linux in the office than there were six months ago. Triple, in fact. But they are not using it on the desktop. They are using it on their mobiles phones. A few just got the latest tablets. Some people will call this cheating, calling Android a version of Linux or that mobile phones don't count. It doesn't matter. Android isn't shy about it's Linux roots. In fact, it points to it's use of the Linux kernel as the reason for it's stability.
To make people change, there has to be a driver, an impetus. A reason. Why not work on a way to introduce more people to Linux via Android. Point out that they already use Linux on the phone. Why not try it on the desktop? Doesn't Apple's owe some of it's popularity to the dominance of it's iPhones and iPads? How many people switched to the MacOS because they like their iPhones or iPads? How many people bought Macs because of it's exclusive image, to be part of the in-crowd? These are all reasons for change and people are changing.
I love open source and my wish is to have more people use it. The more people uses open source, the more other people want to contribute to open source. The more, the merrier. 
Enhanced by Zemanta

Tuesday, October 18, 2011

Linux to the Rescue... Again

The least favorable job a Linux guy can get is... supporting newbie Windows users. While we live in a virus-free, relatively trojan-less environment, our Windows brethren are waist deep in shady toolbars, gotcha embedded web auto-downloads and the un-safe USB drives. It tickles me to no end when a web pop-up tries to convince me that I am looking at my files in windows explorer..on Linux. And while we may feel smug in the knowledge that our understanding of the underlying technology and Internet services allow us to take the necessary precautions, it is these skills that we are often employed to get Windows users back to being productive in the office.
While Linux-To-The-Rescue meant in the past safe partition resizing courtesy of parted (and later libparted-powered tools), all encompassing backups of partitionimage and harrowing hard disk ER with Photorec and Testdisk,  for a long time, virus and malware recovery work tools were limited to ClamAV, which itself it rather limited and really designed to detect viruses in e-mail.
Boot up screen - Notice the Memtest86+ memory tester
Well, now we have new hero on the block: AVG Rescue CD. I had thought about something along these lines some time ago. With the lawlessness of Windows of a few years ago, between Microsoft threatening to turn it's back on XP one more time (favoring Vista against user's wishes) and the overwhelming rejection of the business community to Vista, viruses and trojans seem to propagate at will; building botnets that continue to be reconstituted past when their mother ship has been detroyed. They were getting wilder too, being able to evade (literally) virus scans or rendering installed (but not updated) virus scanners impotent. I had a talk with friends at PandaSecurity about the viability of building a live cd around their command line scanner version that would mount windows partitions and scan them a few years ago. This is the best time to catch the trojans and virus, knife them when they are asleep. I had a problem with malware on my work PC's Windows partition. I ended up booting up from the Linux partition and trying to find the offending files using Clamav. This was before the age of writable NTFS (courtesy of ntfs-3g). So it was a cycle of scanning on Linux, copying down the location of the infected files, booting into Windows in SafeMode and deleting the file and back to booting on Linux and scanning again. Repeat until ClamAV found no more and then I would boot into Windows properly and run the updated Windows antivirus. It took a course of two days.
Main Menu
I heard nothing back from Panda. But a few months back, Panda also came out with a working LiveCD version that you can boot into and scan the PC with. I've used Panda's and AVG's and for now, I prefer AVG's LiveCD because it is light, works quickly to boot up, easy to update the pattern and program, has a character-based menu driven interface and is compatible with a lot of network hardware. It wasn't as compatible a few versions ago  (not recognizing some on-board NIC), but even then it worked on more computers than the Panda Security version. Panda's LiveCD's  GUI demands, which I think uses direct VESA / framebuffer rendering, makes it incompatible with a lot of PCs I use. AVG also bundles some tools commonly found on standard Linux recovery CDs like, PhotoRec and TestDisk. Both of these solutions read the disk in it's entirety and takes a long time to finish (read: hours). Users can't work on anything while it is running either (although one friend did marvel at what he could do with the Links text-based web browser on the AVG version that supports console switching). Users hate to wait but then again, it's their fault they run Windows. :)

Tuesday, October 11, 2011

Wide Pictures and Bad Packages: A Hugin tale

I love my Kodak Z1012. Specifically I love it's in-camera panorama picture auto-stitching feature. I've tried some of the newer cameras' click and sweep panorama picture feature and I am not impressed. The quality is a bit off, like it was a video grab instead of a static picture. Nothing as good as the quality in the picture below which is built from three snaps.
Interesting cloud formation in an afternoon sky
It happened that I was out on a trip and my significant other asked me to stop so that she could take in a nice panoramic view. The road was narrow and downhill, so I stayed in the car while she stood out and took a deep breath. And a few snaps, none of which used the panorama feature. When we got home I realized what had happened and decided this was the opportunity to try out Hugin, a photo stitcher that was featured in Linux Format. There was even a tutorial the for it in the following issue. It is useful as both a straight forward stitcher to create panoramic picture or to create some truly interesting photos. Besides, it had to be good because it was good enough to raise patent violation concerns.
So I started by installing Hugin. Except that the Mandriva Software Manager, powered by urpmi and rpm, said that it couldn't figure out where to get a particular library file. Not the first time this has happened to me, I just copied the name of the file and fed it to rpm.pbone.net, THE place for RPMS. It gave me the name of the Mandriva package and a link where I could download it. I was tempted to click on the file and have Software Manager install it. But I also know that when I do that, it does not check for dependencies. So that would open a whole can of worms.
So I found the same package in Software Manager and installed it that way. I tried installing Hugin again but this time it flat out refused. It didn't complain about the missing library. It just said that it can't install. I was getting annoyed but quickly realized that there was a way. The old-fashioned way.
Rpm.pbone.net has a feature that can check for missing files that are required by a package. It does require enabling Javascript. What it does is that it just looks for them on your hard disk. If it can't find it on your disk, rpm.pbone.net will provide a link to the package that offers the file. Nice.
So I settled on the following process.
  1. Select a file that rpm.pbone.net says hugin needs but I don't have.
  2. Click on the filename to search for that package that has it 
  3. Find the package in Software Manager and install it. 
  4. Re-run the check for missing required files
  5. Go to 1. if there are any missing files.
I like to work according to a process. Sure, I could bulldoze thru and install hugin from the console and fix each errors that pops up. Or worse still, build from source. Been there done, done that. But running through a process like the above gives my head enough space to look at my work for flaws and deal with them. It also gives me a way to backtrack in case I do something wrong. 
Well, I did found something strange. Some of the files the package requires are provided by the package itself. That is, if I click on that missing file, it would say that I need hugin installed for hugin to install. Talk about circular reasoning. It seems that whoever packaged Hugin for Mandriva, got their RPM pre-packaging info files wrong. Another tell-tale sign that the packager messed up is that the Hugin package also requires /bin/sh. Seen that before on other bad packages. Smelled like a bad cut and paste job.
After installing all the of the files rpm.pbone.net found through SoftwareManager, I downloaded hugin from rpm.pbone.net and installed it (Software Manager/urpmi auto adding --no-deps).
It worked like a charm. Hugin, that is. I'm still fiddling with it to come out with a good picture. It works like a charm but has a KDE-esque number of configuration options.
Now where is that tutorial?

Thursday, October 06, 2011

The real legacy of Steve Jobs

When people are going to talk about Steve Jobs, they will most likely point to his most recent successes at Apple, notably the iPad and the iPhone. They will talk about it bringing computing and the Internet to the masses, beyond the 'computer literate' or even the 'computer interested'. They will point out how it made using a computer be so natural that we have stopped talking about using a computer to just simply using it for something.
Some will even look beyond that and talk about him and the Macintosh. They will talk about how the Mac brought the GUI and the mouse to forefront and raised the standard of which how people expect to work with computers. Xerox PARC may have invented it, but it was the Macintosh that captured the imagination. People talking about Jobs will highlight the success of the Macintosh popularizing 'fringe' standards like a network connection built-in rather than as an add-on card and the 3.5 in floppy disk.
But to truly comprehend Steve Job's influence on computing and personal computers, you have to go back to the beginning. You have to go back to when Steve Jobs and Steve Wozniak set out to build and sell the first pre-assembled personal computer. Somewhere then, a spark went off that convinced a young Steve Jobs that this was what he wanted to do. This was what he wanted to shape and influence. This was where he would put his stamp on this world. This was his domain.
Apple Computer and Apple was an expression of Steve Jobs. Steve Wozniak was happy to be the engineer but Steve Jobs knew he had to be the lead, the one in the driver's seat. Think about everything significant that came out from Apple and there was Steve's stamp. Early on, he not only understood how to improve existing technologies to make them relevant to a wider audience and market them as products but also how to manage and handle engineers so that they could produce their best. He took on the image of the creator, basking in the light of adulation but also taking the heat of failed ventures, shielding the engineers away from the public's wrath (although according to some, not from his).
His passion was infectious and with it he sold dreams. Dreams that could only be realized through the computer Apple was making and selling. He understood that the computer was just a tool, unlike other computer companies then (and a few still now) that sold the computer on how beefy the specification were and how many features it came with. As a tool, he understood that the computer was really inconsequential. What the computer made was what really mattered. And the GUI was the first step in making the computer step out of the way and becoming a partner in that process of creation, the process of creating what mattered to the user.
As computers became more ubiquitous, Apple took it to the next level by looking for ways it's computers would positively affect their users, enhancing that relationship, further moving the computer's technicality into the background. The computer became colorful. The computer became beautiful. The computer began doing one thing very well. Until we stopped thinking about the computer and just did things with it. In every step of that evolution was one of Steve Job's imprint, his vision. He lead and others followed.
Steve Jobs also should be remembered for being the new CEO of a new generation. A CEO that understood that a company was about the quality of its people and not solely about the numbers in the balance sheet. He injected his passion in to his work as CEO, changing the notion of the CEO from the topmost manager to the driver, the leader of the company, setting it's path and navigating it through troubled waters. And that passion filtered down, regardless of what the company was facing. Apple was declared dead so many times (Michael Dell said on this month in 1997 that Apple should be shut down after Gil Amelio was fired as CEO), it is more than ironic that it is now is the biggest technology company in the world.
I've done my piece on Apple after Jobs so there is not much to add to that. But don't over look the other significant contribution of Steve Jobs. People should also remember that it was Steve's drive and money that helped kept Pixar alive long enough to fulfill it's full potential. And that is the other genius of Steve Jobs: the ability to know a good idea when he saw one. He saw the potential in the first Apple computer, the potential in the technology of the Xerox Alto, the need for a cheaper Apple Lisa (resulting in the Macintosh), the genius of the magicians at Pixar, the beauty that can inspire computer users that was the iMacs, the need for an easy and cheap way to get more music on to digital players, the desire to both communicate and do simple tasks on phones and the potential that a simple, mobile touch interface can change the way we work with computers. The legacy of Steve Jobs is not just at the birth of the personal computer but it's from there and the evolution of the personal computer to becoming part of who we are and what we do.
TWIT.TV special on Steve Jobs

Tuesday, October 04, 2011

New ideas for WebOS Part 2

This is Part 2 of ideas of what to do with WebOS. For HP or whoever owns WebOS by now. Read Part 1 here.


Go vertical in Business
The Blackberry Playbook is a dud by all accounts. So why not a make a proper business tablet? There is a market for it even though Apple would like you to not think of them. A real business tablet that actually has a proper mail client that works with normal mail servers. Offer a ultra-high security communications back-end service that offers premium features, such as guaranteed delivery and end-to-end encryption. What does that have to do with WebOS? Not much. What does that have to do with selling the tablet? Everything. Businesses love security and centralized management. Neither are available in other tablets platforms.
Especially centralized management. It is the most counter intuitive idea to the tablet concept. The tablet is about mobility and connectivity. But fleet management is a big reason why business opt for Blackberry phones. Remote service provisioning and data security. Bring that idea to the tablet and you have a runaway hit.
Other business friendly features:  Instant Messaging so simple to use, it doesn't get in the way of the message while being secure end to end. And works with HP printers from the get go. Plus whatever a businessman would expect from a tablet on the go. Edit PowerPoint on WebOS? Cool.
Business is conducted everywhere and that is where a business tablet should work. So get the office / productivity apps to work right on the tablet. We'd don't like to think of it but there are times when we just can't connect to the network. Clouds are great but what can you do when it rains? We still need the device to work. Make office apps run on the tablet and push advanced options to the cloud. Most people use only 10% of the full capabilities of MsOffice anyway. Give them the right set of 15% and you're good to go.

Go vertical in Education
    When it comes to tablets, the education market is still wide open. iPads are there but their cost is prohibitive. So take that as an advantage and find revenues further down the chain. Think of the tablet as a service-enabler, much as the set-top box was supposed to be for cable TV. Make money on the hardware but cultivate a thriving market for educational software and services. Unlike the consumer market, when schools buy, they buy in bulk. Customize the app and service market to handle schools of all sizes as well as the home-schooled kids. Sell WebOS devices to publishers as a delivery tool. Whoever starts this at schools may have to pay for companies to come on board at the beginning. Don't charge too much for publishers but aim for gathering a mass market. Spend as much time building a community as you are building a market. Make it easy for people to share safely while offering paid services at many opportunities.
    The killer feature would be the ability to build a child-safe browsing environment with child-safe browsers (via a proxy connection to your subscription-based safe proxy service) or curated links at a predefined gateway. Build tools for collaborated learning for older kids while centralized controls would be nice for the younger set. Think of the teachers. Find ways or build products to be able to pull a particular tablet screen and put it on the big monitor or projected to the wall. Wirelessly. Or to turn off a tablet of an unruly kid until he behaves. Thrown in a lockable cart for charging and evening storage and you're golden.

    Combine a few from the above
    Vertical markets are always afraid of being trapped into buying a product that has limited appeal. The ideas is to invest in a product that everyone is using to leverage on consumer pricing. And pricing, as we have seen, will make or break the future of WebOS. So focus on the markets but offer the product on consumer channels. Sell to the consumer market by touting it's market specific features.

    I am sure my ideas are half cooked at best. I wish WebOS and the people who develop it and who will be supporting it all the success they deserve.

    Thursday, September 29, 2011

    Choosing an Android smartphone

    This is in response to a comment at the end of a previous post asking what smartphone to buy.
    There is no point of me saying such-and-such model is good because these things change all the time. Like anything else, it that depends on what you want to use it for. Gaming, Social Media (Facebook-ing), just checking mail or an attempt to replace your PC.
    Three elements make up a smartphone.
    • The phone
    • The apps
    • The service or carrier

    The physical attributes of the phone is a very personal preference. Some like a big screen, others want a keyboard. Some are looking for HDMI output, others want a phone small enough for their purse or fanny pack. Nothing beats going to the shops and holding one in your hand. Choice is what Andriod users have in abundance.
    The apps that can run are important especially if you can't live without Macromedia Flash and Firefox. Some don't care. What you do need to know are the quirks about the particular Andriod version. Version 2.1 can't run apps stored on the SD card (argh! Palm Pilot flashback!), Version 2.2 is what I use but you may demand the latest and greatest, Gingerbread (V2.3). Any limitation to the apps is tied to the particular phone. Like Flash is looking for phones with an ARM7 processor (or so I am told). Don't settle for 2.1 unless you intend to root the phone and install custom ROMs. In that case, find the cheapest and good luck. Try not to skimp on memory because there is this quirk that even though you have tons of free space and apps on the the SD card, the apps installed do eat up phone memory too, even if it is on the SD card.
    Another aspect is what apps are available specifically for a phone. I love that my phone has both a tethering (able to connect PC to phone and use it as a modem) and portable hotspot options. Apparently not every phone has and not all carriers allow this. Some apps a dependent on the phone features itself, like an enhanced music player. So as you are looking, notice the unique features each phone has.
    Finally, you may or may not care about service and what phones are available for what service. Remember it is a phone so coverage is important. For Malaysia, rule of thumb is Digi is the cheapest but with the worst network. Don't be surprised if you're just out of the city area and left with GPRS. In town, Digi is great. Celcom has the largest network but they charge an arm and a leg for unlimited data. Don't know about Maxis, though.From their website, they are about the same as Celcom. Think about where you live and what is the coverage like. But if you are connecting mainly via Wifi you may not care as much.

    Tuesday, September 27, 2011

    HP seeking new directions for WebOS? Here's some.

    Why do I care for WebOS? Mainly because it is the continuation of Palm. I want to see another personal computing pioneer who has done so much innovation in the mobile / handheld computing space to survive and reap it's rewards. Palm has done so much to survive and grow in the face of challenges including change when change was required. Not too many  have survived in the way they have. WebOS may not be recognizable as something from Palm to their users of a decade ago, but it is an evolution that Palm undertook to continue to innovate and change to where they wanted the technology to be. Too many people think Microsoft and Apple were the only companies that did anything to advance the personal computer. Even tech journalists, especially sloppy ones, don't even acknowledge Palm's place in history as the first popular handheld consumer computing device. Even fewer know about PenPoint OS from Go Corporation, the pioneer in pen-based tablet computing. They were supposed to introduce a tablet PC in the early 90s but were FUDed out by Microsoft. This resulted in a wait and see attitude by developers and users. And we know how successful that Windows Pen Computing was. And Windows tablets. So what did GO left us? Think of this, without Go, we would have never have had Flash.
    Ok, here a few ideas...

    If you have gotten this far, chances are you aren't from HP. But if you are thinking about buying or licencing WebOS, here a few ideas I am sharing.

    Licence it out 
    What is Apple's business model? They sell premium priced hardware. Their OS is designed to take advantage of the hardware to the max since they control exactly what they are. The don't charge for the OS separately so the development cost for it is fixed and one-time (sorta). The more Macs they sell, don't translate in to more software sales. But the more hardware they sell (PCs, tables, phones) the more their revenue. They do also get a cut from things they sell on their marketplaces but the model is still the same, more sold means more profit. More tablets sold is like more store fronts being opened. That is their strength.
    That dependency on hardware sales is also it's weakness. They rely on the software to sell premium priced hardware. Hardware-wise they are no different than most PCs. In fact, the same hardware specs costs a lot less in the PC world. To counter Apple, take the opposite direction but with the same intention. Sell more hardware and make WebOS a driver for consumers to buy more hardware. That means cheaper hardware with the same performance. Focus on blanketing the market first and then break it into low-end, mid-end and premium hardware segment. Build a low-cost, easy to build model for the masses and exclusive, blinged-out, celebrity endorsed versions for the trendy.
    For example, make WebOS work on a commodity platform like the numerous no-name Android-compatible tablets platforms. Then licence it to any Tom, Dick and Harry (or Chen, Wong and Lee). Those hardware manufacturers would love another OS for their hardware platform because it will sell more of their hardware. Think Microsoft in the 80s when they sold the OS on IBM PCs to other manufacturers.

    Friday, September 23, 2011

    Welcome to Techsplatter.com!

    Looking at the past few posts (and the pile in draft), I have come to the realization that my posts are no longer limited to my linux adventures anymore. A few commentary crept in and now is the most read around. To do the title justice, I've decided to rename this site to techsplatter.com. Here some of the changes you can expect to see:

    • More commentary posts, my take on current tech news and trends that affect me
    • Better tagging, specifically two sets of tags on the side, one by topic and other by category. So if you are here for the linux stuff, there will be a linux tag for you filter the posts.
    • More graphics and links to external posts and videos
    Well, that enough for now. The changes are going to be gradual, staring with the site name and title. Comments are always welcome.

    Thursday, September 08, 2011

    HP split is nothing like IBM's

    Senior management at HP are probably wondering what is the noise surrounding about HP saying that they are interested in selling off / reviewing their PC division. It's not like something on this scale wasn't done before. Didn't IBM do it and walked away stronger than before?
    I think they need a reality check.
    When IBM sold off it's PC division, there wasn't as much noise although it was a major event. But they did it after an extremely long process. Analysts were telling IBM to sell it's PC division since the mid-90s, possibly even earlier. We all now know from the stock market crash, how valuable advice from analysts are. IBM in the past recognized the value of the IBM PC brand and the need to keep that front and center. In front of executives and right in the center of the table. That money losing brand was the 'public face' of the large servers and networks in the data center, where few are allowed to go. The large and profitable server and networks business and services. This was a time when computers were becoming even more prevalent in business in general, beyond the previous domain of enterprises. Having the brand on the desktop was still important. It says something about your business when you can afford to be using IBM desktop PCs instead of other brands. So it balances out in the end. Lou Gerstner, IBM's CEO at the time saw it's value but realized the inevitable. He put in place the process to split the PC division before he left and it was completed only a few years afterwards.
    What made IBM successful is that although they are a large corporation, they understood the process of making sales. The tech guys would come up with great products, the sales people would shake hands and push the products out to the customer, the senior executive would bring the CEO of the customer out for golf. That basic concept of focusing on getting Job Number 1 hasn't changed even though the products and technology has. In the end, this commitment to the customer is now the main IBM brand, not the PC.
    And what tech! Those original IBM PCs were built tough like tanks. I have had IBM keyboards that last longer than the PC it came with. When it came time for them to sell to Lenovo, IBM made sure Lenovo continued carrying on that tradition. I still have old-timers call their Lenovo laptops, IBM laptops (talk about brand loyalty). I don't blame them. The hardware are still built tough. I don't see that kind of brand loyalty with HP. [Full disclosure: I just realized my 4 year-old PC under the table is a Lenovo. Great service: I've had two motherboards and a hard disk replaced on-site with no questions asked]
    HP is not chopped liver but my experience with them are still a mixed bag. They make good hardware but the term "good service" doesn't automatically come to mind. It is spotty at best, with some products better supported than others. A tell tale sign? After HP Networking bought 3Com, you have to have a support contract to get access to previously free software or firmware, often the same software that came free on a CD with equipment (which usually is lost within the first few days). Contrast that to the experience accessing HP printer drivers, where software, often updated drivers are free even after the warranty of the product runs out. And guess which one I had to pay more for?
    Splitting up is also not alien to HP. HP used to be even more spread out in terms of technology. It used to have business units doing medical imaging, telecommunication networking, semi-conductors and scientific test equipment. These were spun-off into Agilent Technologies so that HP could focus on servers, storage and computing. Looking at Agilent's history could give a clue as to how HP might look like down the road. Agilent began life as an 8 billion dollar company in 1999. 10 years later, it is worth slightly less. In that time, it has sold off it's medical equipment, semiconductor and network test equipment divisions, focusing on the scientific equipment market. It seems to have a policy on increasingly narrowing it's focus in pursuit of sustainability and profits. Will the to-be-split HP Personal Computing unit become like this? First splitting then shedding it's low-profit computer divisions until all that is remaining is the printer unit? Is this the first step on that road?
    Motorola also split into two to allow each company to focus on it's own market. Each are doing well enough. Motorola Mobility is being purchased by Google (for a much more complex reason than purely financial). I am going to take a leap here and say that the Motorola split and the HP-Agilent split was about technology and focus on markets. It is about a company that has too many of them to manage and decided it would be better off to split itself up to focus their resources on their specific markets. This proposed HP split is more about what the direction of the CEO is, not about technology. HP PC division wasn't losing money, just not making enough profit. The litmus test is whether they sell off the printer division. If the split was about technology, it would also sell off the profitable printer division which is in the consumer half of the consumer-enterprise split. They may have to include it as part of the PC division to make it attractive. But why would you sell the goose that is laying the golden eggs? Customers, myself included, will see this as HP turning it's back on us.
    HP is not a stranger to bad decisions. Come on, this is the company that turned down the Woz's Apple 1. But the way that the announcements were made about the fate of the WebOS tablets and the possibility of selling off the PC division is very suspect. You can look at it one of two ways. Either HP was trying to appease the markets by ditching a low performing unit whose operating profit was 5.7% (and a mere 38 billion in revenue) or it has been working on it some time ago as part of a broader plan to make over the company into something that the CEO better understands. HP's current CEO is from the financial and enterprise world, previously in SAP. The consumer market is probably something he just doesn't want to deal with.
    Like in my previous post, it really makes me feel old when I see things like this happen again.

    Tuesday, September 06, 2011

    Mandriva or Mageia?

    I've been putting this off long enough. I am a long time Mandriva user from the days of Mandrake. Not exclusive, of course. It's the distro I use at home and on my laptop at work. I promote it to novices and other Linux users alike. I think I use it because it appeals to the lazy part of me. I get things done with little or no hassle. No fireworks. Not too much bling. Not many surprises.
    Ever since it was forked into Mageia, I realized that I would soon have to choose. But since there was nothing concrete from the fork, I waited. Then Mageia 1 came out. I waited some more. Now the new Mandriva is out. The updates are getting fewer and father in between for my  Mandriva 2010 Spring. Normally, it is that time when I turn on the backports repo and feed off that while I read the forums about possible show-stoppers. There wasn't much in the last few releases. So I think there shouldn't be much in the way.
    I guess the real question is, how do I choose which one. I use Gnome on Mandriva, so there is that Gnome 3 choice also. After see-sawing back and forth I've decided to burn LiveCds of both to kick the tires a bit.  Only Mandriva doesn't have that anymore. And they only support KDE with a new tablet like interface.
    The Mandriva upgrade instructions look frightening, largely because the English is confusing.

    When you use --download-all option urpmi will download all the packages first and then begins to install all of them. It is strongly recommended option for migration to a new release with urpmi. It is used to provide reliable update, you need to download and update a lot of packages. If you do not use this option and during update process you face Internet connection problems, you will get a very bad situation when only part of system will be updated, that will result in problems with correct system working.

    I'm having nightmares about my former Russian  math tutor repeating that to me again and again. Signs of influence from the Russian investors?
    Not everything is going against Mandriva. There is no PLF in Mageia, so I'll need to figure out how that works out for me.
    This is one of those times I want to shout out, " I HAVE A GREAT SYSTEM. I CUSTOMIZED IT AND IT WORKS FOR ME. WHY DO I HAVE TO KEEP REINSTALLING AND START OVER?" Especially with stuff like wine setups lying around, upgrades means re-configuring those again.
    I find it funny complaining about change. Especially from someone who has probed monitor refresh rates  to configure X windows (Look it up kiddies. There is no more spectacular way to make your monitor into a paperweight).  Change is why I don't have to do that anymore. But I think we have reached a plateau. Hopefully my choices won't lead me to cliff at the end of that plateau.

    Tuesday, August 30, 2011

    Apple after Jobs

    Steve Jobs and Bill Gates at the fifth D: All ...
    Steve Jobs and Bill Gates at the fifth D: All Things Digital conference (D5) in 2007 (Photo credit: Wikipedia)
    That is question on a lot of tech pundit's mind. I've followed Apple news since the 80s, having started on the Apple ][e. The best way to figure out what Apple could look like is to look how other companies have moved beyond their original founding or influential founders. The story is very much varied.
    There is that other company that Jobs have left and worked out ok, Pixar. He believed in having good people around in a company. His hiring of Scully from Pepsi in Apple's early day is an example of that. He refined this belief further in Pixar, where he has a team that has taken it to great successes, stayed hungry and welcome (and look for) change. I remember John Lasseter's comment in the mid-90s on how they at Pixar love the fact that Jobs was getting busy back at Apple. They now effectively controls Disney, with John Lasseter heading Disney Animation Studios. Jobs is the largest single shareholder via Disney's purchase of Pixar. Lesson: making a company great is a team sport.
    Jobs has left Apple before.Contrary to popular belief that he was fired, Jobs left on his own will. Jobs wanted control over the direction and results of his vision. The board was worried about how much it would cost. We all know what Apple leadership went through after that. Guy Kawasaki puts it best in the documentary Welcome to Macintosh where he says "everybody wanted to be somebody else". He probably meant each wanted each other's success and tried emulating them. One CEO, Michael Spindler, wanted to sell Apple to Sun or IBM. All that while, I believed that the only way to fix Apple was to bring Steve Jobs back. Not that the Apple faithful didn't dream in those times of his return. All of us was right. Lesson: Apple is an expression of Steve Jobs's vision of looking and creating the future, instead of just looking at the balance sheet.
    Another company with iconic founders, Hewlett-Packard, began life in the garage, much like Apple. In fact, there were the original home-garage computer company. Bill Hewlett and Dave Packard built it into a mult-billion company. They enshrined their beliefs in management as the "The HP Way". However, I feel that it was largely abandoned after the tech bubble burst in the late 90s and began to lose it's way as a technology leader. It became yet another computer company with many interests, with no distinctive features other than it's corporate maneuverings, sales forecasts and stock price movements. The exception is probably in printing where the brand is strong and products are respected. HP is now at a crossroads, with a CEO clearly looking for a buyer for it's 'low-profit' division, despite what it says otherwise. HP is a company lost not because of itself but it's leaders who have decided to let the numbers do the leading. Their most recent move in looking to sell their PC division is purely financial. And their customers can see right through it. They are worried about their support contracts, their investments in technology and more importantly in the people at HP. The question on their minds are: "Will the HP of tomorrow be the same HP I'm talking to today?" Lesson No 1: Anybody can have a vision and be visionary but the real question is: what is that vision? Lesson No 2: In the pursuit of profits, don't leave your customers behind.
    Apple's current management is well suited to continue Steve Jobs' legacies. But soon it will need a new visionary leader. Someone who is committed to the values of Apple and is surrounded by a team willing to follow.
    You see, Jobs's deal is that he wants to change the world. He wants to change the world by changing how people feel. He affects how people feel by changing or controlling how they interact with their world, whether their experience is visual or through touch. He believes that by making that experience of interacting with Apple products "revolutionary" and "magical", it will make people feel good and thus positively affects their world and the world in general. He understood that while the computer can do useful things, it is also a useful tool to impress people. Those that are impressed will go and buy the same computers. So, computers need to be useful and impressive. Now that's vision. Making a buck along the way is not bad, too.
    Apple will survive after Jobs, it's just not going to be this Apple.
    Enhanced by Zemanta

    Monday, August 22, 2011

    Will WebOS be another opportunity lost?

    You know you have been in the game too long when you see things happen twice. Or more.
    When I heard about what HP did with WebOS tablets and their future 'direction' on it, I was amused and upset at the same time. Amused because the way it was announced speaks volumes on the decision makers themselves rather than WebOS itself. You can bet that it was no knee jerk reaction. This was planned for some time by those who opposed HP buying WebOS or did not see it's value. They were just waiting for an excuse. How else to explain the suddenness of the decision? Why else would you talk smack on something that you wanted to sell? "My car is crap, would you like to buy it?" What normally would happen is to they'd talk about it's good points and try to get the best value from the sale. When you say bad things about that you want to sell, you want it to be valued low enough so that the reluctant buyer sees it as a bargain instead. When the value is low enough, people who would not normally buy something like it, may be tempted to do so.
    I am upset because WebOS represents good technology. When the tablets came out, WebOS got favorable reviews. Some reviewers did complain about some rough edges but forgave them because the tablet was a first model and bound to have growing pains. They expected that HP would work out the kinks in the next model. I was looking forward to picking one up.
    But history is littered with good intention and great technologies. The most analogous example I can think of a is PC-GEOS. For the briefest of time, PC/GEOS and DR-DOS represented a strong challenge to Windows 3.0. PC/GEOS, later GeoWorks, was graphical desktop environment that was advanced in it's day. GeoWorks came with a word processor, graphics editor and communications software. It had a Motif-like UI with advanced features such as scalable fonts and Postscript support. It provided multi-tasking, tear-away menus and an advanced API. The API provided services for almost all of the basic functions for desktop software. The word processor was about 25kb because almost of of the functions were system calls. And since it was the early 1990s, it worked well with only 640kb of RAM. Yes, the early 1990s. Windows 3.0 still had bit-mapped fonts. Some credit it for making Microsoft to come out with Windows 3.1 just after a year it released 3.0, just to add scalable font support (it still used bit-mapped fonts for the OS).

    GeoWorks Ensemble running on DosBox on SUSE
    If you want to try it yourself, you can download a basic version of it called Ensemble Lite from Breadbox. There are general instructions on how to get it running on DOSBOX or you could run in a VM running FreeDOS. Here is video of a working example.

    GeoWorks was a lost opportunity to put ahead a an easy to user, technically superior system that worked on existing computers. It was also a lost opportunity to make using computers less about knowing about computers than getting work done. The two biggest gripes users had about GeoWorks then was that the word processor didn't do tables and there was no spreadsheet. People had little problem using it because it was very stable. In short, it was also a lost opportunity to put applications before operating systems.
    WebOS by most accounts is a system that could offer a choice other then Apple and Android for tablets. Competition is the key to keep innovation humming. Apple has already chosen to litigate while it innovates. The iPads are also still not considered enterprise ready while Apples doesn't care about the enterprise. Androids will always be struggling to keep a balance between openess and security. It also has to balance between apps running locally and depending on the cloud to deliver productivity. WebOS could be that middle ground between flexibility and security, offering fewer apps but have apps that just work out of the box and aren't afraid to live on the box. All the while continuing to push the OS into the background. Which Microsoft can't and won't do.
    GeoWorks is a great product that didn't gain prominence because it couldn't compete against Microsoft's business practices back then (which effectively made PC makers pay for Windows for every machine shipped regardless of whether Windows was bundled or not). This was a time when people still ran other graphical operating systems on PCs and Windows was still version 3.0. GeoWorks didn't do disk operations so it still needed MS-DOS or DR-DOS so it wasn't like it was cutting into existing DOS sales. It also failed because it was hard and expensive to develop for. Sales were so bad, the company behind it later looked to revenue from sales of the SDK to help keep it running, what we now know as suicidal. This created a catch-22. People won't use it because there are no apps and developers won't develop for it because there a few users.
    At first I ran GeoWorks on my PC but I eventually moved on to Linux 0.99pre12(?) and I bought an old 640kb Laptop (with a lead-acid battery!) to run GeoWorks. Printing was a snap because I printed to a file using  the postscript printer driver. I would then pipe the file to a postscript printer for output. Sweet.
    In the end, Geoworks became an ultra-niche product and a promise of better computing unfulfilled. Don't believe me? Try it yourself, guess when you thought the OS came out and tell yourself it came out in 1991.

    Wednesday, August 17, 2011

    Webmin: The Unsung Hero

    Webmin is probably one of the best kept secrets of sysadmins around. Everybody uses it but rarely talks about it. Less still admit using it. Why? Because it makes the difficult config jobs point-and-click easy. It makes what seems to take countless command line commands into a few clicks of the mouse. That is probably why it's an open secret. It does take away some of the mystique of being a sysadmin. Managers, if they knew, would demand faster turnarounds. But it still needs you to know what you are doing.
    Basically webmin is a web-based config front-end for your system. I recommend it all around. Even if you run your own personal Linux desktop, I recommend you installing it. Even if you are Mr. Security Conscious, install it and configure it so that you can only access if from localhost. Because it provides something more valuable beyond that what it does superficially. I'll get to that in a minute.
    Webmin is a collection of server-side scripts, separated into modules, to run local commands to configure your system. It throws up webpages that recasts the various command line options for commands that configure one component of your system. Each module corresponds to a particular component of service. Some offer interactive tools, like access to a java-based file manager. It covers everything from booting up, boot services to server services like Samba and DNS.
    It hides the nitty gritty and allows you to focus on the decisions both technical and managerial. I have used webmin for a long time, I think over a decade. I've seen it' growing pains. It's epic battles of configuration controls with SuSe (one of the reasons I stopped using SuSe regularly) was an example of how much respect developers should place upon users. Suse, at boot time, kept over-writing standard configuration files (which webmin modifies) with values from it's own config file. It chose to favor it's own config files over that which the user has chosen. It was the first step towards a registry-like model and users voted otherwise.
    Some things still don't work great, like Samba. But other than that, are rock solid. It hides the nitty gritty so well, that I used it briefly to manager a Sun Server. I was thrust the responsibility when someone foolishly bought a Sun server because "It was what the vendor uses". The big deal was that it was to run a database (for which there was a linux version available). To Linux users, Sun is different and the same. It has different device naming conventions, slightly different service startup mechanism, to name a few . But it is the same because it is Unix.. So, I installed webmin for Solaris and was able to manage it even though I almost never went to the command line. Manage users, assign resources. Webmin did all I needed.
    But the truly valuable service Webmin gives sysadmins is the time to plan and think. When pressed for a deadline or users breathing down your neck to fix a service, webmin offers an overall view of the command options and simplifies it to clicks, freeing you to come out with solutions and make decisions. Rather than focus on and getting tripped by command line options, you can focus on what is possible and choose what is best, knowing that Webmin won't let you send the wrong command options because of typos. Less time to worry on that, results in more time to think. And contrary to what some people think, thinking is a good thing.

    Recently Popular