You can find Part 1 here and Part 2 here.
Two things you must remember about what the Windows world is about. One: Instant gratification. You can do stuff immediately after the install. Or the install itself has been dumbed down to a wizard. Two: Continuous tweaking. Always something new to try or to patch. Something is always changing or being changed. And guess what, you have little control over it (you can if you have the bucks). It is so bad that many people have been conditioned to want change or updates or something new to tweak ever so often. I've had users who moved to the Mac during the heady colored iMac days that asked me "why are they no updates?". I asked them whether something is wrong. The answer is negative. Somehow these users feel that if there are no regular / weekly updates then they must be something wrong. Think about it for a moment. These Windows users are assuming that the computer will always go wrong if not taken care of regularly. Really, they expect that if there no updates, Windows will blow up.
This is the second hardest thing to teach users who are no longer using Windows. Relax, no updates means that everything is ok. Sometimes calm waters are just that, calm waters. Stop focusing on fixing the computer and just use it. Confidently.
What is the most hardest thing to teach to ex-Windows users? It gets worse before it gets better .. and stays that way. This is the opposite of the Instant Gratification thing. No, you won't get everything running in 10 minutes. But yes, once it is up it'll stay up and we don't have to do anything major on it. In fact, as a user, they'll do even less because the model is that there is another person whose job is to take care of everything else. Unfortunately, that could also be you. And users if given the choice between something done in 5 minutes but always require tinkering vs something done in a day and never causing problems anymore will probably opt for the quick fix. The remedy? Make a big fuss about it. Call a meeting to discuss the steps to be taken and the impact on the users etc etc.. If it is a big fuss, users tend to accept that'll it take time. And if you are calling that meeting, why don't you actually do it properly. Who knows, you can even try to implement Change Management.. oooo that's a big word.
In the meeting identify, define and clearly mark the goal. Then plan on how to get from here to there. Make the transition gradual and plan for it to be so. Start with those that make the least impact, computers that offer limited functions or services to users. Then, back-room / supporting services. This is the area where Linux was born and shines. Finally deal with the desktops.
Once users have made the transition and are still a bit sore about the whole change, don't end your plan there. Think of "Now what?" What else can be done to make the experience of having moved better. Find way of making things better the users so that they can see why the journey was made in the first place. Point out open source / linux projects that will help them or that they are interested in. Start with Gimp and move from there.
Monday, May 21, 2007
Wednesday, January 03, 2007
Living with a Distribution: Personal Preference
I use Mandriva. Have used it almost exclusively for 'serious work' and on my home PC. That doesn't mean I don't use anything else. I have on my second harddisk, OpenSUSE, which I have not booted up since I-don't-know-when. I used it to use Jahsaka, a multimedia composing tool. Cool stuff, maybe I'll post some of the stuff I've done. But when I had no use for Jahshaka, I went back to my aging Mandrake 10.1.
I just upgraded from Mandrake 10.1 to Mandriva 2007. The most painless upgrade I've had. I took the usual percautions.
), and made sure again it didn't mess up my home partition. I created new user accounts using the old names. This will make my other family members feel at home as they try out the new stuff Mandriva has to offer. I move the user files but not the config files so that old settings don't interfere. And it was up and running.
I just upgraded from Mandrake 10.1 to Mandriva 2007. The most painless upgrade I've had. I took the usual percautions.
- Made a list of all applications I had installed and prioritise them according to what I use most often.
- Backed up everything.
- Renamed my home directories.
- Made sure the installation only touched the non-home partition.

Labels:
Linux
Thursday, July 27, 2006
Living with a Distribution: Taking sides
This is not about recommending a distribution. Rather it's about how important the choice is and how to make it.
A distribution is a collection of programs that will make up of what you will be installing in your PC besides the Linux kernel. A distribution is like an accent. A lot of people are saying the same thing but they are just saying it differently. Some more so than others. More importantly, choosing a distribution is like choosing a path. It will lead you to same place, just a different entrance into the city. Also the road can be winding but dotted with friendly towns or it can be 5 lanes wide chock-ful of people, leading the directly to the location. Ok, enough with the anologies.
Choosing a distribution over another will affect you in the following way:
A distribution is a collection of programs that will make up of what you will be installing in your PC besides the Linux kernel. A distribution is like an accent. A lot of people are saying the same thing but they are just saying it differently. Some more so than others. More importantly, choosing a distribution is like choosing a path. It will lead you to same place, just a different entrance into the city. Also the road can be winding but dotted with friendly towns or it can be 5 lanes wide chock-ful of people, leading the directly to the location. Ok, enough with the anologies.
Choosing a distribution over another will affect you in the following way:
- How regular the software is updated. Which may translate into how long will you be exposed to a known venerability after it's made known. A more active distribution will have updates and patches available as soon as possibe. Most of them do because the nature of a distribution is that it is created by people and used by them. More likely, it'll translate into how long before you will be able to use that latest program or software you heard about.
- Selection of pre-packaged software. Some distributions are general purpose, others are catered for a specific group of people. There are also distributions that you don't have to install. They run directly from CDs or DVDs. In this case, your choice is about use rather than having it sit on your hard disk. You might choose to use something off the CD for a while until you decide it is worthy to sit on your hard disk.
- How long will it be supported after the version number rolls over. This is usually short, may be about a year or so for most non-commercial distributions. There is a reason sometimes some new features can't be included in the older version of the distribution.
Labels:
Linux
Monday, May 08, 2006
Minding It - Part 2
I am determined to make this work. This meaning Zoneminder (www.zoneminder.com). I had it working, sort of, on my VMWare Server-ed Mandriva 2006. But even though it ran, there were numerous errors about shmget and shared memory. Zoneminder consists of several applications, with some controlling others that do specific tasks and each of them exchanging information. My guess is that some information exchange is happening through shared memory: program A loads up stuff into memory and then passes a pointer to program B that does something to that stuff (although Inter-process communication could be involved too). I can see why the author does it. Multiple instances of program can load the same kind of stuff based on different configurations while program B doesn't care about where the stuff came from... it'll take it and process it. Sorta like program A makes batter and program B cut the batter into square shapes to be made into cookies.
Ok, off the food and onto business. My guess is that I am facing problem with shared memory because I am running it on a VM, VMWare Server Beta to be exact. So the next option would be to run on a real physical memory. (actually, the next option would be to it run on something similar like VMWare Workstation, which I-Will-Do-Soon(R)). After the couple of installs I've done, I think it is safe enough to try it on my main machine at home because I don't use apache or MySQL extensively there and the dependent libraries won't probably mess up my main apps there. I am severely tempted to use an older version of a Zoneminder (zm) rpm to weed out dependencies. This is faster than the 'configure-->find out missing lib file --> find rpm for lib file --> get and install lib file --> configure' treadmill. But I've decided to take the long way just in to find out what libraries are actually needed if I was any other person doing this on any other distribution. Experiencing this as any other guy is important for the overall user experience. I am not even going to install the rpm for the perl serial and x10 stuff which is not even on the usual Mandriva repositories to see whether zm really needs it.
I already know from previous experiences that the latest version 1.22.1 complies fast and less of a hassle from the previous version I was using. Somehow, the mysterious "unable to create executables" error had magically disappeared. And I found the correct manual this time for this version.
To cut a long story short, compile went ok and everything went exactly like it did previously.. that is it broke in the same place. Same shared memory problem. A glance through the sources did no good. The forum has someone one with the same problem and it was full of debug messages. Out of desperation, I started trying all sorts of things. Like I always tell people who have problems with computers and don't know what to do: "This is not a nuclear bomb. Push all the buttons." Changing the configuration somehow yielded a better error message. I told it to that the image fed to it were 8-bit grayscale (liar) instead of the other only option 24-bit color. I think the image I was sending was 8/16-bit color. But it had only two options. Now the error message was something like "Image captured was of wrong size, height, width or color". Image captured! It got something. Hey, it may be a Celocanth but it's still a fish. Maybe if I set the camera to 8-bit grayscale, it'll capture it correctly! Great! Problem is.. I can't remember the password for the camera. No time to do a reset and sacrifice a lamb or whatever it is I need to do to reset the password. Got to pay the bills. Later.
Ok, off the food and onto business. My guess is that I am facing problem with shared memory because I am running it on a VM, VMWare Server Beta to be exact. So the next option would be to run on a real physical memory. (actually, the next option would be to it run on something similar like VMWare Workstation, which I-Will-Do-Soon(R)). After the couple of installs I've done, I think it is safe enough to try it on my main machine at home because I don't use apache or MySQL extensively there and the dependent libraries won't probably mess up my main apps there. I am severely tempted to use an older version of a Zoneminder (zm) rpm to weed out dependencies. This is faster than the 'configure-->find out missing lib file --> find rpm for lib file --> get and install lib file --> configure' treadmill. But I've decided to take the long way just in to find out what libraries are actually needed if I was any other person doing this on any other distribution. Experiencing this as any other guy is important for the overall user experience. I am not even going to install the rpm for the perl serial and x10 stuff which is not even on the usual Mandriva repositories to see whether zm really needs it.
I already know from previous experiences that the latest version 1.22.1 complies fast and less of a hassle from the previous version I was using. Somehow, the mysterious "unable to create executables" error had magically disappeared. And I found the correct manual this time for this version.
Apparently they had already fixed some of my previous rants.My apologies to developers. No bizarre config program and defaults being set at compile. Thank you.
To cut a long story short, compile went ok and everything went exactly like it did previously.. that is it broke in the same place. Same shared memory problem. A glance through the sources did no good. The forum has someone one with the same problem and it was full of debug messages. Out of desperation, I started trying all sorts of things. Like I always tell people who have problems with computers and don't know what to do: "This is not a nuclear bomb. Push all the buttons." Changing the configuration somehow yielded a better error message. I told it to that the image fed to it were 8-bit grayscale (liar) instead of the other only option 24-bit color. I think the image I was sending was 8/16-bit color. But it had only two options. Now the error message was something like "Image captured was of wrong size, height, width or color". Image captured! It got something. Hey, it may be a Celocanth but it's still a fish. Maybe if I set the camera to 8-bit grayscale, it'll capture it correctly! Great! Problem is.. I can't remember the password for the camera. No time to do a reset and sacrifice a lamb or whatever it is I need to do to reset the password. Got to pay the bills. Later.
Saturday, May 06, 2006
Minding it
I am experimenting with Zoneminder . It looks like a promising digital video recorder (DVR) for closed circuit cameras. My interest is primarily for IP based cameras. In theory, this should be easy. IP based cameras usually have a URL that you could pick up frames or even video streams. So building a DVR software should not be that difficult.
I always start with an RPM because I use Mandriva and this helps keeps updates in check. Also I respect the fact that someone invested time in building a package, so why reinvent the wheel. There were RPMs for Mandriva and tried those. I read the instructions in the manual and strangely enough there were instructions on what to do before installing. Wait a minute, isn't an RPM supposed to handle this. That is one of the points of using an RPM. But the request wasn't demanding, just that apache, mysql and php be installed. So I did that. Installing it using urpmi should have resolved dependancies but I wanted to choose which version to be installed anyway and the GUI tool was better at that. When I finally used urpmi to install it, it complained of some missing perl libraries. Great, I thought, a trek to CSPAN to download the libraries and compile them by hand. I later found out that the libraries were for an optional component and not really necessary. But even after the rpm was installed, the instruction asked me to manually edit a file and run a configuration program.. which was broken. Apparently the package author decided to use MySQL-Max and put in code to detect the version of MySQL that was running. At least he made it easier by clearly marking the offending section in the script file with #FIX ME. Finally, I managed to get the components running and configured the program to capture images from my IP camera.
Or so I thought. I couldn't figure out why my settings would not work when I have read the manuals (two of them) and follows almost every instruction there is. I checked the logs and I found out that the program that was picking up the images was crashing. Apparently it requested memory space incorrectly. Wait a minute, this was serious. This is a beta-level error on a 1.2++ version. Problems with memory allocation should have been delt with a long time ago. This only means one thing.. recompile.
I got the latest STABLE sources and tried to compile it. It got worse. The configure command still required parameters passed in.. for the default web directory and location for mysql? If it can't detect that, what is the program actually doing? The more I fed it CLI parameters, the more it asked. In the end there was about 10 odd CLI parameters. And it still wouldn't compile, complaining about the c compiler unable to create executables! What?!
I little more reading led me to a more stable version, a couple of revisions back. No more c compiler complaints. Now it was the normal missing library files that had to be installed, specifically libjpeg.a (libjpeg), libz.a (zlib-devel) and libmysqlclient.a (libmysql-devel). Configure exited and asked to run a configuration program. Which did a lot of what confgure is supposed to do, esp when it came to component detection. It asked a lot of questions, nearly 30 in all. Most of them should have just been set to default. Too late already. Will compile tommorow.
Lessons Learnt.
I always start with an RPM because I use Mandriva and this helps keeps updates in check. Also I respect the fact that someone invested time in building a package, so why reinvent the wheel. There were RPMs for Mandriva and tried those. I read the instructions in the manual and strangely enough there were instructions on what to do before installing. Wait a minute, isn't an RPM supposed to handle this. That is one of the points of using an RPM. But the request wasn't demanding, just that apache, mysql and php be installed. So I did that. Installing it using urpmi should have resolved dependancies but I wanted to choose which version to be installed anyway and the GUI tool was better at that. When I finally used urpmi to install it, it complained of some missing perl libraries. Great, I thought, a trek to CSPAN to download the libraries and compile them by hand. I later found out that the libraries were for an optional component and not really necessary. But even after the rpm was installed, the instruction asked me to manually edit a file and run a configuration program.. which was broken. Apparently the package author decided to use MySQL-Max and put in code to detect the version of MySQL that was running. At least he made it easier by clearly marking the offending section in the script file with #FIX ME. Finally, I managed to get the components running and configured the program to capture images from my IP camera.
Or so I thought. I couldn't figure out why my settings would not work when I have read the manuals (two of them) and follows almost every instruction there is. I checked the logs and I found out that the program that was picking up the images was crashing. Apparently it requested memory space incorrectly. Wait a minute, this was serious. This is a beta-level error on a 1.2++ version. Problems with memory allocation should have been delt with a long time ago. This only means one thing.. recompile.
I got the latest STABLE sources and tried to compile it. It got worse. The configure command still required parameters passed in.. for the default web directory and location for mysql? If it can't detect that, what is the program actually doing? The more I fed it CLI parameters, the more it asked. In the end there was about 10 odd CLI parameters. And it still wouldn't compile, complaining about the c compiler unable to create executables! What?!
I little more reading led me to a more stable version, a couple of revisions back. No more c compiler complaints. Now it was the normal missing library files that had to be installed, specifically libjpeg.a (libjpeg), libz.a (zlib-devel) and libmysqlclient.a (libmysql-devel). Configure exited and asked to run a configuration program. Which did a lot of what confgure is supposed to do, esp when it came to component detection. It asked a lot of questions, nearly 30 in all. Most of them should have just been set to default. Too late already. Will compile tommorow.
Lessons Learnt.
- Good practise for building packages is to seperate the core and optional components.
- When building a package make sure all dependancies are included.
- Set up DEFAULTS!!
Wednesday, August 24, 2005
Standing on the ledge - to Office or not to Office
As you are moving from Windows to Linux on the desktop, you will have to cross the most difficult bridges of all: Office applications. Or productivity tools. In short you have to ask yourself, "To MSOffice or not to MSOffice, that is the question." It is entirely possible to keep MSOffice and use Linux, despite what purists say. I, for one, is still married to MSOffice, still and I'll explain why at the end.
Coming back to the question at hand, your options are :
1. Do you use Macros? Do things pop out and ask you stuff when you open a template or document? If not, then you answer is most likely no.
2. Do you use outlines or the outlining feature in MSWord? If you are asking, "Wha-?", then your answer would be no.
3. Do you have MS Access databases that you use regularly? Thing about conversion is that MSAccess files are not part of the deal.
If you answered yes to any of the above, go Crossover Office. If not, then you are a prime candidate for switching over from MSOffice to OpenOffice. You will save a ton of money later, especially as you grown and add PCs and realize you don't have to pay for another MSOffice license.
Oh, BTW, I don't use Macros but I love the outlining feature so much, it is a deal breaker.
Coming back to the question at hand, your options are :
- OpenOffice - an alternative application suite. It can read and save files into MSOffice file formats. The best part about it is that it also runs on Windows. So as you are moving people across, you can have them using OpenOffice on Windows and later on Linux. However, this endeavor is so large, it in itself is as a daunting a task as moving people into Linux. The true reason you should have people moving across is that most people do not use all of MSOffices features all the time. They simple can't. If they do clerical work, moving across is a cinch. But if they are advanced users, it will be as painful as a root canal minus the pain killers. Heck, I have some problems with alignment when moving from OpenOffice in Win to OpenOffice in Linux. If you share files out side the company, it then gets really troublesome.
- Cross Over office - A commercial tool that allows you to install and use MSOffice (plus some other Windows applications) on Linux. Like driving on the other lane when the road is empty. It simply works. Well, almost all. You see, what they didn't tell you is that there is a reason why MSOffice is on Windows only. It is just because it uses low-level software calls. I have heard that one of the reason the Windows on Alpha was dropped because it couldn't run MSOffice very well. Some version of Office actually replaced OS files during installation. What other application would do that? So the result is that the major MSOffice applications work fine but some fringe and not-so-fringe applications can be tripped up (e.g. Clipart Manager).
- Like above, Office over Wine - Wine, which is not a Windows Emulator, is designed to run Windows applications on Linux by fooling the application into thinking that it is on Windows, but not. In fact, CrossOver Office is partly Wine. So, why use CrossOver when you can get wine for free. Let's just say that I like my hair too much as this age of my life.
1. Do you use Macros? Do things pop out and ask you stuff when you open a template or document? If not, then you answer is most likely no.
2. Do you use outlines or the outlining feature in MSWord? If you are asking, "Wha-?", then your answer would be no.
3. Do you have MS Access databases that you use regularly? Thing about conversion is that MSAccess files are not part of the deal.
If you answered yes to any of the above, go Crossover Office. If not, then you are a prime candidate for switching over from MSOffice to OpenOffice. You will save a ton of money later, especially as you grown and add PCs and realize you don't have to pay for another MSOffice license.
Oh, BTW, I don't use Macros but I love the outlining feature so much, it is a deal breaker.
Labels:
Linux,
Thinking aloud
Tuesday, August 23, 2005
Standing on the ledge - Part 1
If you are thinking about making the transition from whatever to Linux, read on.
A lot of people asked me two questions since I made the switch, 'Is it hard?' and 'Can you do everything you want to in Windows?'
The answer to the last one is a resounding yes. In fact, after switching from Windows, whenever I have to use a Windows machine, I find it very restrictive and most of my tools are gone. Linux give you so many choices and options, you can't just make up your mind and stick to one set. I find myself switching from KDE to GNOME and back every few months. Without losing access to the core programs I use.
The answer to the first one is 'Hell, yes. It was very hard.' But I was on my own and in retrospect, could have avoided a lot of heartache if there were someone to tell me what to do or what to avoid. This series is dedicated to those thinking about making the switch or the jump. Something to think about and do before making the leap. Most of it will sound like me talking to you as a network administrator but even if you are switching alone, everything still applies. Think of yourself as your own administrator.
First, Why are you making the jump or at least thinking about it. The reasons have to be sound because you have to do it for the right reasons. If not, you will be disappointed or you will find it not suited for you and you switch back. Time lost once will never be regained.
If you are switching for idealogical reasons (i.e. not wanting to pay Microsoft Tax), then you are a Believer. Nothing I say will discourage you and all pain is worth suffering. Just make sure other people involved believe it too. Note to Believer: All proponents of idealogies (prophets, do-gooders) face lynch mobs. Sort of a Darwinian thing about idealogies, those that survive lynch mobs are most likely superior.
If you are thinking about saving costs, I will tell you right now it will be some time before you see significant cost savings. Unless, of course you include licence costs for a large number of people. The is where the most savings will be. But for every cost factor you take away, you will be replacing it with another one. Training or retraining will cost. Reinstallation or upgrades of older PCs will cost. Sure the PCs won't crash as often but people who switch to Linux forget that Linux may not be hard on CPU speed but it does require some amount of memory before things really fly. My suggestion is that hit 256MB as soon as you can. If you are looking at older PCs, 128MB will work. While on this issue, sometimes it's not even the RAM. Getting a new video card with more memory works wonders too. Coming back to cost factors, live with the fact that cost factors are just going to be replaced not eliminated. But if you are smart about it, it just won't cost as much. That is, each cost factor replaced, will likely be less in value.
That said, hunker down for some productivity loss and doubts (or doubting people) nagging you. Remember, it has to get worse before it gets better.
Update: Part 2 and Part 3
A lot of people asked me two questions since I made the switch, 'Is it hard?' and 'Can you do everything you want to in Windows?'
The answer to the last one is a resounding yes. In fact, after switching from Windows, whenever I have to use a Windows machine, I find it very restrictive and most of my tools are gone. Linux give you so many choices and options, you can't just make up your mind and stick to one set. I find myself switching from KDE to GNOME and back every few months. Without losing access to the core programs I use.
The answer to the first one is 'Hell, yes. It was very hard.' But I was on my own and in retrospect, could have avoided a lot of heartache if there were someone to tell me what to do or what to avoid. This series is dedicated to those thinking about making the switch or the jump. Something to think about and do before making the leap. Most of it will sound like me talking to you as a network administrator but even if you are switching alone, everything still applies. Think of yourself as your own administrator.
First, Why are you making the jump or at least thinking about it. The reasons have to be sound because you have to do it for the right reasons. If not, you will be disappointed or you will find it not suited for you and you switch back. Time lost once will never be regained.
If you are switching for idealogical reasons (i.e. not wanting to pay Microsoft Tax), then you are a Believer. Nothing I say will discourage you and all pain is worth suffering. Just make sure other people involved believe it too. Note to Believer: All proponents of idealogies (prophets, do-gooders) face lynch mobs. Sort of a Darwinian thing about idealogies, those that survive lynch mobs are most likely superior.
Remember, it has to get worse before it gets better.
If you are thinking about saving costs, I will tell you right now it will be some time before you see significant cost savings. Unless, of course you include licence costs for a large number of people. The is where the most savings will be. But for every cost factor you take away, you will be replacing it with another one. Training or retraining will cost. Reinstallation or upgrades of older PCs will cost. Sure the PCs won't crash as often but people who switch to Linux forget that Linux may not be hard on CPU speed but it does require some amount of memory before things really fly. My suggestion is that hit 256MB as soon as you can. If you are looking at older PCs, 128MB will work. While on this issue, sometimes it's not even the RAM. Getting a new video card with more memory works wonders too. Coming back to cost factors, live with the fact that cost factors are just going to be replaced not eliminated. But if you are smart about it, it just won't cost as much. That is, each cost factor replaced, will likely be less in value.
That said, hunker down for some productivity loss and doubts (or doubting people) nagging you. Remember, it has to get worse before it gets better.
Update: Part 2 and Part 3
Labels:
Linux,
Thinking aloud
Wednesday, August 17, 2005
Buying reality
You reality is your own perception. If it walks like a duck, quacks like a duck then.. you know. But what if all you see are ducks. Do you think you'd know a chicken if it walked by?
There is a point to this. I make a living from computers (big surprise). And I work with a people out from colleges who are making their first career jobs and people whose businesses are starting to break out from the local market. They all need computers and they all want to use the best at the least possible cost. Recommendations are big thing for me and my clients (and lately, even my suppliers) bring in people they know who can use my expertise. I use my own office setup to demonstrate some of the uses you can get from using open source solutions and Linux in particular. The thing I am getting used to is the response, "You can do that with a computer?" or "It can work like that?"
Thats what bothers me. It used to be the whiz bang stuff that gets them, then the free but high quality stuff (Mozilla, Gimp). But now the stuff that draw theses responses are down right trivial.
I pointed out to a potential client that he could set up a print queue and log all print jobs and the information of each job. He looked at me and point out that wasn't everybody just printing directly to the printer. If everyone could see the printer, couldn't they just bypass the queue? I walked to the printer and turned off SMB-based sharing via the control panel. The printer disappered from the network but I demonstrated that I could still print via the queue. He was bowled over. Seems that he has a problem with his workers printing on the expensive color laser printer after hours. At first he would disconnect the printer at about 5 but stopped that after salespeople complained of not being able to get color brochures printed for clients after hours. The growth in his company was directly the result of his sales staff being able to come in at odd hours and do work, so he couldn't deny their request. The notion of a print queue and the ability to turn off access to the printer (selectively by network protocol) never crossed his mind.
Do you see that? The solution had little to do with open source or Linux or anything new for that matter. Print queues have been around for ages. But what surprised me more was that when I mentioned this to a younger co-worker, he said that compared to what he saw at college (local community college), the stuff at the office was downright revolutionary. The free-flow mess of network and services on Windows networks at college was a stark contrast to the controlled environment at the office where everything just worked or that if it failed something else was waiting to back that up.
Which brings me back to the ducks. One of the problems with computing right now is the dominance of Windows and MS. All people see are Windows. Their sheer ubiquity has blinded a lot of people. They simply don't know any other way. And it if means having to live with unoptimised working environments that often is not productive, so be it. A recent report said that Gartner research says that desktop Linux won't be taking off (I have issues with that but to a certain degree agree that Linux has problems on office desktops). Maybe the case with that is that people don't know better. Maybe it's time to look over and think, "Fried chicked sounds good."
There is a point to this. I make a living from computers (big surprise). And I work with a people out from colleges who are making their first career jobs and people whose businesses are starting to break out from the local market. They all need computers and they all want to use the best at the least possible cost. Recommendations are big thing for me and my clients (and lately, even my suppliers) bring in people they know who can use my expertise. I use my own office setup to demonstrate some of the uses you can get from using open source solutions and Linux in particular. The thing I am getting used to is the response, "You can do that with a computer?" or "It can work like that?"
Thats what bothers me. It used to be the whiz bang stuff that gets them, then the free but high quality stuff (Mozilla, Gimp). But now the stuff that draw theses responses are down right trivial.
I pointed out to a potential client that he could set up a print queue and log all print jobs and the information of each job. He looked at me and point out that wasn't everybody just printing directly to the printer. If everyone could see the printer, couldn't they just bypass the queue? I walked to the printer and turned off SMB-based sharing via the control panel. The printer disappered from the network but I demonstrated that I could still print via the queue. He was bowled over. Seems that he has a problem with his workers printing on the expensive color laser printer after hours. At first he would disconnect the printer at about 5 but stopped that after salespeople complained of not being able to get color brochures printed for clients after hours. The growth in his company was directly the result of his sales staff being able to come in at odd hours and do work, so he couldn't deny their request. The notion of a print queue and the ability to turn off access to the printer (selectively by network protocol) never crossed his mind.
Do you see that? The solution had little to do with open source or Linux or anything new for that matter. Print queues have been around for ages. But what surprised me more was that when I mentioned this to a younger co-worker, he said that compared to what he saw at college (local community college), the stuff at the office was downright revolutionary. The free-flow mess of network and services on Windows networks at college was a stark contrast to the controlled environment at the office where everything just worked or that if it failed something else was waiting to back that up.
Which brings me back to the ducks. One of the problems with computing right now is the dominance of Windows and MS. All people see are Windows. Their sheer ubiquity has blinded a lot of people. They simply don't know any other way. And it if means having to live with unoptimised working environments that often is not productive, so be it. A recent report said that Gartner research says that desktop Linux won't be taking off (I have issues with that but to a certain degree agree that Linux has problems on office desktops). Maybe the case with that is that people don't know better. Maybe it's time to look over and think, "Fried chicked sounds good."
Labels:
Commentary
Thursday, July 21, 2005
Linux Mobile: Running wirelessly
I got the Mandrake-powered notebook to work over the wireless network with AP at home. But no luck at office. This vexed me more than normal because I had a hand in setting up the office wireless AP and was pretty sure of what the settings were. Normally when you build two things that are like, you'd get better the second time, not worse. But since the first time worked flawlessly, I learned nothing from the experience. That is why I don't see problems as obstacles. They are opportunities to learn.
Basically my problem boiled down to the fact my notebook's wireless card can't connect to the office AP using WEP encryption. Without it, no problem. But the kicker was that I was using WEP at home AP and it worked out-of-the-box. No option I tried could get it done. This is the time to take a step back. The thing to do at a time like this is to not go through the things I got wrong. But rather the things I thought I got right. What was it that I did differently at the office than at home?
And there the solution was. The wireless card needed the WEP key to be in hex. It would not use the ASCII key. That I found that out at home but it was fixed easily because the home AP showed the ASCII key I entered as hex and vice versa when I switched between ASCII and hex input. The office AP didn't have that feature. You either entered it in ASCII or Hex and switching between both just blanks out any key previously entered. So I used an ASCII to hex converter at the command line. Apparently these things are case-sensitive. No wonder it wouldn't work. It was just the wrong key! I found that out because I finally decided to change the WEP key at the office AP. I just entered it in hex and did the same on the notebook. It worked straight away. I didn't do this earlier because other people were also using the AP. After changing back the key and more fiddling around I learned that the office AP apparently automatically makes the ASCII key entered into UPPER CASE before converting it to hex value and then using it . The AP vendor committed one of Great Sins of Equipment Manufacturers: Not telling the user of the assumption you made for them (and in a way, about them). I was thankful though they didn't do something boneheaded like configuring the AP to use two keys for every ASCII key entered (that is convert the ASCII key into both upper and lower case and converting each into hex and using them both). It would have made my setup work immediately but it would be Not The Right Way.
Basically my problem boiled down to the fact my notebook's wireless card can't connect to the office AP using WEP encryption. Without it, no problem. But the kicker was that I was using WEP at home AP and it worked out-of-the-box. No option I tried could get it done. This is the time to take a step back. The thing to do at a time like this is to not go through the things I got wrong. But rather the things I thought I got right. What was it that I did differently at the office than at home?
And there the solution was. The wireless card needed the WEP key to be in hex. It would not use the ASCII key. That I found that out at home but it was fixed easily because the home AP showed the ASCII key I entered as hex and vice versa when I switched between ASCII and hex input. The office AP didn't have that feature. You either entered it in ASCII or Hex and switching between both just blanks out any key previously entered. So I used an ASCII to hex converter at the command line. Apparently these things are case-sensitive. No wonder it wouldn't work. It was just the wrong key! I found that out because I finally decided to change the WEP key at the office AP. I just entered it in hex and did the same on the notebook. It worked straight away. I didn't do this earlier because other people were also using the AP. After changing back the key and more fiddling around I learned that the office AP apparently automatically makes the ASCII key entered into UPPER CASE before converting it to hex value and then using it . The AP vendor committed one of Great Sins of Equipment Manufacturers: Not telling the user of the assumption you made for them (and in a way, about them). I was thankful though they didn't do something boneheaded like configuring the AP to use two keys for every ASCII key entered (that is convert the ASCII key into both upper and lower case and converting each into hex and using them both). It would have made my setup work immediately but it would be Not The Right Way.
Linux Mobile: An out-of-the-box experience
Recap on the installation
- LAN network card: ok. Didn't expect any problems but who knows.
- Graphics display = Vesa only. It bombed using the i910 drivers for XFree. I heard Intel is posting it's driver. Will try that. But not really bad Vesa.
- Power Management = ACPI ok, APIC crashes the system for some reason.
Linux Mobile: Introduction
Finally, I got a new notebook at work. I was a bit apprehensive about what distro to put. SuSe Pro is a big pull. Ubuntu even crossed my mind. But realising that this was a notebook that would not have all the pieces working with Linux, I needed most of my experience to make it up and running. And an unfamiliar distribution would make me grope in the dark. Mandrake/Mandriva it was.
In the next course of blogs, I try to document as much as possible what I did right and what I did wrong with the hope it'll help someone out there.
First things first, the notebook is a MSI Megabook, rebranded as a local brand here. Centrino chips, 512MB RAM, DVD-CDRW, 40GB HDD, 3 USB, 1 Firewire, 1 VGA, 1 PCMCIA with integrated card reader (Ricoh), built in Wifi, network and modem. All in a nice 1.8 kg package costing slightly under 1k dollars.
The good news is that I am writing this on the notebook
In the next course of blogs, I try to document as much as possible what I did right and what I did wrong with the hope it'll help someone out there.
First things first, the notebook is a MSI Megabook, rebranded as a local brand here. Centrino chips, 512MB RAM, DVD-CDRW, 40GB HDD, 3 USB, 1 Firewire, 1 VGA, 1 PCMCIA with integrated card reader (Ricoh), built in Wifi, network and modem. All in a nice 1.8 kg package costing slightly under 1k dollars.
The good news is that I am writing this on the notebook
Friday, July 08, 2005
Waiting for nothing
I looking at the Mandriva CD that came with Linux Format, the best Linux magazine for the less uppity or the pocket-protector-less. I wonder when I will get to install it. To be truthful, I had the downloaded CDs longer but if you have read the past few posts, upgrades are something I dread.
It's that I also use the PC so much, I am aprehensive of all the lost time to install most of what I had already installed on the upgraded mahine.
It's that I also use the PC so much, I am aprehensive of all the lost time to install most of what I had already installed on the upgraded mahine.
Labels:
Linux
Shouting obscenities and Error Messages
Mandriva is greatly enhanced by urmpi and more so when combined with the repositories listed on EasyUrpmi. If you haven't got plf repositories listed, you are definitely missing a lot. A side feature of using EasyUrpmi is that you can set the main repository, thereby eliminating the need to have the CDs or DVD around when ever you install stuff. As a desktop OS, you will install a lot of stuff.
Suddenly, things got slower during installs, often failing. There are no clues other than messages saying that some packages cannot be installed due to missing keys and that some packages are corrupted. Checked the name of the package. Correct. Checked rpm.pbone. Correct. Tried restoring missing keys. Trouble is, they weren't missing to begin with.
And it goes on for some time. Sometimes I get to install. Other times I don't. So I tried updating the repository indexes. Some are successful, other times, it just hangs, requiring a kill.
Fortunately, I have another desktop at home. Faster. I am the only one using the Internet connection. Everything I tried installing, worked. Even stuff that didn't work.. at work. Then I realised that the PC at work kept hanging when I tried updating the indexes. So I do what I normally do when trying to figure stuff out. I break it down.
Tried a few indexes at first. But ultimately it all worked. All indexes from repositories that have their contents updated, that is. So I tried updating the index from the repository that is not supposed to change, main. It failed spectacularly. So that was the problem. The repository was no longer where it was. A quick visit to EasyUrpmi fixed that. Deleted the main repository, found another one and added it back with the command generated by EasyUrpmi.
Which says a lot about error messages. Error messages are a must. It tells us when things are wrong. More importantly, it tells us what is wrong so that it can be fixed. When errors messages don't tell me what it wrong, it might as well be shouting obscenities alone.
Suddenly, things got slower during installs, often failing. There are no clues other than messages saying that some packages cannot be installed due to missing keys and that some packages are corrupted. Checked the name of the package. Correct. Checked rpm.pbone. Correct. Tried restoring missing keys. Trouble is, they weren't missing to begin with.
And it goes on for some time. Sometimes I get to install. Other times I don't. So I tried updating the repository indexes. Some are successful, other times, it just hangs, requiring a kill.
Fortunately, I have another desktop at home. Faster. I am the only one using the Internet connection. Everything I tried installing, worked. Even stuff that didn't work.. at work. Then I realised that the PC at work kept hanging when I tried updating the indexes. So I do what I normally do when trying to figure stuff out. I break it down.
Tried a few indexes at first. But ultimately it all worked. All indexes from repositories that have their contents updated, that is. So I tried updating the index from the repository that is not supposed to change, main. It failed spectacularly. So that was the problem. The repository was no longer where it was. A quick visit to EasyUrpmi fixed that. Deleted the main repository, found another one and added it back with the command generated by EasyUrpmi.
Which says a lot about error messages. Error messages are a must. It tells us when things are wrong. More importantly, it tells us what is wrong so that it can be fixed. When errors messages don't tell me what it wrong, it might as well be shouting obscenities alone.
Tuesday, June 28, 2005
Do not be afraid
I don't understand people fearing choice. Maybe it is the fear to choose. Or at least the fear of being wrong. Is that our problem. Is that why Linux on the desktop is slow starting, because the technical people a fearful? Fear of call center-overload?
I carry Knoppix around in my bag. Recently, I enquired around for a new laptop. I fell in love with these low-cost 12" monitor basic laptops. But the smaller it is, the more customised components it'll have. Which is bad news for Linux. But I wasn't deterred. I asked the sales guy about it and popped in the Knoppix CD. He was making a comment on how RedHat still requires command-line installation. But when Knoppix came on, he was blown away. So much so, he asked for a copy of the CD to kick around. I was tempted in giving away my outdated version but I still had a few shops to go to and their laptops to test. So I waited while he copied my Knoppix 3.7.
The moral of the story is always keep 2 copies of Knoppix around.
I carry Knoppix around in my bag. Recently, I enquired around for a new laptop. I fell in love with these low-cost 12" monitor basic laptops. But the smaller it is, the more customised components it'll have. Which is bad news for Linux. But I wasn't deterred. I asked the sales guy about it and popped in the Knoppix CD. He was making a comment on how RedHat still requires command-line installation. But when Knoppix came on, he was blown away. So much so, he asked for a copy of the CD to kick around. I was tempted in giving away my outdated version but I still had a few shops to go to and their laptops to test. So I waited while he copied my Knoppix 3.7.
The moral of the story is always keep 2 copies of Knoppix around.
Labels:
Linux
Friday, May 27, 2005
Moving On Part 2
The last time, I was apprehensive about upgrading. In the past when I've done it, there were several things I didn't like happen.
1. Old settings were carried over literally. There was no way to use the new settings if you use the previous home directories. This happened during an upgrade. I am quit surprised that the newer version of the software, especially for something as important as GNOME, did not notice that the config file was from a previous version and at least offered to replace it with a new version. I understand the concern to keep all the user's settings but should there be an updater program for that.
2. Certain software that were there previously, were not replaced but simply disappeared. This happens especially for software that is not in the vogue or part of the core distribution. I understand that leaving the program there is risking a certain incompatibility but at least if it was there before and there is no replacement during the upgrade, please offer me to leave it there. It happened to Nagios, a network monitoring software I use. It just disappeared. I had to reinstall and reconfigure it every time I upgraded. It is only on the contrib section.
An upgrade is an upgrade, not re-installation and certainly not a fresh installation. Distribution packagers should respect that or loose their user base.
1. Old settings were carried over literally. There was no way to use the new settings if you use the previous home directories. This happened during an upgrade. I am quit surprised that the newer version of the software, especially for something as important as GNOME, did not notice that the config file was from a previous version and at least offered to replace it with a new version. I understand the concern to keep all the user's settings but should there be an updater program for that.
2. Certain software that were there previously, were not replaced but simply disappeared. This happens especially for software that is not in the vogue or part of the core distribution. I understand that leaving the program there is risking a certain incompatibility but at least if it was there before and there is no replacement during the upgrade, please offer me to leave it there. It happened to Nagios, a network monitoring software I use. It just disappeared. I had to reinstall and reconfigure it every time I upgraded. It is only on the contrib section.
An upgrade is an upgrade, not re-installation and certainly not a fresh installation. Distribution packagers should respect that or loose their user base.
Saturday, April 30, 2005
Surviving Mandrake (and NVidia Drivers)
One of the most easiest thing to do on Mandrake is updates. Even kernel updates. But as my previous experience has shown kernel updates with Nvidia drivers are not to be taken lightly. So the first step is to plan.
Or something like that.. DeNiro's character in the movie Ronin said at the beginning of the movie. I have to do something similar. In order to compile Nvidia drivers, you need the kernel source. So, if you update the kernel, you need the related kernel-source at the same time. Problem is, if you update with urpmi or Mandrake update, you lose the kernel source.
What we want to avoid.
We update the kernel and the kernel source. Then either we fail to complie the nvidia driver or the nvidia driver fails to work. Need to undo. Mandrake has kept the previous version of the kernel but not the source. If you failed to complie the nvidia driver, then the old one is probably there. But if the driver was a dud, you have to recompile. If you don't have the previous version of the kernel source, you are screwed because most sites don't keep older versions of the kernel source. Or you can use the non-Nvidia nvidia drivers (just change the XF86Config or equivalent file). Yep, drive your Ferrari only in the first gear.
Moral of story. Download the kernel-source rpm and then install it by hand after you have updated the kernel but before you recompile the Nvidia driver.
Before I get feedbacks on "how stupid it is for the need to recompile in this day and age" or "Run for the hills! Recompilin's here!", let me point out that the Nvidia driver recompile process is menu driven. Push button.
I am so anxious I haven't updated my kernel for so long. But I need to. Sigh. Wish me luck.
Do not enter a room without knowing how to get out.
Or something like that.. DeNiro's character in the movie Ronin said at the beginning of the movie. I have to do something similar. In order to compile Nvidia drivers, you need the kernel source. So, if you update the kernel, you need the related kernel-source at the same time. Problem is, if you update with urpmi or Mandrake update, you lose the kernel source.
What we want to avoid.
We update the kernel and the kernel source. Then either we fail to complie the nvidia driver or the nvidia driver fails to work. Need to undo. Mandrake has kept the previous version of the kernel but not the source. If you failed to complie the nvidia driver, then the old one is probably there. But if the driver was a dud, you have to recompile. If you don't have the previous version of the kernel source, you are screwed because most sites don't keep older versions of the kernel source. Or you can use the non-Nvidia nvidia drivers (just change the XF86Config or equivalent file). Yep, drive your Ferrari only in the first gear.
Moral of story. Download the kernel-source rpm and then install it by hand after you have updated the kernel but before you recompile the Nvidia driver.
Before I get feedbacks on "how stupid it is for the need to recompile in this day and age" or "Run for the hills! Recompilin's here!", let me point out that the Nvidia driver recompile process is menu driven. Push button.
I am so anxious I haven't updated my kernel for so long. But I need to. Sigh. Wish me luck.
Tuesday, April 05, 2005
Surviving Mandrake 2 - Post-installation blues
I have a lot to bitch about: An OpenOffice bug keeps me from saving my files correctly. Firefox still not 1.0 in repositories. And from the comments in the forums, it won't be for sometime. I downloaded the latest version and it had me install itself within the user's directories structure. All the shortcuts would use to old version still but instead created one on the desktop and used that instead.
Wednesday, March 30, 2005
Surviving Mandrake 1 - After the install..
After running out of excuses, I upgraded my home PC to Mandrake 10.1. I took advantage of the fact that my memory burnt out and needed to be replaced. I took out the 80GB hard disk that was gathering dust and plugged it in. I had bought it a couple of weeks ago with the intention of using it is as an excuse to upgrade. Well, good intentions don't always pan out. It wasn't that I was afraid of Mandrake 10.1. I have been using it on the office PC for some time now and have worked out most of the kinks out. But at home, it had grown so comfortable that I felt no great need to move up. It was doing what I wanted it to do. And it was doing it well. The agony of using Linux. Once things work, they just keep working.
But what can you do? You have to move up. For whatever reason. Mine was that eventually most of the new stuff will require libraries no longer compiled for Mandrake 10. Ok, not entirely true. I still could hunt down those libraries but Mandrake Update has been spoilling me silly. If they don't have it, I usualy don't get it. Bizzarely, this could be the great conundrums of Linux on the desktop. While shelf life or stability or longevity or whatever it is that makes it great for building systems and locking them down for use in the server room is great for the server room, it isn't so for the desktop. Desktop or end-user software moves much faster and change more often on the desktop. So the distributions have to keep up.
Now there is an argument for desktop-centric distributions. New or stable? Do you want to drive the latest or the one with the best mileage? Or do you have both?
Back to the story.
The installation was Mandrake-standard: quick, simple and clean. At the end of 20 minutes, I had a newly installed Mandrake 10.1 system, all ready to go and be productive. But of course, it was still a long way to. A couple of pointers.
The trick to MandrakeUpdate is to find a site that not only has a fast line but not that many users. It is the equivalent of gazing into a crystal ball. Short of those you who actually have the statistics for the usage of these sites and their bandwidth usage, it is trial and error. If the site you choose is too slow, just go back to the Software Media Manage and remove the update_souce repository. The next time you start MandrakeUpdate it'll give a list and you can choose another one.
Other stuff you have to install.
But what can you do? You have to move up. For whatever reason. Mine was that eventually most of the new stuff will require libraries no longer compiled for Mandrake 10. Ok, not entirely true. I still could hunt down those libraries but Mandrake Update has been spoilling me silly. If they don't have it, I usualy don't get it. Bizzarely, this could be the great conundrums of Linux on the desktop. While shelf life or stability or longevity or whatever it is that makes it great for building systems and locking them down for use in the server room is great for the server room, it isn't so for the desktop. Desktop or end-user software moves much faster and change more often on the desktop. So the distributions have to keep up.
But can it do so at the expense of the stability needed in the server rooms?
Now there is an argument for desktop-centric distributions. New or stable? Do you want to drive the latest or the one with the best mileage? Or do you have both?
Back to the story.
The installation was Mandrake-standard: quick, simple and clean. At the end of 20 minutes, I had a newly installed Mandrake 10.1 system, all ready to go and be productive. But of course, it was still a long way to. A couple of pointers.
- Don't worry if the system doesn't detect the Internet during installation. It has never done that for my broadband setup. And I have plain vanilla ADSL (PPPoE to be exact). Just configure it and it'll get it after a reboot.
- Long ago, actually a few distributions ago, I would restart the system immediatly after it started Linux after the installation. Just to make sure everything will come or or come up just like after a normal startup. Nowadays, it no longer is needed. When Mandrake comes up after installation, it really starts afresh, not just continuing from the installation session
- If you do have the time, when partitioning the hard disk, go into expert mode and have it run the extended test on the hard disk. It checks the hard disk thouroughly for bad sectors and the like. You'd think that we'd have left that behind by now. It is optional but if you have the time, might as well find the errors now rather than finding them while you are working. Linux does a fairly good job of this but on the off chance that it could be fatal, it doesn't hurt. Especially if you are not the type that makes back ups.
- Don't run update immediately during installation. Run after you boot up the first time, once it has run all those post-installation and run-once-on-first-time scripts. Those scripts might not get updated during the the update or will fail to work with the updated packages. Besides, most of the time, you can't get the Internet up anyway.
The trick to MandrakeUpdate is to find a site that not only has a fast line but not that many users. It is the equivalent of gazing into a crystal ball. Short of those you who actually have the statistics for the usage of these sites and their bandwidth usage, it is trial and error. If the site you choose is too slow, just go back to the Software Media Manage and remove the update_souce repository. The next time you start MandrakeUpdate it'll give a list and you can choose another one.
Other stuff you have to install.
- Flash player - because the internet would be less pretty without it. Get here.
- Java - It's getting prettier and more useful. The best part is that developers are finally understanding that you can use it just in the background. Get it here.
- mplayer - because kaffeine is great but the audio it a bit too soft. Get it on plf.
- decss - hmmm? like above
Friday, January 14, 2005
Still Alive
Just in case anyone is reading this, just dropping a note to say I am still here and back from the holiday season. I have some things I wrote but they never reached a point where I'd want to post them. So, if you like my blog, thanks and I'll be starting back up soon.
Thursday, November 25, 2004
Living with a Distribution
If Linux and Open Source was a religion (to some people it actually is), then a distribution is a sect. Or to a lesser extent a denomination. Not taking everything, but picking and choosing. Exclusive membership. Requires devotion and deviation frowned on. Or at least it'll get you in a lot of trouble. Let me explain in the next few posts with this title.
Labels:
Linux,
Thinking aloud
Subscribe to:
Posts (Atom)
Recently Popular
-
I have been spending time trying to wrap my head around Containers, mainly the Docker container. There are others that are up and coming, bu...
-
I've always been interested in new technology. But I'm always worried about complexity for complexity sake. Now I know that some peo...
-
Fortunately, the pandemic has little negative impact for me. Working from remote, couped up in the house,endless remote meetings. Tell me so...
-
In open source, 'scratching your itch' is a source of birth for many a project. It makes the assumption that someone who has a probl...
-
I am a writer at heart. Just look at the number of blogs I contribute to (listed as the Techsplatter network at the bottom of this page)....