Thursday, November 25, 2004
Living with a Distribution
If Linux and Open Source was a religion (to some people it actually is), then a distribution is a sect. Or to a lesser extent a denomination. Not taking everything, but picking and choosing. Exclusive membership. Requires devotion and deviation frowned on. Or at least it'll get you in a lot of trouble. Let me explain in the next few posts with this title.
Labels:
Linux,
Thinking aloud
Thursday, October 07, 2004
Linux Analogies: Linux as A Property
I have been using "Linux is like a car" anology for years. I'll write about it some other time but essentially it is about the importance of knowing how to take care of your car instead of just driving it.
In my talks with business people who want to use Linux in their business, I found that equating Linux property helps brings about a better understanding. The anology goes that if you are building a business, you must have a place to do business. This place is the 'property'. So, if you are building applications to sell to your customers and you build them using Windows as your operating system, then Windows is your 'property' that you do your business in. If you choose to build that application using Linux, too, you would be like setting up another business in another property. The major difference between the Windows property and the Linux property is that one is a lot cheaper than the other to set up and run. You have better control on one property over the other, akin to buying one and renting / leasing another.
Each property bring with it their own requirements. Maintenance in relation to the OS is an example. You need to have one in your Windows property and another in you Linux property. Add in support, R&D, pre-sales tech and the costs of maintaining your business in both properties will add up to a lot quickly. Unless business in each property is sufficient to justify their own existence, most business would close one shop in favour of the other. How one business reaches that decisions in another matter. What would you look at? Would you look at the revenue each place of business generates? Or the actual profit? Today's business or tommorow's? Profits acheived yesterday or projected profits of tommorow?
How about upkeep of the property? You do need to refresh your place of business out of necessity or just for the heck of it. More space for the increased workforce, new wiring for a faster network or a fresh coat of paint to cover the aging process. Which place would you sink more money in, the place that you rent and eventually need to move out or the one that you bought. With Windows, MS expects you to move out of that property to another with costs. Or else, they will abandon you in it. Linux offers a chance to stay until you are ready to move, at your own pace but eventually your own peril.
Once these businessmen started thinking in those terms, the value of Linux and the value of moving their business and applications to Linux begins to show. You can stretch this even further by implying that one property is in a neighbourhood that is full of people who are trying to break in and your landlord is reluctant to put in the grills, better locks and reinforced doors. As master of your own property, you fully undestand that responsiblity is yours. And you don't feel too bad because it is an investment in your own property regardless of what neighbourhood your property is in. ( I like to think it is in a friendly community :) )
In my talks with business people who want to use Linux in their business, I found that equating Linux property helps brings about a better understanding. The anology goes that if you are building a business, you must have a place to do business. This place is the 'property'. So, if you are building applications to sell to your customers and you build them using Windows as your operating system, then Windows is your 'property' that you do your business in. If you choose to build that application using Linux, too, you would be like setting up another business in another property. The major difference between the Windows property and the Linux property is that one is a lot cheaper than the other to set up and run. You have better control on one property over the other, akin to buying one and renting / leasing another.
Each property bring with it their own requirements. Maintenance in relation to the OS is an example. You need to have one in your Windows property and another in you Linux property. Add in support, R&D, pre-sales tech and the costs of maintaining your business in both properties will add up to a lot quickly. Unless business in each property is sufficient to justify their own existence, most business would close one shop in favour of the other. How one business reaches that decisions in another matter. What would you look at? Would you look at the revenue each place of business generates? Or the actual profit? Today's business or tommorow's? Profits acheived yesterday or projected profits of tommorow?
How about upkeep of the property? You do need to refresh your place of business out of necessity or just for the heck of it. More space for the increased workforce, new wiring for a faster network or a fresh coat of paint to cover the aging process. Which place would you sink more money in, the place that you rent and eventually need to move out or the one that you bought. With Windows, MS expects you to move out of that property to another with costs. Or else, they will abandon you in it. Linux offers a chance to stay until you are ready to move, at your own pace but eventually your own peril.
Once these businessmen started thinking in those terms, the value of Linux and the value of moving their business and applications to Linux begins to show. You can stretch this even further by implying that one property is in a neighbourhood that is full of people who are trying to break in and your landlord is reluctant to put in the grills, better locks and reinforced doors. As master of your own property, you fully undestand that responsiblity is yours. And you don't feel too bad because it is an investment in your own property regardless of what neighbourhood your property is in. ( I like to think it is in a friendly community :) )
Labels:
Commentary,
Linux
Saturday, September 18, 2004
Through the looking glass
A friend asked me whether I had Knoppix. I was happy to hand over an ever-present copy. Knoppix is something I keep around in case someone wants to try Linux out. But it that was not the case this time. Apparently, someone told him that a problem he has could be fixed using Knoppix. A problem with Windows XP Pro.
Intrigued, I sat down to look at it. After a while I figured that he probably got hit by the Sasser virus and it was now preventing him from logging in and rebooting every time he pressed OK at an error message box. Not good. One of the fixes mentioned on the Internet involved replacing corrupted config files with a copy made automatically by Windows.
So I have this laptop physically in front of me and all I had to do was go in a replace a couple a files. Piece of cake. Cake on my face was more like it as the hours passed. The first problem was getting read access to the hard disk. Knoppix gave me that capability via Captive. But it needed to access the Internet and Knoppix didn't detect the laptop's network card. At this point I could have slaved to get Knoppix to detect the card because I had Mandrake 10 running on the exact model elsewhere and it detected the network card. Just a matter of figuring out which module to load.
Then it struck me, what if I didn't have Knoppix? What options do I have? Quite a few. A few solutions involved booting from floppies to get to a command prompt. Nothing beats command line when it comes to fixing a broken OS. Although this was the best way, it involved a purchase. Booting from the XP CD allowed me to go into a "repair mode" which essentially was a command line. But that required the Administrator's password. Despite my friend's assurance, the password he gave me didn't allow me in. I didn't say it was wrong. Since the problem involved corruption at the login process, it was a possiblity that access to the password database was corrupted.
As I sat and thought about the problem, I realised how different would this be if I was looking at Linux. I don't want to alarm people but in the Linux world (or even the Unix world), physical access security is everything. The logic is probably that there is no use for all the fancy network security filtering masqing proxying thingamajig if someone can run away with your hard disk. Or CPU unit. The security experts will tell you that no matter what encryption you put on that hard disk, with enough time and money, it will be cracked. So the moral is don't let someone steal your server or hard disk. Unless you are one of those people who Lojack your hard disk. Hmm.... there's an idea.
With Windows XP Pro, it makes it so hard for a technician (me) sitting right there in front of it to crack through the system to fix it. In fact, it was probably easier to break it from remote than to break it right there. Where is the logic in that? Lock the doors to your house and forget the key will keep you out of it but not the people who can stream in through the subway entrance in your house?
With Linux, there is a lot of emphasis on security with regards to network access. For a good reason. My friend's problem? It still isn't fixed because I had to go do other things but I'm going back to the Knoppix solution.
Intrigued, I sat down to look at it. After a while I figured that he probably got hit by the Sasser virus and it was now preventing him from logging in and rebooting every time he pressed OK at an error message box. Not good. One of the fixes mentioned on the Internet involved replacing corrupted config files with a copy made automatically by Windows.
So I have this laptop physically in front of me and all I had to do was go in a replace a couple a files. Piece of cake. Cake on my face was more like it as the hours passed. The first problem was getting read access to the hard disk. Knoppix gave me that capability via Captive. But it needed to access the Internet and Knoppix didn't detect the laptop's network card. At this point I could have slaved to get Knoppix to detect the card because I had Mandrake 10 running on the exact model elsewhere and it detected the network card. Just a matter of figuring out which module to load.
Then it struck me, what if I didn't have Knoppix? What options do I have? Quite a few. A few solutions involved booting from floppies to get to a command prompt. Nothing beats command line when it comes to fixing a broken OS. Although this was the best way, it involved a purchase. Booting from the XP CD allowed me to go into a "repair mode" which essentially was a command line. But that required the Administrator's password. Despite my friend's assurance, the password he gave me didn't allow me in. I didn't say it was wrong. Since the problem involved corruption at the login process, it was a possiblity that access to the password database was corrupted.
As I sat and thought about the problem, I realised how different would this be if I was looking at Linux. I don't want to alarm people but in the Linux world (or even the Unix world), physical access security is everything. The logic is probably that there is no use for all the fancy network security filtering masqing proxying thingamajig if someone can run away with your hard disk. Or CPU unit. The security experts will tell you that no matter what encryption you put on that hard disk, with enough time and money, it will be cracked. So the moral is don't let someone steal your server or hard disk. Unless you are one of those people who Lojack your hard disk. Hmm.... there's an idea.
With Windows XP Pro, it makes it so hard for a technician (me) sitting right there in front of it to crack through the system to fix it. In fact, it was probably easier to break it from remote than to break it right there. Where is the logic in that? Lock the doors to your house and forget the key will keep you out of it but not the people who can stream in through the subway entrance in your house?
With Linux, there is a lot of emphasis on security with regards to network access. For a good reason. My friend's problem? It still isn't fixed because I had to go do other things but I'm going back to the Knoppix solution.
Labels:
Commentary,
Linux
Sunday, August 22, 2004
Turning back the clock
I like to use Linux with the mindset of a user. I am very wary when I realise that I am using my 10 odd years of using Linux on and off to solve problems that may be faced by a user. This is a line that I clearly mark.
I am saying this because as Linux goes mainstream, this has to become more important. I believe it is the responsibility of entities that are making money out of Linux to put in the effort to ensure that it can be used by the intended users without having to resort to constant professional help. This is even more so for Linux distribution companies like RedHat and Novell. When more users can use the final distribution, it creates more sales opportunity for them. Let's not put aside the fact that the ease of configuration in Windows is what made it most attractive to businesses. You can shoot yourself in the foot with it (misconfigure till you loose data) but at least you did it with ease and style.
Which brings me to my tale this time. Updating (and all it's ills) are handled with relative ease in Linux. With Mandrake, running Mandrake Update and several clicks later will update the system. Emphasis on backwards compatibility and ensuring inter-dependancy is always maintained usually makes regular updates painless. The most crucial part of regular updates is kernel updates. If you are a business user and you use an enterprise version of Linux chances are you're running with the dinosaurs, kernel-wise. For most of us using the Fedora or Mandrake, kernel updates are not that often but still regular. Most experienced users will point out that there is no race to use the latest kernel. Most of the time they are fixes and security updates, rather than additional functionality. You can skip an update releaste or two but stray not too far.
Mandrake recently released an updated called kernel 2.6.3-15. I was using kernel 2.6.3-13. I've updated many times before without a hitch. I have an Nvidia card and it is tied to a specific kernel. So after a kernel update, I need to generate a new driver. This means I need the updated kernel sources, too. So after updating the kernel, I downloaded the kernel sources. Having a broadband connection is a blessing.
The Nvidia installer uses the kernel sources to generate a driver. However, the Nvidia installer complained that my sources were not clean. It suggested I do something with mproper. I followed the instructions without success. Since I was using the new kernel, I switched back to the old one to get to my graphical interface. This was easy because Mandrake keeps a link during boot-up (using lilo) to older versions of the kernel. Very thoughtful. After following suggestions from the web, the problem was not solved, I couldn't generate the Nvidia driver and I was still stuck using the older kernel. So I decided to make a permanent switch back to the older kernel. Now, this was not common but it was very easy and over in a few minutes.
Later I wanted to try VMware. This is a cop-out, I'll admit, but sites that use Windows Media Player tend to let me have the multimedia stuff much faster and I wanted to user Yahoo Launch. What these sites do is possible with linux-friendly players like Real Player but the developers tend to layer it with so many pages and Real Player 10 for Linux (based on helix) still balks regularly at sites that play ads before showing the good stuff. Yes, I have heard of Crossover and that is next on my list.
VMWare says during the installation process that it need to recompile something using the kernel sources. However, I had installed the kernel source for kernel 2.6.3-15 but I was using kernel 2.6.3.-13. As part of it's installation, the newer kernel source package removed files from the older kernel source. "So, I'll just remove this version of the kernel source and install the old one," I thought. But after an hour of searching the Internet, I could find only a single source for the kernel source package, a server in South Africa. Apparently, as the new kernel source is updated on mirrors worldwide, the old one is simple lost and never archived. And this site was horrendously slow. Fearing that some other program might need the sources, I hunkered down for a long download and waited. Two rpm commands later, one to remove the kernel source package and another to install the newly downloaded kernel source, I was back in business.
To be frank, the problen with the newest kernel-source could have been fixed fast enough if I was familiar with building a kernel and the associated processses. But then it probably would have taken more time and crossed the line between the Linux user and the Linux professional.
The good thing about Linux is that eveything is exact. How parts interacts with each other are very regulated by convention or by viture of creating packages or packaging against a specific distribution. Although it took some time, rolling back from a semi-faulty package was not too hard, even for something as complex as a kernel update.
I am saying this because as Linux goes mainstream, this has to become more important. I believe it is the responsibility of entities that are making money out of Linux to put in the effort to ensure that it can be used by the intended users without having to resort to constant professional help. This is even more so for Linux distribution companies like RedHat and Novell. When more users can use the final distribution, it creates more sales opportunity for them. Let's not put aside the fact that the ease of configuration in Windows is what made it most attractive to businesses. You can shoot yourself in the foot with it (misconfigure till you loose data) but at least you did it with ease and style.
Which brings me to my tale this time. Updating (and all it's ills) are handled with relative ease in Linux. With Mandrake, running Mandrake Update and several clicks later will update the system. Emphasis on backwards compatibility and ensuring inter-dependancy is always maintained usually makes regular updates painless. The most crucial part of regular updates is kernel updates. If you are a business user and you use an enterprise version of Linux chances are you're running with the dinosaurs, kernel-wise. For most of us using the Fedora or Mandrake, kernel updates are not that often but still regular. Most experienced users will point out that there is no race to use the latest kernel. Most of the time they are fixes and security updates, rather than additional functionality. You can skip an update releaste or two but stray not too far.
having a broadband connection is a blessing
Mandrake recently released an updated called kernel 2.6.3-15. I was using kernel 2.6.3-13. I've updated many times before without a hitch. I have an Nvidia card and it is tied to a specific kernel. So after a kernel update, I need to generate a new driver. This means I need the updated kernel sources, too. So after updating the kernel, I downloaded the kernel sources. Having a broadband connection is a blessing.
The Nvidia installer uses the kernel sources to generate a driver. However, the Nvidia installer complained that my sources were not clean. It suggested I do something with mproper. I followed the instructions without success. Since I was using the new kernel, I switched back to the old one to get to my graphical interface. This was easy because Mandrake keeps a link during boot-up (using lilo) to older versions of the kernel. Very thoughtful. After following suggestions from the web, the problem was not solved, I couldn't generate the Nvidia driver and I was still stuck using the older kernel. So I decided to make a permanent switch back to the older kernel. Now, this was not common but it was very easy and over in a few minutes.
Later I wanted to try VMware. This is a cop-out, I'll admit, but sites that use Windows Media Player tend to let me have the multimedia stuff much faster and I wanted to user Yahoo Launch. What these sites do is possible with linux-friendly players like Real Player but the developers tend to layer it with so many pages and Real Player 10 for Linux (based on helix) still balks regularly at sites that play ads before showing the good stuff. Yes, I have heard of Crossover and that is next on my list.
VMWare says during the installation process that it need to recompile something using the kernel sources. However, I had installed the kernel source for kernel 2.6.3-15 but I was using kernel 2.6.3.-13. As part of it's installation, the newer kernel source package removed files from the older kernel source. "So, I'll just remove this version of the kernel source and install the old one," I thought. But after an hour of searching the Internet, I could find only a single source for the kernel source package, a server in South Africa. Apparently, as the new kernel source is updated on mirrors worldwide, the old one is simple lost and never archived. And this site was horrendously slow. Fearing that some other program might need the sources, I hunkered down for a long download and waited. Two rpm commands later, one to remove the kernel source package and another to install the newly downloaded kernel source, I was back in business.
rolling back was not too hard
To be frank, the problen with the newest kernel-source could have been fixed fast enough if I was familiar with building a kernel and the associated processses. But then it probably would have taken more time and crossed the line between the Linux user and the Linux professional.
The good thing about Linux is that eveything is exact. How parts interacts with each other are very regulated by convention or by viture of creating packages or packaging against a specific distribution. Although it took some time, rolling back from a semi-faulty package was not too hard, even for something as complex as a kernel update.
Wednesday, August 11, 2004
The Pleasant Surprise
I am determined that my children learn the benefits of choice. Especially computing choices. Already at school kids are learning to use computers and more often than not, the computers are MSWindows PCs. If you think MSWindow's dominance in the business world is strong, is even more in the education sector. While Apple is still there and a good choice for the classroom, Apple are pricing themselves out of the classrooms. Apple computers are easy to use and don't exhibit the problems with the interface like MSWindows computers do. When it came to Apple Macs, you just used them. Instead of struggling with the MSWindows GUI and praying it doesn't hang. I have taught kids and I have taught people how to use computers. It is hard enough with adults having to deal with hung PCs while teaching. Add to that the cries for help of helpless children that only distracts them from the lesson being taught. Compounding that is the other childern who are distracted by the cries. Headache Central.
So I decided that since I am using Linux at home, the children will do too. The first main obstacle is quality educational software for kids on Linux. With due respect to all the efforts on the Internet, the quality is not even close to that of software written 5 years ago. Like it or not, I had to run MSWindows program on Linux. The problem was that the majority of the software I looked at were developed using Shockwave or Director. I have yet to find a native solution although everything from Wine, CrossOver Office, VmWare to Bochs has crossed my mind.
So I've settled on Web based educational sites that are normally running Flash. I created an account for each of them and put their faces as the KDM icon. Next, I created shortcuts on the desktop for Mozilla that opens each of the sites. The sites I find that attracts my kids attention the most are PBSKids, Sesame Street and Playhouse Disney Channel. Ok, so my kids are not that old yet. The more the reason to get them to know Linux. Now, I had to repeat this for the younger sibling on his Desktop.
(Which brought me thinking that this type of setting up user enviroments in Linux is not explored throughroughly yet. Think of Novell's ZenWorks for Desktop for Linux or ZenWorks for Linux Desktop.)
Next problem was that I had set the screen resolution to be 1024x768. Not this is ok for me but the sites above came up quite small. And when the Flash activities came on, it became even smaller. My mind was racing at the thought of setting up individual X config files for each of them. I haven't tweaked X config files in ages. Being the lazy type, I began mucking around and lo behold, you could setup individual screen resolutions in GNOME 2.4. What a pleasant surprise! I thought I had gone over GNOME setups over and over again. And yet I never found this. So like my kids, I was learning something new every day. Maybe I should rename this blog into 'The Linux Adventure'.
I don't know why that config option affected me so much. One one level, I felt that someone had read my mind and made that feature. On another level, I felt beholden to the person who decided to put in that feature. I felt so grateful to the people who put their own time an effort into making Linux and free software great. I guess I can honor them by making sure the next generation starts using Linux and making choices.
So I decided that since I am using Linux at home, the children will do too. The first main obstacle is quality educational software for kids on Linux. With due respect to all the efforts on the Internet, the quality is not even close to that of software written 5 years ago. Like it or not, I had to run MSWindows program on Linux. The problem was that the majority of the software I looked at were developed using Shockwave or Director. I have yet to find a native solution although everything from Wine, CrossOver Office, VmWare to Bochs has crossed my mind.
So I've settled on Web based educational sites that are normally running Flash. I created an account for each of them and put their faces as the KDM icon. Next, I created shortcuts on the desktop for Mozilla that opens each of the sites. The sites I find that attracts my kids attention the most are PBSKids, Sesame Street and Playhouse Disney Channel. Ok, so my kids are not that old yet. The more the reason to get them to know Linux. Now, I had to repeat this for the younger sibling on his Desktop.
(Which brought me thinking that this type of setting up user enviroments in Linux is not explored throughroughly yet. Think of Novell's ZenWorks for Desktop for Linux or ZenWorks for Linux Desktop.)
Next problem was that I had set the screen resolution to be 1024x768. Not this is ok for me but the sites above came up quite small. And when the Flash activities came on, it became even smaller. My mind was racing at the thought of setting up individual X config files for each of them. I haven't tweaked X config files in ages. Being the lazy type, I began mucking around and lo behold, you could setup individual screen resolutions in GNOME 2.4. What a pleasant surprise! I thought I had gone over GNOME setups over and over again. And yet I never found this. So like my kids, I was learning something new every day. Maybe I should rename this blog into 'The Linux Adventure'.
I don't know why that config option affected me so much. One one level, I felt that someone had read my mind and made that feature. On another level, I felt beholden to the person who decided to put in that feature. I felt so grateful to the people who put their own time an effort into making Linux and free software great. I guess I can honor them by making sure the next generation starts using Linux and making choices.
Monday, August 09, 2004
LinuxFormat: Required Reading Material
There is only one Linux Magazine I buy religiously : Linux Format.
Linux Format should be required reading for people who are interested with Linux on the desktop. For once, a magazine addresses the issues of an average user (and not the user with an aveage IQ of 250). If you wanted to see how magazine looked like in the early PC days, this magazine has a few similarities. But more importantly, this magazine embodies the same attitudes the PC magazines had in those days. The days when PC User groups were places to get the latest info and swap war stories (read: trial and error OR My PC blew up and I survived to tell the tale). The magazine even has listed (primarly UK) Linux user groups at the back of the magazine. That brought flashbacks.
This magazine is also easy on the eyes. Not like other similar mags of it's ilk where every possible space is crammed with words, Linux Format uses space and graphics well. And I don't mean diagrams or pictures of product boxes or pictures of other users and people of note caught in most unflattering flashed photos. There are actual graphics that have no relation to the article other than to make it look nice. Imagine that!
(Note to other Linux magazines: it's ok to look nice. Just because your readers spend all day looking at code, doesn't mean they'll appreciate a magazine formatted to look like more code)
I don't care that much about the CDs or DVDs that come with it because I'm on broadband and I can get at the programs or links that are mentioned in the magazine much faster. But if you don't, they are a great resource, especially when it comes with distros.
There is also the companion TuxRadar website and the wonderful Tuxradar podcast. Here, you get to hear the writers from Linux format talk about Linux issues of the day sprinkled with a very UK-centric view. Which is refreshing to hear from the US-centric view I usually get from other sources.
I read other Linux magazines like Linux Journal and Linux Magazine (the US version, not the UK version that seems to be machine translated from German). But I tend to pick them off the discount rack (it is that expensive). And they tend to be there when I pick them up. But with Linux Format, I have to be on my toes when the month rolls over to grab my copy or I'll miss out. I have staked out the quality bookshops that carry it and even have the phone number of the local distributor so that I can bug them as to when the next issue is going to be out.
Why don't I subscribe? Where's the fun in that? :)
Linux Format should be required reading for people who are interested with Linux on the desktop. For once, a magazine addresses the issues of an average user (and not the user with an aveage IQ of 250). If you wanted to see how magazine looked like in the early PC days, this magazine has a few similarities. But more importantly, this magazine embodies the same attitudes the PC magazines had in those days. The days when PC User groups were places to get the latest info and swap war stories (read: trial and error OR My PC blew up and I survived to tell the tale). The magazine even has listed (primarly UK) Linux user groups at the back of the magazine. That brought flashbacks.
This magazine is also easy on the eyes. Not like other similar mags of it's ilk where every possible space is crammed with words, Linux Format uses space and graphics well. And I don't mean diagrams or pictures of product boxes or pictures of other users and people of note caught in most unflattering flashed photos. There are actual graphics that have no relation to the article other than to make it look nice. Imagine that!
(Note to other Linux magazines: it's ok to look nice. Just because your readers spend all day looking at code, doesn't mean they'll appreciate a magazine formatted to look like more code)
I don't care that much about the CDs or DVDs that come with it because I'm on broadband and I can get at the programs or links that are mentioned in the magazine much faster. But if you don't, they are a great resource, especially when it comes with distros.
There is also the companion TuxRadar website and the wonderful Tuxradar podcast. Here, you get to hear the writers from Linux format talk about Linux issues of the day sprinkled with a very UK-centric view. Which is refreshing to hear from the US-centric view I usually get from other sources.
I read other Linux magazines like Linux Journal and Linux Magazine (the US version, not the UK version that seems to be machine translated from German). But I tend to pick them off the discount rack (it is that expensive). And they tend to be there when I pick them up. But with Linux Format, I have to be on my toes when the month rolls over to grab my copy or I'll miss out. I have staked out the quality bookshops that carry it and even have the phone number of the local distributor so that I can bug them as to when the next issue is going to be out.
Why don't I subscribe? Where's the fun in that? :)
Labels:
Linux
Friday, August 06, 2004
Splitting Linux
Before those of you who know better howl at me, let me get this straight: I understand that a Linux distribution is not Linux.
But the reality is that people tend to think of it that way, that Linux is a Linux distribution. For a long time people never understood why I wasn't using RedHat, like everybody else. I wasn't using it because I understood that I could go to any Linux distribution and switch back later as long as I was on the Intel platform. For a time RedHat was Linux. I still think that some business that they get is simply just because the users don't know about the other distributions or haven't bothered to take a look. For now onwards, the term Linux will largely mean Linux distributions.
Today I installed Fedora Core 2 (FC2). Now, I use Mandrake on my desktops and a mix of other distributions on the servers. My RH8 servers were looking long in the tooth and I figured out I needed to refresh them. So I got the set of FC2 disks that came with this month's Linux Format and burned disk 3 and 4 from the Internet. As I was configuring them, I became acutely aware of how different it was to configure Linux under FC2 than under Mandrake. As with all of my machines, I install Webmin to help me administer them. That me got me fiddling with the package managers and demonstrating how different the approaches of Mandrake and FC2 was when it came to package management resources. Basically, I was finding it hard to find any of the programs that helped me to configure some of the stuff, mot of the time the programs I was used to on Mandrake but not entirely. Some programs that I assumed were part of KDE were also missing or quite hard to find..
It then dawned to me not only that I was just using a different distribution but a distribution meant for someone else. I was so used to Mandrake, a distribution clearly gunning for the desktop, that I was upset with Fedora for not doing enough to have it as easily configurable as Mandrake. It was ok with me finally because I realized that RedHat/Fedora were really about servers. Both distributions were catering to their consituency and their unique requirements and environment. For example, with a server, configuration is a once off thing and the people using them were expected to know what they were doing. On a desktop, the configuration can change quite often as USB devices are plugged and pulled. Plus, most of the time people don't know or care what they are doing to the computer as long as they can use it.
It is important that Linux distributions become adjusted for the environment that it will live in. This leverages on it's nature to be flexible through the choices of the programs it includes as part of the distribution which includes configuration tools. By adapting to the environment, Linux addresses the unique environment that it will function in be it a server or desktop of today or the embedded device of tomorrow. It is very much a version of tolerance within the space of Linux technology.
But the reality is that people tend to think of it that way, that Linux is a Linux distribution. For a long time people never understood why I wasn't using RedHat, like everybody else. I wasn't using it because I understood that I could go to any Linux distribution and switch back later as long as I was on the Intel platform. For a time RedHat was Linux. I still think that some business that they get is simply just because the users don't know about the other distributions or haven't bothered to take a look. For now onwards, the term Linux will largely mean Linux distributions.
Today I installed Fedora Core 2 (FC2). Now, I use Mandrake on my desktops and a mix of other distributions on the servers. My RH8 servers were looking long in the tooth and I figured out I needed to refresh them. So I got the set of FC2 disks that came with this month's Linux Format and burned disk 3 and 4 from the Internet. As I was configuring them, I became acutely aware of how different it was to configure Linux under FC2 than under Mandrake. As with all of my machines, I install Webmin to help me administer them. That me got me fiddling with the package managers and demonstrating how different the approaches of Mandrake and FC2 was when it came to package management resources. Basically, I was finding it hard to find any of the programs that helped me to configure some of the stuff, mot of the time the programs I was used to on Mandrake but not entirely. Some programs that I assumed were part of KDE were also missing or quite hard to find..
It then dawned to me not only that I was just using a different distribution but a distribution meant for someone else. I was so used to Mandrake, a distribution clearly gunning for the desktop, that I was upset with Fedora for not doing enough to have it as easily configurable as Mandrake. It was ok with me finally because I realized that RedHat/Fedora were really about servers. Both distributions were catering to their consituency and their unique requirements and environment. For example, with a server, configuration is a once off thing and the people using them were expected to know what they were doing. On a desktop, the configuration can change quite often as USB devices are plugged and pulled. Plus, most of the time people don't know or care what they are doing to the computer as long as they can use it.
It is important that Linux distributions become adjusted for the environment that it will live in. This leverages on it's nature to be flexible through the choices of the programs it includes as part of the distribution which includes configuration tools. By adapting to the environment, Linux addresses the unique environment that it will function in be it a server or desktop of today or the embedded device of tomorrow. It is very much a version of tolerance within the space of Linux technology.
Thursday, August 05, 2004
Planning as part of life
I'm switching over the ISP connection to a higher bandwidth which also means that I'll be switching IP addresses. This also means the slightly tricky business of moving the DNS server. I'm hosting my own DNS for some strange reason and I have to inform the NIC involved that my domain's DNS server has changed IP addresses. The tricky part comes in because I have only one DNS server but I have to move it and keep the older DNS setting alive for a while. Thanks goodness Linux is so 'not demanding'.
Took out an old Celeron PC with 32MB of RAM(!). Installed Linux on it and installed the bind package. Within a hour or so I have another server ready to do some light work. I configured it so that the Celeron DNS would host slave zones to the main DNS server for a while. Essentially it gets the settings and information about the zone from the main DNS server. After a while, I converted the slave zones to a normal or master zone. I configured the Celeron PC so that it took over the IP address off the main DNS server. The main DNS server was then transferred to use the newer IPs. Finally, I informed the NIC and they'll update their records accordingly.
What surprised me was not that I was able to set up another DNS server in record time but I spent more time planning than actually doing the job.
That is something I credit using Linux for, the tendency to plan stuff out ahead. Linux is extremely powerful and offers 1001 variations of everything. So choosing what to do is very important. It's not enough to decide to do something and just do it it. Ensuring that minimal disruption occurs and detected by the users is as important as achieving what I set out to achieve . Making sure that things can be undone if the goals achieved are not actually what they seem to be is also very important.
The overall concern shifts from whether "Can it be done and what tools do I need to buy to do them?" to "How can I do it using what I have or have to download from somewhere? How can I do it quietly without people feeling a thing." That may not be a paradigm shift but it sure is a shift for the better.
Took out an old Celeron PC with 32MB of RAM(!). Installed Linux on it and installed the bind package. Within a hour or so I have another server ready to do some light work. I configured it so that the Celeron DNS would host slave zones to the main DNS server for a while. Essentially it gets the settings and information about the zone from the main DNS server. After a while, I converted the slave zones to a normal or master zone. I configured the Celeron PC so that it took over the IP address off the main DNS server. The main DNS server was then transferred to use the newer IPs. Finally, I informed the NIC and they'll update their records accordingly.
What surprised me was not that I was able to set up another DNS server in record time but I spent more time planning than actually doing the job.
That is something I credit using Linux for, the tendency to plan stuff out ahead. Linux is extremely powerful and offers 1001 variations of everything. So choosing what to do is very important. It's not enough to decide to do something and just do it it. Ensuring that minimal disruption occurs and detected by the users is as important as achieving what I set out to achieve . Making sure that things can be undone if the goals achieved are not actually what they seem to be is also very important.
The overall concern shifts from whether "Can it be done and what tools do I need to buy to do them?" to "How can I do it using what I have or have to download from somewhere? How can I do it quietly without people feeling a thing." That may not be a paradigm shift but it sure is a shift for the better.
Friday, July 30, 2004
The path treaded once
I was shopping when a sales girl stopped me to talk about the positive points of a Macintosh. I politely told her that I used to use an Blueberry iMac running MacOS 8.something but since changing jobs have not used one regularly since. She ignored that and continued with her pitch. While I was nodding to be polite, I couldn't help noticing the parallel of my Mac experience with my Linux experience on the desktop.
I was new to the Mac when one was plunked on my desk. A VIP in the company I was working for wisely and stubbornly refused to move to a Windows PC from his aging Mac. So it finally came to the point where he got an upgrade on his machine and I was entrusted to support him directly. Coming from the Windows desktop background, I had tried applying what I learned with Windows while using the Mac. I learned that two of the most pointless things to do on a Mac back then was
1. Check of updates - to OS, new version of programs etc
2. Find cool new tools to do what you want to do.
.. which was what a lot of Windows users were doing then (and still do).
First, Mac updates weren't that often released. What ever bug there was in the OS, they didn't bother me often and those that do were probably caused by free or shareware programs that were not checked thoughroughly.
Second, there was no need for those additional tools. The tools that they came with or other popular downloadable tools not only did what they advertised but did them well. There was no point in getting another. People tried of course and some times they succeeded.
Suddenly I found myself with more free time and began to be more productive with the Mac. Since I used it for supporting at the most 10 Macs and I was still sharing files with my other collegues, I had to still use Windows for most stuff. But the Mac became my primary web browsing device for the sheer stability of it. Since that, I began to appreciate the Macs for what they were, well designed machines.
The Point
The point I am trying to make is that in terms of Linux on the desktop, it'll follow pretty closely to the footsteps of the Macs. It'll suffer from lack of hardware support or have limited support from the vendors, especially those smaller ones taking 'development funds' from MS. These are the companies that make cheap peripherals and hardware. Those products help define what the total PC costs and how much bang for the buck it'll bring. What new Mac users find out very quickly is that not all hardware are Mac compatible and those that are, are more expensive. That in turn limits the market and slows consumer adoption. Exactly what MS wants.
When it comes to updates, I think Linux users are guilty of having to update as often as their Windows counterparts. That is if they want to update everything in a distribution. Since the Desktop Linux is usually a distribution-based install, the distribution usually has a way of keeping up to date with the latest files. And they are pretty smart in updating what only is installed. Updating files is a way of life under Linux because of it's dynamic nature. Updates are either because of progress in the development or responding to security threats. Those security threats are usually not exploited yet but efforts were made to make sure no one does.
And updates usually don't break unrelated software.
With the Mac, updates were not that often and checking for them weekly, became an excercise in futility.
Another thing about Linux is that it has many interdependant components. Often a utility builds on another program or a foundation of components that already does something similar. Take ftp for example. You can use interactively, old-school style or use wget. Or use a tool that does ftp amongst other things like curl. But then again, you may need more help and opt for Download for X or GFTP. This interdependency is good because it'll help improve the tools and reduces recurring work. Which may lead to just a few products that become important to the community. Me-the-Windows-user used to say that this reduces choice. But in the end, why choose something else since the one used already has all that is needed. It was the same for with the Mac. I wanted to try new stuff but ultimately the tools for the Mac, while few, did their jobs very well.
Despite all that, Mac use is not what it was and may be shrinking, porportionately. I had to admit Steve Jobs saved Apple by focusing on what made a good product sell. But odd are stacked against him if he decides to add features to the Mac to reach even more users, make it more popular. What is different is that while Mac users has to wait for Apple to do something, with Linux, anybody and everybody can make a difference and do something about it. Whether it is for more frequent or less frequent updates or for more tools or tools with more features, Linux users can make an impact. That is what the community is about.
Linux users should look at what issues the Mac faced in it's effort to gain wide acceptance and see where they are today. And prepare. The Linux Desktop should be there soon.
I was new to the Mac when one was plunked on my desk. A VIP in the company I was working for wisely and stubbornly refused to move to a Windows PC from his aging Mac. So it finally came to the point where he got an upgrade on his machine and I was entrusted to support him directly. Coming from the Windows desktop background, I had tried applying what I learned with Windows while using the Mac. I learned that two of the most pointless things to do on a Mac back then was
1. Check of updates - to OS, new version of programs etc
2. Find cool new tools to do what you want to do.
.. which was what a lot of Windows users were doing then (and still do).
First, Mac updates weren't that often released. What ever bug there was in the OS, they didn't bother me often and those that do were probably caused by free or shareware programs that were not checked thoughroughly.
Second, there was no need for those additional tools. The tools that they came with or other popular downloadable tools not only did what they advertised but did them well. There was no point in getting another. People tried of course and some times they succeeded.
Suddenly I found myself with more free time and began to be more productive with the Mac. Since I used it for supporting at the most 10 Macs and I was still sharing files with my other collegues, I had to still use Windows for most stuff. But the Mac became my primary web browsing device for the sheer stability of it. Since that, I began to appreciate the Macs for what they were, well designed machines.
The Point
The point I am trying to make is that in terms of Linux on the desktop, it'll follow pretty closely to the footsteps of the Macs. It'll suffer from lack of hardware support or have limited support from the vendors, especially those smaller ones taking 'development funds' from MS. These are the companies that make cheap peripherals and hardware. Those products help define what the total PC costs and how much bang for the buck it'll bring. What new Mac users find out very quickly is that not all hardware are Mac compatible and those that are, are more expensive. That in turn limits the market and slows consumer adoption. Exactly what MS wants.
When it comes to updates, I think Linux users are guilty of having to update as often as their Windows counterparts. That is if they want to update everything in a distribution. Since the Desktop Linux is usually a distribution-based install, the distribution usually has a way of keeping up to date with the latest files. And they are pretty smart in updating what only is installed. Updating files is a way of life under Linux because of it's dynamic nature. Updates are either because of progress in the development or responding to security threats. Those security threats are usually not exploited yet but efforts were made to make sure no one does.
And updates usually don't break unrelated software.
With the Mac, updates were not that often and checking for them weekly, became an excercise in futility.
Another thing about Linux is that it has many interdependant components. Often a utility builds on another program or a foundation of components that already does something similar. Take ftp for example. You can use interactively, old-school style or use wget. Or use a tool that does ftp amongst other things like curl. But then again, you may need more help and opt for Download for X or GFTP. This interdependency is good because it'll help improve the tools and reduces recurring work. Which may lead to just a few products that become important to the community. Me-the-Windows-user used to say that this reduces choice. But in the end, why choose something else since the one used already has all that is needed. It was the same for with the Mac. I wanted to try new stuff but ultimately the tools for the Mac, while few, did their jobs very well.
Despite all that, Mac use is not what it was and may be shrinking, porportionately. I had to admit Steve Jobs saved Apple by focusing on what made a good product sell. But odd are stacked against him if he decides to add features to the Mac to reach even more users, make it more popular. What is different is that while Mac users has to wait for Apple to do something, with Linux, anybody and everybody can make a difference and do something about it. Whether it is for more frequent or less frequent updates or for more tools or tools with more features, Linux users can make an impact. That is what the community is about.
Linux users should look at what issues the Mac faced in it's effort to gain wide acceptance and see where they are today. And prepare. The Linux Desktop should be there soon.
Labels:
Commentary,
Linux
Monday, July 26, 2004
Moving on.
I tried logging into my workstation today and GNOME refused to start up properly. It said the program in charge of storing (and restoring) my settings was not starting up properly. It also complained it couldn't find out my workstation FQDN (fully qualified domain name (e.g. www.google.com)) and warned me that certain things were going to go wrong.
I think I have an idea of what was caused it. I tried changing the hostname of the PC recently. I do most of my administration using Webmin. I changed it in past on RedHat machines and they turned out OK. But this was Mandrake. In their efforts to make it easy on the user, they took away some liberties, or so I thought. Things had to be done using their tools or in a certain way for the workstatin to remain consistent. So when I changed the name of workstation via Webmin, it changed it most places. But not all. One of those places was now telling Gnome my old hostname which of course, did not match the current hostname. My guess these were client/server things, that the hostname was being used by the client to find the server.
Or maybe not. There is a big sin when using a distribution, it' called Distribution Mixing or Repository Mixing. A distribution is such because the people behind them made some decisions that affected a lot of Linux components, most of them core components. These were basically options that could be set when the components were compiled. Since these affected a whole lot of files, each addtional component or program were compiled against these core Linux components. And it builds and builds and builds that way until you have a distribution.
The way a distribution can add and remove pre-compiled programs to a PC is by using a package method. For example RPM is a pacakge method. And so it deb for the Debian distribution. A collection of these pre-compiled programs is stored and shared and called a respository.
Often, the way a program is pacakged is shared between distributions. For example, both RedHat and Mandrake use RPM. Mixing repositories is essentially using a program from another repository together with another distribution. An example of this would be using a RedHat RPM pacakge file to install a program on a Mandrake distribution.
Most of the time the differences are minimal but some are really critical. I have RedCarpet on my PC and I have it to download automatically the latest update. RedCarpet, being part of Ximian, maintains a repository for GNOME and it's components and programs. I was foolish enough to use that. My guess is that RedCarpet updated my GNOME with files from the Ximian repository. Those files worked all this while but maybe some recent changes caused my system to break. Even when I switched to KDE, Evolution still won't run.
The point it, I was not in the mood to hunt this down. I had faced it once in the past and I think I am going to take the MSWindow's All-fit-fix-it solution - reinstall. The workstation was running Mandrake 9.1 and I was running version 10 on the laptop without a major hitch. What a better reason to upgrade.
Which made me think about issues around workstation migration. How would this be done on a larger scale? What tool would I use? How much planning would it require?
My strategy was to simply dump everything in /home on another hard disk, reformat the disk, install Mandrake 10 and copy everything back. Of course there were several missing pieces. I'd have to take stock of what services running on the machine that affected others and plan for those services to be running on the new machine. This involves having a copy of /etc too. If I installed the services without using a package manager, I'd have to ensure I have a copy of the source or whatever was used to install these services. Installing Mandrake 10 also means a big fat update just after installation.
Well, reinstalling once every two years isn't so bad. Upgrade don't get me started on that.
I think I have an idea of what was caused it. I tried changing the hostname of the PC recently. I do most of my administration using Webmin. I changed it in past on RedHat machines and they turned out OK. But this was Mandrake. In their efforts to make it easy on the user, they took away some liberties, or so I thought. Things had to be done using their tools or in a certain way for the workstatin to remain consistent. So when I changed the name of workstation via Webmin, it changed it most places. But not all. One of those places was now telling Gnome my old hostname which of course, did not match the current hostname. My guess these were client/server things, that the hostname was being used by the client to find the server.
Or maybe not. There is a big sin when using a distribution, it' called Distribution Mixing or Repository Mixing. A distribution is such because the people behind them made some decisions that affected a lot of Linux components, most of them core components. These were basically options that could be set when the components were compiled. Since these affected a whole lot of files, each addtional component or program were compiled against these core Linux components. And it builds and builds and builds that way until you have a distribution.
The way a distribution can add and remove pre-compiled programs to a PC is by using a package method. For example RPM is a pacakge method. And so it deb for the Debian distribution. A collection of these pre-compiled programs is stored and shared and called a respository.
Often, the way a program is pacakged is shared between distributions. For example, both RedHat and Mandrake use RPM. Mixing repositories is essentially using a program from another repository together with another distribution. An example of this would be using a RedHat RPM pacakge file to install a program on a Mandrake distribution.
Most of the time the differences are minimal but some are really critical. I have RedCarpet on my PC and I have it to download automatically the latest update. RedCarpet, being part of Ximian, maintains a repository for GNOME and it's components and programs. I was foolish enough to use that. My guess is that RedCarpet updated my GNOME with files from the Ximian repository. Those files worked all this while but maybe some recent changes caused my system to break. Even when I switched to KDE, Evolution still won't run.
The point it, I was not in the mood to hunt this down. I had faced it once in the past and I think I am going to take the MSWindow's All-fit-fix-it solution - reinstall. The workstation was running Mandrake 9.1 and I was running version 10 on the laptop without a major hitch. What a better reason to upgrade.
Which made me think about issues around workstation migration. How would this be done on a larger scale? What tool would I use? How much planning would it require?
My strategy was to simply dump everything in /home on another hard disk, reformat the disk, install Mandrake 10 and copy everything back. Of course there were several missing pieces. I'd have to take stock of what services running on the machine that affected others and plan for those services to be running on the new machine. This involves having a copy of /etc too. If I installed the services without using a package manager, I'd have to ensure I have a copy of the source or whatever was used to install these services. Installing Mandrake 10 also means a big fat update just after installation.
Well, reinstalling once every two years isn't so bad. Upgrade don't get me started on that.
Monday, July 19, 2004
The power to be sure
I fixed an new DSL connection and for the life of me, I could not get the server to work. I wanted to host my own DNS server and had set up a firewall with a DMZ and everything. The connection gave me 5 IP addresses which I could use. I could browse, which meant the connection was up but I can't access the DNS server from other machines on the Internet.
After ensuring all the routing and firewall rules did not interfere with what I wanted to do, it finally dawned to me that I needed to see the actual packet themselves. I needed to see the network.
The best program for this by far is ethereal. I had an old hub lying around so used that to create a primitive network tap between the DSL modem and the firewall. The hub would retransmit all the communication between the firewall and the DSL modem. Ethereal was running on another PC connected to the hub and would capture and decipher this for me. Ethereal is wonderful at filtering. I found out that although I could browse the web and see the packets from the firewall to the DSL modem, whenever I tried to access the DNS server from another Internet connection, there would be no traffic for that server or that protocol. Essentially, I was like behind another firewall. Traceroute-ing lead me to just one hop before it should hit the server, the DSL router at the ISP. Definitely something is not right.
What amazed me most is the ability to use tools like Ethereal at almost no cost. A few years ago meant I have to do a lot of guessing or cough up a lot of money for a network analyzer.
Now, I can choke my ISP (for an answer) much faster.
After ensuring all the routing and firewall rules did not interfere with what I wanted to do, it finally dawned to me that I needed to see the actual packet themselves. I needed to see the network.
The best program for this by far is ethereal. I had an old hub lying around so used that to create a primitive network tap between the DSL modem and the firewall. The hub would retransmit all the communication between the firewall and the DSL modem. Ethereal was running on another PC connected to the hub and would capture and decipher this for me. Ethereal is wonderful at filtering. I found out that although I could browse the web and see the packets from the firewall to the DSL modem, whenever I tried to access the DNS server from another Internet connection, there would be no traffic for that server or that protocol. Essentially, I was like behind another firewall. Traceroute-ing lead me to just one hop before it should hit the server, the DSL router at the ISP. Definitely something is not right.
What amazed me most is the ability to use tools like Ethereal at almost no cost. A few years ago meant I have to do a lot of guessing or cough up a lot of money for a network analyzer.
Now, I can choke my ISP (for an answer) much faster.
Wednesday, July 14, 2004
Thinking outside the box.. Literally
I've just set up squid cache server and configured the firewalls to do transparent proxying for web browsing. I needed to add frox to do transparent ftp proxy. I figured that would cover the majority of the traffic going out the office. Setting up frox was easy even though there were no packages of it for RedHat.
Then, things got a bit slow. I could see that Mozilla would freeze up a moment after I pressed enter. After a few seconds the whole page would come at once. Now, I'd also changed the settings of the squid server to handle the additional traffic. I figured that my changes weren't enough to cope with the additional traffic. So I changed it and tried again. It didn't get better. So I changed it somewhere else and tried again. And again. But no matter what I tried, it still couldn't make the pages come up faster.
I began to think that my eyes were fooling me. So, I looked at all sorts of information counters squid has and used that instead of my eyes. My changes did show some improvement when it came to what the counters were telling me but my eyes said it still looked the same. I was about to set mrtg loose on the machine when I noticed how cfgmaker calculated the speed of the network interface card in the mrtg config file.
I then realised my attempts to a find solution were not addressing the problem. My eyes were telling me that the response from the squid cache server was being held up. The counters said that squid was doing it's best. cfgmaker told me what speed the network was running; 10MBps! The server was connected to a 10BaseT hub not the 10/100 switch. I switched the connection to the switch and my eyes were happy indeed with the result.
I was too caught up in trying to make things better linux-wise that I forgot that there were other things that affected my browsing experience outside of the linux box. The network, my browser, my workstation configuration. All of these has an impact. I was also guilty of distrusting my eyes. Ultimately, it does not matter whatever other things says, when it come to the browsing experience, what you get is what you see.
Then, things got a bit slow. I could see that Mozilla would freeze up a moment after I pressed enter. After a few seconds the whole page would come at once. Now, I'd also changed the settings of the squid server to handle the additional traffic. I figured that my changes weren't enough to cope with the additional traffic. So I changed it and tried again. It didn't get better. So I changed it somewhere else and tried again. And again. But no matter what I tried, it still couldn't make the pages come up faster.
I began to think that my eyes were fooling me. So, I looked at all sorts of information counters squid has and used that instead of my eyes. My changes did show some improvement when it came to what the counters were telling me but my eyes said it still looked the same. I was about to set mrtg loose on the machine when I noticed how cfgmaker calculated the speed of the network interface card in the mrtg config file.
I then realised my attempts to a find solution were not addressing the problem. My eyes were telling me that the response from the squid cache server was being held up. The counters said that squid was doing it's best. cfgmaker told me what speed the network was running; 10MBps! The server was connected to a 10BaseT hub not the 10/100 switch. I switched the connection to the switch and my eyes were happy indeed with the result.
I was too caught up in trying to make things better linux-wise that I forgot that there were other things that affected my browsing experience outside of the linux box. The network, my browser, my workstation configuration. All of these has an impact. I was also guilty of distrusting my eyes. Ultimately, it does not matter whatever other things says, when it come to the browsing experience, what you get is what you see.
Monday, July 12, 2004
RTFM
Often those four letters crop up and for me it is a constant reminder of something that is true: If someone took the trouble to write a manual for it, it probably worth the time to read it. Or skim it, at least.
Well, today it wasn't a manual, but close enough. I got myself another USB disk or thumb drive. One thing nice about Mandrake is that once you plug it in, it does the rest and puts a shortcut on the desktop. Once you're done, right click on it, choose Unmount, the icon disappers and you can take it out. I have a thumb drive which works fine. So, I expected the new one to work without a hitch but of course, it didn't. dmesg at the console said something about the partitions not aligned, out whack or something. I did manage to manually mount it without much hassle. Not much once but annoyingly enough when repeated several times a day.
I ran Mandrake Control Center but it didn't see the drive. Well, it was listed on the List of Hardware (Harddrake) but when I ran the config tool it kept on trying to configure the hard disk. I finally took a look at the box to find out where it said "Linux Compatible". The box assured me that, so I read on. And there it was, it said
Yup, clear as day. So I ran Mandrake Control Center again and used the repartition tool. It immediately offered to remove all existing partition because it can't read them and I accepted. I set it to auto mount (supermount) and allowed the user to mount it (Advanced options). Formatted it with FAT32 (still need to share data with Windows-using-people plus you never what computer I might get stuck using at any time in the future). Plugged in my laptop (Mandrake 10) and 'removable disk' appeard on the desktop.
Lesson of the day: RTFM. Even if it is only the box.
Well, today it wasn't a manual, but close enough. I got myself another USB disk or thumb drive. One thing nice about Mandrake is that once you plug it in, it does the rest and puts a shortcut on the desktop. Once you're done, right click on it, choose Unmount, the icon disappers and you can take it out. I have a thumb drive which works fine. So, I expected the new one to work without a hitch but of course, it didn't. dmesg at the console said something about the partitions not aligned, out whack or something. I did manage to manually mount it without much hassle. Not much once but annoyingly enough when repeated several times a day.
I ran Mandrake Control Center but it didn't see the drive. Well, it was listed on the List of Hardware (Harddrake) but when I ran the config tool it kept on trying to configure the hard disk. I finally took a look at the box to find out where it said "Linux Compatible". The box assured me that, so I read on. And there it was, it said
If using Linux 2.4 kernel and above use 'fdisk' and 'format'.
Yup, clear as day. So I ran Mandrake Control Center again and used the repartition tool. It immediately offered to remove all existing partition because it can't read them and I accepted. I set it to auto mount (supermount) and allowed the user to mount it (Advanced options). Formatted it with FAT32 (still need to share data with Windows-using-people plus you never what computer I might get stuck using at any time in the future). Plugged in my laptop (Mandrake 10) and 'removable disk' appeard on the desktop.
Lesson of the day: RTFM. Even if it is only the box.
Sunday, July 11, 2004
About this
About 2002, I moved to a new company. They have a very bizzare policy of providing almost no support to the users. The response I got was 'This is a computer company, everyone should be able to manage their own computer.' Which ranks up there as the laziest policy I have every heard. So when I finally got my computer, it came with no OS. 'You should be able to run Win2k on this,' I was told. So where is the CDs? I asked.
'Some guy on the some other floor. Get it from him. I only deliver this.'
Which was a big mistake because I got so pissed that I decided to load up Mandrake 8.2 on my computer. I wanted to see how far could I go before someone in the company stopped me. I have used Linux on and off since 1992 and in my previous jobs put it on servers doing all sorts of things, mostly Internet related. I am not even OS shy as in my last job I have both MSWindows and MacOS 8 on my desk. But even, at my skill level, I wasn't sure. Now, 2 years later and still on Mandrake 9.1, I am still using Linux in the office full time. No Windows-made-in-Seattle on my machine.
In learning to use Linux full time, it made me confident to make the jump on my home machine. This is the machine I share with my wife (also a computer professional) and my kids. It still has Windows, sort of (that's another story) but 99% of the time it's linux.
Now, I use mostly Mandrake on the desktop and RedHat/Fedora on the servers at the office. I hope to share my experience with the hope it'll help at least one more person make the jump.
'Some guy on the some other floor. Get it from him. I only deliver this.'
Which was a big mistake because I got so pissed that I decided to load up Mandrake 8.2 on my computer. I wanted to see how far could I go before someone in the company stopped me. I have used Linux on and off since 1992 and in my previous jobs put it on servers doing all sorts of things, mostly Internet related. I am not even OS shy as in my last job I have both MSWindows and MacOS 8 on my desk. But even, at my skill level, I wasn't sure. Now, 2 years later and still on Mandrake 9.1, I am still using Linux in the office full time. No Windows-made-in-Seattle on my machine.
In learning to use Linux full time, it made me confident to make the jump on my home machine. This is the machine I share with my wife (also a computer professional) and my kids. It still has Windows, sort of (that's another story) but 99% of the time it's linux.
Now, I use mostly Mandrake on the desktop and RedHat/Fedora on the servers at the office. I hope to share my experience with the hope it'll help at least one more person make the jump.
Subscribe to:
Posts (Atom)
Recently Popular
-
There is a wonderful article in ArsTechnica that summarizes the story of OS/2 , the competitor to Windows at it's infancy. I'll ha...
-
I love Google Drive. Specifically, I love Google Docs. It gives me what I always wanted. A word processor on demand, whenever, wherever I n...
-
I hate my Blackberry. It represents to me the most intrusive Microsoft-soaked influence on my life. My Blackberry Bold is temperamental, ...
-
I have been spending time trying to wrap my head around Containers, mainly the Docker container. There are others that are up and coming, bu...
-
The solution to .com.android.phone process crashing is here . This post explained what I did to get to the solution. I was out enjoying...