Did Microsoft buy Skype for itself or was this a clever way to move funds out of the US? If you don't know what the latter means, Skype is registered in Luxembourg, which has a more favorable tax rate than the US. So, there is speculation that the high price for Skype is partly to save on taxes. It did seem odd that Microsoft bought it outright and made it a unit of Microsoft, instead of investing in it. Skype has a very strong brand to stand on it's own.What will happen to it's existing branding agreements? Will we start seeing from Microsoft Hardware division Skype phones?
The alleged tax reason might be a side benefit though, given the fact that Skype can play a prominent role in establishing an on-line office business suite. In the beginning, solutions like Google Docs tease the possibility of an on-line office suite. Microsoft responded with Office Web, basically providing the same functions you would get from the desktop suite. Adding Skype can push this further by adding communication. If Skype could be tightly integrated with the other MS communication tools, Sharepoint and Exchange, it would allow businesses to be able to communicate documents with their customers, bringing them closer and extending their reach at the same time. Who wouldn't want to be able to call their supplier for free? That customer whom you prefer to talk to but costs more for long-distance; well, now he is a click away. How about a supply chain tool using Office Web? Not sure what that invoice is for? Click on the person who signed for the corresponding delivery and get to talk to him.
Let's view a scenario. Supplier and customers are running Sharepoint and Exchange. The two Exchange servers talk to each other over the Internet and exchange info on their users, including Skype account/numbers auto-generated by the Exchange server as part of the user creation process. Now when the need to talk arises, a click will not only send an e-mail, but may include a live invitation to talk that show you whether the person sending it is online and accepting calls. A supplier adds the users to a customer community powered by Sharepoint and they all can optionally share Skype numbers. Meetings can now be scheduled via Exchange and powered by Sharepoint and Skype. Now bring in Office365 and that offering goes over a cloud, lowering barrier for entry with a pay-as-you-use model.
This puts Citrix, which run GotoMeeting, in the Microsoft's crosshairs. Both companies have a close relationship, primarily through cross licencing for Windows Terminal Server and MetaFrame. Or more like MS strongarmed Winframe from Citrix to become Windows Terminal server. What would be worrying is that Microsoft would build a unifying directory service (that runs on a cloud, of course) that would tie their Exchange and Skype users worldwide. How much would companies prefer to Skype than pick up the phone?
Tuesday, June 28, 2011
Tuesday, May 03, 2011
Printing to file lets CUPS print from Flash
Photo credit: Wikipedia) |
Once great thing about Linux is that it's components work good together. Even when they don't, you can always use how they work together to get what you want, at least in some part.
My toddler was asking for a coloring picture for Elmo, the Sesame Street muppet. The official site at www.sesamestreet.org didn't have a picture so I went to the Sesame Street section on the PBS site at kids.pbs.org. Both of them were basically Flash programs. Not pages with Flash elements but a page with probably one big Flash element. There were linked to other pages with the same structure. I think this is sort of a workaround to getting Shockwave-like experience without actually using the memory drain that Shockwave is on Windows. There is no Shockwave for Linux for whatever reason.
Anyway, so I found what I was looking for and clicked on the Flash control to print the picture. A CUPS pop-up came up. Now here is another interesting component. Acquired by Apple, CUPS was the elixir that solved so many problems with printing over lpr for inkjet and non-Postscript, non-PCL printers. I credit CUPS and HPLIP to ending any printer setup and printing issues on Linux, essentially taking the drama away. Really smart on HP's part. By keeping old HP printers printing, means more ink cartridges being sold. And Linux guys keep thing running for a long time.
The thing about CUPS is that it prefers to work in the background. It lets the user facing part be handled by the OS. So when I clicked on the icon to print the picture, Mandriva popped up a different print dialog than I would normally get. This dialog did not offer the "print to file" option. I wasn't concerned initially because I wanted to print to my inkjet. But after clicking on the inkjet and Print, the printer did not print out Elmo. I suspected that there was a problem between the hand-off between Flash and CUPS/Mandriva. I looked at the print queue and there was large job just waiting.
My go-to strategy for jobs being stuck, which is usually a driver problem, is to use the print to file option. This would create a postscript output in a text file which I would convert to PDF. I would then print it again on a another PC in the house or somewhere else. Since postscript is a printer language and PDF is based on postscript, all of the kinks related to printing would have been worked out and the printer driver would just focus on printing an image basically.
But the dialog didn't offer me the print to file option. I looked on the Internet and discovered that there was a separate Print to File printer definition, called CUPS-PDF. It still used the postscript printer driver on the back-. It just deposited the resulting file on the Desktop. I installed the driver from urpmi and printed Elmo again. I checked on the Desktop and the file was there. Or what I thought it was. True enough, it was Elmo in a postscript format. I converted it to PDF and printed it in no time. Total fault to solution time: 10 minutes.
Upon further inspection, the file on the Desktop wasn't really finished. But it was enough for me to create the PDF and get what I want. It seems that the code in Flash to print the picture passed enough info to the printer driver to print the first page. But it didn't send the code to say that the document was done. It assumed that the printer driver would just take the end-of-page marker and send it off to the printer. But CUPS being a good print system dutifully waited for the end-of-document marker in vain.
So not all things in Linux work well together. But even when they don't, Linux stuff offers other ways of getting what you want. And that is all I need.
My go-to strategy for jobs being stuck, which is usually a driver problem, is to use the print to file option. This would create a postscript output in a text file which I would convert to PDF. I would then print it again on a another PC in the house or somewhere else. Since postscript is a printer language and PDF is based on postscript, all of the kinks related to printing would have been worked out and the printer driver would just focus on printing an image basically.
But the dialog didn't offer me the print to file option. I looked on the Internet and discovered that there was a separate Print to File printer definition, called CUPS-PDF. It still used the postscript printer driver on the back-. It just deposited the resulting file on the Desktop. I installed the driver from urpmi and printed Elmo again. I checked on the Desktop and the file was there. Or what I thought it was. True enough, it was Elmo in a postscript format. I converted it to PDF and printed it in no time. Total fault to solution time: 10 minutes.
Upon further inspection, the file on the Desktop wasn't really finished. But it was enough for me to create the PDF and get what I want. It seems that the code in Flash to print the picture passed enough info to the printer driver to print the first page. But it didn't send the code to say that the document was done. It assumed that the printer driver would just take the end-of-page marker and send it off to the printer. But CUPS being a good print system dutifully waited for the end-of-document marker in vain.
So not all things in Linux work well together. But even when they don't, Linux stuff offers other ways of getting what you want. And that is all I need.
Wednesday, April 20, 2011
Vista Flash Odyssey
I still maintain a Vista partition on my PC for my kids stuff or the stuff that they bring back from school to run. These would be educational CDs and stuff. Also it says in my support contract that I have to have Vista there to enjoy their 3 years on-site support (which I have BTW, 2 motherboards replaced FOC). But we use Linux 99% of the time. I figure that if I expose them now, their perception of computers would not be limited to the Microsoft World.
But recognizing the possibility of needing to use Vista for whatever reason, I maintain the partition and I maintain Vista. This means periodically log in an update windows, flash, java, openoffice and the cone / VLC media player. What prompted me this time was that I wanted to move from openoffice to libreoffice. My other Windows PCs have them already so it was more of leveling the playing field, making sure I have similar programs on all of the PCs in the house. It has been a while since I used Vista. So much so I was also installing chrome this time around.
Sometimes I wonder whether I am denying my children access to their educational software by defaulting to Mandriva. There is this great math tutorial program and a interactive language learning kit. If they ask for it, I'll boot up Vista and set things up for them. But they don't mind and I seem to be getting better mileage from Flash demos and YouTube tutorial videos on the Internet anyway.
During the update eveything went well except for updating Flash on IE. I went to the Adobe website and clicked on the button to down load the lastest version of Flash. It downloaded the Adobe downloader, installed it and executed it. It then threw up an error window said the it was unable to get the correct parameters. I figured that the downloader was facing problems with the Internet link. Checked that and it was ok. So I followed the troubleshooting link from Adobe download page.
Basically, it recommended that I stop every single program I can think of that runs flash and then run the uninstaller for Flash. Well, that's great. Even Adobe has little faith in my ability to figure out by looking in the taskbar which apps is using Adobe Flash and locking the flash files. Why? Because it recommended that if it didn't work, try again because I probably missed a program. I humored Adobe for a while and uninstalling and reinstalling the downloader didn't work.
So off to the Internet we go. I found a highly rated advices which advised me to download a file from the Windows Resource Toolkit and a command file for the toolkit to use. That removed or fixes flash related stuff. What is rich is that the command file is from Adobe. So I tried that and yet the dreaded "unable to obtain correct parameter" error came out.
This was getting ridiculous but reminded me of how lucky I am using Linux. Even with flash and it's installation instruction which divert you to the command line, that routine has worked ok for years (don't get me started on the simlar java installation). I realised that my problem wasn't with installing flash, it was the downloader. It was acting as tthe gatekeeper to getting flash. In reality it was nothing but a billboard. So after looking around I found a link to get the installers directly.
The bruhaha in recent years about flash and Apple's refusal to use it (not for technical reasons, I'm sure) seemed to me like case against progression. Not supporting flash is a deal breaker with all sorts of sites using flash to get past the home page. But with Apple's clout and the popularity of ipads amongst senior management, a lot of sites have had to provide flash-free alternatives. After this episode, good riddance to flash and let's move on to HTML5.
But recognizing the possibility of needing to use Vista for whatever reason, I maintain the partition and I maintain Vista. This means periodically log in an update windows, flash, java, openoffice and the cone / VLC media player. What prompted me this time was that I wanted to move from openoffice to libreoffice. My other Windows PCs have them already so it was more of leveling the playing field, making sure I have similar programs on all of the PCs in the house. It has been a while since I used Vista. So much so I was also installing chrome this time around.
Sometimes I wonder whether I am denying my children access to their educational software by defaulting to Mandriva. There is this great math tutorial program and a interactive language learning kit. If they ask for it, I'll boot up Vista and set things up for them. But they don't mind and I seem to be getting better mileage from Flash demos and YouTube tutorial videos on the Internet anyway.
During the update eveything went well except for updating Flash on IE. I went to the Adobe website and clicked on the button to down load the lastest version of Flash. It downloaded the Adobe downloader, installed it and executed it. It then threw up an error window said the it was unable to get the correct parameters. I figured that the downloader was facing problems with the Internet link. Checked that and it was ok. So I followed the troubleshooting link from Adobe download page.
Basically, it recommended that I stop every single program I can think of that runs flash and then run the uninstaller for Flash. Well, that's great. Even Adobe has little faith in my ability to figure out by looking in the taskbar which apps is using Adobe Flash and locking the flash files. Why? Because it recommended that if it didn't work, try again because I probably missed a program. I humored Adobe for a while and uninstalling and reinstalling the downloader didn't work.
So off to the Internet we go. I found a highly rated advices which advised me to download a file from the Windows Resource Toolkit and a command file for the toolkit to use. That removed or fixes flash related stuff. What is rich is that the command file is from Adobe. So I tried that and yet the dreaded "unable to obtain correct parameter" error came out.
This was getting ridiculous but reminded me of how lucky I am using Linux. Even with flash and it's installation instruction which divert you to the command line, that routine has worked ok for years (don't get me started on the simlar java installation). I realised that my problem wasn't with installing flash, it was the downloader. It was acting as tthe gatekeeper to getting flash. In reality it was nothing but a billboard. So after looking around I found a link to get the installers directly.
The bruhaha in recent years about flash and Apple's refusal to use it (not for technical reasons, I'm sure) seemed to me like case against progression. Not supporting flash is a deal breaker with all sorts of sites using flash to get past the home page. But with Apple's clout and the popularity of ipads amongst senior management, a lot of sites have had to provide flash-free alternatives. After this episode, good riddance to flash and let's move on to HTML5.
Labels:
Fix
Thursday, April 14, 2011
Letting go of old programs
As you may know I am a Mandriva user. More hardcore than I thought, I discovered today.
I am lucky because I have padawan now. Eager to learn but patient enough not to bug me all day long.
So the need was to log onto the desktop from remote. Not just access but use the desktop. Mandriva has this tool called rfbdrake. It provide a one stop interface for remote access, going to and setting up. Basically it calls on rdesktop to connect to Windows boxes, VNC for Linux boxes and uses rfb to share out the current desktop. Not to be confused with the brilliant remote access tool on SuSe which spawns vncserver to provide remote desktop access from the point of login. This is much more pedestrian. Just share what I am seeing. Problem is, I couldn't find it on urpmi or on the Software Installer. Now I had procrastinated over some time on fixing a problem that workstation which prevented some updates from being completed. Since both of my problem could be rpm related, I finally set aside some time to do it. The update problem was simple enough. Apparently, the Fortigate firewall triggered some false positives on the files that was being downloaded. So amending the rules slightly to allow the updates to pass thru did the trick. But in the process earlier, the various repositories were also messed up. So removed them all and redownloaded a new set. For good measure, I plunked in plf too.
But after all the updates, I still couldn't get rfbdrake. Time to hunt RPMs on the net then. But horrors, rpm.pbone.net was down. RPMFind was no good either. I had given up on it to find Mandriva RPMs a long time ago. So a hunting on google we go. I finally found it on (of all places) SUNET. Nostalgia engulfed me as I remember the old days of going through SUNET looking for free/shareware software. Then followed the ensuing dependency hell. I was missing rfb itself. Hunt as I may, I could only find one from 2008. Security-wise not good.
Then it dawned to me. I was asking the wrong question. Why was I hung up on rfbdrake? The question would be, what would give me desktop access? If rfb is gone, what are their replacements? I should have learned that I should let go of old programs. The new guys were Vino and krfb. Turns out they worked fine. I miss the unified interface but if it is for the better, why not.
P/S - I am still haunted by my failure to keep a copy of a DOS IVR program (that fit on a floppy!) that ran together with a voice modem (a 33.6kbp with voice capabilites). I am that old. *sigh*
I am lucky because I have padawan now. Eager to learn but patient enough not to bug me all day long.
So the need was to log onto the desktop from remote. Not just access but use the desktop. Mandriva has this tool called rfbdrake. It provide a one stop interface for remote access, going to and setting up. Basically it calls on rdesktop to connect to Windows boxes, VNC for Linux boxes and uses rfb to share out the current desktop. Not to be confused with the brilliant remote access tool on SuSe which spawns vncserver to provide remote desktop access from the point of login. This is much more pedestrian. Just share what I am seeing. Problem is, I couldn't find it on urpmi or on the Software Installer. Now I had procrastinated over some time on fixing a problem that workstation which prevented some updates from being completed. Since both of my problem could be rpm related, I finally set aside some time to do it. The update problem was simple enough. Apparently, the Fortigate firewall triggered some false positives on the files that was being downloaded. So amending the rules slightly to allow the updates to pass thru did the trick. But in the process earlier, the various repositories were also messed up. So removed them all and redownloaded a new set. For good measure, I plunked in plf too.
But after all the updates, I still couldn't get rfbdrake. Time to hunt RPMs on the net then. But horrors, rpm.pbone.net was down. RPMFind was no good either. I had given up on it to find Mandriva RPMs a long time ago. So a hunting on google we go. I finally found it on (of all places) SUNET. Nostalgia engulfed me as I remember the old days of going through SUNET looking for free/shareware software. Then followed the ensuing dependency hell. I was missing rfb itself. Hunt as I may, I could only find one from 2008. Security-wise not good.
Then it dawned to me. I was asking the wrong question. Why was I hung up on rfbdrake? The question would be, what would give me desktop access? If rfb is gone, what are their replacements? I should have learned that I should let go of old programs. The new guys were Vino and krfb. Turns out they worked fine. I miss the unified interface but if it is for the better, why not.
P/S - I am still haunted by my failure to keep a copy of a DOS IVR program (that fit on a floppy!) that ran together with a voice modem (a 33.6kbp with voice capabilites). I am that old. *sigh*
Thursday, March 10, 2011
My WDTV and I
There has to be some Open Source / Linux goodness going on because the WDTV even behaves like an Open Source app. For example, the firmware update process is so simple it's down to pressing Yes or No. But thats not it. The fact that is asks to update the firmware and waits for your approval as opposed to downloading it and forcing you to restart the set top box, says a lot more about the people who built this.
The costs is relatively cheap compared to setting up your own MythTV box. MythTV may do more but for most people, the WDTV is more than enough. Read the post here.
Labels:
Commentary,
Mandriva or Mageia
Wednesday, February 09, 2011
I suck at this
I haven't been a good blogger. My last post was some time ago and not a good one at that.
Truth is I have a few other topical blogs and I am spending more time on them. Don't get me wrong, I love Linux and I use it every day in almost every possible way. But usually I don't have anything interesting to say about it. I could write about what I am thinking about most of the time when it comes to linux : I thank God I am not a Windows user. But that gets boring after some time. Plus I can't write too much about work and my projects because of legal reasons.
Linux is so ubiquitous in my life, it really is boring. It used to be really a challenge just to install. Then config the graphic card in X. But now everything is so well done and tested by so many other users that my experiences are shying away from the technicalities that I love to more of.... management. So much I do right now is management that I even blog about it.
Moreover I need to update this blog's template. I promise to write about something as soon as I have something interesting to say.
Truth is I have a few other topical blogs and I am spending more time on them. Don't get me wrong, I love Linux and I use it every day in almost every possible way. But usually I don't have anything interesting to say about it. I could write about what I am thinking about most of the time when it comes to linux : I thank God I am not a Windows user. But that gets boring after some time. Plus I can't write too much about work and my projects because of legal reasons.
Linux is so ubiquitous in my life, it really is boring. It used to be really a challenge just to install. Then config the graphic card in X. But now everything is so well done and tested by so many other users that my experiences are shying away from the technicalities that I love to more of.... management. So much I do right now is management that I even blog about it.
Moreover I need to update this blog's template. I promise to write about something as soon as I have something interesting to say.
Labels:
Commentary,
Mandriva or Mageia
Monday, December 20, 2010
Anything that can fail will fail... and at the worst possible moment
I am facing a crisis of sorts, both in the personal realm and at work. Fortunately, only one of them is Linux-related. Unfortunately, that problem is at work and is creeping in to the personal.
I have been spending a lot of time recently grappling with mail. Mail in the sense that mail is clogging the queue every morning. All sorts of mail are being held up. Seems that one of the mail servers I oversee, running Scalix, is causing problems with a mail server running sendmail. Seems the scalix smtp daemon is dying and sendmail keeps on hammering the scalix server, locking the message. Problem is, the scalix smtpd is not dead but dying. It responds just enough to keeps the sendmail server interested and not return an immediate error. A few mails like that and it starts to tie up resources.
But that is not the story. Mail is the no 1 app at work. So, anything is second. I am now spending my dawns watching mail start to pile up and soon enough it starts to pile up. I have a VPN into server so in theory I can work over a wireless connection. I have a 3G phone which links up nicely to my HP mini running Mandriva. It link to the phone via Bluetooth.
The morning starts ok enough. Then the mails start piling up in the queue. No problem, I started flushing the queue. Then, I go where I needed to be. Once there, there was a long line. So no worries, I thought. Let's fire up the HP and link to phone to check on work.
Except Bluetooth wasn't there. I panicked. It wasn't on the task bar. It was there just a few days ago! Usually it's there saying its disabled. I looked at the dmesg info. I looked for bluez. Used the radio on-off button to force it off and back on again. Restarted Mandriva. Still no luck. In desperation I tried Wifi but found no open system.
Finally, I remembered what worked and started doing that. Start Windows and turn on bluetooth from there. The agony. Waiting for Windows to startup. Waiting to log in. Waiting to show me the icon on the taskbar. Even more waiting for it to shutdown. Curse you, windows-specific hardware!
Restarted Mandriva and it was there. There is the problem with my hardware: Sometimes it just doesn't work. Network card plug and play doesn't work. Mine has a Marvell chipset which can only be detected if it is pluggged into network at bootup. Plug in the cable later had no effect. Even dmesg shows no change when i plug a cable in.
Lesson: Don't rely on having things run during an emergency
I have been spending a lot of time recently grappling with mail. Mail in the sense that mail is clogging the queue every morning. All sorts of mail are being held up. Seems that one of the mail servers I oversee, running Scalix, is causing problems with a mail server running sendmail. Seems the scalix smtp daemon is dying and sendmail keeps on hammering the scalix server, locking the message. Problem is, the scalix smtpd is not dead but dying. It responds just enough to keeps the sendmail server interested and not return an immediate error. A few mails like that and it starts to tie up resources.
But that is not the story. Mail is the no 1 app at work. So, anything is second. I am now spending my dawns watching mail start to pile up and soon enough it starts to pile up. I have a VPN into server so in theory I can work over a wireless connection. I have a 3G phone which links up nicely to my HP mini running Mandriva. It link to the phone via Bluetooth.
The morning starts ok enough. Then the mails start piling up in the queue. No problem, I started flushing the queue. Then, I go where I needed to be. Once there, there was a long line. So no worries, I thought. Let's fire up the HP and link to phone to check on work.
Except Bluetooth wasn't there. I panicked. It wasn't on the task bar. It was there just a few days ago! Usually it's there saying its disabled. I looked at the dmesg info. I looked for bluez. Used the radio on-off button to force it off and back on again. Restarted Mandriva. Still no luck. In desperation I tried Wifi but found no open system.
Finally, I remembered what worked and started doing that. Start Windows and turn on bluetooth from there. The agony. Waiting for Windows to startup. Waiting to log in. Waiting to show me the icon on the taskbar. Even more waiting for it to shutdown. Curse you, windows-specific hardware!
Restarted Mandriva and it was there. There is the problem with my hardware: Sometimes it just doesn't work. Network card plug and play doesn't work. Mine has a Marvell chipset which can only be detected if it is pluggged into network at bootup. Plug in the cable later had no effect. Even dmesg shows no change when i plug a cable in.
Lesson: Don't rely on having things run during an emergency
Tuesday, December 14, 2010
OpenOffice and Java: A case of VBRUNs?
There was a time when most software on Windows was written in Visual Basic. Some programs or development houses tried to hide it or make the lineage obscure. And why not? The image they wanted to project was that they devoted a quality programmers and time in crafting complex but useful code. Not that they were using the same tools as the casual developer or the one-man operation who sometimes wrote shareware. But one look under the hood and the evidence was clear. Look for a file called VBRUN.dll and there was the smoking gun pointing to Visual Basic. This is jack-of-all-trades dll. It was so tied in to VB, developers packaged it together even though they complied the libraries into the final program instead of linking the libraries. Just in case one of those libraries called it.
Now VBrun had a peculiar thing about it. It was incompatible across different versions of VB. So MS named the VRUN file according to version of VB. Soon, it was common to have 2 or 3 version of VBRUN.dll lying around in the hard disk somewhere used by 2 or 3 programs each. Some poorly written and compiled programs counldn't even tell the difference between versions. Remember those who tried to hide the fact they were running VB? They tried renaming VBRUN.DLL and installing it during the first time but the program reverted back to looking for VBRUN.DLL after an update, causing all sorts of havoc.
And you even had to put in a particular directory or else all those programs can't find it.
OpenOffice has or is trying to build a similar relationship with Java. The problems I have when Java or some libraries that OO uses is updated is sort of a deja vu but not exactly. Somethings breaks after the update. I used an OO extensions] called OpenCards to create flashcards from OO Impress files. After an update, it broke. I could use up to a certain number of cards and then it'll crash. Fortunately, OO has a solution. Go to Tools --> Opton --> Java. You can select the version of Java OO uses based on what is installed. If there are no entries press Add and then Cancel and the list of installed, detected Java versions will appear. Choose an older one and you are good to go.
Now VBrun had a peculiar thing about it. It was incompatible across different versions of VB. So MS named the VRUN file according to version of VB. Soon, it was common to have 2 or 3 version of VBRUN.dll lying around in the hard disk somewhere used by 2 or 3 programs each. Some poorly written and compiled programs counldn't even tell the difference between versions. Remember those who tried to hide the fact they were running VB? They tried renaming VBRUN.DLL and installing it during the first time but the program reverted back to looking for VBRUN.DLL after an update, causing all sorts of havoc.
And you even had to put in a particular directory or else all those programs can't find it.
OpenOffice has or is trying to build a similar relationship with Java. The problems I have when Java or some libraries that OO uses is updated is sort of a deja vu but not exactly. Somethings breaks after the update. I used an OO extensions] called OpenCards to create flashcards from OO Impress files. After an update, it broke. I could use up to a certain number of cards and then it'll crash. Fortunately, OO has a solution. Go to Tools --> Opton --> Java. You can select the version of Java OO uses based on what is installed. If there are no entries press Add and then Cancel and the list of installed, detected Java versions will appear. Choose an older one and you are good to go.
Saturday, November 06, 2010
Mandrive 2010 Spring Rocks!
I am purposefully making the title a bit more showy with the hopes that it get ranked better by the search engines. SEO this ain't. But I feel a responsibility to do so because I have been using Mandrake and later Mandriva for so many years. I've lived with it's idiosyncrasies and sometime severe limitations but more recent versions have repaid my faith. But just as the more recent version, Mandriva 2010 Spring came out, I read of the discord amongst it's developers and the desire to fork it. I won't comment on the politics but as a distribution Mandriva 2010 delivers.
I am writing this on a HP Mini which many has been written by others, shouldn't be able to install a standard distribution but is yet running Mandriva 2010.
It is sad that most reviewers out there take the easy way out and install the Mandriva Live version and use it to pass judgement. Installing Mandriva 2010 from the DVD (officially called Mandriva Free) is a different experience. It does have it's bumps due to it's commitment to go open source all the way, in the way of Flash and Java. However, that can be worked around.
..upgrading finally works.
I think I will have to dedicate the next few entries to how I've installed Mandriva (for the last several versions) with the hope that some one reading this may find it of some use. This is above and beyond my other post on moving or upgrading distributions. I am overjoyed to announce that finally, choosing the 'upgrade the distribution' finally works. Everything transitioned ok from Mandriva 2010 to 2010 spring on all my machines. No loss of shortcut or even non-standard applications (ie. installed from non-Mandriva RPMs or sources not added yet (yes PLF, I am talking about you :) )). I've even tried it from Mandriva 2009 Spring and it works ok. This is a big deal because I tend to loose sooo much time with the transition (backing up, installing fresh, updating and restoring files). Yes, upgrading does take a long time, on the average of 2-3 hours. But that is 2-3 hours of unattended installation
Friday, August 28, 2009
Spun by Ntop
I have seen ntop way back when it was just character based. It may have not been the same program but I recently came back looking for a program that'll allow me to figure out what sort of traffic is going through my network and ntop came back to my mind. I didn't have the stomach to run a full-blown Snort setup nor was I interested in watching packets fly by in Wireshark. I figured I'd use Ntop to get a general feel and I then zoom in on particular hosts with Wireshart.
I tried a test setup of ntop running on Mandriva and quickly realised it's potential. I could see as far as the packet my NIC could catch. Problem was that I on a separate switch quite some ways off from the core switch.
So I found the hardware and setup a clean CentOS 5 setup and plugged it into a port that was mirrored to the port where the firewall was connected. I installed Ntop from an RPM (from rpmforge I think) and immediately hit a brick wall. The install didn't tell me to run from the command line at least once to set the admin password. And after that the init scripts spat out errors. I could not understand the errors until I realised that there was nothing wrong with the script file but there was a bug within ntop itself.
The man file explained that specifying a conf filename would expand the file into parameters onto the command line. However, the RPM I had was probably from some transitional stage because the file expansion would result in the parameters being delimited with a comma and a space on the command line while the version of ntop that was running wanted it delimited with a space only. So took the 3 parameters in the conf file and put them into the command line in the init script and said goodbye to the specified conf file.
I said this ntop was in a transitional stage because it's settings was also being kept in files set up by the web interface. These were created and updated after ntop was running. For some bizzare reason, while ntop could an wanted to run as user ntop, the files could not be created as user ntop not matter what I did with directory permissions. (I think I stopped short of using sticky bits). So I removed the parameter that specified the user. But then the graphs would not show. RRD, which was used by ntop, wanted to write as user ntop and having the directories created (and thus owned) by root prevented that. I was getting upset and I just changed the owner of the RRD directory from root to ntop. And then fireworks. Ntop provided quite a lot of insight into what people were doing on the network. For all of 15 minutes. I could not get past that magic number. Ntop would run at most 15 minutes, mostly less. No clue in the logs. All it said was that the network interface stopped becoming promiscuous.
I gave up. Set up a cron job to start ntop back every 15 minutes went back to the real task at hand, the reason why ntop was set up, trying to get a handle on my network.
After a few weeks ntop just refused to start.
I removed all of the unique ntop related-rpms and updated everything else and began from scratch. From another job, I figured out that SELinux was messing with some sensitive system calls. I had run it under Premissive, with the illusion that I would come back and do what was needed to get it to run under Enforcing. Fat chance. I disabled SElinux and NTOP is singing.
Now I got to figure out what all this data means.
I tried a test setup of ntop running on Mandriva and quickly realised it's potential. I could see as far as the packet my NIC could catch. Problem was that I on a separate switch quite some ways off from the core switch.
So I found the hardware and setup a clean CentOS 5 setup and plugged it into a port that was mirrored to the port where the firewall was connected. I installed Ntop from an RPM (from rpmforge I think) and immediately hit a brick wall. The install didn't tell me to run from the command line at least once to set the admin password. And after that the init scripts spat out errors. I could not understand the errors until I realised that there was nothing wrong with the script file but there was a bug within ntop itself.
The man file explained that specifying a conf filename would expand the file into parameters onto the command line. However, the RPM I had was probably from some transitional stage because the file expansion would result in the parameters being delimited with a comma and a space on the command line while the version of ntop that was running wanted it delimited with a space only. So took the 3 parameters in the conf file and put them into the command line in the init script and said goodbye to the specified conf file.
I said this ntop was in a transitional stage because it's settings was also being kept in files set up by the web interface. These were created and updated after ntop was running. For some bizzare reason, while ntop could an wanted to run as user ntop, the files could not be created as user ntop not matter what I did with directory permissions. (I think I stopped short of using sticky bits). So I removed the parameter that specified the user. But then the graphs would not show. RRD, which was used by ntop, wanted to write as user ntop and having the directories created (and thus owned) by root prevented that. I was getting upset and I just changed the owner of the RRD directory from root to ntop. And then fireworks. Ntop provided quite a lot of insight into what people were doing on the network. For all of 15 minutes. I could not get past that magic number. Ntop would run at most 15 minutes, mostly less. No clue in the logs. All it said was that the network interface stopped becoming promiscuous.
I gave up. Set up a cron job to start ntop back every 15 minutes went back to the real task at hand, the reason why ntop was set up, trying to get a handle on my network.
After a few weeks ntop just refused to start.
I removed all of the unique ntop related-rpms and updated everything else and began from scratch. From another job, I figured out that SELinux was messing with some sensitive system calls. I had run it under Premissive, with the illusion that I would come back and do what was needed to get it to run under Enforcing. Fat chance. I disabled SElinux and NTOP is singing.
Now I got to figure out what all this data means.
Friday, June 20, 2008
Standing on the ledge - Part 2
Continued from this article. I apologize for the long time between postings.
I am still on the issue of making the jump to Linux and why you should do it for the right reasons. If you are trying Linux, this series may not be just for you yet. By all means, try it out and discover the world of Linux and Open Source software. Or rediscover the joy of discovery. That alone could be worth your effort. What I am trying to lay out are here are some things that are slightly different, things that affect people who are making the jump to Linux full-time.
Ask yourself, 'what is it that I do now on my computer'? Make a list of activities and the software you use for them. Be honest (because no one else is watching) and mark out programs that you use every day, occasionally/regularly and programs that you think you need and have installed but not touched it since then. The list is important because it could be a deal breaker in your jump to Linux. If you are doing this for your company, think about and talk to others and have them make their own list and combine their answers with yours.
Most of the time, we just do stuff and have very little time to plan. This is a great opportunity to think about what you have been doing and how you have been doing them. Is there a better way? Of do you want to stick with what you have now for a bit more? What is it that you wish you could do? If you have the time, try to improve things because if you can improve the way things are done, the more the move will be about something else than just an OS switch. If you don't or are doing this for a number of people, try it with the old way first because people accept change differently. You are lucky if you can convince them not only to change their OS but also change the way they work.
Now that you know what are the activities that you do and their frequency plus the software that you use now, you can find Linux programs or Open Source tools that match them. There is another article dedicated just for OpenOffice because it is the most likely replacement for MsOffice. But if you are willing to spend a bit more, Crossover Office make the transition easier by allowing you to run MsOffice on Linux. Trust me, some users only care about MsOffice. If there is a version for the dishwasher OS, the users won't blink an eye if you gave them a dishwasher. That runs MsOffice.
If you are dealing with a number of users, you may need to set aside time for training. There is always a hump, as I call it, when it comes to Linux. More on that later. In the meantime, keep you goals in sight.
I am still on the issue of making the jump to Linux and why you should do it for the right reasons. If you are trying Linux, this series may not be just for you yet. By all means, try it out and discover the world of Linux and Open Source software. Or rediscover the joy of discovery. That alone could be worth your effort. What I am trying to lay out are here are some things that are slightly different, things that affect people who are making the jump to Linux full-time.
Ask yourself, 'what is it that I do now on my computer'? Make a list of activities and the software you use for them. Be honest (because no one else is watching) and mark out programs that you use every day, occasionally/regularly and programs that you think you need and have installed but not touched it since then. The list is important because it could be a deal breaker in your jump to Linux. If you are doing this for your company, think about and talk to others and have them make their own list and combine their answers with yours.
Most of the time, we just do stuff and have very little time to plan. This is a great opportunity to think about what you have been doing and how you have been doing them. Is there a better way? Of do you want to stick with what you have now for a bit more? What is it that you wish you could do? If you have the time, try to improve things because if you can improve the way things are done, the more the move will be about something else than just an OS switch. If you don't or are doing this for a number of people, try it with the old way first because people accept change differently. You are lucky if you can convince them not only to change their OS but also change the way they work.
Now that you know what are the activities that you do and their frequency plus the software that you use now, you can find Linux programs or Open Source tools that match them. There is another article dedicated just for OpenOffice because it is the most likely replacement for MsOffice. But if you are willing to spend a bit more, Crossover Office make the transition easier by allowing you to run MsOffice on Linux. Trust me, some users only care about MsOffice. If there is a version for the dishwasher OS, the users won't blink an eye if you gave them a dishwasher. That runs MsOffice.
If you are dealing with a number of users, you may need to set aside time for training. There is always a hump, as I call it, when it comes to Linux. More on that later. In the meantime, keep you goals in sight.
Labels:
Linux,
Thinking aloud
Friday, June 22, 2007
Die Ribbon Die
I am currently on contract and the organisation that I work with gave me a slightly used laptop that had the Office 2007 installed. For what reason, beats me. I have Office 2003 on the desktop and I am fine with that. I would have reinstalled the desktop with Linux but my stint here is too short for that. After using Office 2007 for 30 minutes, I was ready to throw out the laptop out of the window.
No other user interface I have ever used is more frustrating that the tab and Ribbon interface. And that includes the old Digital Research file manager that was included in DR-DOS.
Come on MS. People are not that stupid. The Office icon is just another Start Button. The tabs are just modifiable menu items. And the ribbon is an over-sized toolbar. But here is the kicker, MS only gives one toolbar for Office 2007 while older versions of office has the option to set up more than one, on both top and on the bottom. Here is what the UI change really about : MS is creating a digital cage to force you to use the programs only the way that they want you to. You can do anything you want, as long as you do it the MS way.
Another trap like this is MS Internet Explorer. People are building applications that work only on it and take advantage of proprietary functions like embedding MS Word within it. What they don't realise is that MS IE is moving target and you are at the mercy of MS. There is nothing to prevent them from breaking your applications through forced updates. Heck, normal OS updates are enough to break custom applications. A standard MS response to suggestions of making the application both MSIE and Firefox compatible is that there is no guarantee that Firefox will be installed on a PC but they'll guarantee MISE is there. This typical misdirection is to move the focus from compatibility (which is the real issue here) to availability. My answer to this is when you make an application work on both IE and Firefox, regardless of what is installed, it will work. I have PCs at the office where the MS IE is installed and can't access some websites because it is an old version IE. Why should the old version fail when the standard built is for MSIE? What guarantee do you have that some feature you are using right now will be there in the next auto-upgrading.
Choice. Choose Linux
No other user interface I have ever used is more frustrating that the tab and Ribbon interface. And that includes the old Digital Research file manager that was included in DR-DOS.
Come on MS. People are not that stupid. The Office icon is just another Start Button. The tabs are just modifiable menu items. And the ribbon is an over-sized toolbar. But here is the kicker, MS only gives one toolbar for Office 2007 while older versions of office has the option to set up more than one, on both top and on the bottom. Here is what the UI change really about : MS is creating a digital cage to force you to use the programs only the way that they want you to. You can do anything you want, as long as you do it the MS way.
Another trap like this is MS Internet Explorer. People are building applications that work only on it and take advantage of proprietary functions like embedding MS Word within it. What they don't realise is that MS IE is moving target and you are at the mercy of MS. There is nothing to prevent them from breaking your applications through forced updates. Heck, normal OS updates are enough to break custom applications. A standard MS response to suggestions of making the application both MSIE and Firefox compatible is that there is no guarantee that Firefox will be installed on a PC but they'll guarantee MISE is there. This typical misdirection is to move the focus from compatibility (which is the real issue here) to availability. My answer to this is when you make an application work on both IE and Firefox, regardless of what is installed, it will work. I have PCs at the office where the MS IE is installed and can't access some websites because it is an old version IE. Why should the old version fail when the standard built is for MSIE? What guarantee do you have that some feature you are using right now will be there in the next auto-upgrading.
Choice. Choose Linux
Labels:
Commentary
Monday, May 21, 2007
Standing on the ledge - Part 3 - Kicking the habit
You can find Part 1 here and Part 2 here.
Two things you must remember about what the Windows world is about. One: Instant gratification. You can do stuff immediately after the install. Or the install itself has been dumbed down to a wizard. Two: Continuous tweaking. Always something new to try or to patch. Something is always changing or being changed. And guess what, you have little control over it (you can if you have the bucks). It is so bad that many people have been conditioned to want change or updates or something new to tweak ever so often. I've had users who moved to the Mac during the heady colored iMac days that asked me "why are they no updates?". I asked them whether something is wrong. The answer is negative. Somehow these users feel that if there are no regular / weekly updates then they must be something wrong. Think about it for a moment. These Windows users are assuming that the computer will always go wrong if not taken care of regularly. Really, they expect that if there no updates, Windows will blow up.
This is the second hardest thing to teach users who are no longer using Windows. Relax, no updates means that everything is ok. Sometimes calm waters are just that, calm waters. Stop focusing on fixing the computer and just use it. Confidently.
What is the most hardest thing to teach to ex-Windows users? It gets worse before it gets better .. and stays that way. This is the opposite of the Instant Gratification thing. No, you won't get everything running in 10 minutes. But yes, once it is up it'll stay up and we don't have to do anything major on it. In fact, as a user, they'll do even less because the model is that there is another person whose job is to take care of everything else. Unfortunately, that could also be you. And users if given the choice between something done in 5 minutes but always require tinkering vs something done in a day and never causing problems anymore will probably opt for the quick fix. The remedy? Make a big fuss about it. Call a meeting to discuss the steps to be taken and the impact on the users etc etc.. If it is a big fuss, users tend to accept that'll it take time. And if you are calling that meeting, why don't you actually do it properly. Who knows, you can even try to implement Change Management.. oooo that's a big word.
In the meeting identify, define and clearly mark the goal. Then plan on how to get from here to there. Make the transition gradual and plan for it to be so. Start with those that make the least impact, computers that offer limited functions or services to users. Then, back-room / supporting services. This is the area where Linux was born and shines. Finally deal with the desktops.
Once users have made the transition and are still a bit sore about the whole change, don't end your plan there. Think of "Now what?" What else can be done to make the experience of having moved better. Find way of making things better the users so that they can see why the journey was made in the first place. Point out open source / linux projects that will help them or that they are interested in. Start with Gimp and move from there.
Two things you must remember about what the Windows world is about. One: Instant gratification. You can do stuff immediately after the install. Or the install itself has been dumbed down to a wizard. Two: Continuous tweaking. Always something new to try or to patch. Something is always changing or being changed. And guess what, you have little control over it (you can if you have the bucks). It is so bad that many people have been conditioned to want change or updates or something new to tweak ever so often. I've had users who moved to the Mac during the heady colored iMac days that asked me "why are they no updates?". I asked them whether something is wrong. The answer is negative. Somehow these users feel that if there are no regular / weekly updates then they must be something wrong. Think about it for a moment. These Windows users are assuming that the computer will always go wrong if not taken care of regularly. Really, they expect that if there no updates, Windows will blow up.
This is the second hardest thing to teach users who are no longer using Windows. Relax, no updates means that everything is ok. Sometimes calm waters are just that, calm waters. Stop focusing on fixing the computer and just use it. Confidently.
What is the most hardest thing to teach to ex-Windows users? It gets worse before it gets better .. and stays that way. This is the opposite of the Instant Gratification thing. No, you won't get everything running in 10 minutes. But yes, once it is up it'll stay up and we don't have to do anything major on it. In fact, as a user, they'll do even less because the model is that there is another person whose job is to take care of everything else. Unfortunately, that could also be you. And users if given the choice between something done in 5 minutes but always require tinkering vs something done in a day and never causing problems anymore will probably opt for the quick fix. The remedy? Make a big fuss about it. Call a meeting to discuss the steps to be taken and the impact on the users etc etc.. If it is a big fuss, users tend to accept that'll it take time. And if you are calling that meeting, why don't you actually do it properly. Who knows, you can even try to implement Change Management.. oooo that's a big word.
In the meeting identify, define and clearly mark the goal. Then plan on how to get from here to there. Make the transition gradual and plan for it to be so. Start with those that make the least impact, computers that offer limited functions or services to users. Then, back-room / supporting services. This is the area where Linux was born and shines. Finally deal with the desktops.
Once users have made the transition and are still a bit sore about the whole change, don't end your plan there. Think of "Now what?" What else can be done to make the experience of having moved better. Find way of making things better the users so that they can see why the journey was made in the first place. Point out open source / linux projects that will help them or that they are interested in. Start with Gimp and move from there.
Labels:
Linux,
Thinking aloud
Wednesday, January 03, 2007
Living with a Distribution: Personal Preference
I use Mandriva. Have used it almost exclusively for 'serious work' and on my home PC. That doesn't mean I don't use anything else. I have on my second harddisk, OpenSUSE, which I have not booted up since I-don't-know-when. I used it to use Jahsaka, a multimedia composing tool. Cool stuff, maybe I'll post some of the stuff I've done. But when I had no use for Jahshaka, I went back to my aging Mandrake 10.1.
I just upgraded from Mandrake 10.1 to Mandriva 2007. The most painless upgrade I've had. I took the usual percautions.
), and made sure again it didn't mess up my home partition. I created new user accounts using the old names. This will make my other family members feel at home as they try out the new stuff Mandriva has to offer. I move the user files but not the config files so that old settings don't interfere. And it was up and running.
I just upgraded from Mandrake 10.1 to Mandriva 2007. The most painless upgrade I've had. I took the usual percautions.
- Made a list of all applications I had installed and prioritise them according to what I use most often.
- Backed up everything.
- Renamed my home directories.
- Made sure the installation only touched the non-home partition.

Labels:
Linux
Thursday, July 27, 2006
Living with a Distribution: Taking sides
This is not about recommending a distribution. Rather it's about how important the choice is and how to make it.
A distribution is a collection of programs that will make up of what you will be installing in your PC besides the Linux kernel. A distribution is like an accent. A lot of people are saying the same thing but they are just saying it differently. Some more so than others. More importantly, choosing a distribution is like choosing a path. It will lead you to same place, just a different entrance into the city. Also the road can be winding but dotted with friendly towns or it can be 5 lanes wide chock-ful of people, leading the directly to the location. Ok, enough with the anologies.
Choosing a distribution over another will affect you in the following way:
A distribution is a collection of programs that will make up of what you will be installing in your PC besides the Linux kernel. A distribution is like an accent. A lot of people are saying the same thing but they are just saying it differently. Some more so than others. More importantly, choosing a distribution is like choosing a path. It will lead you to same place, just a different entrance into the city. Also the road can be winding but dotted with friendly towns or it can be 5 lanes wide chock-ful of people, leading the directly to the location. Ok, enough with the anologies.
Choosing a distribution over another will affect you in the following way:
- How regular the software is updated. Which may translate into how long will you be exposed to a known venerability after it's made known. A more active distribution will have updates and patches available as soon as possibe. Most of them do because the nature of a distribution is that it is created by people and used by them. More likely, it'll translate into how long before you will be able to use that latest program or software you heard about.
- Selection of pre-packaged software. Some distributions are general purpose, others are catered for a specific group of people. There are also distributions that you don't have to install. They run directly from CDs or DVDs. In this case, your choice is about use rather than having it sit on your hard disk. You might choose to use something off the CD for a while until you decide it is worthy to sit on your hard disk.
- How long will it be supported after the version number rolls over. This is usually short, may be about a year or so for most non-commercial distributions. There is a reason sometimes some new features can't be included in the older version of the distribution.
Labels:
Linux
Monday, May 08, 2006
Minding It - Part 2
I am determined to make this work. This meaning Zoneminder (www.zoneminder.com). I had it working, sort of, on my VMWare Server-ed Mandriva 2006. But even though it ran, there were numerous errors about shmget and shared memory. Zoneminder consists of several applications, with some controlling others that do specific tasks and each of them exchanging information. My guess is that some information exchange is happening through shared memory: program A loads up stuff into memory and then passes a pointer to program B that does something to that stuff (although Inter-process communication could be involved too). I can see why the author does it. Multiple instances of program can load the same kind of stuff based on different configurations while program B doesn't care about where the stuff came from... it'll take it and process it. Sorta like program A makes batter and program B cut the batter into square shapes to be made into cookies.
Ok, off the food and onto business. My guess is that I am facing problem with shared memory because I am running it on a VM, VMWare Server Beta to be exact. So the next option would be to run on a real physical memory. (actually, the next option would be to it run on something similar like VMWare Workstation, which I-Will-Do-Soon(R)). After the couple of installs I've done, I think it is safe enough to try it on my main machine at home because I don't use apache or MySQL extensively there and the dependent libraries won't probably mess up my main apps there. I am severely tempted to use an older version of a Zoneminder (zm) rpm to weed out dependencies. This is faster than the 'configure-->find out missing lib file --> find rpm for lib file --> get and install lib file --> configure' treadmill. But I've decided to take the long way just in to find out what libraries are actually needed if I was any other person doing this on any other distribution. Experiencing this as any other guy is important for the overall user experience. I am not even going to install the rpm for the perl serial and x10 stuff which is not even on the usual Mandriva repositories to see whether zm really needs it.
I already know from previous experiences that the latest version 1.22.1 complies fast and less of a hassle from the previous version I was using. Somehow, the mysterious "unable to create executables" error had magically disappeared. And I found the correct manual this time for this version.
To cut a long story short, compile went ok and everything went exactly like it did previously.. that is it broke in the same place. Same shared memory problem. A glance through the sources did no good. The forum has someone one with the same problem and it was full of debug messages. Out of desperation, I started trying all sorts of things. Like I always tell people who have problems with computers and don't know what to do: "This is not a nuclear bomb. Push all the buttons." Changing the configuration somehow yielded a better error message. I told it to that the image fed to it were 8-bit grayscale (liar) instead of the other only option 24-bit color. I think the image I was sending was 8/16-bit color. But it had only two options. Now the error message was something like "Image captured was of wrong size, height, width or color". Image captured! It got something. Hey, it may be a Celocanth but it's still a fish. Maybe if I set the camera to 8-bit grayscale, it'll capture it correctly! Great! Problem is.. I can't remember the password for the camera. No time to do a reset and sacrifice a lamb or whatever it is I need to do to reset the password. Got to pay the bills. Later.
Ok, off the food and onto business. My guess is that I am facing problem with shared memory because I am running it on a VM, VMWare Server Beta to be exact. So the next option would be to run on a real physical memory. (actually, the next option would be to it run on something similar like VMWare Workstation, which I-Will-Do-Soon(R)). After the couple of installs I've done, I think it is safe enough to try it on my main machine at home because I don't use apache or MySQL extensively there and the dependent libraries won't probably mess up my main apps there. I am severely tempted to use an older version of a Zoneminder (zm) rpm to weed out dependencies. This is faster than the 'configure-->find out missing lib file --> find rpm for lib file --> get and install lib file --> configure' treadmill. But I've decided to take the long way just in to find out what libraries are actually needed if I was any other person doing this on any other distribution. Experiencing this as any other guy is important for the overall user experience. I am not even going to install the rpm for the perl serial and x10 stuff which is not even on the usual Mandriva repositories to see whether zm really needs it.
I already know from previous experiences that the latest version 1.22.1 complies fast and less of a hassle from the previous version I was using. Somehow, the mysterious "unable to create executables" error had magically disappeared. And I found the correct manual this time for this version.
Apparently they had already fixed some of my previous rants.My apologies to developers. No bizarre config program and defaults being set at compile. Thank you.
To cut a long story short, compile went ok and everything went exactly like it did previously.. that is it broke in the same place. Same shared memory problem. A glance through the sources did no good. The forum has someone one with the same problem and it was full of debug messages. Out of desperation, I started trying all sorts of things. Like I always tell people who have problems with computers and don't know what to do: "This is not a nuclear bomb. Push all the buttons." Changing the configuration somehow yielded a better error message. I told it to that the image fed to it were 8-bit grayscale (liar) instead of the other only option 24-bit color. I think the image I was sending was 8/16-bit color. But it had only two options. Now the error message was something like "Image captured was of wrong size, height, width or color". Image captured! It got something. Hey, it may be a Celocanth but it's still a fish. Maybe if I set the camera to 8-bit grayscale, it'll capture it correctly! Great! Problem is.. I can't remember the password for the camera. No time to do a reset and sacrifice a lamb or whatever it is I need to do to reset the password. Got to pay the bills. Later.
Saturday, May 06, 2006
Minding it
I am experimenting with Zoneminder . It looks like a promising digital video recorder (DVR) for closed circuit cameras. My interest is primarily for IP based cameras. In theory, this should be easy. IP based cameras usually have a URL that you could pick up frames or even video streams. So building a DVR software should not be that difficult.
I always start with an RPM because I use Mandriva and this helps keeps updates in check. Also I respect the fact that someone invested time in building a package, so why reinvent the wheel. There were RPMs for Mandriva and tried those. I read the instructions in the manual and strangely enough there were instructions on what to do before installing. Wait a minute, isn't an RPM supposed to handle this. That is one of the points of using an RPM. But the request wasn't demanding, just that apache, mysql and php be installed. So I did that. Installing it using urpmi should have resolved dependancies but I wanted to choose which version to be installed anyway and the GUI tool was better at that. When I finally used urpmi to install it, it complained of some missing perl libraries. Great, I thought, a trek to CSPAN to download the libraries and compile them by hand. I later found out that the libraries were for an optional component and not really necessary. But even after the rpm was installed, the instruction asked me to manually edit a file and run a configuration program.. which was broken. Apparently the package author decided to use MySQL-Max and put in code to detect the version of MySQL that was running. At least he made it easier by clearly marking the offending section in the script file with #FIX ME. Finally, I managed to get the components running and configured the program to capture images from my IP camera.
Or so I thought. I couldn't figure out why my settings would not work when I have read the manuals (two of them) and follows almost every instruction there is. I checked the logs and I found out that the program that was picking up the images was crashing. Apparently it requested memory space incorrectly. Wait a minute, this was serious. This is a beta-level error on a 1.2++ version. Problems with memory allocation should have been delt with a long time ago. This only means one thing.. recompile.
I got the latest STABLE sources and tried to compile it. It got worse. The configure command still required parameters passed in.. for the default web directory and location for mysql? If it can't detect that, what is the program actually doing? The more I fed it CLI parameters, the more it asked. In the end there was about 10 odd CLI parameters. And it still wouldn't compile, complaining about the c compiler unable to create executables! What?!
I little more reading led me to a more stable version, a couple of revisions back. No more c compiler complaints. Now it was the normal missing library files that had to be installed, specifically libjpeg.a (libjpeg), libz.a (zlib-devel) and libmysqlclient.a (libmysql-devel). Configure exited and asked to run a configuration program. Which did a lot of what confgure is supposed to do, esp when it came to component detection. It asked a lot of questions, nearly 30 in all. Most of them should have just been set to default. Too late already. Will compile tommorow.
Lessons Learnt.
I always start with an RPM because I use Mandriva and this helps keeps updates in check. Also I respect the fact that someone invested time in building a package, so why reinvent the wheel. There were RPMs for Mandriva and tried those. I read the instructions in the manual and strangely enough there were instructions on what to do before installing. Wait a minute, isn't an RPM supposed to handle this. That is one of the points of using an RPM. But the request wasn't demanding, just that apache, mysql and php be installed. So I did that. Installing it using urpmi should have resolved dependancies but I wanted to choose which version to be installed anyway and the GUI tool was better at that. When I finally used urpmi to install it, it complained of some missing perl libraries. Great, I thought, a trek to CSPAN to download the libraries and compile them by hand. I later found out that the libraries were for an optional component and not really necessary. But even after the rpm was installed, the instruction asked me to manually edit a file and run a configuration program.. which was broken. Apparently the package author decided to use MySQL-Max and put in code to detect the version of MySQL that was running. At least he made it easier by clearly marking the offending section in the script file with #FIX ME. Finally, I managed to get the components running and configured the program to capture images from my IP camera.
Or so I thought. I couldn't figure out why my settings would not work when I have read the manuals (two of them) and follows almost every instruction there is. I checked the logs and I found out that the program that was picking up the images was crashing. Apparently it requested memory space incorrectly. Wait a minute, this was serious. This is a beta-level error on a 1.2++ version. Problems with memory allocation should have been delt with a long time ago. This only means one thing.. recompile.
I got the latest STABLE sources and tried to compile it. It got worse. The configure command still required parameters passed in.. for the default web directory and location for mysql? If it can't detect that, what is the program actually doing? The more I fed it CLI parameters, the more it asked. In the end there was about 10 odd CLI parameters. And it still wouldn't compile, complaining about the c compiler unable to create executables! What?!
I little more reading led me to a more stable version, a couple of revisions back. No more c compiler complaints. Now it was the normal missing library files that had to be installed, specifically libjpeg.a (libjpeg), libz.a (zlib-devel) and libmysqlclient.a (libmysql-devel). Configure exited and asked to run a configuration program. Which did a lot of what confgure is supposed to do, esp when it came to component detection. It asked a lot of questions, nearly 30 in all. Most of them should have just been set to default. Too late already. Will compile tommorow.
Lessons Learnt.
- Good practise for building packages is to seperate the core and optional components.
- When building a package make sure all dependancies are included.
- Set up DEFAULTS!!
Wednesday, August 24, 2005
Standing on the ledge - to Office or not to Office
As you are moving from Windows to Linux on the desktop, you will have to cross the most difficult bridges of all: Office applications. Or productivity tools. In short you have to ask yourself, "To MSOffice or not to MSOffice, that is the question." It is entirely possible to keep MSOffice and use Linux, despite what purists say. I, for one, is still married to MSOffice, still and I'll explain why at the end.
Coming back to the question at hand, your options are :
1. Do you use Macros? Do things pop out and ask you stuff when you open a template or document? If not, then you answer is most likely no.
2. Do you use outlines or the outlining feature in MSWord? If you are asking, "Wha-?", then your answer would be no.
3. Do you have MS Access databases that you use regularly? Thing about conversion is that MSAccess files are not part of the deal.
If you answered yes to any of the above, go Crossover Office. If not, then you are a prime candidate for switching over from MSOffice to OpenOffice. You will save a ton of money later, especially as you grown and add PCs and realize you don't have to pay for another MSOffice license.
Oh, BTW, I don't use Macros but I love the outlining feature so much, it is a deal breaker.
Coming back to the question at hand, your options are :
- OpenOffice - an alternative application suite. It can read and save files into MSOffice file formats. The best part about it is that it also runs on Windows. So as you are moving people across, you can have them using OpenOffice on Windows and later on Linux. However, this endeavor is so large, it in itself is as a daunting a task as moving people into Linux. The true reason you should have people moving across is that most people do not use all of MSOffices features all the time. They simple can't. If they do clerical work, moving across is a cinch. But if they are advanced users, it will be as painful as a root canal minus the pain killers. Heck, I have some problems with alignment when moving from OpenOffice in Win to OpenOffice in Linux. If you share files out side the company, it then gets really troublesome.
- Cross Over office - A commercial tool that allows you to install and use MSOffice (plus some other Windows applications) on Linux. Like driving on the other lane when the road is empty. It simply works. Well, almost all. You see, what they didn't tell you is that there is a reason why MSOffice is on Windows only. It is just because it uses low-level software calls. I have heard that one of the reason the Windows on Alpha was dropped because it couldn't run MSOffice very well. Some version of Office actually replaced OS files during installation. What other application would do that? So the result is that the major MSOffice applications work fine but some fringe and not-so-fringe applications can be tripped up (e.g. Clipart Manager).
- Like above, Office over Wine - Wine, which is not a Windows Emulator, is designed to run Windows applications on Linux by fooling the application into thinking that it is on Windows, but not. In fact, CrossOver Office is partly Wine. So, why use CrossOver when you can get wine for free. Let's just say that I like my hair too much as this age of my life.
1. Do you use Macros? Do things pop out and ask you stuff when you open a template or document? If not, then you answer is most likely no.
2. Do you use outlines or the outlining feature in MSWord? If you are asking, "Wha-?", then your answer would be no.
3. Do you have MS Access databases that you use regularly? Thing about conversion is that MSAccess files are not part of the deal.
If you answered yes to any of the above, go Crossover Office. If not, then you are a prime candidate for switching over from MSOffice to OpenOffice. You will save a ton of money later, especially as you grown and add PCs and realize you don't have to pay for another MSOffice license.
Oh, BTW, I don't use Macros but I love the outlining feature so much, it is a deal breaker.
Labels:
Linux,
Thinking aloud
Tuesday, August 23, 2005
Standing on the ledge - Part 1
If you are thinking about making the transition from whatever to Linux, read on.
A lot of people asked me two questions since I made the switch, 'Is it hard?' and 'Can you do everything you want to in Windows?'
The answer to the last one is a resounding yes. In fact, after switching from Windows, whenever I have to use a Windows machine, I find it very restrictive and most of my tools are gone. Linux give you so many choices and options, you can't just make up your mind and stick to one set. I find myself switching from KDE to GNOME and back every few months. Without losing access to the core programs I use.
The answer to the first one is 'Hell, yes. It was very hard.' But I was on my own and in retrospect, could have avoided a lot of heartache if there were someone to tell me what to do or what to avoid. This series is dedicated to those thinking about making the switch or the jump. Something to think about and do before making the leap. Most of it will sound like me talking to you as a network administrator but even if you are switching alone, everything still applies. Think of yourself as your own administrator.
First, Why are you making the jump or at least thinking about it. The reasons have to be sound because you have to do it for the right reasons. If not, you will be disappointed or you will find it not suited for you and you switch back. Time lost once will never be regained.
If you are switching for idealogical reasons (i.e. not wanting to pay Microsoft Tax), then you are a Believer. Nothing I say will discourage you and all pain is worth suffering. Just make sure other people involved believe it too. Note to Believer: All proponents of idealogies (prophets, do-gooders) face lynch mobs. Sort of a Darwinian thing about idealogies, those that survive lynch mobs are most likely superior.
If you are thinking about saving costs, I will tell you right now it will be some time before you see significant cost savings. Unless, of course you include licence costs for a large number of people. The is where the most savings will be. But for every cost factor you take away, you will be replacing it with another one. Training or retraining will cost. Reinstallation or upgrades of older PCs will cost. Sure the PCs won't crash as often but people who switch to Linux forget that Linux may not be hard on CPU speed but it does require some amount of memory before things really fly. My suggestion is that hit 256MB as soon as you can. If you are looking at older PCs, 128MB will work. While on this issue, sometimes it's not even the RAM. Getting a new video card with more memory works wonders too. Coming back to cost factors, live with the fact that cost factors are just going to be replaced not eliminated. But if you are smart about it, it just won't cost as much. That is, each cost factor replaced, will likely be less in value.
That said, hunker down for some productivity loss and doubts (or doubting people) nagging you. Remember, it has to get worse before it gets better.
Update: Part 2 and Part 3
A lot of people asked me two questions since I made the switch, 'Is it hard?' and 'Can you do everything you want to in Windows?'
The answer to the last one is a resounding yes. In fact, after switching from Windows, whenever I have to use a Windows machine, I find it very restrictive and most of my tools are gone. Linux give you so many choices and options, you can't just make up your mind and stick to one set. I find myself switching from KDE to GNOME and back every few months. Without losing access to the core programs I use.
The answer to the first one is 'Hell, yes. It was very hard.' But I was on my own and in retrospect, could have avoided a lot of heartache if there were someone to tell me what to do or what to avoid. This series is dedicated to those thinking about making the switch or the jump. Something to think about and do before making the leap. Most of it will sound like me talking to you as a network administrator but even if you are switching alone, everything still applies. Think of yourself as your own administrator.
First, Why are you making the jump or at least thinking about it. The reasons have to be sound because you have to do it for the right reasons. If not, you will be disappointed or you will find it not suited for you and you switch back. Time lost once will never be regained.
If you are switching for idealogical reasons (i.e. not wanting to pay Microsoft Tax), then you are a Believer. Nothing I say will discourage you and all pain is worth suffering. Just make sure other people involved believe it too. Note to Believer: All proponents of idealogies (prophets, do-gooders) face lynch mobs. Sort of a Darwinian thing about idealogies, those that survive lynch mobs are most likely superior.
Remember, it has to get worse before it gets better.
If you are thinking about saving costs, I will tell you right now it will be some time before you see significant cost savings. Unless, of course you include licence costs for a large number of people. The is where the most savings will be. But for every cost factor you take away, you will be replacing it with another one. Training or retraining will cost. Reinstallation or upgrades of older PCs will cost. Sure the PCs won't crash as often but people who switch to Linux forget that Linux may not be hard on CPU speed but it does require some amount of memory before things really fly. My suggestion is that hit 256MB as soon as you can. If you are looking at older PCs, 128MB will work. While on this issue, sometimes it's not even the RAM. Getting a new video card with more memory works wonders too. Coming back to cost factors, live with the fact that cost factors are just going to be replaced not eliminated. But if you are smart about it, it just won't cost as much. That is, each cost factor replaced, will likely be less in value.
That said, hunker down for some productivity loss and doubts (or doubting people) nagging you. Remember, it has to get worse before it gets better.
Update: Part 2 and Part 3
Labels:
Linux,
Thinking aloud
Wednesday, August 17, 2005
Buying reality
You reality is your own perception. If it walks like a duck, quacks like a duck then.. you know. But what if all you see are ducks. Do you think you'd know a chicken if it walked by?
There is a point to this. I make a living from computers (big surprise). And I work with a people out from colleges who are making their first career jobs and people whose businesses are starting to break out from the local market. They all need computers and they all want to use the best at the least possible cost. Recommendations are big thing for me and my clients (and lately, even my suppliers) bring in people they know who can use my expertise. I use my own office setup to demonstrate some of the uses you can get from using open source solutions and Linux in particular. The thing I am getting used to is the response, "You can do that with a computer?" or "It can work like that?"
Thats what bothers me. It used to be the whiz bang stuff that gets them, then the free but high quality stuff (Mozilla, Gimp). But now the stuff that draw theses responses are down right trivial.
I pointed out to a potential client that he could set up a print queue and log all print jobs and the information of each job. He looked at me and point out that wasn't everybody just printing directly to the printer. If everyone could see the printer, couldn't they just bypass the queue? I walked to the printer and turned off SMB-based sharing via the control panel. The printer disappered from the network but I demonstrated that I could still print via the queue. He was bowled over. Seems that he has a problem with his workers printing on the expensive color laser printer after hours. At first he would disconnect the printer at about 5 but stopped that after salespeople complained of not being able to get color brochures printed for clients after hours. The growth in his company was directly the result of his sales staff being able to come in at odd hours and do work, so he couldn't deny their request. The notion of a print queue and the ability to turn off access to the printer (selectively by network protocol) never crossed his mind.
Do you see that? The solution had little to do with open source or Linux or anything new for that matter. Print queues have been around for ages. But what surprised me more was that when I mentioned this to a younger co-worker, he said that compared to what he saw at college (local community college), the stuff at the office was downright revolutionary. The free-flow mess of network and services on Windows networks at college was a stark contrast to the controlled environment at the office where everything just worked or that if it failed something else was waiting to back that up.
Which brings me back to the ducks. One of the problems with computing right now is the dominance of Windows and MS. All people see are Windows. Their sheer ubiquity has blinded a lot of people. They simply don't know any other way. And it if means having to live with unoptimised working environments that often is not productive, so be it. A recent report said that Gartner research says that desktop Linux won't be taking off (I have issues with that but to a certain degree agree that Linux has problems on office desktops). Maybe the case with that is that people don't know better. Maybe it's time to look over and think, "Fried chicked sounds good."
There is a point to this. I make a living from computers (big surprise). And I work with a people out from colleges who are making their first career jobs and people whose businesses are starting to break out from the local market. They all need computers and they all want to use the best at the least possible cost. Recommendations are big thing for me and my clients (and lately, even my suppliers) bring in people they know who can use my expertise. I use my own office setup to demonstrate some of the uses you can get from using open source solutions and Linux in particular. The thing I am getting used to is the response, "You can do that with a computer?" or "It can work like that?"
Thats what bothers me. It used to be the whiz bang stuff that gets them, then the free but high quality stuff (Mozilla, Gimp). But now the stuff that draw theses responses are down right trivial.
I pointed out to a potential client that he could set up a print queue and log all print jobs and the information of each job. He looked at me and point out that wasn't everybody just printing directly to the printer. If everyone could see the printer, couldn't they just bypass the queue? I walked to the printer and turned off SMB-based sharing via the control panel. The printer disappered from the network but I demonstrated that I could still print via the queue. He was bowled over. Seems that he has a problem with his workers printing on the expensive color laser printer after hours. At first he would disconnect the printer at about 5 but stopped that after salespeople complained of not being able to get color brochures printed for clients after hours. The growth in his company was directly the result of his sales staff being able to come in at odd hours and do work, so he couldn't deny their request. The notion of a print queue and the ability to turn off access to the printer (selectively by network protocol) never crossed his mind.
Do you see that? The solution had little to do with open source or Linux or anything new for that matter. Print queues have been around for ages. But what surprised me more was that when I mentioned this to a younger co-worker, he said that compared to what he saw at college (local community college), the stuff at the office was downright revolutionary. The free-flow mess of network and services on Windows networks at college was a stark contrast to the controlled environment at the office where everything just worked or that if it failed something else was waiting to back that up.
Which brings me back to the ducks. One of the problems with computing right now is the dominance of Windows and MS. All people see are Windows. Their sheer ubiquity has blinded a lot of people. They simply don't know any other way. And it if means having to live with unoptimised working environments that often is not productive, so be it. A recent report said that Gartner research says that desktop Linux won't be taking off (I have issues with that but to a certain degree agree that Linux has problems on office desktops). Maybe the case with that is that people don't know better. Maybe it's time to look over and think, "Fried chicked sounds good."
Labels:
Commentary
Thursday, July 21, 2005
Linux Mobile: Running wirelessly
I got the Mandrake-powered notebook to work over the wireless network with AP at home. But no luck at office. This vexed me more than normal because I had a hand in setting up the office wireless AP and was pretty sure of what the settings were. Normally when you build two things that are like, you'd get better the second time, not worse. But since the first time worked flawlessly, I learned nothing from the experience. That is why I don't see problems as obstacles. They are opportunities to learn.
Basically my problem boiled down to the fact my notebook's wireless card can't connect to the office AP using WEP encryption. Without it, no problem. But the kicker was that I was using WEP at home AP and it worked out-of-the-box. No option I tried could get it done. This is the time to take a step back. The thing to do at a time like this is to not go through the things I got wrong. But rather the things I thought I got right. What was it that I did differently at the office than at home?
And there the solution was. The wireless card needed the WEP key to be in hex. It would not use the ASCII key. That I found that out at home but it was fixed easily because the home AP showed the ASCII key I entered as hex and vice versa when I switched between ASCII and hex input. The office AP didn't have that feature. You either entered it in ASCII or Hex and switching between both just blanks out any key previously entered. So I used an ASCII to hex converter at the command line. Apparently these things are case-sensitive. No wonder it wouldn't work. It was just the wrong key! I found that out because I finally decided to change the WEP key at the office AP. I just entered it in hex and did the same on the notebook. It worked straight away. I didn't do this earlier because other people were also using the AP. After changing back the key and more fiddling around I learned that the office AP apparently automatically makes the ASCII key entered into UPPER CASE before converting it to hex value and then using it . The AP vendor committed one of Great Sins of Equipment Manufacturers: Not telling the user of the assumption you made for them (and in a way, about them). I was thankful though they didn't do something boneheaded like configuring the AP to use two keys for every ASCII key entered (that is convert the ASCII key into both upper and lower case and converting each into hex and using them both). It would have made my setup work immediately but it would be Not The Right Way.
Basically my problem boiled down to the fact my notebook's wireless card can't connect to the office AP using WEP encryption. Without it, no problem. But the kicker was that I was using WEP at home AP and it worked out-of-the-box. No option I tried could get it done. This is the time to take a step back. The thing to do at a time like this is to not go through the things I got wrong. But rather the things I thought I got right. What was it that I did differently at the office than at home?
And there the solution was. The wireless card needed the WEP key to be in hex. It would not use the ASCII key. That I found that out at home but it was fixed easily because the home AP showed the ASCII key I entered as hex and vice versa when I switched between ASCII and hex input. The office AP didn't have that feature. You either entered it in ASCII or Hex and switching between both just blanks out any key previously entered. So I used an ASCII to hex converter at the command line. Apparently these things are case-sensitive. No wonder it wouldn't work. It was just the wrong key! I found that out because I finally decided to change the WEP key at the office AP. I just entered it in hex and did the same on the notebook. It worked straight away. I didn't do this earlier because other people were also using the AP. After changing back the key and more fiddling around I learned that the office AP apparently automatically makes the ASCII key entered into UPPER CASE before converting it to hex value and then using it . The AP vendor committed one of Great Sins of Equipment Manufacturers: Not telling the user of the assumption you made for them (and in a way, about them). I was thankful though they didn't do something boneheaded like configuring the AP to use two keys for every ASCII key entered (that is convert the ASCII key into both upper and lower case and converting each into hex and using them both). It would have made my setup work immediately but it would be Not The Right Way.
Linux Mobile: An out-of-the-box experience
Recap on the installation
- LAN network card: ok. Didn't expect any problems but who knows.
- Graphics display = Vesa only. It bombed using the i910 drivers for XFree. I heard Intel is posting it's driver. Will try that. But not really bad Vesa.
- Power Management = ACPI ok, APIC crashes the system for some reason.
Linux Mobile: Introduction
Finally, I got a new notebook at work. I was a bit apprehensive about what distro to put. SuSe Pro is a big pull. Ubuntu even crossed my mind. But realising that this was a notebook that would not have all the pieces working with Linux, I needed most of my experience to make it up and running. And an unfamiliar distribution would make me grope in the dark. Mandrake/Mandriva it was.
In the next course of blogs, I try to document as much as possible what I did right and what I did wrong with the hope it'll help someone out there.
First things first, the notebook is a MSI Megabook, rebranded as a local brand here. Centrino chips, 512MB RAM, DVD-CDRW, 40GB HDD, 3 USB, 1 Firewire, 1 VGA, 1 PCMCIA with integrated card reader (Ricoh), built in Wifi, network and modem. All in a nice 1.8 kg package costing slightly under 1k dollars.
The good news is that I am writing this on the notebook
In the next course of blogs, I try to document as much as possible what I did right and what I did wrong with the hope it'll help someone out there.
First things first, the notebook is a MSI Megabook, rebranded as a local brand here. Centrino chips, 512MB RAM, DVD-CDRW, 40GB HDD, 3 USB, 1 Firewire, 1 VGA, 1 PCMCIA with integrated card reader (Ricoh), built in Wifi, network and modem. All in a nice 1.8 kg package costing slightly under 1k dollars.
The good news is that I am writing this on the notebook
Friday, July 08, 2005
Waiting for nothing
I looking at the Mandriva CD that came with Linux Format, the best Linux magazine for the less uppity or the pocket-protector-less. I wonder when I will get to install it. To be truthful, I had the downloaded CDs longer but if you have read the past few posts, upgrades are something I dread.
It's that I also use the PC so much, I am aprehensive of all the lost time to install most of what I had already installed on the upgraded mahine.
It's that I also use the PC so much, I am aprehensive of all the lost time to install most of what I had already installed on the upgraded mahine.
Labels:
Linux
Shouting obscenities and Error Messages
Mandriva is greatly enhanced by urmpi and more so when combined with the repositories listed on EasyUrpmi. If you haven't got plf repositories listed, you are definitely missing a lot. A side feature of using EasyUrpmi is that you can set the main repository, thereby eliminating the need to have the CDs or DVD around when ever you install stuff. As a desktop OS, you will install a lot of stuff.
Suddenly, things got slower during installs, often failing. There are no clues other than messages saying that some packages cannot be installed due to missing keys and that some packages are corrupted. Checked the name of the package. Correct. Checked rpm.pbone. Correct. Tried restoring missing keys. Trouble is, they weren't missing to begin with.
And it goes on for some time. Sometimes I get to install. Other times I don't. So I tried updating the repository indexes. Some are successful, other times, it just hangs, requiring a kill.
Fortunately, I have another desktop at home. Faster. I am the only one using the Internet connection. Everything I tried installing, worked. Even stuff that didn't work.. at work. Then I realised that the PC at work kept hanging when I tried updating the indexes. So I do what I normally do when trying to figure stuff out. I break it down.
Tried a few indexes at first. But ultimately it all worked. All indexes from repositories that have their contents updated, that is. So I tried updating the index from the repository that is not supposed to change, main. It failed spectacularly. So that was the problem. The repository was no longer where it was. A quick visit to EasyUrpmi fixed that. Deleted the main repository, found another one and added it back with the command generated by EasyUrpmi.
Which says a lot about error messages. Error messages are a must. It tells us when things are wrong. More importantly, it tells us what is wrong so that it can be fixed. When errors messages don't tell me what it wrong, it might as well be shouting obscenities alone.
Suddenly, things got slower during installs, often failing. There are no clues other than messages saying that some packages cannot be installed due to missing keys and that some packages are corrupted. Checked the name of the package. Correct. Checked rpm.pbone. Correct. Tried restoring missing keys. Trouble is, they weren't missing to begin with.
And it goes on for some time. Sometimes I get to install. Other times I don't. So I tried updating the repository indexes. Some are successful, other times, it just hangs, requiring a kill.
Fortunately, I have another desktop at home. Faster. I am the only one using the Internet connection. Everything I tried installing, worked. Even stuff that didn't work.. at work. Then I realised that the PC at work kept hanging when I tried updating the indexes. So I do what I normally do when trying to figure stuff out. I break it down.
Tried a few indexes at first. But ultimately it all worked. All indexes from repositories that have their contents updated, that is. So I tried updating the index from the repository that is not supposed to change, main. It failed spectacularly. So that was the problem. The repository was no longer where it was. A quick visit to EasyUrpmi fixed that. Deleted the main repository, found another one and added it back with the command generated by EasyUrpmi.
Which says a lot about error messages. Error messages are a must. It tells us when things are wrong. More importantly, it tells us what is wrong so that it can be fixed. When errors messages don't tell me what it wrong, it might as well be shouting obscenities alone.
Tuesday, June 28, 2005
Do not be afraid
I don't understand people fearing choice. Maybe it is the fear to choose. Or at least the fear of being wrong. Is that our problem. Is that why Linux on the desktop is slow starting, because the technical people a fearful? Fear of call center-overload?
I carry Knoppix around in my bag. Recently, I enquired around for a new laptop. I fell in love with these low-cost 12" monitor basic laptops. But the smaller it is, the more customised components it'll have. Which is bad news for Linux. But I wasn't deterred. I asked the sales guy about it and popped in the Knoppix CD. He was making a comment on how RedHat still requires command-line installation. But when Knoppix came on, he was blown away. So much so, he asked for a copy of the CD to kick around. I was tempted in giving away my outdated version but I still had a few shops to go to and their laptops to test. So I waited while he copied my Knoppix 3.7.
The moral of the story is always keep 2 copies of Knoppix around.
I carry Knoppix around in my bag. Recently, I enquired around for a new laptop. I fell in love with these low-cost 12" monitor basic laptops. But the smaller it is, the more customised components it'll have. Which is bad news for Linux. But I wasn't deterred. I asked the sales guy about it and popped in the Knoppix CD. He was making a comment on how RedHat still requires command-line installation. But when Knoppix came on, he was blown away. So much so, he asked for a copy of the CD to kick around. I was tempted in giving away my outdated version but I still had a few shops to go to and their laptops to test. So I waited while he copied my Knoppix 3.7.
The moral of the story is always keep 2 copies of Knoppix around.
Labels:
Linux
Friday, May 27, 2005
Moving On Part 2
The last time, I was apprehensive about upgrading. In the past when I've done it, there were several things I didn't like happen.
1. Old settings were carried over literally. There was no way to use the new settings if you use the previous home directories. This happened during an upgrade. I am quit surprised that the newer version of the software, especially for something as important as GNOME, did not notice that the config file was from a previous version and at least offered to replace it with a new version. I understand the concern to keep all the user's settings but should there be an updater program for that.
2. Certain software that were there previously, were not replaced but simply disappeared. This happens especially for software that is not in the vogue or part of the core distribution. I understand that leaving the program there is risking a certain incompatibility but at least if it was there before and there is no replacement during the upgrade, please offer me to leave it there. It happened to Nagios, a network monitoring software I use. It just disappeared. I had to reinstall and reconfigure it every time I upgraded. It is only on the contrib section.
An upgrade is an upgrade, not re-installation and certainly not a fresh installation. Distribution packagers should respect that or loose their user base.
1. Old settings were carried over literally. There was no way to use the new settings if you use the previous home directories. This happened during an upgrade. I am quit surprised that the newer version of the software, especially for something as important as GNOME, did not notice that the config file was from a previous version and at least offered to replace it with a new version. I understand the concern to keep all the user's settings but should there be an updater program for that.
2. Certain software that were there previously, were not replaced but simply disappeared. This happens especially for software that is not in the vogue or part of the core distribution. I understand that leaving the program there is risking a certain incompatibility but at least if it was there before and there is no replacement during the upgrade, please offer me to leave it there. It happened to Nagios, a network monitoring software I use. It just disappeared. I had to reinstall and reconfigure it every time I upgraded. It is only on the contrib section.
An upgrade is an upgrade, not re-installation and certainly not a fresh installation. Distribution packagers should respect that or loose their user base.
Saturday, April 30, 2005
Surviving Mandrake (and NVidia Drivers)
One of the most easiest thing to do on Mandrake is updates. Even kernel updates. But as my previous experience has shown kernel updates with Nvidia drivers are not to be taken lightly. So the first step is to plan.
Or something like that.. DeNiro's character in the movie Ronin said at the beginning of the movie. I have to do something similar. In order to compile Nvidia drivers, you need the kernel source. So, if you update the kernel, you need the related kernel-source at the same time. Problem is, if you update with urpmi or Mandrake update, you lose the kernel source.
What we want to avoid.
We update the kernel and the kernel source. Then either we fail to complie the nvidia driver or the nvidia driver fails to work. Need to undo. Mandrake has kept the previous version of the kernel but not the source. If you failed to complie the nvidia driver, then the old one is probably there. But if the driver was a dud, you have to recompile. If you don't have the previous version of the kernel source, you are screwed because most sites don't keep older versions of the kernel source. Or you can use the non-Nvidia nvidia drivers (just change the XF86Config or equivalent file). Yep, drive your Ferrari only in the first gear.
Moral of story. Download the kernel-source rpm and then install it by hand after you have updated the kernel but before you recompile the Nvidia driver.
Before I get feedbacks on "how stupid it is for the need to recompile in this day and age" or "Run for the hills! Recompilin's here!", let me point out that the Nvidia driver recompile process is menu driven. Push button.
I am so anxious I haven't updated my kernel for so long. But I need to. Sigh. Wish me luck.
Do not enter a room without knowing how to get out.
Or something like that.. DeNiro's character in the movie Ronin said at the beginning of the movie. I have to do something similar. In order to compile Nvidia drivers, you need the kernel source. So, if you update the kernel, you need the related kernel-source at the same time. Problem is, if you update with urpmi or Mandrake update, you lose the kernel source.
What we want to avoid.
We update the kernel and the kernel source. Then either we fail to complie the nvidia driver or the nvidia driver fails to work. Need to undo. Mandrake has kept the previous version of the kernel but not the source. If you failed to complie the nvidia driver, then the old one is probably there. But if the driver was a dud, you have to recompile. If you don't have the previous version of the kernel source, you are screwed because most sites don't keep older versions of the kernel source. Or you can use the non-Nvidia nvidia drivers (just change the XF86Config or equivalent file). Yep, drive your Ferrari only in the first gear.
Moral of story. Download the kernel-source rpm and then install it by hand after you have updated the kernel but before you recompile the Nvidia driver.
Before I get feedbacks on "how stupid it is for the need to recompile in this day and age" or "Run for the hills! Recompilin's here!", let me point out that the Nvidia driver recompile process is menu driven. Push button.
I am so anxious I haven't updated my kernel for so long. But I need to. Sigh. Wish me luck.
Tuesday, April 05, 2005
Surviving Mandrake 2 - Post-installation blues
I have a lot to bitch about: An OpenOffice bug keeps me from saving my files correctly. Firefox still not 1.0 in repositories. And from the comments in the forums, it won't be for sometime. I downloaded the latest version and it had me install itself within the user's directories structure. All the shortcuts would use to old version still but instead created one on the desktop and used that instead.
Wednesday, March 30, 2005
Surviving Mandrake 1 - After the install..
After running out of excuses, I upgraded my home PC to Mandrake 10.1. I took advantage of the fact that my memory burnt out and needed to be replaced. I took out the 80GB hard disk that was gathering dust and plugged it in. I had bought it a couple of weeks ago with the intention of using it is as an excuse to upgrade. Well, good intentions don't always pan out. It wasn't that I was afraid of Mandrake 10.1. I have been using it on the office PC for some time now and have worked out most of the kinks out. But at home, it had grown so comfortable that I felt no great need to move up. It was doing what I wanted it to do. And it was doing it well. The agony of using Linux. Once things work, they just keep working.
But what can you do? You have to move up. For whatever reason. Mine was that eventually most of the new stuff will require libraries no longer compiled for Mandrake 10. Ok, not entirely true. I still could hunt down those libraries but Mandrake Update has been spoilling me silly. If they don't have it, I usualy don't get it. Bizzarely, this could be the great conundrums of Linux on the desktop. While shelf life or stability or longevity or whatever it is that makes it great for building systems and locking them down for use in the server room is great for the server room, it isn't so for the desktop. Desktop or end-user software moves much faster and change more often on the desktop. So the distributions have to keep up.
Now there is an argument for desktop-centric distributions. New or stable? Do you want to drive the latest or the one with the best mileage? Or do you have both?
Back to the story.
The installation was Mandrake-standard: quick, simple and clean. At the end of 20 minutes, I had a newly installed Mandrake 10.1 system, all ready to go and be productive. But of course, it was still a long way to. A couple of pointers.
The trick to MandrakeUpdate is to find a site that not only has a fast line but not that many users. It is the equivalent of gazing into a crystal ball. Short of those you who actually have the statistics for the usage of these sites and their bandwidth usage, it is trial and error. If the site you choose is too slow, just go back to the Software Media Manage and remove the update_souce repository. The next time you start MandrakeUpdate it'll give a list and you can choose another one.
Other stuff you have to install.
But what can you do? You have to move up. For whatever reason. Mine was that eventually most of the new stuff will require libraries no longer compiled for Mandrake 10. Ok, not entirely true. I still could hunt down those libraries but Mandrake Update has been spoilling me silly. If they don't have it, I usualy don't get it. Bizzarely, this could be the great conundrums of Linux on the desktop. While shelf life or stability or longevity or whatever it is that makes it great for building systems and locking them down for use in the server room is great for the server room, it isn't so for the desktop. Desktop or end-user software moves much faster and change more often on the desktop. So the distributions have to keep up.
But can it do so at the expense of the stability needed in the server rooms?
Now there is an argument for desktop-centric distributions. New or stable? Do you want to drive the latest or the one with the best mileage? Or do you have both?
Back to the story.
The installation was Mandrake-standard: quick, simple and clean. At the end of 20 minutes, I had a newly installed Mandrake 10.1 system, all ready to go and be productive. But of course, it was still a long way to. A couple of pointers.
- Don't worry if the system doesn't detect the Internet during installation. It has never done that for my broadband setup. And I have plain vanilla ADSL (PPPoE to be exact). Just configure it and it'll get it after a reboot.
- Long ago, actually a few distributions ago, I would restart the system immediatly after it started Linux after the installation. Just to make sure everything will come or or come up just like after a normal startup. Nowadays, it no longer is needed. When Mandrake comes up after installation, it really starts afresh, not just continuing from the installation session
- If you do have the time, when partitioning the hard disk, go into expert mode and have it run the extended test on the hard disk. It checks the hard disk thouroughly for bad sectors and the like. You'd think that we'd have left that behind by now. It is optional but if you have the time, might as well find the errors now rather than finding them while you are working. Linux does a fairly good job of this but on the off chance that it could be fatal, it doesn't hurt. Especially if you are not the type that makes back ups.
- Don't run update immediately during installation. Run after you boot up the first time, once it has run all those post-installation and run-once-on-first-time scripts. Those scripts might not get updated during the the update or will fail to work with the updated packages. Besides, most of the time, you can't get the Internet up anyway.
The trick to MandrakeUpdate is to find a site that not only has a fast line but not that many users. It is the equivalent of gazing into a crystal ball. Short of those you who actually have the statistics for the usage of these sites and their bandwidth usage, it is trial and error. If the site you choose is too slow, just go back to the Software Media Manage and remove the update_souce repository. The next time you start MandrakeUpdate it'll give a list and you can choose another one.
Other stuff you have to install.
- Flash player - because the internet would be less pretty without it. Get here.
- Java - It's getting prettier and more useful. The best part is that developers are finally understanding that you can use it just in the background. Get it here.
- mplayer - because kaffeine is great but the audio it a bit too soft. Get it on plf.
- decss - hmmm? like above
Friday, January 14, 2005
Still Alive
Just in case anyone is reading this, just dropping a note to say I am still here and back from the holiday season. I have some things I wrote but they never reached a point where I'd want to post them. So, if you like my blog, thanks and I'll be starting back up soon.
Thursday, November 25, 2004
Living with a Distribution
If Linux and Open Source was a religion (to some people it actually is), then a distribution is a sect. Or to a lesser extent a denomination. Not taking everything, but picking and choosing. Exclusive membership. Requires devotion and deviation frowned on. Or at least it'll get you in a lot of trouble. Let me explain in the next few posts with this title.
Labels:
Linux,
Thinking aloud
Thursday, October 07, 2004
Linux Analogies: Linux as A Property
I have been using "Linux is like a car" anology for years. I'll write about it some other time but essentially it is about the importance of knowing how to take care of your car instead of just driving it.
In my talks with business people who want to use Linux in their business, I found that equating Linux property helps brings about a better understanding. The anology goes that if you are building a business, you must have a place to do business. This place is the 'property'. So, if you are building applications to sell to your customers and you build them using Windows as your operating system, then Windows is your 'property' that you do your business in. If you choose to build that application using Linux, too, you would be like setting up another business in another property. The major difference between the Windows property and the Linux property is that one is a lot cheaper than the other to set up and run. You have better control on one property over the other, akin to buying one and renting / leasing another.
Each property bring with it their own requirements. Maintenance in relation to the OS is an example. You need to have one in your Windows property and another in you Linux property. Add in support, R&D, pre-sales tech and the costs of maintaining your business in both properties will add up to a lot quickly. Unless business in each property is sufficient to justify their own existence, most business would close one shop in favour of the other. How one business reaches that decisions in another matter. What would you look at? Would you look at the revenue each place of business generates? Or the actual profit? Today's business or tommorow's? Profits acheived yesterday or projected profits of tommorow?
How about upkeep of the property? You do need to refresh your place of business out of necessity or just for the heck of it. More space for the increased workforce, new wiring for a faster network or a fresh coat of paint to cover the aging process. Which place would you sink more money in, the place that you rent and eventually need to move out or the one that you bought. With Windows, MS expects you to move out of that property to another with costs. Or else, they will abandon you in it. Linux offers a chance to stay until you are ready to move, at your own pace but eventually your own peril.
Once these businessmen started thinking in those terms, the value of Linux and the value of moving their business and applications to Linux begins to show. You can stretch this even further by implying that one property is in a neighbourhood that is full of people who are trying to break in and your landlord is reluctant to put in the grills, better locks and reinforced doors. As master of your own property, you fully undestand that responsiblity is yours. And you don't feel too bad because it is an investment in your own property regardless of what neighbourhood your property is in. ( I like to think it is in a friendly community :) )
In my talks with business people who want to use Linux in their business, I found that equating Linux property helps brings about a better understanding. The anology goes that if you are building a business, you must have a place to do business. This place is the 'property'. So, if you are building applications to sell to your customers and you build them using Windows as your operating system, then Windows is your 'property' that you do your business in. If you choose to build that application using Linux, too, you would be like setting up another business in another property. The major difference between the Windows property and the Linux property is that one is a lot cheaper than the other to set up and run. You have better control on one property over the other, akin to buying one and renting / leasing another.
Each property bring with it their own requirements. Maintenance in relation to the OS is an example. You need to have one in your Windows property and another in you Linux property. Add in support, R&D, pre-sales tech and the costs of maintaining your business in both properties will add up to a lot quickly. Unless business in each property is sufficient to justify their own existence, most business would close one shop in favour of the other. How one business reaches that decisions in another matter. What would you look at? Would you look at the revenue each place of business generates? Or the actual profit? Today's business or tommorow's? Profits acheived yesterday or projected profits of tommorow?
How about upkeep of the property? You do need to refresh your place of business out of necessity or just for the heck of it. More space for the increased workforce, new wiring for a faster network or a fresh coat of paint to cover the aging process. Which place would you sink more money in, the place that you rent and eventually need to move out or the one that you bought. With Windows, MS expects you to move out of that property to another with costs. Or else, they will abandon you in it. Linux offers a chance to stay until you are ready to move, at your own pace but eventually your own peril.
Once these businessmen started thinking in those terms, the value of Linux and the value of moving their business and applications to Linux begins to show. You can stretch this even further by implying that one property is in a neighbourhood that is full of people who are trying to break in and your landlord is reluctant to put in the grills, better locks and reinforced doors. As master of your own property, you fully undestand that responsiblity is yours. And you don't feel too bad because it is an investment in your own property regardless of what neighbourhood your property is in. ( I like to think it is in a friendly community :) )
Labels:
Commentary,
Linux
Saturday, September 18, 2004
Through the looking glass
A friend asked me whether I had Knoppix. I was happy to hand over an ever-present copy. Knoppix is something I keep around in case someone wants to try Linux out. But it that was not the case this time. Apparently, someone told him that a problem he has could be fixed using Knoppix. A problem with Windows XP Pro.
Intrigued, I sat down to look at it. After a while I figured that he probably got hit by the Sasser virus and it was now preventing him from logging in and rebooting every time he pressed OK at an error message box. Not good. One of the fixes mentioned on the Internet involved replacing corrupted config files with a copy made automatically by Windows.
So I have this laptop physically in front of me and all I had to do was go in a replace a couple a files. Piece of cake. Cake on my face was more like it as the hours passed. The first problem was getting read access to the hard disk. Knoppix gave me that capability via Captive. But it needed to access the Internet and Knoppix didn't detect the laptop's network card. At this point I could have slaved to get Knoppix to detect the card because I had Mandrake 10 running on the exact model elsewhere and it detected the network card. Just a matter of figuring out which module to load.
Then it struck me, what if I didn't have Knoppix? What options do I have? Quite a few. A few solutions involved booting from floppies to get to a command prompt. Nothing beats command line when it comes to fixing a broken OS. Although this was the best way, it involved a purchase. Booting from the XP CD allowed me to go into a "repair mode" which essentially was a command line. But that required the Administrator's password. Despite my friend's assurance, the password he gave me didn't allow me in. I didn't say it was wrong. Since the problem involved corruption at the login process, it was a possiblity that access to the password database was corrupted.
As I sat and thought about the problem, I realised how different would this be if I was looking at Linux. I don't want to alarm people but in the Linux world (or even the Unix world), physical access security is everything. The logic is probably that there is no use for all the fancy network security filtering masqing proxying thingamajig if someone can run away with your hard disk. Or CPU unit. The security experts will tell you that no matter what encryption you put on that hard disk, with enough time and money, it will be cracked. So the moral is don't let someone steal your server or hard disk. Unless you are one of those people who Lojack your hard disk. Hmm.... there's an idea.
With Windows XP Pro, it makes it so hard for a technician (me) sitting right there in front of it to crack through the system to fix it. In fact, it was probably easier to break it from remote than to break it right there. Where is the logic in that? Lock the doors to your house and forget the key will keep you out of it but not the people who can stream in through the subway entrance in your house?
With Linux, there is a lot of emphasis on security with regards to network access. For a good reason. My friend's problem? It still isn't fixed because I had to go do other things but I'm going back to the Knoppix solution.
Intrigued, I sat down to look at it. After a while I figured that he probably got hit by the Sasser virus and it was now preventing him from logging in and rebooting every time he pressed OK at an error message box. Not good. One of the fixes mentioned on the Internet involved replacing corrupted config files with a copy made automatically by Windows.
So I have this laptop physically in front of me and all I had to do was go in a replace a couple a files. Piece of cake. Cake on my face was more like it as the hours passed. The first problem was getting read access to the hard disk. Knoppix gave me that capability via Captive. But it needed to access the Internet and Knoppix didn't detect the laptop's network card. At this point I could have slaved to get Knoppix to detect the card because I had Mandrake 10 running on the exact model elsewhere and it detected the network card. Just a matter of figuring out which module to load.
Then it struck me, what if I didn't have Knoppix? What options do I have? Quite a few. A few solutions involved booting from floppies to get to a command prompt. Nothing beats command line when it comes to fixing a broken OS. Although this was the best way, it involved a purchase. Booting from the XP CD allowed me to go into a "repair mode" which essentially was a command line. But that required the Administrator's password. Despite my friend's assurance, the password he gave me didn't allow me in. I didn't say it was wrong. Since the problem involved corruption at the login process, it was a possiblity that access to the password database was corrupted.
As I sat and thought about the problem, I realised how different would this be if I was looking at Linux. I don't want to alarm people but in the Linux world (or even the Unix world), physical access security is everything. The logic is probably that there is no use for all the fancy network security filtering masqing proxying thingamajig if someone can run away with your hard disk. Or CPU unit. The security experts will tell you that no matter what encryption you put on that hard disk, with enough time and money, it will be cracked. So the moral is don't let someone steal your server or hard disk. Unless you are one of those people who Lojack your hard disk. Hmm.... there's an idea.
With Windows XP Pro, it makes it so hard for a technician (me) sitting right there in front of it to crack through the system to fix it. In fact, it was probably easier to break it from remote than to break it right there. Where is the logic in that? Lock the doors to your house and forget the key will keep you out of it but not the people who can stream in through the subway entrance in your house?
With Linux, there is a lot of emphasis on security with regards to network access. For a good reason. My friend's problem? It still isn't fixed because I had to go do other things but I'm going back to the Knoppix solution.
Labels:
Commentary,
Linux
Sunday, August 22, 2004
Turning back the clock
I like to use Linux with the mindset of a user. I am very wary when I realise that I am using my 10 odd years of using Linux on and off to solve problems that may be faced by a user. This is a line that I clearly mark.
I am saying this because as Linux goes mainstream, this has to become more important. I believe it is the responsibility of entities that are making money out of Linux to put in the effort to ensure that it can be used by the intended users without having to resort to constant professional help. This is even more so for Linux distribution companies like RedHat and Novell. When more users can use the final distribution, it creates more sales opportunity for them. Let's not put aside the fact that the ease of configuration in Windows is what made it most attractive to businesses. You can shoot yourself in the foot with it (misconfigure till you loose data) but at least you did it with ease and style.
Which brings me to my tale this time. Updating (and all it's ills) are handled with relative ease in Linux. With Mandrake, running Mandrake Update and several clicks later will update the system. Emphasis on backwards compatibility and ensuring inter-dependancy is always maintained usually makes regular updates painless. The most crucial part of regular updates is kernel updates. If you are a business user and you use an enterprise version of Linux chances are you're running with the dinosaurs, kernel-wise. For most of us using the Fedora or Mandrake, kernel updates are not that often but still regular. Most experienced users will point out that there is no race to use the latest kernel. Most of the time they are fixes and security updates, rather than additional functionality. You can skip an update releaste or two but stray not too far.
Mandrake recently released an updated called kernel 2.6.3-15. I was using kernel 2.6.3-13. I've updated many times before without a hitch. I have an Nvidia card and it is tied to a specific kernel. So after a kernel update, I need to generate a new driver. This means I need the updated kernel sources, too. So after updating the kernel, I downloaded the kernel sources. Having a broadband connection is a blessing.
The Nvidia installer uses the kernel sources to generate a driver. However, the Nvidia installer complained that my sources were not clean. It suggested I do something with mproper. I followed the instructions without success. Since I was using the new kernel, I switched back to the old one to get to my graphical interface. This was easy because Mandrake keeps a link during boot-up (using lilo) to older versions of the kernel. Very thoughtful. After following suggestions from the web, the problem was not solved, I couldn't generate the Nvidia driver and I was still stuck using the older kernel. So I decided to make a permanent switch back to the older kernel. Now, this was not common but it was very easy and over in a few minutes.
Later I wanted to try VMware. This is a cop-out, I'll admit, but sites that use Windows Media Player tend to let me have the multimedia stuff much faster and I wanted to user Yahoo Launch. What these sites do is possible with linux-friendly players like Real Player but the developers tend to layer it with so many pages and Real Player 10 for Linux (based on helix) still balks regularly at sites that play ads before showing the good stuff. Yes, I have heard of Crossover and that is next on my list.
VMWare says during the installation process that it need to recompile something using the kernel sources. However, I had installed the kernel source for kernel 2.6.3-15 but I was using kernel 2.6.3.-13. As part of it's installation, the newer kernel source package removed files from the older kernel source. "So, I'll just remove this version of the kernel source and install the old one," I thought. But after an hour of searching the Internet, I could find only a single source for the kernel source package, a server in South Africa. Apparently, as the new kernel source is updated on mirrors worldwide, the old one is simple lost and never archived. And this site was horrendously slow. Fearing that some other program might need the sources, I hunkered down for a long download and waited. Two rpm commands later, one to remove the kernel source package and another to install the newly downloaded kernel source, I was back in business.
To be frank, the problen with the newest kernel-source could have been fixed fast enough if I was familiar with building a kernel and the associated processses. But then it probably would have taken more time and crossed the line between the Linux user and the Linux professional.
The good thing about Linux is that eveything is exact. How parts interacts with each other are very regulated by convention or by viture of creating packages or packaging against a specific distribution. Although it took some time, rolling back from a semi-faulty package was not too hard, even for something as complex as a kernel update.
I am saying this because as Linux goes mainstream, this has to become more important. I believe it is the responsibility of entities that are making money out of Linux to put in the effort to ensure that it can be used by the intended users without having to resort to constant professional help. This is even more so for Linux distribution companies like RedHat and Novell. When more users can use the final distribution, it creates more sales opportunity for them. Let's not put aside the fact that the ease of configuration in Windows is what made it most attractive to businesses. You can shoot yourself in the foot with it (misconfigure till you loose data) but at least you did it with ease and style.
Which brings me to my tale this time. Updating (and all it's ills) are handled with relative ease in Linux. With Mandrake, running Mandrake Update and several clicks later will update the system. Emphasis on backwards compatibility and ensuring inter-dependancy is always maintained usually makes regular updates painless. The most crucial part of regular updates is kernel updates. If you are a business user and you use an enterprise version of Linux chances are you're running with the dinosaurs, kernel-wise. For most of us using the Fedora or Mandrake, kernel updates are not that often but still regular. Most experienced users will point out that there is no race to use the latest kernel. Most of the time they are fixes and security updates, rather than additional functionality. You can skip an update releaste or two but stray not too far.
having a broadband connection is a blessing
Mandrake recently released an updated called kernel 2.6.3-15. I was using kernel 2.6.3-13. I've updated many times before without a hitch. I have an Nvidia card and it is tied to a specific kernel. So after a kernel update, I need to generate a new driver. This means I need the updated kernel sources, too. So after updating the kernel, I downloaded the kernel sources. Having a broadband connection is a blessing.
The Nvidia installer uses the kernel sources to generate a driver. However, the Nvidia installer complained that my sources were not clean. It suggested I do something with mproper. I followed the instructions without success. Since I was using the new kernel, I switched back to the old one to get to my graphical interface. This was easy because Mandrake keeps a link during boot-up (using lilo) to older versions of the kernel. Very thoughtful. After following suggestions from the web, the problem was not solved, I couldn't generate the Nvidia driver and I was still stuck using the older kernel. So I decided to make a permanent switch back to the older kernel. Now, this was not common but it was very easy and over in a few minutes.
Later I wanted to try VMware. This is a cop-out, I'll admit, but sites that use Windows Media Player tend to let me have the multimedia stuff much faster and I wanted to user Yahoo Launch. What these sites do is possible with linux-friendly players like Real Player but the developers tend to layer it with so many pages and Real Player 10 for Linux (based on helix) still balks regularly at sites that play ads before showing the good stuff. Yes, I have heard of Crossover and that is next on my list.
VMWare says during the installation process that it need to recompile something using the kernel sources. However, I had installed the kernel source for kernel 2.6.3-15 but I was using kernel 2.6.3.-13. As part of it's installation, the newer kernel source package removed files from the older kernel source. "So, I'll just remove this version of the kernel source and install the old one," I thought. But after an hour of searching the Internet, I could find only a single source for the kernel source package, a server in South Africa. Apparently, as the new kernel source is updated on mirrors worldwide, the old one is simple lost and never archived. And this site was horrendously slow. Fearing that some other program might need the sources, I hunkered down for a long download and waited. Two rpm commands later, one to remove the kernel source package and another to install the newly downloaded kernel source, I was back in business.
rolling back was not too hard
To be frank, the problen with the newest kernel-source could have been fixed fast enough if I was familiar with building a kernel and the associated processses. But then it probably would have taken more time and crossed the line between the Linux user and the Linux professional.
The good thing about Linux is that eveything is exact. How parts interacts with each other are very regulated by convention or by viture of creating packages or packaging against a specific distribution. Although it took some time, rolling back from a semi-faulty package was not too hard, even for something as complex as a kernel update.
Wednesday, August 11, 2004
The Pleasant Surprise
So I've settled on Web based educational sites that are normally running Flash. I created an account for each of them and put their faces as the KDM icon. Next, I created shortcuts on the desktop for Mozilla that opens each of the sites. The sites I find that attracts my kids attention the most are PBSKids, Sesame Street and Playhouse Disney Channel. Ok, so my kids are not that old yet. The more the reason to get them to know Linux. Now, I had to repeat this for the younger sibling on his Desktop.
(Which brought me thinking that this type of setting up user enviroments in Linux is not explored throughroughly yet. Think of Novell's ZenWorks for Desktop for Linux or ZenWorks for Linux Desktop.)
Next problem was that I had set the screen resolution to be 1024x768. Not this is ok for me but the sites above came up quite small. And when the Flash activities came on, it became even smaller. My mind was racing at the thought of setting up individual X config files for each of them. I haven't tweaked X config files in ages. Being the lazy type, I began mucking around and lo behold, you could setup individual screen resolutions in GNOME 2.4. What a pleasant surprise! I thought I had gone over GNOME setups over and over again. And yet I never found this. So like my kids, I was learning something new every day. Maybe I should rename this blog into 'The Linux Adventure'.
I don't know why that config option affected me so much. One one level, I felt that someone had read my mind and made that feature. On another level, I felt beholden to the person who decided to put in that feature. I felt so grateful to the people who put their own time an effort into making Linux and free software great. I guess I can honor them by making sure the next generation starts using Linux and making choices.
Monday, August 09, 2004
LinuxFormat: Required Reading Material
There is only one Linux Magazine I buy religiously : Linux Format.
Linux Format should be required reading for people who are interested with Linux on the desktop. For once, a magazine addresses the issues of an average user (and not the user with an aveage IQ of 250). If you wanted to see how magazine looked like in the early PC days, this magazine has a few similarities. But more importantly, this magazine embodies the same attitudes the PC magazines had in those days. The days when PC User groups were places to get the latest info and swap war stories (read: trial and error OR My PC blew up and I survived to tell the tale). The magazine even has listed (primarly UK) Linux user groups at the back of the magazine. That brought flashbacks.
This magazine is also easy on the eyes. Not like other similar mags of it's ilk where every possible space is crammed with words, Linux Format uses space and graphics well. And I don't mean diagrams or pictures of product boxes or pictures of other users and people of note caught in most unflattering flashed photos. There are actual graphics that have no relation to the article other than to make it look nice. Imagine that!
(Note to other Linux magazines: it's ok to look nice. Just because your readers spend all day looking at code, doesn't mean they'll appreciate a magazine formatted to look like more code)
I don't care that much about the CDs or DVDs that come with it because I'm on broadband and I can get at the programs or links that are mentioned in the magazine much faster. But if you don't, they are a great resource, especially when it comes with distros.
There is also the companion TuxRadar website and the wonderful Tuxradar podcast. Here, you get to hear the writers from Linux format talk about Linux issues of the day sprinkled with a very UK-centric view. Which is refreshing to hear from the US-centric view I usually get from other sources.
I read other Linux magazines like Linux Journal and Linux Magazine (the US version, not the UK version that seems to be machine translated from German). But I tend to pick them off the discount rack (it is that expensive). And they tend to be there when I pick them up. But with Linux Format, I have to be on my toes when the month rolls over to grab my copy or I'll miss out. I have staked out the quality bookshops that carry it and even have the phone number of the local distributor so that I can bug them as to when the next issue is going to be out.
Why don't I subscribe? Where's the fun in that? :)
Linux Format should be required reading for people who are interested with Linux on the desktop. For once, a magazine addresses the issues of an average user (and not the user with an aveage IQ of 250). If you wanted to see how magazine looked like in the early PC days, this magazine has a few similarities. But more importantly, this magazine embodies the same attitudes the PC magazines had in those days. The days when PC User groups were places to get the latest info and swap war stories (read: trial and error OR My PC blew up and I survived to tell the tale). The magazine even has listed (primarly UK) Linux user groups at the back of the magazine. That brought flashbacks.
This magazine is also easy on the eyes. Not like other similar mags of it's ilk where every possible space is crammed with words, Linux Format uses space and graphics well. And I don't mean diagrams or pictures of product boxes or pictures of other users and people of note caught in most unflattering flashed photos. There are actual graphics that have no relation to the article other than to make it look nice. Imagine that!
(Note to other Linux magazines: it's ok to look nice. Just because your readers spend all day looking at code, doesn't mean they'll appreciate a magazine formatted to look like more code)
I don't care that much about the CDs or DVDs that come with it because I'm on broadband and I can get at the programs or links that are mentioned in the magazine much faster. But if you don't, they are a great resource, especially when it comes with distros.
There is also the companion TuxRadar website and the wonderful Tuxradar podcast. Here, you get to hear the writers from Linux format talk about Linux issues of the day sprinkled with a very UK-centric view. Which is refreshing to hear from the US-centric view I usually get from other sources.
I read other Linux magazines like Linux Journal and Linux Magazine (the US version, not the UK version that seems to be machine translated from German). But I tend to pick them off the discount rack (it is that expensive). And they tend to be there when I pick them up. But with Linux Format, I have to be on my toes when the month rolls over to grab my copy or I'll miss out. I have staked out the quality bookshops that carry it and even have the phone number of the local distributor so that I can bug them as to when the next issue is going to be out.
Why don't I subscribe? Where's the fun in that? :)
Labels:
Linux
Friday, August 06, 2004
Splitting Linux
Before those of you who know better howl at me, let me get this straight: I understand that a Linux distribution is not Linux.
But the reality is that people tend to think of it that way, that Linux is a Linux distribution. For a long time people never understood why I wasn't using RedHat, like everybody else. I wasn't using it because I understood that I could go to any Linux distribution and switch back later as long as I was on the Intel platform. For a time RedHat was Linux. I still think that some business that they get is simply just because the users don't know about the other distributions or haven't bothered to take a look. For now onwards, the term Linux will largely mean Linux distributions.
Today I installed Fedora Core 2 (FC2). Now, I use Mandrake on my desktops and a mix of other distributions on the servers. My RH8 servers were looking long in the tooth and I figured out I needed to refresh them. So I got the set of FC2 disks that came with this month's Linux Format and burned disk 3 and 4 from the Internet. As I was configuring them, I became acutely aware of how different it was to configure Linux under FC2 than under Mandrake. As with all of my machines, I install Webmin to help me administer them. That me got me fiddling with the package managers and demonstrating how different the approaches of Mandrake and FC2 was when it came to package management resources. Basically, I was finding it hard to find any of the programs that helped me to configure some of the stuff, mot of the time the programs I was used to on Mandrake but not entirely. Some programs that I assumed were part of KDE were also missing or quite hard to find..
It then dawned to me not only that I was just using a different distribution but a distribution meant for someone else. I was so used to Mandrake, a distribution clearly gunning for the desktop, that I was upset with Fedora for not doing enough to have it as easily configurable as Mandrake. It was ok with me finally because I realized that RedHat/Fedora were really about servers. Both distributions were catering to their consituency and their unique requirements and environment. For example, with a server, configuration is a once off thing and the people using them were expected to know what they were doing. On a desktop, the configuration can change quite often as USB devices are plugged and pulled. Plus, most of the time people don't know or care what they are doing to the computer as long as they can use it.
It is important that Linux distributions become adjusted for the environment that it will live in. This leverages on it's nature to be flexible through the choices of the programs it includes as part of the distribution which includes configuration tools. By adapting to the environment, Linux addresses the unique environment that it will function in be it a server or desktop of today or the embedded device of tomorrow. It is very much a version of tolerance within the space of Linux technology.
But the reality is that people tend to think of it that way, that Linux is a Linux distribution. For a long time people never understood why I wasn't using RedHat, like everybody else. I wasn't using it because I understood that I could go to any Linux distribution and switch back later as long as I was on the Intel platform. For a time RedHat was Linux. I still think that some business that they get is simply just because the users don't know about the other distributions or haven't bothered to take a look. For now onwards, the term Linux will largely mean Linux distributions.
Today I installed Fedora Core 2 (FC2). Now, I use Mandrake on my desktops and a mix of other distributions on the servers. My RH8 servers were looking long in the tooth and I figured out I needed to refresh them. So I got the set of FC2 disks that came with this month's Linux Format and burned disk 3 and 4 from the Internet. As I was configuring them, I became acutely aware of how different it was to configure Linux under FC2 than under Mandrake. As with all of my machines, I install Webmin to help me administer them. That me got me fiddling with the package managers and demonstrating how different the approaches of Mandrake and FC2 was when it came to package management resources. Basically, I was finding it hard to find any of the programs that helped me to configure some of the stuff, mot of the time the programs I was used to on Mandrake but not entirely. Some programs that I assumed were part of KDE were also missing or quite hard to find..
It then dawned to me not only that I was just using a different distribution but a distribution meant for someone else. I was so used to Mandrake, a distribution clearly gunning for the desktop, that I was upset with Fedora for not doing enough to have it as easily configurable as Mandrake. It was ok with me finally because I realized that RedHat/Fedora were really about servers. Both distributions were catering to their consituency and their unique requirements and environment. For example, with a server, configuration is a once off thing and the people using them were expected to know what they were doing. On a desktop, the configuration can change quite often as USB devices are plugged and pulled. Plus, most of the time people don't know or care what they are doing to the computer as long as they can use it.
It is important that Linux distributions become adjusted for the environment that it will live in. This leverages on it's nature to be flexible through the choices of the programs it includes as part of the distribution which includes configuration tools. By adapting to the environment, Linux addresses the unique environment that it will function in be it a server or desktop of today or the embedded device of tomorrow. It is very much a version of tolerance within the space of Linux technology.
Thursday, August 05, 2004
Planning as part of life
I'm switching over the ISP connection to a higher bandwidth which also means that I'll be switching IP addresses. This also means the slightly tricky business of moving the DNS server. I'm hosting my own DNS for some strange reason and I have to inform the NIC involved that my domain's DNS server has changed IP addresses. The tricky part comes in because I have only one DNS server but I have to move it and keep the older DNS setting alive for a while. Thanks goodness Linux is so 'not demanding'.
Took out an old Celeron PC with 32MB of RAM(!). Installed Linux on it and installed the bind package. Within a hour or so I have another server ready to do some light work. I configured it so that the Celeron DNS would host slave zones to the main DNS server for a while. Essentially it gets the settings and information about the zone from the main DNS server. After a while, I converted the slave zones to a normal or master zone. I configured the Celeron PC so that it took over the IP address off the main DNS server. The main DNS server was then transferred to use the newer IPs. Finally, I informed the NIC and they'll update their records accordingly.
What surprised me was not that I was able to set up another DNS server in record time but I spent more time planning than actually doing the job.
That is something I credit using Linux for, the tendency to plan stuff out ahead. Linux is extremely powerful and offers 1001 variations of everything. So choosing what to do is very important. It's not enough to decide to do something and just do it it. Ensuring that minimal disruption occurs and detected by the users is as important as achieving what I set out to achieve . Making sure that things can be undone if the goals achieved are not actually what they seem to be is also very important.
The overall concern shifts from whether "Can it be done and what tools do I need to buy to do them?" to "How can I do it using what I have or have to download from somewhere? How can I do it quietly without people feeling a thing." That may not be a paradigm shift but it sure is a shift for the better.
Took out an old Celeron PC with 32MB of RAM(!). Installed Linux on it and installed the bind package. Within a hour or so I have another server ready to do some light work. I configured it so that the Celeron DNS would host slave zones to the main DNS server for a while. Essentially it gets the settings and information about the zone from the main DNS server. After a while, I converted the slave zones to a normal or master zone. I configured the Celeron PC so that it took over the IP address off the main DNS server. The main DNS server was then transferred to use the newer IPs. Finally, I informed the NIC and they'll update their records accordingly.
What surprised me was not that I was able to set up another DNS server in record time but I spent more time planning than actually doing the job.
That is something I credit using Linux for, the tendency to plan stuff out ahead. Linux is extremely powerful and offers 1001 variations of everything. So choosing what to do is very important. It's not enough to decide to do something and just do it it. Ensuring that minimal disruption occurs and detected by the users is as important as achieving what I set out to achieve . Making sure that things can be undone if the goals achieved are not actually what they seem to be is also very important.
The overall concern shifts from whether "Can it be done and what tools do I need to buy to do them?" to "How can I do it using what I have or have to download from somewhere? How can I do it quietly without people feeling a thing." That may not be a paradigm shift but it sure is a shift for the better.
Subscribe to:
Posts (Atom)
Recently Popular
-
I have been spending time trying to wrap my head around Containers, mainly the Docker container. There are others that are up and coming, bu...
-
I've always been interested in new technology. But I'm always worried about complexity for complexity sake. Now I know that some peo...
-
Fortunately, the pandemic has little negative impact for me. Working from remote, couped up in the house,endless remote meetings. Tell me so...
-
In open source, 'scratching your itch' is a source of birth for many a project. It makes the assumption that someone who has a probl...
-
I am a writer at heart. Just look at the number of blogs I contribute to (listed as the Techsplatter network at the bottom of this page)....