I've been putting this off long enough. I am a long time Mandriva user from the days of Mandrake. Not exclusive, of course. It's the distro I use at home and on my laptop at work. I promote it to novices and other Linux users alike. I think I use it because it appeals to the lazy part of me. I get things done with little or no hassle. No fireworks. Not too much bling. Not many surprises.
Ever since it was forked into Mageia, I realized that I would soon have to choose. But since there was nothing concrete from the fork, I waited. Then Mageia 1 came out. I waited some more. Now the new Mandriva is out. The updates are getting fewer and father in between for my Mandriva 2010 Spring. Normally, it is that time when I turn on the backports repo and feed off that while I read the forums about possible show-stoppers. There wasn't much in the last few releases. So I think there shouldn't be much in the way.
I guess the real question is, how do I choose which one. I use Gnome on Mandriva, so there is that Gnome 3 choice also. After see-sawing back and forth I've decided to burn LiveCds of both to kick the tires a bit. Only Mandriva doesn't have that anymore. And they only support KDE with a new tablet like interface.
The Mandriva upgrade instructions look frightening, largely because the English is confusing.
When you use --download-all option urpmi will download all the packages first and then begins to install all of them. It is strongly recommended option for migration to a new release with urpmi. It is used to provide reliable update, you need to download and update a lot of packages. If you do not use this option and during update process you face Internet connection problems, you will get a very bad situation when only part of system will be updated, that will result in problems with correct system working.
I'm having nightmares about my former Russian math tutor repeating that to me again and again. Signs of influence from the Russian investors?
Not everything is going against Mandriva. There is no PLF in Mageia, so I'll need to figure out how that works out for me.
This is one of those times I want to shout out, " I HAVE A GREAT SYSTEM. I CUSTOMIZED IT AND IT WORKS FOR ME. WHY DO I HAVE TO KEEP REINSTALLING AND START OVER?" Especially with stuff like wine setups lying around, upgrades means re-configuring those again.
I find it funny complaining about change. Especially from someone who has probed monitor refresh rates to configure X windows (Look it up kiddies. There is no more spectacular way to make your monitor into a paperweight). Change is why I don't have to do that anymore. But I think we have reached a plateau. Hopefully my choices won't lead me to cliff at the end of that plateau.
Steve Jobs and Bill Gates at the fifth D: All Things Digital conference (D5) in 2007 (Photo credit: Wikipedia)
That is question on a lot of tech pundit's mind. I've followed Apple news since the 80s, having started on the Apple ][e. The best way to figure out what Apple could look like is to look how other companies have moved beyond their original founding or influential founders. The story is very much varied.
There is that other company that Jobs have left and worked out ok, Pixar. He believed in having good people around in a company. His hiring of Scully from Pepsi in Apple's early day is an example of that. He refined this belief further in Pixar, where he has a team that has taken it to great successes, stayed hungry and welcome (and look for) change. I remember John Lasseter's comment in the mid-90s on how they at Pixar love the fact that Jobs was getting busy back at Apple. They now effectively controls Disney, with John Lasseter heading Disney Animation Studios. Jobs is the largest single shareholder via Disney's purchase of Pixar. Lesson: making a company great is a team sport.
Jobs has left Apple before.Contrary to popular belief that he was fired, Jobs left on his own will. Jobs wanted control over the direction and results of his vision. The board was worried about how much it would cost. We all know what Apple leadership went through after that. Guy Kawasaki puts it best in the documentary Welcome to Macintosh where he says "everybody wanted to be somebody else". He probably meant each wanted each other's success and tried emulating them. One CEO, Michael Spindler, wanted to sell Apple to Sun or IBM. All that while, I believed that the only way to fix Apple was to bring Steve Jobs back. Not that the Apple faithful didn't dream in those times of his return. All of us was right. Lesson: Apple is an expression of Steve Jobs's vision of looking and creating the future, instead of just looking at the balance sheet.
Another company with iconic founders, Hewlett-Packard, began life in the garage, much like Apple. In fact, there were the original home-garage computer company. Bill Hewlett and Dave Packard built it into a mult-billion company. They enshrined their beliefs in management as the "The HP Way". However, I feel that it was largely abandoned after the tech bubble burst in the late 90s and began to lose it's way as a technology leader. It became yet another computer company with many interests, with no distinctive features other than it's corporate maneuverings, sales forecasts and stock price movements. The exception is probably in printing where the brand is strong and products are respected. HP is now at a crossroads, with a CEO clearly looking for a buyer for it's 'low-profit' division, despite what it says otherwise. HP is a company lost not because of itself but it's leaders who have decided to let the numbers do the leading. Their most recent move in looking to sell their PC division is purely financial. And their customers can see right through it. They are worried about their support contracts, their investments in technology and more importantly in the people at HP. The question on their minds are: "Will the HP of tomorrow be the same HP I'm talking to today?" Lesson No 1: Anybody can have a vision and be visionary but the real question is: what is that vision? Lesson No 2: In the pursuit of profits, don't leave your customers behind.
Apple's current management is well suited to continue Steve Jobs' legacies. But soon it will need a new visionary leader. Someone who is committed to the values of Apple and is surrounded by a team willing to follow.
You see, Jobs's deal is that he wants to change the world. He wants to change the world by changing how people feel. He affects how people feel by changing or controlling how they interact with their world, whether their experience is visual or through touch. He believes that by making that experience of interacting with Apple products "revolutionary" and "magical", it will make people feel good and thus positively affects their world and the world in general. He understood that while the computer can do useful things, it is also a useful tool to impress people. Those that are impressed will go and buy the same computers. So, computers need to be useful and impressive. Now that's vision. Making a buck along the way is not bad, too.
Apple will survive after Jobs, it's just not going to be this Apple.
You know you have been in the game too long when you see things happen twice. Or more.
When I heard about what HP did with WebOS tablets and their future 'direction' on it, I was amused and upset at the same time. Amused because the way it was announced speaks volumes on the decision makers themselves rather than WebOS itself. You can bet that it was no knee jerk reaction. This was planned for some time by those who opposed HP buying WebOS or did not see it's value. They were just waiting for an excuse. How else to explain the suddenness of the decision? Why else would you talk smack on something that you wanted to sell? "My car is crap, would you like to buy it?" What normally would happen is to they'd talk about it's good points and try to get the best value from the sale. When you say bad things about that you want to sell, you want it to be valued low enough so that the reluctant buyer sees it as a bargain instead. When the value is low enough, people who would not normally buy something like it, may be tempted to do so.
I am upset because WebOS represents good technology. When the tablets came out, WebOS got favorable reviews. Some reviewers did complain about some rough edges but forgave them because the tablet was a first model and bound to have growing pains. They expected that HP would work out the kinks in the next model. I was looking forward to picking one up.
But history is littered with good intention and great technologies. The most analogous example I can think of a is PC-GEOS. For the briefest of time, PC/GEOS and DR-DOS represented a strong challenge to Windows 3.0. PC/GEOS, later GeoWorks, was graphical desktop environment that was advanced in it's day. GeoWorks came with a word processor, graphics editor and communications software. It had a Motif-like UI with advanced features such as scalable fonts and Postscript support. It provided multi-tasking, tear-away menus and an advanced API. The API provided services for almost all of the basic functions for desktop software. The word processor was about 25kb because almost of of the functions were system calls. And since it was the early 1990s, it worked well with only 640kb of RAM. Yes, the early 1990s. Windows 3.0 still had bit-mapped fonts. Some credit it for making Microsoft to come out with Windows 3.1 just after a year it released 3.0, just to add scalable font support (it still used bit-mapped fonts for the OS).
GeoWorks was a lost opportunity to put ahead a an easy to user, technically superior system that worked on existing computers. It was also a lost opportunity to make using computers less about knowing about computers than getting work done. The two biggest gripes users had about GeoWorks then was that the word processor didn't do tables and there was no spreadsheet. People had little problem using it because it was very stable. In short, it was also a lost opportunity to put applications before operating systems.
WebOS by most accounts is a system that could offer a choice other then Apple and Android for tablets. Competition is the key to keep innovation humming. Apple has already chosen to litigate while it innovates. The iPads are also still not considered enterprise ready while Apples doesn't care about the enterprise. Androids will always be struggling to keep a balance between openess and security. It also has to balance between apps running locally and depending on the cloud to deliver productivity. WebOS could be that middle ground between flexibility and security, offering fewer apps but have apps that just work out of the box and aren't afraid to live on the box. All the while continuing to push the OS into the background. Which Microsoft can't and won't do.
GeoWorks is a great product that didn't gain prominence because it couldn't compete against Microsoft's business practices back then (which effectively made PC makers pay for Windows for every machine shipped regardless of whether Windows was bundled or not). This was a time when people still ran other graphical operating systems on PCs and Windows was still version 3.0. GeoWorks didn't do disk operations so it still needed MS-DOS or DR-DOS so it wasn't like it was cutting into existing DOS sales. It also failed because it was hard and expensive to develop for. Sales were so bad, the company behind it later looked to revenue from sales of the SDK to help keep it running, what we now know as suicidal. This created a catch-22. People won't use it because there are no apps and developers won't develop for it because there a few users.
At first I ran GeoWorks on my PC but I eventually moved on to Linux 0.99pre12(?) and I bought an old 640kb Laptop (with a lead-acid battery!) to run GeoWorks. Printing was a snap because I printed to a file using the postscript printer driver. I would then pipe the file to a postscript printer for output. Sweet.
In the end, Geoworks became an ultra-niche product and a promise of better computing unfulfilled. Don't believe me? Try it yourself, guess when you thought the OS came out and tell yourself it came out in 1991.
Webmin is probably one of the best kept secrets of sysadmins around. Everybody uses it but rarely talks about it. Less still admit using it. Why? Because it makes the difficult config jobs point-and-click easy. It makes what seems to take countless command line commands into a few clicks of the mouse. That is probably why it's an open secret. It does take away some of the mystique of being a sysadmin. Managers, if they knew, would demand faster turnarounds. But it still needs you to know what you are doing.
Basically webmin is a web-based config front-end for your system. I recommend it all around. Even if you run your own personal Linux desktop, I recommend you installing it. Even if you are Mr. Security Conscious, install it and configure it so that you can only access if from localhost. Because it provides something more valuable beyond that what it does superficially. I'll get to that in a minute.
Webmin is a collection of server-side scripts, separated into modules, to run local commands to configure your system. It throws up webpages that recasts the various command line options for commands that configure one component of your system. Each module corresponds to a particular component of service. Some offer interactive tools, like access to a java-based file manager. It covers everything from booting up, boot services to server services like Samba and DNS.
It hides the nitty gritty and allows you to focus on the decisions both technical and managerial. I have used webmin for a long time, I think over a decade. I've seen it' growing pains. It's epic battles of configuration controls with SuSe (one of the reasons I stopped using SuSe regularly) was an example of how much respect developers should place upon users. Suse, at boot time, kept over-writing standard configuration files (which webmin modifies) with values from it's own config file. It chose to favor it's own config files over that which the user has chosen. It was the first step towards a registry-like model and users voted otherwise.
Some things still don't work great, like Samba. But other than that, are rock solid. It hides the nitty gritty so well, that I used it briefly to manager a Sun Server. I was thrust the responsibility when someone foolishly bought a Sun server because "It was what the vendor uses". The big deal was that it was to run a database (for which there was a linux version available). To Linux users, Sun is different and the same. It has different device naming conventions, slightly different service startup mechanism, to name a few . But it is the same because it is Unix.. So, I installed webmin for Solaris and was able to manage it even though I almost never went to the command line. Manage users, assign resources. Webmin did all I needed.
But the truly valuable service Webmin gives sysadmins is the time to plan and think. When pressed for a deadline or users breathing down your neck to fix a service, webmin offers an overall view of the command options and simplifies it to clicks, freeing you to come out with solutions and make decisions. Rather than focus on and getting tripped by command line options, you can focus on what is possible and choose what is best, knowing that Webmin won't let you send the wrong command options because of typos. Less time to worry on that, results in more time to think. And contrary to what some people think, thinking is a good thing.
When did Apple become the Man?
Apple has always been protective about it's copyright. Some of us remember their Interface Wars with HP and Microsoft in the early 1990s. But they have been smart about it. They have been protective of their copyright but generally shared their innovation. They popularized 3.5 inch Floppy Drives, Ethernet, Laser printers, Firewire to name a few.
But their lawsuit against Motorola revealed in their injunction on Samsung's Galaxy Tab smacks of fear of competition. Is Apple so insecure of their post-Jobs era that it will anything to milk the most out what they have now? Apple is respected for it's style and design, quality and innovation. They can always manufacture two of them. But are they running out the third?
Just to follow up on the last post where I touched on the growing prominence of Linux on the Consumer Computing devices:
The news that Nokia is not going to sell the N9 in the US yet should not be surprising. What surprising is the decision to stop selling phones altogether. The situation is likely this: they don't sell a lot of smartphones (Symbian phones) in the US. The market segment that they are huge in everywhere, the feature phones and basic phones, are not selling in the US and is being eroded by smaller phone makers who sell their phones cheaper. Maybe they are looking at what IBM faced with PC and decided to take that critical step earlier.
IBM sold the original PCs but later were being outsold by other "PC clone" maker. They eventually lost money on the business but kept it around to foster brand recognition. Nothing says your business is successful by having an IBM PC on your desk. Also kept them in visible to the decision makers who would decide on the more profitable sever and services business. Don't get me wrong, those machines were no pushovers. IBM means quality and it shows. These decision makers can't see or touch the servers that they bought but using a quality IBM laptop makes them feel connected somehow.
It seems that Nokia is probably not waiting for that. They are losing money already. But to make their brand disappear feels like they are putting all their eggs in the Windows Phone basket. Microsoft loose nothing either way. Nokia wins and start selling loads of Windows Phones, they make money. Nokia goes bust again for another year despite the Windows phone and MS can pick them up for a song, positioning them squarely against Apple. Why not?
A possible success of the N9 powered by Meego could derail this. They probably had to release the N9 because it was so far down the production pipeline. It's not like it would be a surprise. Their previous Linux-based non-phones were a hit among the tech crowd. N900 showed promise. But if N9 is an improvement on that and the result is a more polished, consumer friendly experience, it would not only be trouble for Andriod and the Iphone but also other Windows Phones.Would they continue making a popular selling phone or would they compete against themselves by having both the N9 and the Windows phone? You have to wonder how much is Nokia listening to MS (remember this is not like MS "helping" Apple)? Samsung does that just fine. They have phones for every segment; Android, BadaOS, basic phones, and they are making money. Even review units of the N9 are not available to the press. Nokia says that they are reviewing market by market. They were surprised that instead of the focus on the hardware platform, which they wanted to highlight with the N9, the entire phone caused a stir. Missing maketplace or not, Flash in a phone solves a lot.
My guess is that it'll be released after the Windows phone to little press or in markets that cannot afford it and be killed off quietly. The march to MSNokia continues...
Like it or not, Linux as we know is changing. With the rise of the IPad, the face or the interaction between a user and their computer has changed significantly. The interface is simpler, touch interaction, full screen and instant-on. Interface between man and their machines has always been evolving. This most recent wave of change is significant in relation to Linux because it sees the goal of getting Linux everywhere being realized but at the cost of the Linux desktop. For years the battleground over Linux has stalled when it comes to MSOffice. The central role it places at the office, makes it a key target of any effort to implement Linux on the desktop. While OpenOffice is a choice, I have seen many efforts fail because OpenOffice was either too buggy or too MSOffice95. The countering point to this was the rallying cry of "focus on what you want to do, not the applications". To make it more palatable at the workplace, it was modified to "focus on producing work, not the tools". Even though users most of the times use the computer in the office as a super-typewriter, that is not using the majority of the functions in MSOffice, they still demand it to be installed, if anything else, for familiarity. That coupled with office politics and brand-consciousness, together with OpenOffice's own failings noted previously, halted most conversion efforts. Without weaning users off MSOffice, the Linux desktop efforts have either forced to take the "tech-users only" road or give business to Codeweavers for their brilliant CrossOver Office. This has been made worse by offices switching to web-based office systems such as Google Apps. MSOffice's resource appetite has driven users to these types of solutions, bypassing OpenOffice, the junction to desktop Linux.
I believe fundamentally that the relationship between the majority of users and their computers have changed. I am not old enough to be from the generation that had to build their first computers from kits but I have built and restored machine of that age to appreciate what personal computing might have been then. If you trace the evolution of computers in the home, you could draw a line from the first home computers for hobbyists to the personal home computers to the home office computers and laptops to the iPads and tablets (and in cases, smartphones too). In each step of this evolution, the user is still someone who wants to use a computer at home. But that person is no longer the tinkerer of yesteryear. They are not interested in how the computer works. That person now just uses the computer at home without even thinking. They don't think about using the computer, they think in terms of reading e-mail, using facebook and watching YouTube. Call it consumer computing. And if the past is any indication, those tools and concepts will start appearing at offices as users demand tools that are familiar to them to be productive.
In short, the place of the OS in our conscious thinking of computers is almost gone. Think about it. Windows was about hiding the command line. X Windows, KDE, Gnome were about that too. Browsers, HTML, Java and Flash gave us information and interaction within a window, obscuring the OS further. Now not only we don't see the window, we don't even see the OS. It all falls away as we focus on using the computer for whatever we want to use it for.
Will the choice of OS no longer be relevant? Will that work in Linux's favor? I think yes and yes. Look at the Linux underneath Android. It touts it's Linux connection to get the tech guys buy-in but soon enough it wouldn't matter and Google won't mention it anymore. IPad users don't care it runs IOS. They care that it runs.
When it come to running stable for longer, Linux is already there. There is an opportunity here. The 'fall' of the OS's importance is an opportunity for Consumer Computing solutions running Linux. Key to any success is apps and Linux has quality apps in spades. Nokia (soon to be MSNokia) bailed on MeeGo but don't count it out on tablets yet. Intel is hungry for the tablet market and may pull off another netbook-like push with tablet reference machines running MeeGo (followed by hordes of clones from China). Ubuntu is there with it's Unity interface. All it needs is compatible hardware. Not to mention other efforts to make Linux work on the multi-touch interface. Each effort represents potentially more Linux everywhere.
Did Microsoft buy Skype for itself or was this a clever way to move funds out of the US? If you don't know what the latter means, Skype is registered in Luxembourg, which has a more favorable tax rate than the US. So, there is speculation that the high price for Skype is partly to save on taxes. It did seem odd that Microsoft bought it outright and made it a unit of Microsoft, instead of investing in it. Skype has a very strong brand to stand on it's own.What will happen to it's existing branding agreements? Will we start seeing from Microsoft Hardware division Skype phones?
The alleged tax reason might be a side benefit though, given the fact that Skype can play a prominent role in establishing an on-line office business suite. In the beginning, solutions like Google Docs tease the possibility of an on-line office suite. Microsoft responded with Office Web, basically providing the same functions you would get from the desktop suite. Adding Skype can push this further by adding communication. If Skype could be tightly integrated with the other MS communication tools, Sharepoint and Exchange, it would allow businesses to be able to communicate documents with their customers, bringing them closer and extending their reach at the same time. Who wouldn't want to be able to call their supplier for free? That customer whom you prefer to talk to but costs more for long-distance; well, now he is a click away. How about a supply chain tool using Office Web? Not sure what that invoice is for? Click on the person who signed for the corresponding delivery and get to talk to him.
Let's view a scenario. Supplier and customers are running Sharepoint and Exchange. The two Exchange servers talk to each other over the Internet and exchange info on their users, including Skype account/numbers auto-generated by the Exchange server as part of the user creation process. Now when the need to talk arises, a click will not only send an e-mail, but may include a live invitation to talk that show you whether the person sending it is online and accepting calls. A supplier adds the users to a customer community powered by Sharepoint and they all can optionally share Skype numbers. Meetings can now be scheduled via Exchange and powered by Sharepoint and Skype. Now bring in Office365 and that offering goes over a cloud, lowering barrier for entry with a pay-as-you-use model.
This puts Citrix, which run GotoMeeting, in the Microsoft's crosshairs. Both companies have a close relationship, primarily through cross licencing for Windows Terminal Server and MetaFrame. Or more like MS strongarmed Winframe from Citrix to become Windows Terminal server. What would be worrying is that Microsoft would build a unifying directory service (that runs on a cloud, of course) that would tie their Exchange and Skype users worldwide. How much would companies prefer to Skype than pick up the phone?
Update 16 March 2014: This no longer happens to me when using Mageia 4. But having the option to print to file is always nice to have.
Once great thing about Linux is that it's components work good together. Even when they don't, you can always use how they work together to get what you want, at least in some part.
My toddler was asking for a coloring picture for Elmo, the Sesame Street muppet. The official site at www.sesamestreet.org didn't have a picture so I went to the Sesame Street section on the PBS site at kids.pbs.org. Both of them were basically Flash programs. Not pages with Flash elements but a page with probably one big Flash element. There were linked to other pages with the same structure. I think this is sort of a workaround to getting Shockwave-like experience without actually using the memory drain that Shockwave is on Windows. There is no Shockwave for Linux for whatever reason.
Anyway, so I found what I was looking for and clicked on the Flash control to print the picture. A CUPS pop-up came up. Now here is another interesting component. Acquired by Apple, CUPS was the elixir that solved so many problems with printing over lpr for inkjet and non-Postscript, non-PCL printers. I credit CUPS and HPLIP to ending any printer setup and printing issues on Linux, essentially taking the drama away. Really smart on HP's part. By keeping old HP printers printing, means more ink cartridges being sold. And Linux guys keep thing running for a long time.
The thing about CUPS is that it prefers to work in the background. It lets the user facing part be handled by the OS. So when I clicked on the icon to print the picture, Mandriva popped up a different print dialog than I would normally get. This dialog did not offer the "print to file" option. I wasn't concerned initially because I wanted to print to my inkjet. But after clicking on the inkjet and Print, the printer did not print out Elmo. I suspected that there was a problem between the hand-off between Flash and CUPS/Mandriva. I looked at the print queue and there was large job just waiting.
My go-to strategy for jobs being stuck, which is usually a driver problem, is to use the print to file option. This would create a postscript output in a text file which I would convert to PDF. I would then print it again on a another PC in the house or somewhere else. Since postscript is a printer language and PDF is based on postscript, all of the kinks related to printing would have been worked out and the printer driver would just focus on printing an image basically.
But the dialog didn't offer me the print to file option. I looked on the Internet and discovered that there was a separate Print to File printer definition, called CUPS-PDF. It still used the postscript printer driver on the back-. It just deposited the resulting file on the Desktop. I installed the driver from urpmi and printed Elmo again. I checked on the Desktop and the file was there. Or what I thought it was. True enough, it was Elmo in a postscript format. I converted it to PDF and printed it in no time. Total fault to solution time: 10 minutes.
Upon further inspection, the file on the Desktop wasn't really finished. But it was enough for me to create the PDF and get what I want. It seems that the code in Flash to print the picture passed enough info to the printer driver to print the first page. But it didn't send the code to say that the document was done. It assumed that the printer driver would just take the end-of-page marker and send it off to the printer. But CUPS being a good print system dutifully waited for the end-of-document marker in vain.
So not all things in Linux work well together. But even when they don't, Linux stuff offers other ways of getting what you want. And that is all I need.
I still maintain a Vista partition on my PC for my kids stuff or the stuff that they bring back from school to run. These would be educational CDs and stuff. Also it says in my support contract that I have to have Vista there to enjoy their 3 years on-site support (which I have BTW, 2 motherboards replaced FOC). But we use Linux 99% of the time. I figure that if I expose them now, their perception of computers would not be limited to the Microsoft World.
But recognizing the possibility of needing to use Vista for whatever reason, I maintain the partition and I maintain Vista. This means periodically log in an update windows, flash, java, openoffice and the cone / VLC media player. What prompted me this time was that I wanted to move from openoffice to libreoffice. My other Windows PCs have them already so it was more of leveling the playing field, making sure I have similar programs on all of the PCs in the house. It has been a while since I used Vista. So much so I was also installing chrome this time around.
Sometimes I wonder whether I am denying my children access to their educational software by defaulting to Mandriva. There is this great math tutorial program and a interactive language learning kit. If they ask for it, I'll boot up Vista and set things up for them. But they don't mind and I seem to be getting better mileage from Flash demos and YouTube tutorial videos on the Internet anyway.
During the update eveything went well except for updating Flash on IE. I went to the Adobe website and clicked on the button to down load the lastest version of Flash. It downloaded the Adobe downloader, installed it and executed it. It then threw up an error window said the it was unable to get the correct parameters. I figured that the downloader was facing problems with the Internet link. Checked that and it was ok. So I followed the troubleshooting link from Adobe download page.
Basically, it recommended that I stop every single program I can think of that runs flash and then run the uninstaller for Flash. Well, that's great. Even Adobe has little faith in my ability to figure out by looking in the taskbar which apps is using Adobe Flash and locking the flash files. Why? Because it recommended that if it didn't work, try again because I probably missed a program. I humored Adobe for a while and uninstalling and reinstalling the downloader didn't work.
So off to the Internet we go. I found a highly rated advices which advised me to download a file from the Windows Resource Toolkit and a command file for the toolkit to use. That removed or fixes flash related stuff. What is rich is that the command file is from Adobe. So I tried that and yet the dreaded "unable to obtain correct parameter" error came out.
This was getting ridiculous but reminded me of how lucky I am using Linux. Even with flash and it's installation instruction which divert you to the command line, that routine has worked ok for years (don't get me started on the simlar java installation). I realised that my problem wasn't with installing flash, it was the downloader. It was acting as tthe gatekeeper to getting flash. In reality it was nothing but a billboard. So after looking around I found a link to get the installers directly.
The bruhaha in recent years about flash and Apple's refusal to use it (not for technical reasons, I'm sure) seemed to me like case against progression. Not supporting flash is a deal breaker with all sorts of sites using flash to get past the home page. But with Apple's clout and the popularity of ipads amongst senior management, a lot of sites have had to provide flash-free alternatives. After this episode, good riddance to flash and let's move on to HTML5.
As you may know I am a Mandriva user. More hardcore than I thought, I discovered today.
I am lucky because I have padawan now. Eager to learn but patient enough not to bug me all day long.
So the need was to log onto the desktop from remote. Not just access but use the desktop. Mandriva has this tool called rfbdrake. It provide a one stop interface for remote access, going to and setting up. Basically it calls on rdesktop to connect to Windows boxes, VNC for Linux boxes and uses rfb to share out the current desktop. Not to be confused with the brilliant remote access tool on SuSe which spawns vncserver to provide remote desktop access from the point of login. This is much more pedestrian. Just share what I am seeing. Problem is, I couldn't find it on urpmi or on the Software Installer. Now I had procrastinated over some time on fixing a problem that workstation which prevented some updates from being completed. Since both of my problem could be rpm related, I finally set aside some time to do it. The update problem was simple enough. Apparently, the Fortigate firewall triggered some false positives on the files that was being downloaded. So amending the rules slightly to allow the updates to pass thru did the trick. But in the process earlier, the various repositories were also messed up. So removed them all and redownloaded a new set. For good measure, I plunked in plf too.
But after all the updates, I still couldn't get rfbdrake. Time to hunt RPMs on the net then. But horrors, rpm.pbone.net was down. RPMFind was no good either. I had given up on it to find Mandriva RPMs a long time ago. So a hunting on google we go. I finally found it on (of all places) SUNET. Nostalgia engulfed me as I remember the old days of going through SUNET looking for free/shareware software. Then followed the ensuing dependency hell. I was missing rfb itself. Hunt as I may, I could only find one from 2008. Security-wise not good.
Then it dawned to me. I was asking the wrong question. Why was I hung up on rfbdrake? The question would be, what would give me desktop access? If rfb is gone, what are their replacements? I should have learned that I should let go of old programs. The new guys wereVino and krfb. Turns out they worked fine. I miss the unified interface but if it is for the better, why not.
P/S - I am still haunted by my failure to keep a copy of a DOS IVR program (that fit on a floppy!) that ran together with a voice modem (a 33.6kbp with voice capabilites). I am that old. *sigh*
I've said before that I have other topical blogs and they have been taking my time away from this. Now, I am even spending less time writing since I got the WDTV, a box that connects the amongst other things, the Internet to my TV. Thats right, YouTube and podcasts (via MediaFly and Flingo)on the big screen.
There has to be some Open Source / Linux goodness going on because the WDTV even behaves like an Open Source app. For example, the firmware update process is so simple it's down to pressing Yes or No. But thats not it. The fact that is asks to update the firmware and waits for your approval as opposed to downloading it and forcing you to restart the set top box, says a lot more about the people who built this.
The costs is relatively cheap compared to setting up your own MythTV box. MythTV may do more but for most people, the WDTV is more than enough. Read the post here.
I haven't been a good blogger. My last post was some time ago and not a good one at that.
Truth is I have a few other topical blogs and I am spending more time on them. Don't get me wrong, I love Linux and I use it every day in almost every possible way. But usually I don't have anything interesting to say about it. I could write about what I am thinking about most of the time when it comes to linux : I thank God I am not a Windows user. But that gets boring after some time. Plus I can't write too much about work and my projects because of legal reasons.
Linux is so ubiquitous in my life, it really is boring. It used to be really a challenge just to install. Then config the graphic card in X. But now everything is so well done and tested by so many other users that my experiences are shying away from the technicalities that I love to more of.... management. So much I do right now is management that I even blog about it.
Moreover I need to update this blog's template. I promise to write about something as soon as I have something interesting to say.
I am facing a crisis of sorts, both in the personal realm and at work. Fortunately, only one of them is Linux-related. Unfortunately, that problem is at work and is creeping in to the personal.
I have been spending a lot of time recently grappling with mail. Mail in the sense that mail is clogging the queue every morning. All sorts of mail are being held up. Seems that one of the mail servers I oversee, running Scalix, is causing problems with a mail server running sendmail. Seems the scalix smtp daemon is dying and sendmail keeps on hammering the scalix server, locking the message. Problem is, the scalix smtpd is not dead but dying. It responds just enough to keeps the sendmail server interested and not return an immediate error. A few mails like that and it starts to tie up resources.
But that is not the story. Mail is the no 1 app at work. So, anything is second. I am now spending my dawns watching mail start to pile up and soon enough it starts to pile up. I have a VPN into server so in theory I can work over a wireless connection. I have a 3G phone which links up nicely to my HP mini running Mandriva. It link to the phone via Bluetooth.
The morning starts ok enough. Then the mails start piling up in the queue. No problem, I started flushing the queue. Then, I go where I needed to be. Once there, there was a long line. So no worries, I thought. Let's fire up the HP and link to phone to check on work.
Except Bluetooth wasn't there. I panicked. It wasn't on the task bar. It was there just a few days ago! Usually it's there saying its disabled. I looked at the dmesg info. I looked for bluez. Used the radio on-off button to force it off and back on again. Restarted Mandriva. Still no luck. In desperation I tried Wifi but found no open system.
Finally, I remembered what worked and started doing that. Start Windows and turn on bluetooth from there. The agony. Waiting for Windows to startup. Waiting to log in. Waiting to show me the icon on the taskbar. Even more waiting for it to shutdown. Curse you, windows-specific hardware!
Restarted Mandriva and it was there. There is the problem with my hardware: Sometimes it just doesn't work. Network card plug and play doesn't work. Mine has a Marvell chipset which can only be detected if it is pluggged into network at bootup. Plug in the cable later had no effect. Even dmesg shows no change when i plug a cable in.
Lesson: Don't rely on having things run during an emergency
There was a time when most software on Windows was written in Visual Basic. Some programs or development houses tried to hide it or make the lineage obscure. And why not? The image they wanted to project was that they devoted a quality programmers and time in crafting complex but useful code. Not that they were using the same tools as the casual developer or the one-man operation who sometimes wrote shareware. But one look under the hood and the evidence was clear. Look for a file called VBRUN.dll and there was the smoking gun pointing to Visual Basic. This is jack-of-all-trades dll. It was so tied in to VB, developers packaged it together even though they complied the libraries into the final program instead of linking the libraries. Just in case one of those libraries called it.
Now VBrun had a peculiar thing about it. It was incompatible across different versions of VB. So MS named the VRUN file according to version of VB. Soon, it was common to have 2 or 3 version of VBRUN.dll lying around in the hard disk somewhere used by 2 or 3 programs each. Some poorly written and compiled programs counldn't even tell the difference between versions. Remember those who tried to hide the fact they were running VB? They tried renaming VBRUN.DLL and installing it during the first time but the program reverted back to looking for VBRUN.DLL after an update, causing all sorts of havoc.
And you even had to put in a particular directory or else all those programs can't find it.
OpenOffice has or is trying to build a similar relationship with Java. The problems I have when Java or some libraries that OO uses is updated is sort of a deja vu but not exactly. Somethings breaks after the update. I used an OO extensions] called OpenCards to create flashcards from OO Impress files. After an update, it broke. I could use up to a certain number of cards and then it'll crash. Fortunately, OO has a solution. Go to Tools --> Opton --> Java. You can select the version of Java OO uses based on what is installed. If there are no entries press Add and then Cancel and the list of installed, detected Java versions will appear. Choose an older one and you are good to go.
I am purposefully making the title a bit more showy with the hopes that it get ranked better by the search engines. SEO this ain't. But I feel a responsibility to do so because I have been using Mandrake and later Mandriva for so many years. I've lived with it's idiosyncrasies and sometime severe limitations but more recent versions have repaid my faith. But just as the more recent version, Mandriva 2010 Spring came out, I read of the discord amongst it's developers and the desire to fork it. I won't comment on the politics but as a distribution Mandriva 2010 delivers.
I am writing this on a HP Mini which many has been written by others, shouldn't be able to install a standard distribution but is yet running Mandriva 2010.
It is sad that most reviewers out there take the easy way out and install the Mandriva Live version and use it to pass judgement. Installing Mandriva 2010 from the DVD (officially called Mandriva Free) is a different experience. It does have it's bumps due to it's commitment to go open source all the way, in the way of Flash and Java. However, that can be worked around.
..upgrading finally works.
I think I will have to dedicate the next few entries to how I've installed Mandriva (for the last several versions) with the hope that some one reading this may find it of some use. This is above and beyond my other post on moving or upgrading distributions. I am overjoyed to announce that finally, choosing the 'upgrade the distribution' finally works. Everything transitioned ok from Mandriva 2010 to 2010 spring on all my machines. No loss of shortcut or even non-standard applications (ie. installed from non-Mandriva RPMs or sources not added yet (yes PLF, I am talking about you :) )). I've even tried it from Mandriva 2009 Spring and it works ok. This is a big deal because I tend to loose sooo much time with the transition (backing up, installing fresh, updating and restoring files). Yes, upgrading does take a long time, on the average of 2-3 hours. But that is 2-3 hours of unattended installation
I have seen ntop way back when it was just character based. It may have not been the same program but I recently came back looking for a program that'll allow me to figure out what sort of traffic is going through my network and ntop came back to my mind. I didn't have the stomach to run a full-blown Snort setup nor was I interested in watching packets fly by in Wireshark. I figured I'd use Ntop to get a general feel and I then zoom in on particular hosts with Wireshart.
I tried a test setup of ntop running on Mandriva and quickly realised it's potential. I could see as far as the packet my NIC could catch. Problem was that I on a separate switch quite some ways off from the core switch.
So I found the hardware and setup a clean CentOS 5 setup and plugged it into a port that was mirrored to the port where the firewall was connected. I installed Ntop from an RPM (from rpmforge I think) and immediately hit a brick wall. The install didn't tell me to run from the command line at least once to set the admin password. And after that the init scripts spat out errors. I could not understand the errors until I realised that there was nothing wrong with the script file but there was a bug within ntop itself.
The man file explained that specifying a conf filename would expand the file into parameters onto the command line. However, the RPM I had was probably from some transitional stage because the file expansion would result in the parameters being delimited with a comma and a space on the command line while the version of ntop that was running wanted it delimited with a space only. So took the 3 parameters in the conf file and put them into the command line in the init script and said goodbye to the specified conf file.
I said this ntop was in a transitional stage because it's settings was also being kept in files set up by the web interface. These were created and updated after ntop was running. For some bizzare reason, while ntop could an wanted to run as user ntop, the files could not be created as user ntop not matter what I did with directory permissions. (I think I stopped short of using sticky bits). So I removed the parameter that specified the user. But then the graphs would not show. RRD, which was used by ntop, wanted to write as user ntop and having the directories created (and thus owned) by root prevented that. I was getting upset and I just changed the owner of the RRD directory from root to ntop. And then fireworks. Ntop provided quite a lot of insight into what people were doing on the network. For all of 15 minutes. I could not get past that magic number. Ntop would run at most 15 minutes, mostly less. No clue in the logs. All it said was that the network interface stopped becoming promiscuous.
I gave up. Set up a cron job to start ntop back every 15 minutes went back to the real task at hand, the reason why ntop was set up, trying to get a handle on my network.
After a few weeks ntop just refused to start.
I removed all of the unique ntop related-rpms and updated everything else and began from scratch. From another job, I figured out that SELinux was messing with some sensitive system calls. I had run it under Premissive, with the illusion that I would come back and do what was needed to get it to run under Enforcing. Fat chance. I disabled SElinux and NTOP is singing.
Now I got to figure out what all this data means.
Continued from this article. I apologize for the long time between postings.
I am still on the issue of making the jump to Linux and why you should do it for the right reasons. If you are trying Linux, this series may not be just for you yet. By all means, try it out and discover the world of Linux and Open Source software. Or rediscover the joy of discovery. That alone could be worth your effort. What I am trying to lay out are here are some things that are slightly different, things that affect people who are making the jump to Linux full-time.
Ask yourself, 'what is it that I do now on my computer'? Make a list of activities and the software you use for them. Be honest (because no one else is watching) and mark out programs that you use every day, occasionally/regularly and programs that you think you need and have installed but not touched it since then. The list is important because it could be a deal breaker in your jump to Linux. If you are doing this for your company, think about and talk to others and have them make their own list and combine their answers with yours.
Most of the time, we just do stuff and have very little time to plan. This is a great opportunity to think about what you have been doing and how you have been doing them. Is there a better way? Of do you want to stick with what you have now for a bit more? What is it that you wish you could do? If you have the time, try to improve things because if you can improve the way things are done, the more the move will be about something else than just an OS switch. If you don't or are doing this for a number of people, try it with the old way first because people accept change differently. You are lucky if you can convince them not only to change their OS but also change the way they work.
Now that you know what are the activities that you do and their frequency plus the software that you use now, you can find Linux programs or Open Source tools that match them. There is another article dedicated just for OpenOffice because it is the most likely replacement for MsOffice. But if you are willing to spend a bit more, Crossover Office make the transition easier by allowing you to run MsOffice on Linux. Trust me, some users only care about MsOffice. If there is a version for the dishwasher OS, the users won't blink an eye if you gave them a dishwasher. That runs MsOffice.
If you are dealing with a number of users, you may need to set aside time for training. There is always a hump, as I call it, when it comes to Linux. More on that later. In the meantime, keep you goals in sight.
I am currently on contract and the organisation that I work with gave me a slightly used laptop that had the Office 2007 installed. For what reason, beats me. I have Office 2003 on the desktop and I am fine with that. I would have reinstalled the desktop with Linux but my stint here is too short for that. After using Office 2007 for 30 minutes, I was ready to throw out the laptop out of the window.
No other user interface I have ever used is more frustrating that the tab and Ribbon interface. And that includes the old Digital Research file manager that was included in DR-DOS.
Come on MS. People are not that stupid. The Office icon is just another Start Button. The tabs are just modifiable menu items. And the ribbon is an over-sized toolbar. But here is the kicker, MS only gives one toolbar for Office 2007 while older versions of office has the option to set up more than one, on both top and on the bottom. Here is what the UI change really about : MS is creating a digital cage to force you to use the programs only the way that they want you to. You can do anything you want, as long as you do it the MS way.
Another trap like this is MS Internet Explorer. People are building applications that work only on it and take advantage of proprietary functions like embedding MS Word within it. What they don't realise is that MS IE is moving target and you are at the mercy of MS. There is nothing to prevent them from breaking your applications through forced updates. Heck, normal OS updates are enough to break custom applications. A standard MS response to suggestions of making the application both MSIE and Firefox compatible is that there is no guarantee that Firefox will be installed on a PC but they'll guarantee MISE is there. This typical misdirection is to move the focus from compatibility (which is the real issue here) to availability. My answer to this is when you make an application work on both IE and Firefox, regardless of what is installed, it will work. I have PCs at the office where the MS IE is installed and can't access some websites because it is an old version IE. Why should the old version fail when the standard built is for MSIE? What guarantee do you have that some feature you are using right now will be there in the next auto-upgrading.
Choice. Choose Linux
You can find Part 1 here and Part 2 here.
Two things you must remember about what the Windows world is about. One: Instant gratification. You can do stuff immediately after the install. Or the install itself has been dumbed down to a wizard. Two: Continuous tweaking. Always something new to try or to patch. Something is always changing or being changed. And guess what, you have little control over it (you can if you have the bucks). It is so bad that many people have been conditioned to want change or updates or something new to tweak ever so often. I've had users who moved to the Mac during the heady colored iMac days that asked me "why are they no updates?". I asked them whether something is wrong. The answer is negative. Somehow these users feel that if there are no regular / weekly updates then they must be something wrong. Think about it for a moment. These Windows users are assuming that the computer will always go wrong if not taken care of regularly. Really, they expect that if there no updates, Windows will blow up.
This is the second hardest thing to teach users who are no longer using Windows. Relax, no updates means that everything is ok. Sometimes calm waters are just that, calm waters. Stop focusing on fixing the computer and just use it. Confidently.
What is the most hardest thing to teach to ex-Windows users? It gets worse before it gets better .. and stays that way. This is the opposite of the Instant Gratification thing. No, you won't get everything running in 10 minutes. But yes, once it is up it'll stay up and we don't have to do anything major on it. In fact, as a user, they'll do even less because the model is that there is another person whose job is to take care of everything else. Unfortunately, that could also be you. And users if given the choice between something done in 5 minutes but always require tinkering vs something done in a day and never causing problems anymore will probably opt for the quick fix. The remedy? Make a big fuss about it. Call a meeting to discuss the steps to be taken and the impact on the users etc etc.. If it is a big fuss, users tend to accept that'll it take time. And if you are calling that meeting, why don't you actually do it properly. Who knows, you can even try to implement Change Management.. oooo that's a big word.
In the meeting identify, define and clearly mark the goal. Then plan on how to get from here to there. Make the transition gradual and plan for it to be so. Start with those that make the least impact, computers that offer limited functions or services to users. Then, back-room / supporting services. This is the area where Linux was born and shines. Finally deal with the desktops.
Once users have made the transition and are still a bit sore about the whole change, don't end your plan there. Think of "Now what?" What else can be done to make the experience of having moved better. Find way of making things better the users so that they can see why the journey was made in the first place. Point out open source / linux projects that will help them or that they are interested in. Start with Gimp and move from there.
I use Mandriva. Have used it almost exclusively for 'serious work' and on my home PC. That doesn't mean I don't use anything else. I have on my second harddisk, OpenSUSE, which I have not booted up since I-don't-know-when. I used it to use Jahsaka, a multimedia composing tool. Cool stuff, maybe I'll post some of the stuff I've done. But when I had no use for Jahshaka, I went back to my aging Mandrake 10.1. I just upgraded from Mandrake 10.1 to Mandriva 2007. The most painless upgrade I've had. I took the usual percautions.
Made a list of all applications I had installed and prioritise them according to what I use most often.
Backed up everything.
Renamed my home directories.
Made sure the installation only touched the non-home partition.
And within a reasonable amount of time, I had everything running. It detected almost everything I have, except for my motorola-HSP-based winmodem (No tears there because I am using broadband). I updated the distribution, added the required respositories (more on that here), and made sure again it didn't mess up my home partition. I created new user accounts using the old names. This will make my other family members feel at home as they try out the new stuff Mandriva has to offer. I move the user files but not the config files so that old settings don't interfere. And it was up and running.
This is not about recommending a distribution. Rather it's about how important the choice is and how to make it. A distribution is a collection of programs that will make up of what you will be installing in your PC besides the Linux kernel. A distribution is like an accent. A lot of people are saying the same thing but they are just saying it differently. Some more so than others. More importantly, choosing a distribution is like choosing a path. It will lead you to same place, just a different entrance into the city. Also the road can be winding but dotted with friendly towns or it can be 5 lanes wide chock-ful of people, leading the directly to the location. Ok, enough with the anologies.
Choosing a distribution over another will affect you in the following way:
How regular the software is updated. Which may translate into how long will you be exposed to a known venerability after it's made known. A more active distribution will have updates and patches available as soon as possibe. Most of them do because the nature of a distribution is that it is created by people and used by them. More likely, it'll translate into how long before you will be able to use that latest program or software you heard about.
Selection of pre-packaged software. Some distributions are general purpose, others are catered for a specific group of people. There are also distributions that you don't have to install. They run directly from CDs or DVDs. In this case, your choice is about use rather than having it sit on your hard disk. You might choose to use something off the CD for a while until you decide it is worthy to sit on your hard disk.
How long will it be supported after the version number rolls over. This is usually short, may be about a year or so for most non-commercial distributions. There is a reason sometimes some new features can't be included in the older version of the distribution.
You have to choose. But a good rule of thumb is that if you are not sure about which distribution, stick to one of the 'mainstream' distributions: Ubuntu, RedHat, Mandriva - in no particular order.
I am determined to make this work. This meaning Zoneminder (www.zoneminder.com). I had it working, sort of, on my VMWare Server-ed Mandriva 2006. But even though it ran, there were numerous errors about shmget and shared memory. Zoneminder consists of several applications, with some controlling others that do specific tasks and each of them exchanging information. My guess is that some information exchange is happening through shared memory: program A loads up stuff into memory and then passes a pointer to program B that does something to that stuff (although Inter-process communication could be involved too). I can see why the author does it. Multiple instances of program can load the same kind of stuff based on different configurations while program B doesn't care about where the stuff came from... it'll take it and process it. Sorta like program A makes batter and program B cut the batter into square shapes to be made into cookies. Ok, off the food and onto business. My guess is that I am facing problem with shared memory because I am running it on a VM, VMWare Server Beta to be exact. So the next option would be to run on a real physical memory. (actually, the next option would be to it run on something similar like VMWare Workstation, which I-Will-Do-Soon(R)). After the couple of installs I've done, I think it is safe enough to try it on my main machine at home because I don't use apache or MySQL extensively there and the dependent libraries won't probably mess up my main apps there. I am severely tempted to use an older version of a Zoneminder (zm) rpm to weed out dependencies. This is faster than the 'configure-->find out missing lib file --> find rpm for lib file --> get and install lib file --> configure' treadmill. But I've decided to take the long way just in to find out what libraries are actually needed if I was any other person doing this on any other distribution. Experiencing this as any other guy is important for the overall user experience. I am not even going to install the rpm for the perl serial and x10 stuff which is not even on the usual Mandriva repositories to see whether zm really needs it. I already know from previous experiences that the latest version 1.22.1 complies fast and less of a hassle from the previous version I was using. Somehow, the mysterious "unable to create executables" error had magically disappeared. And I found the correct manual this time for this version.
Apparently they had already fixed some of my previous rants.
My apologies to developers. No bizarre config program and defaults being set at compile. Thank you.
To cut a long story short, compile went ok and everything went exactly like it did previously.. that is it broke in the same place. Same shared memory problem. A glance through the sources did no good. The forum has someone one with the same problem and it was full of debug messages. Out of desperation, I started trying all sorts of things. Like I always tell people who have problems with computers and don't know what to do: "This is not a nuclear bomb. Push all the buttons." Changing the configuration somehow yielded a better error message. I told it to that the image fed to it were 8-bit grayscale (liar) instead of the other only option 24-bit color. I think the image I was sending was 8/16-bit color. But it had only two options. Now the error message was something like "Image captured was of wrong size, height, width or color". Image captured! It got something. Hey, it may be a Celocanth but it's still a fish. Maybe if I set the camera to 8-bit grayscale, it'll capture it correctly! Great! Problem is.. I can't remember the password for the camera. No time to do a reset and sacrifice a lamb or whatever it is I need to do to reset the password. Got to pay the bills. Later.
I am experimenting with Zoneminder . It looks like a promising digital video recorder (DVR) for closed circuit cameras. My interest is primarily for IP based cameras. In theory, this should be easy. IP based cameras usually have a URL that you could pick up frames or even video streams. So building a DVR software should not be that difficult. I always start with an RPM because I use Mandriva and this helps keeps updates in check. Also I respect the fact that someone invested time in building a package, so why reinvent the wheel. There were RPMs for Mandriva and tried those. I read the instructions in the manual and strangely enough there were instructions on what to do before installing. Wait a minute, isn't an RPM supposed to handle this. That is one of the points of using an RPM. But the request wasn't demanding, just that apache, mysql and php be installed. So I did that. Installing it using urpmi should have resolved dependancies but I wanted to choose which version to be installed anyway and the GUI tool was better at that. When I finally used urpmi to install it, it complained of some missing perl libraries. Great, I thought, a trek to CSPAN to download the libraries and compile them by hand. I later found out that the libraries were for an optional component and not really necessary. But even after the rpm was installed, the instruction asked me to manually edit a file and run a configuration program.. which was broken. Apparently the package author decided to use MySQL-Max and put in code to detect the version of MySQL that was running. At least he made it easier by clearly marking the offending section in the script file with #FIX ME. Finally, I managed to get the components running and configured the program to capture images from my IP camera. Or so I thought. I couldn't figure out why my settings would not work when I have read the manuals (two of them) and follows almost every instruction there is. I checked the logs and I found out that the program that was picking up the images was crashing. Apparently it requested memory space incorrectly. Wait a minute, this was serious. This is a beta-level error on a 1.2++ version. Problems with memory allocation should have been delt with a long time ago. This only means one thing.. recompile. I got the latest STABLE sources and tried to compile it. It got worse. The configure command still required parameters passed in.. for the default web directory and location for mysql? If it can't detect that, what is the program actually doing? The more I fed it CLI parameters, the more it asked. In the end there was about 10 odd CLI parameters. And it still wouldn't compile, complaining about the c compiler unable to create executables! What?! I little more reading led me to a more stable version, a couple of revisions back. No more c compiler complaints. Now it was the normal missing library files that had to be installed, specifically libjpeg.a (libjpeg), libz.a (zlib-devel) and libmysqlclient.a (libmysql-devel). Configure exited and asked to run a configuration program. Which did a lot of what confgure is supposed to do, esp when it came to component detection. It asked a lot of questions, nearly 30 in all. Most of them should have just been set to default. Too late already. Will compile tommorow. Lessons Learnt.
Good practise for building packages is to seperate the core and optional components.
When building a package make sure all dependancies are included.
As you are moving from Windows to Linux on the desktop, you will have to cross the most difficult bridges of all: Office applications. Or productivity tools. In short you have to ask yourself, "To MSOffice or not to MSOffice, that is the question." It is entirely possible to keep MSOffice and use Linux, despite what purists say. I, for one, is still married to MSOffice, still and I'll explain why at the end. Coming back to the question at hand, your options are :
OpenOffice - an alternative application suite. It can read and save files into MSOffice file formats. The best part about it is that it also runs on Windows. So as you are moving people across, you can have them using OpenOffice on Windows and later on Linux. However, this endeavor is so large, it in itself is as a daunting a task as moving people into Linux. The true reason you should have people moving across is that most people do not use all of MSOffices features all the time. They simple can't. If they do clerical work, moving across is a cinch. But if they are advanced users, it will be as painful as a root canal minus the pain killers. Heck, I have some problems with alignment when moving from OpenOffice in Win to OpenOffice in Linux. If you share files out side the company, it then gets really troublesome.
Cross Over office - A commercial tool that allows you to install and use MSOffice (plus some other Windows applications) on Linux. Like driving on the other lane when the road is empty. It simply works. Well, almost all. You see, what they didn't tell you is that there is a reason why MSOffice is on Windows only. It is just because it uses low-level software calls. I have heard that one of the reason the Windows on Alpha was dropped because it couldn't run MSOffice very well. Some version of Office actually replaced OS files during installation. What other application would do that? So the result is that the major MSOffice applications work fine but some fringe and not-so-fringe applications can be tripped up (e.g. Clipart Manager).
Like above, Office over Wine - Wine, which is not a Windows Emulator, is designed to run Windows applications on Linux by fooling the application into thinking that it is on Windows, but not. In fact, CrossOver Office is partly Wine. So, why use CrossOver when you can get wine for free. Let's just say that I like my hair too much as this age of my life.
So to sum up, there area three questions you need to ask, the acid test: 1. Do you use Macros? Do things pop out and ask you stuff when you open a template or document? If not, then you answer is most likely no. 2. Do you use outlines or the outlining feature in MSWord? If you are asking, "Wha-?", then your answer would be no. 3. Do you have MS Access databases that you use regularly? Thing about conversion is that MSAccess files are not part of the deal.
If you answered yes to any of the above, go Crossover Office. If not, then you are a prime candidate for switching over from MSOffice to OpenOffice. You will save a ton of money later, especially as you grown and add PCs and realize you don't have to pay for another MSOffice license. Oh, BTW, I don't use Macros but I love the outlining feature so much, it is a deal breaker.
If you are thinking about making the transition from whatever to Linux, read on.
A lot of people asked me two questions since I made the switch, 'Is it hard?' and 'Can you do everything you want to in Windows?'
The answer to the last one is a resounding yes. In fact, after switching from Windows, whenever I have to use a Windows machine, I find it very restrictive and most of my tools are gone. Linux give you so many choices and options, you can't just make up your mind and stick to one set. I find myself switching from KDE to GNOME and back every few months. Without losing access to the core programs I use.
The answer to the first one is 'Hell, yes. It was very hard.' But I was on my own and in retrospect, could have avoided a lot of heartache if there were someone to tell me what to do or what to avoid. This series is dedicated to those thinking about making the switch or the jump. Something to think about and do before making the leap. Most of it will sound like me talking to you as a network administrator but even if you are switching alone, everything still applies. Think of yourself as your own administrator.
First, Why are you making the jump or at least thinking about it. The reasons have to be sound because you have to do it for the right reasons. If not, you will be disappointed or you will find it not suited for you and you switch back. Time lost once will never be regained.
If you are switching for idealogical reasons (i.e. not wanting to pay Microsoft Tax), then you are a Believer. Nothing I say will discourage you and all pain is worth suffering. Just make sure other people involved believe it too. Note to Believer: All proponents of idealogies (prophets, do-gooders) face lynch mobs. Sort of a Darwinian thing about idealogies, those that survive lynch mobs are most likely superior.
Remember, it has to get worse before it gets better.
If you are thinking about saving costs, I will tell you right now it will be some time before you see significant cost savings. Unless, of course you include licence costs for a large number of people. The is where the most savings will be. But for every cost factor you take away, you will be replacing it with another one. Training or retraining will cost. Reinstallation or upgrades of older PCs will cost. Sure the PCs won't crash as often but people who switch to Linux forget that Linux may not be hard on CPU speed but it does require some amount of memory before things really fly. My suggestion is that hit 256MB as soon as you can. If you are looking at older PCs, 128MB will work. While on this issue, sometimes it's not even the RAM. Getting a new video card with more memory works wonders too. Coming back to cost factors, live with the fact that cost factors are just going to be replaced not eliminated. But if you are smart about it, it just won't cost as much. That is, each cost factor replaced, will likely be less in value.
That said, hunker down for some productivity loss and doubts (or doubting people) nagging you. Remember, it has to get worse before it gets better.
You reality is your own perception. If it walks like a duck, quacks like a duck then.. you know. But what if all you see are ducks. Do you think you'd know a chicken if it walked by?
There is a point to this. I make a living from computers (big surprise). And I work with a people out from colleges who are making their first career jobs and people whose businesses are starting to break out from the local market. They all need computers and they all want to use the best at the least possible cost. Recommendations are big thing for me and my clients (and lately, even my suppliers) bring in people they know who can use my expertise. I use my own office setup to demonstrate some of the uses you can get from using open source solutions and Linux in particular. The thing I am getting used to is the response, "You can do that with a computer?" or "It can work like that?"
Thats what bothers me. It used to be the whiz bang stuff that gets them, then the free but high quality stuff (Mozilla, Gimp). But now the stuff that draw theses responses are down right trivial.
I pointed out to a potential client that he could set up a print queue and log all print jobs and the information of each job. He looked at me and point out that wasn't everybody just printing directly to the printer. If everyone could see the printer, couldn't they just bypass the queue? I walked to the printer and turned off SMB-based sharing via the control panel. The printer disappered from the network but I demonstrated that I could still print via the queue. He was bowled over. Seems that he has a problem with his workers printing on the expensive color laser printer after hours. At first he would disconnect the printer at about 5 but stopped that after salespeople complained of not being able to get color brochures printed for clients after hours. The growth in his company was directly the result of his sales staff being able to come in at odd hours and do work, so he couldn't deny their request. The notion of a print queue and the ability to turn off access to the printer (selectively by network protocol) never crossed his mind.
Do you see that? The solution had little to do with open source or Linux or anything new for that matter. Print queues have been around for ages. But what surprised me more was that when I mentioned this to a younger co-worker, he said that compared to what he saw at college (local community college), the stuff at the office was downright revolutionary. The free-flow mess of network and services on Windows networks at college was a stark contrast to the controlled environment at the office where everything just worked or that if it failed something else was waiting to back that up.
Which brings me back to the ducks. One of the problems with computing right now is the dominance of Windows and MS. All people see are Windows. Their sheer ubiquity has blinded a lot of people. They simply don't know any other way. And it if means having to live with unoptimised working environments that often is not productive, so be it. A recent report said that Gartner research says that desktop Linux won't be taking off (I have issues with that but to a certain degree agree that Linux has problems on office desktops). Maybe the case with that is that people don't know better. Maybe it's time to look over and think, "Fried chicked sounds good."
I got the Mandrake-powered notebook to work over the wireless network with AP at home. But no luck at office. This vexed me more than normal because I had a hand in setting up the office wireless AP and was pretty sure of what the settings were. Normally when you build two things that are like, you'd get better the second time, not worse. But since the first time worked flawlessly, I learned nothing from the experience. That is why I don't see problems as obstacles. They are opportunities to learn. Basically my problem boiled down to the fact my notebook's wireless card can't connect to the office AP using WEP encryption. Without it, no problem. But the kicker was that I was using WEP at home AP and it worked out-of-the-box. No option I tried could get it done. This is the time to take a step back. The thing to do at a time like this is to not go through the things I got wrong. But rather the things I thought I got right. What was it that I did differently at the office than at home? And there the solution was. The wireless card needed the WEP key to be in hex. It would not use the ASCII key. That I found that out at home but it was fixed easily because the home AP showed the ASCII key I entered as hex and vice versa when I switched between ASCII and hex input. The office AP didn't have that feature. You either entered it in ASCII or Hex and switching between both just blanks out any key previously entered. So I used an ASCII to hex converter at the command line. Apparently these things are case-sensitive. No wonder it wouldn't work. It was just the wrong key! I found that out because I finally decided to change the WEP key at the office AP. I just entered it in hex and did the same on the notebook. It worked straight away. I didn't do this earlier because other people were also using the AP. After changing back the key and more fiddling around I learned that the office AP apparently automatically makes the ASCII key entered into UPPER CASE before converting it to hex value and then using it . The AP vendor committed one of Great Sins of Equipment Manufacturers: Not telling the user of the assumption you made for them (and in a way, about them). I was thankful though they didn't do something boneheaded like configuring the AP to use two keys for every ASCII key entered (that is convert the ASCII key into both upper and lower case and converting each into hex and using them both). It would have made my setup work immediately but it would be Not The Right Way.
LAN network card: ok. Didn't expect any problems but who knows.
Graphics display = Vesa only. It bombed using the i910 drivers for XFree. I heard Intel is posting it's driver. Will try that. But not really bad Vesa.
Power Management = ACPI ok, APIC crashes the system for some reason.
I mucked up the NTFS XP partition. It went from ok to just gone. I have backup for the XP but not since the major updates. I think the partition was corrupted as I trying to resize the ext partitions. However, using a Resuce CD, I managed to repair it using parted. Or Parted managed to repair it. All I did was made sure it showed up and pressed a few key. It is amazing what is autoamted nowadays. My wireless network card loaded ok but it still required the ipw2200 firmware package. It only detected it after a urpmi makeover. I made sure all of the other repositiories were visible before I tried again. After the nail biting wait for the dependancy resolution and downloading and installing, it worked like a charm.
Finally, I got a new notebook at work. I was a bit apprehensive about what distro to put. SuSe Pro is a big pull. Ubuntu even crossed my mind. But realising that this was a notebook that would not have all the pieces working with Linux, I needed most of my experience to make it up and running. And an unfamiliar distribution would make me grope in the dark. Mandrake/Mandriva it was. In the next course of blogs, I try to document as much as possible what I did right and what I did wrong with the hope it'll help someone out there. First things first, the notebook is a MSI Megabook, rebranded as a local brand here. Centrino chips, 512MB RAM, DVD-CDRW, 40GB HDD, 3 USB, 1 Firewire, 1 VGA, 1 PCMCIA with integrated card reader (Ricoh), built in Wifi, network and modem. All in a nice 1.8 kg package costing slightly under 1k dollars. The good news is that I am writing this on the notebook
I looking at the Mandriva CD that came with Linux Format, the best Linux magazine for the less uppity or the pocket-protector-less. I wonder when I will get to install it. To be truthful, I had the downloaded CDs longer but if you have read the past few posts, upgrades are something I dread. It's that I also use the PC so much, I am aprehensive of all the lost time to install most of what I had already installed on the upgraded mahine.
Mandriva is greatly enhanced by urmpi and more so when combined with the repositories listed on EasyUrpmi. If you haven't got plf repositories listed, you are definitely missing a lot. A side feature of using EasyUrpmi is that you can set the main repository, thereby eliminating the need to have the CDs or DVD around when ever you install stuff. As a desktop OS, you will install a lot of stuff.
Suddenly, things got slower during installs, often failing. There are no clues other than messages saying that some packages cannot be installed due to missing keys and that some packages are corrupted. Checked the name of the package. Correct. Checked rpm.pbone. Correct. Tried restoring missing keys. Trouble is, they weren't missing to begin with.
And it goes on for some time. Sometimes I get to install. Other times I don't. So I tried updating the repository indexes. Some are successful, other times, it just hangs, requiring a kill.
Fortunately, I have another desktop at home. Faster. I am the only one using the Internet connection. Everything I tried installing, worked. Even stuff that didn't work.. at work. Then I realised that the PC at work kept hanging when I tried updating the indexes. So I do what I normally do when trying to figure stuff out. I break it down.
Tried a few indexes at first. But ultimately it all worked. All indexes from repositories that have their contents updated, that is. So I tried updating the index from the repository that is not supposed to change, main. It failed spectacularly. So that was the problem. The repository was no longer where it was. A quick visit to EasyUrpmi fixed that. Deleted the main repository, found another one and added it back with the command generated by EasyUrpmi.
Which says a lot about error messages. Error messages are a must. It tells us when things are wrong. More importantly, it tells us what is wrong so that it can be fixed. When errors messages don't tell me what it wrong, it might as well be shouting obscenities alone.
I don't understand people fearing choice. Maybe it is the fear to choose. Or at least the fear of being wrong. Is that our problem. Is that why Linux on the desktop is slow starting, because the technical people a fearful? Fear of call center-overload?
I carry Knoppix around in my bag. Recently, I enquired around for a new laptop. I fell in love with these low-cost 12" monitor basic laptops. But the smaller it is, the more customised components it'll have. Which is bad news for Linux. But I wasn't deterred. I asked the sales guy about it and popped in the Knoppix CD. He was making a comment on how RedHat still requires command-line installation. But when Knoppix came on, he was blown away. So much so, he asked for a copy of the CD to kick around. I was tempted in giving away my outdated version but I still had a few shops to go to and their laptops to test. So I waited while he copied my Knoppix 3.7. The moral of the story is always keep 2 copies of Knoppix around.
The last time, I was apprehensive about upgrading. In the past when I've done it, there were several things I didn't like happen.
1. Old settings were carried over literally. There was no way to use the new settings if you use the previous home directories. This happened during an upgrade. I am quit surprised that the newer version of the software, especially for something as important as GNOME, did not notice that the config file was from a previous version and at least offered to replace it with a new version. I understand the concern to keep all the user's settings but should there be an updater program for that. 2. Certain software that were there previously, were not replaced but simply disappeared. This happens especially for software that is not in the vogue or part of the core distribution. I understand that leaving the program there is risking a certain incompatibility but at least if it was there before and there is no replacement during the upgrade, please offer me to leave it there. It happened to Nagios, a network monitoring software I use. It just disappeared. I had to reinstall and reconfigure it every time I upgraded. It is only on the contrib section.
An upgrade is an upgrade, not re-installation and certainly not a fresh installation. Distribution packagers should respect that or loose their user base.
One of the most easiest thing to do on Mandrake is updates. Even kernel updates. But as my previous experience has shown kernel updates with Nvidia drivers are not to be taken lightly. So the first step is to plan.
Do not enter a room without knowing how to get out.
Or something like that.. DeNiro's character in the movie Ronin said at the beginning of the movie. I have to do something similar. In order to compile Nvidia drivers, you need the kernel source. So, if you update the kernel, you need the related kernel-source at the same time. Problem is, if you update with urpmi or Mandrake update, you lose the kernel source. What we want to avoid. We update the kernel and the kernel source. Then either we fail to complie the nvidia driver or the nvidia driver fails to work. Need to undo. Mandrake has kept the previous version of the kernel but not the source. If you failed to complie the nvidia driver, then the old one is probably there. But if the driver was a dud, you have to recompile. If you don't have the previous version of the kernel source, you are screwed because most sites don't keep older versions of the kernel source. Or you can use the non-Nvidia nvidia drivers (just change the XF86Config or equivalent file). Yep, drive your Ferrari only in the first gear. Moral of story. Download the kernel-source rpm and then install it by hand after you have updated the kernel but before you recompile the Nvidia driver. Before I get feedbacks on "how stupid it is for the need to recompile in this day and age" or "Run for the hills! Recompilin's here!", let me point out that the Nvidia driver recompile process is menu driven. Push button. I am so anxious I haven't updated my kernel for so long. But I need to. Sigh. Wish me luck.
I have a lot to bitch about: An OpenOffice bug keeps me from saving my files correctly. Firefox still not 1.0 in repositories. And from the comments in the forums, it won't be for sometime. I downloaded the latest version and it had me install itself within the user's directories structure. All the shortcuts would use to old version still but instead created one on the desktop and used that instead.